Apr 22 13:20:54.090121 ip-10-0-136-73 systemd[1]: Starting Kubernetes Kubelet... Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.599720 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605629 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605645 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605649 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605652 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605655 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605658 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605662 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605665 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 13:20:54.614191 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605668 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605671 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605674 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605677 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605680 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605682 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605686 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605689 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605692 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605694 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605697 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605700 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605702 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605705 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605708 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605710 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605719 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605723 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605726 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605729 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 13:20:54.615477 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605733 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605735 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605738 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605742 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605745 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605748 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605750 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605755 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605758 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605760 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605765 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605769 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605773 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605776 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605779 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605782 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605785 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605787 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605790 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605792 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 13:20:54.617229 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605796 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605799 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605802 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605804 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605807 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605809 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605825 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605828 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605831 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605833 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605836 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605838 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605842 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605845 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605848 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605850 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605853 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605856 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605858 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 13:20:54.618139 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605861 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605864 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605866 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605870 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605874 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605878 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605880 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605883 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605886 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605888 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605891 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605894 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605896 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605899 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605902 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605905 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605908 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605910 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.605913 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 13:20:54.618764 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606325 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606331 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606334 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606338 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606340 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606343 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606346 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606349 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606351 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606354 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606357 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606360 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606362 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606365 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606368 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606370 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606373 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606376 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606378 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606381 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 13:20:54.619571 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606383 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606386 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606389 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606391 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606394 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606396 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606399 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606404 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606408 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606411 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606413 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606416 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606418 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606421 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606424 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606426 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606429 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606431 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606434 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606436 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 13:20:54.620172 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606439 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606442 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606444 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606447 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606450 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606452 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606455 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606458 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606460 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606464 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606466 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606468 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606471 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606475 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606478 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606481 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606485 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606488 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606491 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 13:20:54.620771 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606494 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606498 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606501 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606504 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606506 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606509 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606512 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606514 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606517 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606520 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606523 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606525 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606529 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606531 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606534 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606536 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606542 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606544 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606547 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606550 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 13:20:54.621414 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606553 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606555 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606558 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606560 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606563 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606565 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.606568 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606649 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606667 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606674 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606678 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606683 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606687 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606691 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606696 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606699 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606702 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606706 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606709 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606712 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606716 2575 flags.go:64] FLAG: --cgroup-root="" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606718 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606721 2575 flags.go:64] FLAG: --client-ca-file="" Apr 22 13:20:54.621981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606724 2575 flags.go:64] FLAG: --cloud-config="" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606727 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.606730 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607876 2575 flags.go:64] FLAG: --cluster-domain="" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607880 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607885 2575 flags.go:64] FLAG: --config-dir="" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607888 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607892 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607897 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607900 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607903 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607907 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607910 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607913 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607916 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607920 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607923 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607928 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607931 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607934 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607937 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607941 2575 flags.go:64] FLAG: --enable-server="true" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607944 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607948 2575 flags.go:64] FLAG: --event-burst="100" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607952 2575 flags.go:64] FLAG: --event-qps="50" Apr 22 13:20:54.807862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607955 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607959 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607962 2575 flags.go:64] FLAG: --eviction-hard="" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607966 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607969 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607972 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607975 2575 flags.go:64] FLAG: --eviction-soft="" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607978 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607981 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607984 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607987 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607990 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607994 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.607997 2575 flags.go:64] FLAG: --feature-gates="" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608001 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608004 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608008 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608011 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608015 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608018 2575 flags.go:64] FLAG: --help="false" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608021 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-136-73.ec2.internal" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608024 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608027 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 13:20:54.822970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608030 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 13:20:54.710032 ip-10-0-136-73 systemd[1]: Started Kubernetes Kubelet. Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608033 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608037 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608043 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608046 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608049 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608053 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608056 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608059 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608063 2575 flags.go:64] FLAG: --kube-reserved="" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608066 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608069 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608072 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608075 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608078 2575 flags.go:64] FLAG: --lock-file="" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608081 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608084 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608087 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608092 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608095 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608098 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608103 2575 flags.go:64] FLAG: --logging-format="text" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608106 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608109 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 13:20:54.825363 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608112 2575 flags.go:64] FLAG: --manifest-url="" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608115 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608120 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608123 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608127 2575 flags.go:64] FLAG: --max-pods="110" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608130 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608133 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608136 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608139 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608142 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608146 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608150 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608158 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608161 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608164 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608168 2575 flags.go:64] FLAG: --pod-cidr="" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608171 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608176 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608179 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608183 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608186 2575 flags.go:64] FLAG: --port="10250" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608189 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608192 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-092f2883ae5816215" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608195 2575 flags.go:64] FLAG: --qos-reserved="" Apr 22 13:20:54.826130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608198 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608201 2575 flags.go:64] FLAG: --register-node="true" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608204 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608207 2575 flags.go:64] FLAG: --register-with-taints="" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608211 2575 flags.go:64] FLAG: --registry-burst="10" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608215 2575 flags.go:64] FLAG: --registry-qps="5" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608218 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608221 2575 flags.go:64] FLAG: --reserved-memory="" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608225 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608228 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608231 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608234 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608237 2575 flags.go:64] FLAG: --runonce="false" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608240 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608243 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608246 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608249 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608252 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608256 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608259 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608264 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608267 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608270 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608273 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608276 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608279 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 13:20:55.035210 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608282 2575 flags.go:64] FLAG: --system-cgroups="" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608285 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608291 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608293 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608296 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608301 2575 flags.go:64] FLAG: --tls-min-version="" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608304 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608307 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608310 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608313 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608315 2575 flags.go:64] FLAG: --v="2" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608321 2575 flags.go:64] FLAG: --version="false" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608325 2575 flags.go:64] FLAG: --vmodule="" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608329 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608332 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608441 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608445 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608449 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608452 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608455 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608457 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608460 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608463 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 13:20:55.036030 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608466 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608469 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608471 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608476 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608478 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608481 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608484 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608486 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608489 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608492 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608494 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608497 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608500 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608503 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608505 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608508 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608511 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608514 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608516 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608519 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 13:20:55.036686 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608522 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608526 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608528 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608531 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608534 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608536 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608539 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608541 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608544 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608547 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608549 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608552 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608554 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608557 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608559 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608563 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608566 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608568 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608571 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608573 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 13:20:55.037529 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608576 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608578 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608581 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608584 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608587 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608589 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608592 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608595 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608597 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608600 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608603 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608605 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608609 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608611 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608614 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608617 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608619 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608622 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608624 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608627 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 13:20:55.038154 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608630 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608633 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608638 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608640 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608643 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608645 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608648 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608652 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608654 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608658 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608661 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608665 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608668 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608671 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608674 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608676 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608679 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 13:20:55.038876 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.608683 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.608687 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.618921 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.618940 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.618991 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619002 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619005 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619009 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619013 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619016 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619019 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619022 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619025 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619028 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619030 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619034 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 13:20:55.039454 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619039 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619042 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619045 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619047 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619050 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619053 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619056 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619059 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619061 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619065 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619067 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619070 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619073 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619076 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619079 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619081 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619084 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619087 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619089 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619092 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 13:20:55.040007 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619095 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619098 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619100 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619103 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619106 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619109 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619111 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619114 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619116 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619119 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619122 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619125 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619127 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619130 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619133 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619135 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619138 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619141 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619143 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619146 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 13:20:55.040745 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619149 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619153 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619156 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619159 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619162 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619165 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619169 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619172 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619175 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619178 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619180 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619183 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619186 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619188 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619191 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619193 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619195 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619198 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619201 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 13:20:55.041350 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619204 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619206 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619209 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619211 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619214 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619218 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619220 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619223 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619226 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619229 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619231 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619234 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619236 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619239 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619242 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 13:20:55.041920 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.619247 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619344 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619348 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619352 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619354 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619357 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619360 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619363 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619365 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619368 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619371 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619374 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619376 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619379 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619382 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619384 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619387 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619390 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619392 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619395 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 13:20:55.042374 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619398 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619400 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619403 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619406 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619408 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619414 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619417 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619419 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619422 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619424 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619427 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619429 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619432 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619435 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619437 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619440 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619442 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619445 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619447 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619450 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 13:20:55.042908 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619452 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619455 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619457 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619460 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619462 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619465 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619468 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619470 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619473 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619476 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619478 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619481 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619483 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619486 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619488 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619491 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619494 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619499 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619502 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 13:20:55.043401 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619504 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619507 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619509 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619512 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619515 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619518 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619520 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619524 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619527 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619530 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619533 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619536 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619540 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619542 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619545 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619548 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619551 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619554 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619557 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 13:20:55.043921 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619559 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619562 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619565 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619569 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619571 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619575 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619577 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619581 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:54.619583 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.619589 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.619728 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.624043 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.625016 2575 server.go:1019] "Starting client certificate rotation" Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.625113 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 13:20:55.044373 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.625149 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 13:20:55.044718 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.649179 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 13:20:55.044718 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.652098 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 13:20:55.044718 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.665916 2575 log.go:25] "Validated CRI v1 runtime API" Apr 22 13:20:55.044718 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.674104 2575 log.go:25] "Validated CRI v1 image API" Apr 22 13:20:55.044718 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.675891 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 13:20:55.044718 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.680788 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7e78efe9-7583-4a40-8354-c1c242960454:/dev/nvme0n1p3 d032fa2a-e717-4afd-baec-48cfc072ceb7:/dev/nvme0n1p4] Apr 22 13:20:55.044718 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.680805 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 13:20:55.044718 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.682662 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.688608 2575 manager.go:217] Machine: {Timestamp:2026-04-22 13:20:54.686619179 +0000 UTC m=+0.465305968 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3142159 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c428ceef1d6363743e5035d3c3422 SystemUUID:ec2c428c-eef1-d636-3743-e5035d3c3422 BootID:1107a9ed-538f-4804-906e-790001fe23bf Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e5:91:65:7c:95 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e5:91:65:7c:95 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c2:f4:45:62:1a:20 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.688715 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.688808 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.689906 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.689930 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-73.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.690079 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.690087 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.690101 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.690839 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.692143 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.692295 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.694881 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.694895 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.694907 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.694917 2575 kubelet.go:397] "Adding apiserver pod source" Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.694927 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.696091 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 13:20:55.044960 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.696105 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.701534 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.703333 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.704721 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.704735 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.704741 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.704747 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.704753 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.704760 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.704765 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.704771 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.704779 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.704784 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.704798 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.704806 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.705592 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.705600 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.708793 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-73.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.708810 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-73.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.708944 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.709163 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.709197 2575 server.go:1295] "Started kubelet" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.709272 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.709291 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.709348 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.711066 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.711362 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.717233 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.717677 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.718399 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.718421 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.718662 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-73.ec2.internal\" not found" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.719803 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.719882 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.719888 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.720284 2575 factory.go:55] Registering systemd factory Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.720300 2575 factory.go:223] Registration of the systemd container factory successfully Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.720633 2575 factory.go:153] Registering CRI-O factory Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.720642 2575 factory.go:223] Registration of the crio container factory successfully Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.720683 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.720709 2575 factory.go:103] Registering Raw factory Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.720724 2575 manager.go:1196] Started watching for new ooms in manager Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.721442 2575 manager.go:319] Starting recovery of all containers Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.725541 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-73.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.725662 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 13:20:55.045446 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.725710 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.725357 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-73.ec2.internal.18a8b0718b19b7e4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-73.ec2.internal,UID:ip-10-0-136-73.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-73.ec2.internal,},FirstTimestamp:2026-04-22 13:20:54.709172196 +0000 UTC m=+0.487858977,LastTimestamp:2026-04-22 13:20:54.709172196 +0000 UTC m=+0.487858977,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-73.ec2.internal,}" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.731055 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kpp9t" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.734703 2575 manager.go:324] Recovery completed Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.738025 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kpp9t" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.738699 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.740948 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasSufficientMemory" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.741076 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.741088 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasSufficientPID" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.741543 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.741556 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.741571 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.743255 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-73.ec2.internal.18a8b0718d004c69 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-73.ec2.internal,UID:ip-10-0-136-73.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-73.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-73.ec2.internal,},FirstTimestamp:2026-04-22 13:20:54.741060713 +0000 UTC m=+0.519747482,LastTimestamp:2026-04-22 13:20:54.741060713 +0000 UTC m=+0.519747482,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-73.ec2.internal,}" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.743807 2575 policy_none.go:49] "None policy: Start" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.743834 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.743844 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.783104 2575 manager.go:341] "Starting Device Plugin manager" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.783142 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.783153 2575 server.go:85] "Starting device plugin registration server" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.783391 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.783404 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.783495 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.783586 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.783595 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.784238 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.784275 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-73.ec2.internal\" not found" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.848242 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.849465 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.849495 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.849518 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.849527 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.849566 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.853515 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.883610 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.884469 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasSufficientMemory" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.884495 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 13:20:55.046612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.884506 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasSufficientPID" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.884530 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.930946 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-136-73.ec2.internal\" not found" node="ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.950044 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-73.ec2.internal"] Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.950110 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.951452 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasSufficientMemory" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.951472 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.951481 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasSufficientPID" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.952905 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.953094 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.953119 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.953549 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasSufficientMemory" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.953574 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasSufficientMemory" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.953593 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.953602 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasSufficientPID" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.953578 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.953657 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasSufficientPID" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.954658 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.954679 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.955284 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasSufficientMemory" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.955308 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.955320 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeHasSufficientPID" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:54.957359 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.957375 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-73.ec2.internal\": node \"ip-10-0-136-73.ec2.internal\" not found" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.970581 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-73.ec2.internal\" not found" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.981506 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-73.ec2.internal\" not found" node="ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:54.990140 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-73.ec2.internal\" not found" node="ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.021087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a1a5a7d927cecab18576aa6fbed00027-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal\" (UID: \"a1a5a7d927cecab18576aa6fbed00027\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.021112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1a5a7d927cecab18576aa6fbed00027-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal\" (UID: \"a1a5a7d927cecab18576aa6fbed00027\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.047597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.021128 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5e1839749775b093843d832935f4dc52-config\") pod \"kube-apiserver-proxy-ip-10-0-136-73.ec2.internal\" (UID: \"5e1839749775b093843d832935f4dc52\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.071529 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.071494 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-73.ec2.internal\" not found" Apr 22 13:20:55.121938 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.121909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1a5a7d927cecab18576aa6fbed00027-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal\" (UID: \"a1a5a7d927cecab18576aa6fbed00027\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.122049 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.121954 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5e1839749775b093843d832935f4dc52-config\") pod \"kube-apiserver-proxy-ip-10-0-136-73.ec2.internal\" (UID: \"5e1839749775b093843d832935f4dc52\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.122049 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.121980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a1a5a7d927cecab18576aa6fbed00027-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal\" (UID: \"a1a5a7d927cecab18576aa6fbed00027\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.122049 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.122002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1a5a7d927cecab18576aa6fbed00027-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal\" (UID: \"a1a5a7d927cecab18576aa6fbed00027\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.122049 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.122014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5e1839749775b093843d832935f4dc52-config\") pod \"kube-apiserver-proxy-ip-10-0-136-73.ec2.internal\" (UID: \"5e1839749775b093843d832935f4dc52\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.122049 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.122032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a1a5a7d927cecab18576aa6fbed00027-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal\" (UID: \"a1a5a7d927cecab18576aa6fbed00027\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.172198 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.172171 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-73.ec2.internal\" not found" Apr 22 13:20:55.272885 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.272855 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-73.ec2.internal\" not found" Apr 22 13:20:55.282974 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.282951 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.292448 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.292427 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.373462 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.373421 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-73.ec2.internal\" not found" Apr 22 13:20:55.473979 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.473947 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-73.ec2.internal\" not found" Apr 22 13:20:55.574492 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.574414 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-73.ec2.internal\" not found" Apr 22 13:20:55.593886 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.593866 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 13:20:55.602888 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.602869 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 13:20:55.619735 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.619709 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.624965 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.624952 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 13:20:55.625061 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.625046 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 13:20:55.625116 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.625094 2575 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a4fdceb89f7ba46fb885d43a2000a723-e8fad4f9fd20d4a5.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/kube-system/pods\": read tcp 10.0.136.73:45972->44.216.167.228:6443: use of closed network connection" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.625116 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.625101 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 13:20:55.625187 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.625102 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 13:20:55.625187 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.625111 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal" Apr 22 13:20:55.646687 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.646667 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 13:20:55.695216 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.695188 2575 apiserver.go:52] "Watching apiserver" Apr 22 13:20:55.703827 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.703789 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 13:20:55.704145 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.704123 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kf5cf","kube-system/kube-apiserver-proxy-ip-10-0-136-73.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b","openshift-cluster-node-tuning-operator/tuned-l8mkn","openshift-image-registry/node-ca-2x5k7","openshift-multus/multus-additional-cni-plugins-82s2m","openshift-multus/multus-g5ctl","kube-system/konnectivity-agent-sclfb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal","openshift-multus/network-metrics-daemon-jgxt9","openshift-network-diagnostics/network-check-target-ggdrt","openshift-network-operator/iptables-alerter-lld46"] Apr 22 13:20:55.705515 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.705495 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lld46" Apr 22 13:20:55.706684 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.706665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.707551 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.707532 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 13:20:55.707631 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.707537 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 13:20:55.707631 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.707600 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xpx5q\"" Apr 22 13:20:55.707891 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.707876 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.708180 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.708156 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 13:20:55.708765 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.708746 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 13:20:55.708878 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.708858 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 13:20:55.709010 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.708960 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.709126 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.709065 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 13:20:55.709255 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.709225 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 13:20:55.709320 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.709264 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 13:20:55.709370 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.709343 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 13:20:55.709370 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.709343 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-m7dq6\"" Apr 22 13:20:55.709802 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.709774 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 13:20:55.709915 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.709898 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 13:20:55.709998 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.709978 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jgsg7\"" Apr 22 13:20:55.710048 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.710013 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 13:20:55.710091 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.710053 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2x5k7" Apr 22 13:20:55.710914 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.710898 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tk9vn\"" Apr 22 13:20:55.711123 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.711109 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 13:20:55.711248 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.711228 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.711651 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.711619 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 13:20:55.712013 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.711997 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 13:20:55.712094 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.712056 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 13:20:55.712157 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.712145 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 13:20:55.712234 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.712220 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-64hkt\"" Apr 22 13:20:55.712478 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.712463 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.713034 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.713019 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 13:20:55.713106 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.713045 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 13:20:55.713277 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.713264 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 13:20:55.713277 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.713271 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 13:20:55.713446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.713294 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dqq7h\"" Apr 22 13:20:55.713446 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.713385 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 13:20:55.713541 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.713477 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-sclfb" Apr 22 13:20:55.714275 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.714259 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 13:20:55.714684 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.714669 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:20:55.714857 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.714733 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:20:55.715018 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.715003 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-ks7c6\"" Apr 22 13:20:55.715526 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.715501 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 13:20:55.715619 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.715507 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-r7jpz\"" Apr 22 13:20:55.715619 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.715566 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 13:20:55.715789 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.715773 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:20:55.715872 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.715853 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:20:55.717996 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.717983 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 13:20:55.721574 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.721559 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 13:20:55.725350 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5dedb94-cce8-40ef-8b20-152362aec6dc-system-cni-dir\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.725443 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725383 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-multus-cni-dir\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.725443 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725420 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/544999a6-323d-481d-b6ad-d24f5da3e82f-cni-binary-copy\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.725544 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725454 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx4zw\" (UniqueName: \"kubernetes.io/projected/544999a6-323d-481d-b6ad-d24f5da3e82f-kube-api-access-sx4zw\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.725544 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-device-dir\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.725544 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-kubelet\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.725684 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-cni-bin\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.725684 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.725684 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725661 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-system-cni-dir\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.725853 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-cnibin\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.725853 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e919041-6f50-4989-b55f-057c690de2ab-iptables-alerter-script\") pod \"iptables-alerter-lld46\" (UID: \"9e919041-6f50-4989-b55f-057c690de2ab\") " pod="openshift-network-operator/iptables-alerter-lld46" Apr 22 13:20:55.725853 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5dedb94-cce8-40ef-8b20-152362aec6dc-cni-binary-copy\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.725853 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725789 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a28f28cc-e746-49c4-bf70-a476e379f760-ovn-node-metrics-cert\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.725853 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725839 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.726087 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-sysctl-conf\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.726087 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725918 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-run-ovn\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.726087 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725950 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-log-socket\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.726087 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.725986 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-registration-dir\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.726087 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-systemd\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.726087 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-run-systemd\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.726087 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-etc-openvswitch\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.726392 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5dedb94-cce8-40ef-8b20-152362aec6dc-os-release\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.726392 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d5dedb94-cce8-40ef-8b20-152362aec6dc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.726392 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhlrh\" (UniqueName: \"kubernetes.io/projected/9e919041-6f50-4989-b55f-057c690de2ab-kube-api-access-zhlrh\") pod \"iptables-alerter-lld46\" (UID: \"9e919041-6f50-4989-b55f-057c690de2ab\") " pod="openshift-network-operator/iptables-alerter-lld46" Apr 22 13:20:55.726392 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-socket-dir\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.726392 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726250 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-run-k8s-cni-cncf-io\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.726392 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726279 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-var-lib-cni-multus\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.726392 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-etc-kubernetes\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.726392 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726347 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlgxm\" (UniqueName: \"kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm\") pod \"network-check-target-ggdrt\" (UID: \"767888b5-4be3-4a3e-ac92-a5c0cd2708fe\") " pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:20:55.726392 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726380 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l849x\" (UniqueName: \"kubernetes.io/projected/15e591d3-68e8-48e6-854d-1b459e3bf1c1-kube-api-access-l849x\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.726794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726412 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-cni-netd\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.726794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5dedb94-cce8-40ef-8b20-152362aec6dc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.726794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726478 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-etc-selinux\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.726794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726510 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-sysconfig\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.726794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726546 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnmsm\" (UniqueName: \"kubernetes.io/projected/4f52168a-3467-4e13-b154-1feaf9796063-kube-api-access-dnmsm\") pod \"node-ca-2x5k7\" (UID: \"4f52168a-3467-4e13-b154-1feaf9796063\") " pod="openshift-image-registry/node-ca-2x5k7" Apr 22 13:20:55.726794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d5dedb94-cce8-40ef-8b20-152362aec6dc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.726794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726612 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f-konnectivity-ca\") pod \"konnectivity-agent-sclfb\" (UID: \"02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f\") " pod="kube-system/konnectivity-agent-sclfb" Apr 22 13:20:55.726794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726646 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x79xf\" (UniqueName: \"kubernetes.io/projected/8b92597f-aa17-456d-bc65-ee5880d70a69-kube-api-access-x79xf\") pod \"network-metrics-daemon-jgxt9\" (UID: \"8b92597f-aa17-456d-bc65-ee5880d70a69\") " pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:20:55.726794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726678 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f52168a-3467-4e13-b154-1feaf9796063-serviceca\") pod \"node-ca-2x5k7\" (UID: \"4f52168a-3467-4e13-b154-1feaf9796063\") " pod="openshift-image-registry/node-ca-2x5k7" Apr 22 13:20:55.726794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726706 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-run-netns\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.726794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-os-release\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.726794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726776 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-var-lib-kubelet\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.727345 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726809 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-tmp\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.727345 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-var-lib-openvswitch\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.727345 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726894 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-multus-socket-dir-parent\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.727345 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726927 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-var-lib-kubelet\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.727345 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.726960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/544999a6-323d-481d-b6ad-d24f5da3e82f-multus-daemon-config\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.727345 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.727011 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e919041-6f50-4989-b55f-057c690de2ab-host-slash\") pod \"iptables-alerter-lld46\" (UID: \"9e919041-6f50-4989-b55f-057c690de2ab\") " pod="openshift-network-operator/iptables-alerter-lld46" Apr 22 13:20:55.727345 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.727045 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-modprobe-d\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.727345 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.727076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-sysctl-d\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.727345 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.727111 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-node-log\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.727690 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.727581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s92p\" (UniqueName: \"kubernetes.io/projected/d5dedb94-cce8-40ef-8b20-152362aec6dc-kube-api-access-2s92p\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.727690 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.727677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-sys\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.727796 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.727764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-run\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.727849 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.727837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-run-ovn-kubernetes\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.727899 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.727882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a28f28cc-e746-49c4-bf70-a476e379f760-ovnkube-config\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.727943 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.727920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a28f28cc-e746-49c4-bf70-a476e379f760-ovnkube-script-lib\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.727993 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.727965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44d8x\" (UniqueName: \"kubernetes.io/projected/a28f28cc-e746-49c4-bf70-a476e379f760-kube-api-access-44d8x\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.728302 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728285 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-run-netns\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.728348 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-host\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.728348 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728344 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f52168a-3467-4e13-b154-1feaf9796063-host\") pod \"node-ca-2x5k7\" (UID: \"4f52168a-3467-4e13-b154-1feaf9796063\") " pod="openshift-image-registry/node-ca-2x5k7" Apr 22 13:20:55.728420 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728371 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-run-openvswitch\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.728420 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a28f28cc-e746-49c4-bf70-a476e379f760-env-overrides\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.728483 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728419 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5dedb94-cce8-40ef-8b20-152362aec6dc-cnibin\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.728483 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-slash\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.728483 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-hostroot\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.728599 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-multus-conf-dir\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.728599 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-lib-modules\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.728599 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-tuned\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.728705 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728590 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-kubernetes\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.728705 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728630 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-var-lib-cni-bin\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.728705 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f-agent-certs\") pod \"konnectivity-agent-sclfb\" (UID: \"02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f\") " pod="kube-system/konnectivity-agent-sclfb" Apr 22 13:20:55.728705 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs\") pod \"network-metrics-daemon-jgxt9\" (UID: \"8b92597f-aa17-456d-bc65-ee5880d70a69\") " pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:20:55.728809 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728703 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn4l2\" (UniqueName: \"kubernetes.io/projected/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-kube-api-access-hn4l2\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.728809 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-sys-fs\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.728809 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728760 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-systemd-units\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.728809 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.728789 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-run-multus-certs\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.730198 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.730181 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 13:20:55.739423 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.739401 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 13:15:54 +0000 UTC" deadline="2027-09-24 13:44:58.031068936 +0000 UTC" Apr 22 13:20:55.739423 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.739422 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12480h24m2.291649148s" Apr 22 13:20:55.757804 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.757682 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-l5qjq" Apr 22 13:20:55.765243 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.765223 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-l5qjq" Apr 22 13:20:55.825017 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.824864 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 13:20:55.829891 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.829867 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/544999a6-323d-481d-b6ad-d24f5da3e82f-cni-binary-copy\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.830007 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.829906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sx4zw\" (UniqueName: \"kubernetes.io/projected/544999a6-323d-481d-b6ad-d24f5da3e82f-kube-api-access-sx4zw\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.830007 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.829926 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-device-dir\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.830007 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.829941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-kubelet\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.830007 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.829956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-cni-bin\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.830007 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.829972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.830007 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.829993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-system-cni-dir\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-cnibin\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e919041-6f50-4989-b55f-057c690de2ab-iptables-alerter-script\") pod \"iptables-alerter-lld46\" (UID: \"9e919041-6f50-4989-b55f-057c690de2ab\") " pod="openshift-network-operator/iptables-alerter-lld46" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-cni-bin\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-kubelet\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5dedb94-cce8-40ef-8b20-152362aec6dc-cni-binary-copy\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a28f28cc-e746-49c4-bf70-a476e379f760-ovn-node-metrics-cert\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830094 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-device-dir\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-sysctl-conf\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-system-cni-dir\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-run-ovn\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-log-socket\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830211 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830222 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-registration-dir\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830254 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-cnibin\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.830291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-systemd\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830314 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-systemd\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830389 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-log-socket\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830442 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-run-ovn\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830445 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-sysctl-conf\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830476 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-run-systemd\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830484 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/544999a6-323d-481d-b6ad-d24f5da3e82f-cni-binary-copy\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-etc-openvswitch\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830522 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-registration-dir\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5dedb94-cce8-40ef-8b20-152362aec6dc-os-release\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-etc-openvswitch\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-run-systemd\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d5dedb94-cce8-40ef-8b20-152362aec6dc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830583 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830629 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e919041-6f50-4989-b55f-057c690de2ab-iptables-alerter-script\") pod \"iptables-alerter-lld46\" (UID: \"9e919041-6f50-4989-b55f-057c690de2ab\") " pod="openshift-network-operator/iptables-alerter-lld46" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5dedb94-cce8-40ef-8b20-152362aec6dc-os-release\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhlrh\" (UniqueName: \"kubernetes.io/projected/9e919041-6f50-4989-b55f-057c690de2ab-kube-api-access-zhlrh\") pod \"iptables-alerter-lld46\" (UID: \"9e919041-6f50-4989-b55f-057c690de2ab\") " pod="openshift-network-operator/iptables-alerter-lld46" Apr 22 13:20:55.831051 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830674 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5dedb94-cce8-40ef-8b20-152362aec6dc-cni-binary-copy\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-socket-dir\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-run-k8s-cni-cncf-io\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-var-lib-cni-multus\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-etc-kubernetes\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgxm\" (UniqueName: \"kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm\") pod \"network-check-target-ggdrt\" (UID: \"767888b5-4be3-4a3e-ac92-a5c0cd2708fe\") " pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-socket-dir\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830839 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l849x\" (UniqueName: \"kubernetes.io/projected/15e591d3-68e8-48e6-854d-1b459e3bf1c1-kube-api-access-l849x\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-var-lib-cni-multus\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-run-k8s-cni-cncf-io\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830865 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-cni-netd\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5dedb94-cce8-40ef-8b20-152362aec6dc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-etc-kubernetes\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-etc-selinux\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-sysconfig\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-cni-netd\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.830997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnmsm\" (UniqueName: \"kubernetes.io/projected/4f52168a-3467-4e13-b154-1feaf9796063-kube-api-access-dnmsm\") pod \"node-ca-2x5k7\" (UID: \"4f52168a-3467-4e13-b154-1feaf9796063\") " pod="openshift-image-registry/node-ca-2x5k7" Apr 22 13:20:55.831736 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831015 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d5dedb94-cce8-40ef-8b20-152362aec6dc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d5dedb94-cce8-40ef-8b20-152362aec6dc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5dedb94-cce8-40ef-8b20-152362aec6dc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831028 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-sysconfig\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831072 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f-konnectivity-ca\") pod \"konnectivity-agent-sclfb\" (UID: \"02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f\") " pod="kube-system/konnectivity-agent-sclfb" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x79xf\" (UniqueName: \"kubernetes.io/projected/8b92597f-aa17-456d-bc65-ee5880d70a69-kube-api-access-x79xf\") pod \"network-metrics-daemon-jgxt9\" (UID: \"8b92597f-aa17-456d-bc65-ee5880d70a69\") " pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831120 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-etc-selinux\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f52168a-3467-4e13-b154-1feaf9796063-serviceca\") pod \"node-ca-2x5k7\" (UID: \"4f52168a-3467-4e13-b154-1feaf9796063\") " pod="openshift-image-registry/node-ca-2x5k7" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-run-netns\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-run-netns\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-os-release\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-var-lib-kubelet\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-tmp\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831430 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-os-release\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-var-lib-openvswitch\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-var-lib-kubelet\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-multus-socket-dir-parent\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.832488 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831503 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d5dedb94-cce8-40ef-8b20-152362aec6dc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831506 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f52168a-3467-4e13-b154-1feaf9796063-serviceca\") pod \"node-ca-2x5k7\" (UID: \"4f52168a-3467-4e13-b154-1feaf9796063\") " pod="openshift-image-registry/node-ca-2x5k7" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-var-lib-kubelet\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831539 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f-konnectivity-ca\") pod \"konnectivity-agent-sclfb\" (UID: \"02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f\") " pod="kube-system/konnectivity-agent-sclfb" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831553 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/544999a6-323d-481d-b6ad-d24f5da3e82f-multus-daemon-config\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-var-lib-openvswitch\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831569 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-var-lib-kubelet\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-multus-socket-dir-parent\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e919041-6f50-4989-b55f-057c690de2ab-host-slash\") pod \"iptables-alerter-lld46\" (UID: \"9e919041-6f50-4989-b55f-057c690de2ab\") " pod="openshift-network-operator/iptables-alerter-lld46" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-modprobe-d\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-sysctl-d\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e919041-6f50-4989-b55f-057c690de2ab-host-slash\") pod \"iptables-alerter-lld46\" (UID: \"9e919041-6f50-4989-b55f-057c690de2ab\") " pod="openshift-network-operator/iptables-alerter-lld46" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-node-log\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831661 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-node-log\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2s92p\" (UniqueName: \"kubernetes.io/projected/d5dedb94-cce8-40ef-8b20-152362aec6dc-kube-api-access-2s92p\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-sys\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831738 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-run\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831745 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-modprobe-d\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.833282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-sysctl-d\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831764 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-run-ovn-kubernetes\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831788 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-sys\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a28f28cc-e746-49c4-bf70-a476e379f760-ovnkube-config\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a28f28cc-e746-49c4-bf70-a476e379f760-ovnkube-script-lib\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831859 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-run-ovn-kubernetes\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44d8x\" (UniqueName: \"kubernetes.io/projected/a28f28cc-e746-49c4-bf70-a476e379f760-kube-api-access-44d8x\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-run\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-run-netns\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-host\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f52168a-3467-4e13-b154-1feaf9796063-host\") pod \"node-ca-2x5k7\" (UID: \"4f52168a-3467-4e13-b154-1feaf9796063\") " pod="openshift-image-registry/node-ca-2x5k7" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.831969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-run-netns\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832013 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/544999a6-323d-481d-b6ad-d24f5da3e82f-multus-daemon-config\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-run-openvswitch\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a28f28cc-e746-49c4-bf70-a476e379f760-env-overrides\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832132 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f52168a-3467-4e13-b154-1feaf9796063-host\") pod \"node-ca-2x5k7\" (UID: \"4f52168a-3467-4e13-b154-1feaf9796063\") " pod="openshift-image-registry/node-ca-2x5k7" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5dedb94-cce8-40ef-8b20-152362aec6dc-cnibin\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832183 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-host\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.834081 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832196 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-slash\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-hostroot\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-multus-conf-dir\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-lib-modules\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-tuned\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-kubernetes\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832378 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-var-lib-cni-bin\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f-agent-certs\") pod \"konnectivity-agent-sclfb\" (UID: \"02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f\") " pod="kube-system/konnectivity-agent-sclfb" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832422 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-hostroot\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a28f28cc-e746-49c4-bf70-a476e379f760-ovnkube-config\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a28f28cc-e746-49c4-bf70-a476e379f760-ovnkube-script-lib\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs\") pod \"network-metrics-daemon-jgxt9\" (UID: \"8b92597f-aa17-456d-bc65-ee5880d70a69\") " pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832512 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5dedb94-cce8-40ef-8b20-152362aec6dc-cnibin\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832480 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-run-openvswitch\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn4l2\" (UniqueName: \"kubernetes.io/projected/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-kube-api-access-hn4l2\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a28f28cc-e746-49c4-bf70-a476e379f760-env-overrides\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.832565 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-var-lib-cni-bin\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.834857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-sys-fs\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.832651 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs podName:8b92597f-aa17-456d-bc65-ee5880d70a69 nodeName:}" failed. No retries permitted until 2026-04-22 13:20:56.332620295 +0000 UTC m=+2.111307074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs") pod "network-metrics-daemon-jgxt9" (UID: "8b92597f-aa17-456d-bc65-ee5880d70a69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832696 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/15e591d3-68e8-48e6-854d-1b459e3bf1c1-sys-fs\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-host-slash\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-systemd-units\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832763 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a28f28cc-e746-49c4-bf70-a476e379f760-systemd-units\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-run-multus-certs\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5dedb94-cce8-40ef-8b20-152362aec6dc-system-cni-dir\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-host-run-multus-certs\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-kubernetes\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-multus-conf-dir\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-multus-cni-dir\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832926 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5dedb94-cce8-40ef-8b20-152362aec6dc-system-cni-dir\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.832929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-lib-modules\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.833025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/544999a6-323d-481d-b6ad-d24f5da3e82f-multus-cni-dir\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.834697 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-etc-tuned\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.834774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-tmp\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.835393 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.834926 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a28f28cc-e746-49c4-bf70-a476e379f760-ovn-node-metrics-cert\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.835885 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.834998 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f-agent-certs\") pod \"konnectivity-agent-sclfb\" (UID: \"02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f\") " pod="kube-system/konnectivity-agent-sclfb" Apr 22 13:20:55.839366 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.839344 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 13:20:55.839485 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.839371 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 13:20:55.839485 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.839386 2575 projected.go:194] Error preparing data for projected volume kube-api-access-xlgxm for pod openshift-network-diagnostics/network-check-target-ggdrt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:20:55.839485 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:55.839451 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm podName:767888b5-4be3-4a3e-ac92-a5c0cd2708fe nodeName:}" failed. No retries permitted until 2026-04-22 13:20:56.339434275 +0000 UTC m=+2.118121033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xlgxm" (UniqueName: "kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm") pod "network-check-target-ggdrt" (UID: "767888b5-4be3-4a3e-ac92-a5c0cd2708fe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:20:55.839758 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.839735 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx4zw\" (UniqueName: \"kubernetes.io/projected/544999a6-323d-481d-b6ad-d24f5da3e82f-kube-api-access-sx4zw\") pod \"multus-g5ctl\" (UID: \"544999a6-323d-481d-b6ad-d24f5da3e82f\") " pod="openshift-multus/multus-g5ctl" Apr 22 13:20:55.840631 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.840603 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhlrh\" (UniqueName: \"kubernetes.io/projected/9e919041-6f50-4989-b55f-057c690de2ab-kube-api-access-zhlrh\") pod \"iptables-alerter-lld46\" (UID: \"9e919041-6f50-4989-b55f-057c690de2ab\") " pod="openshift-network-operator/iptables-alerter-lld46" Apr 22 13:20:55.840884 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.840863 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l849x\" (UniqueName: \"kubernetes.io/projected/15e591d3-68e8-48e6-854d-1b459e3bf1c1-kube-api-access-l849x\") pod \"aws-ebs-csi-driver-node-7sk7b\" (UID: \"15e591d3-68e8-48e6-854d-1b459e3bf1c1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:55.841784 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.841764 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x79xf\" (UniqueName: \"kubernetes.io/projected/8b92597f-aa17-456d-bc65-ee5880d70a69-kube-api-access-x79xf\") pod \"network-metrics-daemon-jgxt9\" (UID: \"8b92597f-aa17-456d-bc65-ee5880d70a69\") " pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:20:55.844588 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.844565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnmsm\" (UniqueName: \"kubernetes.io/projected/4f52168a-3467-4e13-b154-1feaf9796063-kube-api-access-dnmsm\") pod \"node-ca-2x5k7\" (UID: \"4f52168a-3467-4e13-b154-1feaf9796063\") " pod="openshift-image-registry/node-ca-2x5k7" Apr 22 13:20:55.847659 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.847628 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s92p\" (UniqueName: \"kubernetes.io/projected/d5dedb94-cce8-40ef-8b20-152362aec6dc-kube-api-access-2s92p\") pod \"multus-additional-cni-plugins-82s2m\" (UID: \"d5dedb94-cce8-40ef-8b20-152362aec6dc\") " pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:55.847940 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.847925 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44d8x\" (UniqueName: \"kubernetes.io/projected/a28f28cc-e746-49c4-bf70-a476e379f760-kube-api-access-44d8x\") pod \"ovnkube-node-kf5cf\" (UID: \"a28f28cc-e746-49c4-bf70-a476e379f760\") " pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:55.847988 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.847942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn4l2\" (UniqueName: \"kubernetes.io/projected/55d1184b-6b1c-43fd-9fdf-a5cbe05174b6-kube-api-access-hn4l2\") pod \"tuned-l8mkn\" (UID: \"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6\") " pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:55.900416 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:55.900386 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a5a7d927cecab18576aa6fbed00027.slice/crio-ea270b558829c24b4a25b4988c3cb22a02b8f5d26580bc765171cf113cb1f187 WatchSource:0}: Error finding container ea270b558829c24b4a25b4988c3cb22a02b8f5d26580bc765171cf113cb1f187: Status 404 returned error can't find the container with id ea270b558829c24b4a25b4988c3cb22a02b8f5d26580bc765171cf113cb1f187 Apr 22 13:20:55.900855 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:55.900839 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e1839749775b093843d832935f4dc52.slice/crio-cd1cd385a62a756d1d53a88eae612a578529cf86f1d8b2a39969e26cf9989012 WatchSource:0}: Error finding container cd1cd385a62a756d1d53a88eae612a578529cf86f1d8b2a39969e26cf9989012: Status 404 returned error can't find the container with id cd1cd385a62a756d1d53a88eae612a578529cf86f1d8b2a39969e26cf9989012 Apr 22 13:20:55.904800 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:55.904783 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 13:20:56.024956 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.024922 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lld46" Apr 22 13:20:56.031966 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:56.031941 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e919041_6f50_4989_b55f_057c690de2ab.slice/crio-ee74c0c3773ed4042bd4e7d0e9cab1e451a3862f503ef9952d75cf83a518b0b8 WatchSource:0}: Error finding container ee74c0c3773ed4042bd4e7d0e9cab1e451a3862f503ef9952d75cf83a518b0b8: Status 404 returned error can't find the container with id ee74c0c3773ed4042bd4e7d0e9cab1e451a3862f503ef9952d75cf83a518b0b8 Apr 22 13:20:56.038526 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.038506 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:20:56.043913 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:56.043895 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda28f28cc_e746_49c4_bf70_a476e379f760.slice/crio-c12721def7a6c59b31198970c82e3df7255822c9b2c8ef61f50443e70c9f7b36 WatchSource:0}: Error finding container c12721def7a6c59b31198970c82e3df7255822c9b2c8ef61f50443e70c9f7b36: Status 404 returned error can't find the container with id c12721def7a6c59b31198970c82e3df7255822c9b2c8ef61f50443e70c9f7b36 Apr 22 13:20:56.057440 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.057413 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" Apr 22 13:20:56.063184 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:56.063163 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15e591d3_68e8_48e6_854d_1b459e3bf1c1.slice/crio-c1bcc0b539772cce0e5bfaecc59fb2098e9a4d34da28ebf5cdd8e3157bbe96ca WatchSource:0}: Error finding container c1bcc0b539772cce0e5bfaecc59fb2098e9a4d34da28ebf5cdd8e3157bbe96ca: Status 404 returned error can't find the container with id c1bcc0b539772cce0e5bfaecc59fb2098e9a4d34da28ebf5cdd8e3157bbe96ca Apr 22 13:20:56.079424 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.079376 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" Apr 22 13:20:56.084748 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:56.084727 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55d1184b_6b1c_43fd_9fdf_a5cbe05174b6.slice/crio-b0e6aeec9c46e60486748ba3b4b38592d274eb2afa8c3b1e5a5b6e3ac5256877 WatchSource:0}: Error finding container b0e6aeec9c46e60486748ba3b4b38592d274eb2afa8c3b1e5a5b6e3ac5256877: Status 404 returned error can't find the container with id b0e6aeec9c46e60486748ba3b4b38592d274eb2afa8c3b1e5a5b6e3ac5256877 Apr 22 13:20:56.089483 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.089467 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2x5k7" Apr 22 13:20:56.097789 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:56.097768 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f52168a_3467_4e13_b154_1feaf9796063.slice/crio-8d376fdddfcaf52806e9ff73c4ff15e8d6613c937e206138c95250473658b44b WatchSource:0}: Error finding container 8d376fdddfcaf52806e9ff73c4ff15e8d6613c937e206138c95250473658b44b: Status 404 returned error can't find the container with id 8d376fdddfcaf52806e9ff73c4ff15e8d6613c937e206138c95250473658b44b Apr 22 13:20:56.107042 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.107024 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-82s2m" Apr 22 13:20:56.112252 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:56.112231 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5dedb94_cce8_40ef_8b20_152362aec6dc.slice/crio-02bff40e9a2375d40c272536813b1c0120a7fce613a369ade5140d5a582a9b23 WatchSource:0}: Error finding container 02bff40e9a2375d40c272536813b1c0120a7fce613a369ade5140d5a582a9b23: Status 404 returned error can't find the container with id 02bff40e9a2375d40c272536813b1c0120a7fce613a369ade5140d5a582a9b23 Apr 22 13:20:56.122319 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.122304 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g5ctl" Apr 22 13:20:56.127492 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.127474 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-sclfb" Apr 22 13:20:56.127773 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:56.127748 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod544999a6_323d_481d_b6ad_d24f5da3e82f.slice/crio-2fa88fd3d2175c2e91680838205e09f6a231a1f35cf11bcb0ce66de35116e4a9 WatchSource:0}: Error finding container 2fa88fd3d2175c2e91680838205e09f6a231a1f35cf11bcb0ce66de35116e4a9: Status 404 returned error can't find the container with id 2fa88fd3d2175c2e91680838205e09f6a231a1f35cf11bcb0ce66de35116e4a9 Apr 22 13:20:56.133993 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:20:56.133973 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02a0d9d1_db8d_479c_9fe1_ca3a2cfd049f.slice/crio-cbc1c37b4f6dddbcc0aaf94779b27ce0d5f7e379c48ffc3c024e0da07e886744 WatchSource:0}: Error finding container cbc1c37b4f6dddbcc0aaf94779b27ce0d5f7e379c48ffc3c024e0da07e886744: Status 404 returned error can't find the container with id cbc1c37b4f6dddbcc0aaf94779b27ce0d5f7e379c48ffc3c024e0da07e886744 Apr 22 13:20:56.336604 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.336524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs\") pod \"network-metrics-daemon-jgxt9\" (UID: \"8b92597f-aa17-456d-bc65-ee5880d70a69\") " pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:20:56.336741 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:56.336643 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:20:56.336741 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:56.336722 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs podName:8b92597f-aa17-456d-bc65-ee5880d70a69 nodeName:}" failed. No retries permitted until 2026-04-22 13:20:57.336693588 +0000 UTC m=+3.115380344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs") pod "network-metrics-daemon-jgxt9" (UID: "8b92597f-aa17-456d-bc65-ee5880d70a69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:20:56.436957 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.436919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgxm\" (UniqueName: \"kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm\") pod \"network-check-target-ggdrt\" (UID: \"767888b5-4be3-4a3e-ac92-a5c0cd2708fe\") " pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:20:56.437138 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:56.437098 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 13:20:56.437138 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:56.437119 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 13:20:56.437138 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:56.437132 2575 projected.go:194] Error preparing data for projected volume kube-api-access-xlgxm for pod openshift-network-diagnostics/network-check-target-ggdrt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:20:56.437285 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:56.437189 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm podName:767888b5-4be3-4a3e-ac92-a5c0cd2708fe nodeName:}" failed. No retries permitted until 2026-04-22 13:20:57.437170646 +0000 UTC m=+3.215857416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xlgxm" (UniqueName: "kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm") pod "network-check-target-ggdrt" (UID: "767888b5-4be3-4a3e-ac92-a5c0cd2708fe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:20:56.766663 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.766564 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 13:15:55 +0000 UTC" deadline="2027-10-27 20:05:44.040613847 +0000 UTC" Apr 22 13:20:56.766663 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.766602 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13278h44m47.274016264s" Apr 22 13:20:56.853376 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.852837 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:20:56.853376 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:56.852971 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:20:56.863976 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.863917 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-sclfb" event={"ID":"02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f","Type":"ContainerStarted","Data":"cbc1c37b4f6dddbcc0aaf94779b27ce0d5f7e379c48ffc3c024e0da07e886744"} Apr 22 13:20:56.870109 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.870075 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82s2m" event={"ID":"d5dedb94-cce8-40ef-8b20-152362aec6dc","Type":"ContainerStarted","Data":"02bff40e9a2375d40c272536813b1c0120a7fce613a369ade5140d5a582a9b23"} Apr 22 13:20:56.877718 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.877677 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2x5k7" event={"ID":"4f52168a-3467-4e13-b154-1feaf9796063","Type":"ContainerStarted","Data":"8d376fdddfcaf52806e9ff73c4ff15e8d6613c937e206138c95250473658b44b"} Apr 22 13:20:56.881934 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.879961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" event={"ID":"a28f28cc-e746-49c4-bf70-a476e379f760","Type":"ContainerStarted","Data":"c12721def7a6c59b31198970c82e3df7255822c9b2c8ef61f50443e70c9f7b36"} Apr 22 13:20:56.886637 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.886609 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g5ctl" event={"ID":"544999a6-323d-481d-b6ad-d24f5da3e82f","Type":"ContainerStarted","Data":"2fa88fd3d2175c2e91680838205e09f6a231a1f35cf11bcb0ce66de35116e4a9"} Apr 22 13:20:56.896219 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.896193 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 13:20:56.918047 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.918013 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" event={"ID":"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6","Type":"ContainerStarted","Data":"b0e6aeec9c46e60486748ba3b4b38592d274eb2afa8c3b1e5a5b6e3ac5256877"} Apr 22 13:20:56.925371 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.925313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" event={"ID":"15e591d3-68e8-48e6-854d-1b459e3bf1c1","Type":"ContainerStarted","Data":"c1bcc0b539772cce0e5bfaecc59fb2098e9a4d34da28ebf5cdd8e3157bbe96ca"} Apr 22 13:20:56.932117 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.932083 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lld46" event={"ID":"9e919041-6f50-4989-b55f-057c690de2ab","Type":"ContainerStarted","Data":"ee74c0c3773ed4042bd4e7d0e9cab1e451a3862f503ef9952d75cf83a518b0b8"} Apr 22 13:20:56.938643 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.938614 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal" event={"ID":"a1a5a7d927cecab18576aa6fbed00027","Type":"ContainerStarted","Data":"ea270b558829c24b4a25b4988c3cb22a02b8f5d26580bc765171cf113cb1f187"} Apr 22 13:20:56.944616 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:56.944587 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-73.ec2.internal" event={"ID":"5e1839749775b093843d832935f4dc52","Type":"ContainerStarted","Data":"cd1cd385a62a756d1d53a88eae612a578529cf86f1d8b2a39969e26cf9989012"} Apr 22 13:20:57.345328 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:57.345293 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs\") pod \"network-metrics-daemon-jgxt9\" (UID: \"8b92597f-aa17-456d-bc65-ee5880d70a69\") " pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:20:57.345509 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:57.345474 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:20:57.345574 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:57.345534 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs podName:8b92597f-aa17-456d-bc65-ee5880d70a69 nodeName:}" failed. No retries permitted until 2026-04-22 13:20:59.345516049 +0000 UTC m=+5.124202807 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs") pod "network-metrics-daemon-jgxt9" (UID: "8b92597f-aa17-456d-bc65-ee5880d70a69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:20:57.446386 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:57.446339 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgxm\" (UniqueName: \"kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm\") pod \"network-check-target-ggdrt\" (UID: \"767888b5-4be3-4a3e-ac92-a5c0cd2708fe\") " pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:20:57.446571 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:57.446496 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 13:20:57.446571 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:57.446515 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 13:20:57.446571 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:57.446528 2575 projected.go:194] Error preparing data for projected volume kube-api-access-xlgxm for pod openshift-network-diagnostics/network-check-target-ggdrt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:20:57.446738 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:57.446585 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm podName:767888b5-4be3-4a3e-ac92-a5c0cd2708fe nodeName:}" failed. No retries permitted until 2026-04-22 13:20:59.446566745 +0000 UTC m=+5.225253508 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xlgxm" (UniqueName: "kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm") pod "network-check-target-ggdrt" (UID: "767888b5-4be3-4a3e-ac92-a5c0cd2708fe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:20:57.751040 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:57.750961 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 13:20:57.767193 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:57.767153 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 13:15:55 +0000 UTC" deadline="2027-12-21 06:38:43.448403126 +0000 UTC" Apr 22 13:20:57.767193 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:57.767190 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14585h17m45.681216981s" Apr 22 13:20:57.849868 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:57.849835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:20:57.850046 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:57.849975 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:20:58.710872 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:58.710839 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 13:20:58.851477 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:58.851447 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:20:58.851838 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:58.851594 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:20:59.360636 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:59.360058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs\") pod \"network-metrics-daemon-jgxt9\" (UID: \"8b92597f-aa17-456d-bc65-ee5880d70a69\") " pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:20:59.360636 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:59.360234 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:20:59.360636 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:59.360295 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs podName:8b92597f-aa17-456d-bc65-ee5880d70a69 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:03.360276093 +0000 UTC m=+9.138962855 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs") pod "network-metrics-daemon-jgxt9" (UID: "8b92597f-aa17-456d-bc65-ee5880d70a69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:20:59.461526 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:59.460937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgxm\" (UniqueName: \"kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm\") pod \"network-check-target-ggdrt\" (UID: \"767888b5-4be3-4a3e-ac92-a5c0cd2708fe\") " pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:20:59.461526 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:59.461090 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 13:20:59.461526 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:59.461108 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 13:20:59.461526 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:59.461121 2575 projected.go:194] Error preparing data for projected volume kube-api-access-xlgxm for pod openshift-network-diagnostics/network-check-target-ggdrt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:20:59.461526 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:59.461173 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm podName:767888b5-4be3-4a3e-ac92-a5c0cd2708fe nodeName:}" failed. No retries permitted until 2026-04-22 13:21:03.461155074 +0000 UTC m=+9.239841834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xlgxm" (UniqueName: "kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm") pod "network-check-target-ggdrt" (UID: "767888b5-4be3-4a3e-ac92-a5c0cd2708fe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:20:59.850834 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:20:59.850728 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:20:59.850994 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:20:59.850879 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:00.850207 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:00.850149 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:00.850638 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:00.850299 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:01.850506 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:01.850471 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:01.851003 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:01.850610 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:02.850895 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:02.850616 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:02.850895 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:02.850761 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:03.394938 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:03.394895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs\") pod \"network-metrics-daemon-jgxt9\" (UID: \"8b92597f-aa17-456d-bc65-ee5880d70a69\") " pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:03.395122 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:03.395058 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:03.395192 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:03.395131 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs podName:8b92597f-aa17-456d-bc65-ee5880d70a69 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:11.395110186 +0000 UTC m=+17.173796943 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs") pod "network-metrics-daemon-jgxt9" (UID: "8b92597f-aa17-456d-bc65-ee5880d70a69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:03.495616 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:03.495567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgxm\" (UniqueName: \"kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm\") pod \"network-check-target-ggdrt\" (UID: \"767888b5-4be3-4a3e-ac92-a5c0cd2708fe\") " pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:03.495796 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:03.495724 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 13:21:03.495796 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:03.495744 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 13:21:03.495796 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:03.495757 2575 projected.go:194] Error preparing data for projected volume kube-api-access-xlgxm for pod openshift-network-diagnostics/network-check-target-ggdrt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:03.496002 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:03.495838 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm podName:767888b5-4be3-4a3e-ac92-a5c0cd2708fe nodeName:}" failed. No retries permitted until 2026-04-22 13:21:11.49580566 +0000 UTC m=+17.274492419 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xlgxm" (UniqueName: "kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm") pod "network-check-target-ggdrt" (UID: "767888b5-4be3-4a3e-ac92-a5c0cd2708fe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:03.850553 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:03.850470 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:03.850714 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:03.850601 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:04.850683 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:04.850624 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:04.851158 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:04.850746 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:05.849763 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:05.849728 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:05.849925 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:05.849883 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:06.850284 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:06.850242 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:06.850701 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:06.850369 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:07.850121 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:07.850081 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:07.850305 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:07.850221 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:08.850006 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:08.849968 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:08.850442 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:08.850124 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:09.850194 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:09.850162 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:09.850578 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:09.850276 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:10.850135 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:10.850097 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:10.850297 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:10.850206 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:11.450297 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:11.450077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs\") pod \"network-metrics-daemon-jgxt9\" (UID: \"8b92597f-aa17-456d-bc65-ee5880d70a69\") " pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:11.450485 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:11.450236 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:11.450485 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:11.450422 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs podName:8b92597f-aa17-456d-bc65-ee5880d70a69 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:27.450403117 +0000 UTC m=+33.229089891 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs") pod "network-metrics-daemon-jgxt9" (UID: "8b92597f-aa17-456d-bc65-ee5880d70a69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:11.551353 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:11.551310 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgxm\" (UniqueName: \"kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm\") pod \"network-check-target-ggdrt\" (UID: \"767888b5-4be3-4a3e-ac92-a5c0cd2708fe\") " pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:11.551515 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:11.551444 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 13:21:11.551515 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:11.551461 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 13:21:11.551515 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:11.551474 2575 projected.go:194] Error preparing data for projected volume kube-api-access-xlgxm for pod openshift-network-diagnostics/network-check-target-ggdrt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:11.551655 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:11.551542 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm podName:767888b5-4be3-4a3e-ac92-a5c0cd2708fe nodeName:}" failed. No retries permitted until 2026-04-22 13:21:27.55152297 +0000 UTC m=+33.330209727 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xlgxm" (UniqueName: "kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm") pod "network-check-target-ggdrt" (UID: "767888b5-4be3-4a3e-ac92-a5c0cd2708fe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 13:21:11.850501 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:11.850430 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:11.850926 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:11.850550 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:12.850417 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:12.850386 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:12.850599 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:12.850508 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:13.850125 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:13.849941 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:13.850125 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:13.850080 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:14.851149 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.850921 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:14.851782 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:14.851250 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:14.976761 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.976674 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-sclfb" event={"ID":"02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f","Type":"ContainerStarted","Data":"312bda3f4cbe7b095d58bc26f57c9a92d03c66166101c0b4a53caf2564283e1a"} Apr 22 13:21:14.978037 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.978005 2575 generic.go:358] "Generic (PLEG): container finished" podID="d5dedb94-cce8-40ef-8b20-152362aec6dc" containerID="fc5be2f8225f6eb0f36ea399d722e262cf81022540e6bd97cbbe45aa431a7de3" exitCode=0 Apr 22 13:21:14.978170 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.978096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82s2m" event={"ID":"d5dedb94-cce8-40ef-8b20-152362aec6dc","Type":"ContainerDied","Data":"fc5be2f8225f6eb0f36ea399d722e262cf81022540e6bd97cbbe45aa431a7de3"} Apr 22 13:21:14.979340 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.979255 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2x5k7" event={"ID":"4f52168a-3467-4e13-b154-1feaf9796063","Type":"ContainerStarted","Data":"bd101d94faa3aca94486716a03fc57aa7c93d6a8cb12056b75f78af4e1d2a246"} Apr 22 13:21:14.984217 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.984198 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:21:14.984501 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.984482 2575 generic.go:358] "Generic (PLEG): container finished" podID="a28f28cc-e746-49c4-bf70-a476e379f760" containerID="9620ff07b1b7a97b2eec0a3683b3dd276688eaae46ccb55525a3cdca8f8137fe" exitCode=1 Apr 22 13:21:14.984560 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.984541 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" event={"ID":"a28f28cc-e746-49c4-bf70-a476e379f760","Type":"ContainerStarted","Data":"d10aa500ffcaf7a4b265d1c209d70a769e7be496aa2a3ef95cae30d026b0b46d"} Apr 22 13:21:14.984603 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.984569 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" event={"ID":"a28f28cc-e746-49c4-bf70-a476e379f760","Type":"ContainerStarted","Data":"921de0a5d40f81f8ca8c2a6ff1d788e8a594f555921b0c2b6abf701c75564c40"} Apr 22 13:21:14.984603 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.984579 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" event={"ID":"a28f28cc-e746-49c4-bf70-a476e379f760","Type":"ContainerStarted","Data":"e3f77de449fb0fbe38fd9f6df9256b18e53f104022b68d6c82e6db213d493144"} Apr 22 13:21:14.984603 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.984588 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" event={"ID":"a28f28cc-e746-49c4-bf70-a476e379f760","Type":"ContainerStarted","Data":"c60e6673098fff3dee3e933fe6ce9ef1cdbaa12692a23b65ce069e0ce867c6ba"} Apr 22 13:21:14.984603 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.984601 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" event={"ID":"a28f28cc-e746-49c4-bf70-a476e379f760","Type":"ContainerDied","Data":"9620ff07b1b7a97b2eec0a3683b3dd276688eaae46ccb55525a3cdca8f8137fe"} Apr 22 13:21:14.984738 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.984617 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" event={"ID":"a28f28cc-e746-49c4-bf70-a476e379f760","Type":"ContainerStarted","Data":"e5e2cd3ccf80a5004737b1069ad2d61efc011f32f56a818a56bbef69a9f449f1"} Apr 22 13:21:14.985675 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.985649 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g5ctl" event={"ID":"544999a6-323d-481d-b6ad-d24f5da3e82f","Type":"ContainerStarted","Data":"c5a07704a3969e7c7e1e7b35a97ab2febc79abd72ca1dd176a77039473d8501d"} Apr 22 13:21:14.986752 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.986729 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" event={"ID":"55d1184b-6b1c-43fd-9fdf-a5cbe05174b6","Type":"ContainerStarted","Data":"917fb1864712f3d7d821a1c8ee6f1b35c64d28dfb7a66df4b968b8fe577a39f7"} Apr 22 13:21:14.987905 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.987887 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" event={"ID":"15e591d3-68e8-48e6-854d-1b459e3bf1c1","Type":"ContainerStarted","Data":"43fbc13c2c25a5666660aa71eaea5723f0eac97bc74a04863f43956a787a1e84"} Apr 22 13:21:14.988995 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.988975 2575 generic.go:358] "Generic (PLEG): container finished" podID="a1a5a7d927cecab18576aa6fbed00027" containerID="ca935bd5bf84adec7abce4fd3c99eab9336f494bea21b4d441a5f012b37f8fb4" exitCode=0 Apr 22 13:21:14.989069 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.989035 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal" event={"ID":"a1a5a7d927cecab18576aa6fbed00027","Type":"ContainerDied","Data":"ca935bd5bf84adec7abce4fd3c99eab9336f494bea21b4d441a5f012b37f8fb4"} Apr 22 13:21:14.990137 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.990119 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-73.ec2.internal" event={"ID":"5e1839749775b093843d832935f4dc52","Type":"ContainerStarted","Data":"67946ae589adfbfa190a3a121c2bc2df15cde96e534fe6cd6720d9558c199f8c"} Apr 22 13:21:14.992213 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:14.992177 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-sclfb" podStartSLOduration=2.282856405 podStartE2EDuration="19.992167771s" podCreationTimestamp="2026-04-22 13:20:55 +0000 UTC" firstStartedPulling="2026-04-22 13:20:56.135527722 +0000 UTC m=+1.914214478" lastFinishedPulling="2026-04-22 13:21:13.844839081 +0000 UTC m=+19.623525844" observedRunningTime="2026-04-22 13:21:14.991943961 +0000 UTC m=+20.770630740" watchObservedRunningTime="2026-04-22 13:21:14.992167771 +0000 UTC m=+20.770854549" Apr 22 13:21:15.018857 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:15.018640 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-l8mkn" podStartSLOduration=2.256632743 podStartE2EDuration="20.018623649s" podCreationTimestamp="2026-04-22 13:20:55 +0000 UTC" firstStartedPulling="2026-04-22 13:20:56.086111522 +0000 UTC m=+1.864798278" lastFinishedPulling="2026-04-22 13:21:13.848102414 +0000 UTC m=+19.626789184" observedRunningTime="2026-04-22 13:21:15.006364783 +0000 UTC m=+20.785051561" watchObservedRunningTime="2026-04-22 13:21:15.018623649 +0000 UTC m=+20.797310427" Apr 22 13:21:15.019039 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:15.019014 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-g5ctl" podStartSLOduration=2.109633886 podStartE2EDuration="20.019004846s" podCreationTimestamp="2026-04-22 13:20:55 +0000 UTC" firstStartedPulling="2026-04-22 13:20:56.130258464 +0000 UTC m=+1.908945224" lastFinishedPulling="2026-04-22 13:21:14.039629426 +0000 UTC m=+19.818316184" observedRunningTime="2026-04-22 13:21:15.018177727 +0000 UTC m=+20.796864518" watchObservedRunningTime="2026-04-22 13:21:15.019004846 +0000 UTC m=+20.797691624" Apr 22 13:21:15.053846 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:15.053784 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-73.ec2.internal" podStartSLOduration=20.053768486 podStartE2EDuration="20.053768486s" podCreationTimestamp="2026-04-22 13:20:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 13:21:15.053523281 +0000 UTC m=+20.832210064" watchObservedRunningTime="2026-04-22 13:21:15.053768486 +0000 UTC m=+20.832455264" Apr 22 13:21:15.068483 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:15.068437 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2x5k7" podStartSLOduration=2.323064843 podStartE2EDuration="20.068424206s" podCreationTimestamp="2026-04-22 13:20:55 +0000 UTC" firstStartedPulling="2026-04-22 13:20:56.099484887 +0000 UTC m=+1.878171644" lastFinishedPulling="2026-04-22 13:21:13.844844238 +0000 UTC m=+19.623531007" observedRunningTime="2026-04-22 13:21:15.068330372 +0000 UTC m=+20.847017149" watchObservedRunningTime="2026-04-22 13:21:15.068424206 +0000 UTC m=+20.847110983" Apr 22 13:21:15.582970 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:15.582946 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 13:21:15.795297 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:15.795146 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T13:21:15.582964698Z","UUID":"210e53af-bd00-4afa-b7ea-dda7d316b78f","Handler":null,"Name":"","Endpoint":""} Apr 22 13:21:15.797266 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:15.797240 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 13:21:15.797266 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:15.797271 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 13:21:15.850381 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:15.850346 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:15.850536 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:15.850482 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:15.994362 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:15.994323 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" event={"ID":"15e591d3-68e8-48e6-854d-1b459e3bf1c1","Type":"ContainerStarted","Data":"28c4a83706eacb50fc9d039d7a730216e24dbf407c91b6c03ff7d7156ec2c175"} Apr 22 13:21:15.996243 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:15.996213 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lld46" event={"ID":"9e919041-6f50-4989-b55f-057c690de2ab","Type":"ContainerStarted","Data":"5fedb55f4a7a21a2aa55a25a3b1207be24b1e522d8946394e52c72befcdd9caf"} Apr 22 13:21:15.998464 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:15.998438 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal" event={"ID":"a1a5a7d927cecab18576aa6fbed00027","Type":"ContainerStarted","Data":"79e22d4bf32cf2f4bbf107640946ba95e3f6f9904e14536e94022c5d42d4babe"} Apr 22 13:21:16.012167 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:16.010068 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lld46" podStartSLOduration=4.198657666 podStartE2EDuration="22.010053388s" podCreationTimestamp="2026-04-22 13:20:54 +0000 UTC" firstStartedPulling="2026-04-22 13:20:56.033475472 +0000 UTC m=+1.812162228" lastFinishedPulling="2026-04-22 13:21:13.84487118 +0000 UTC m=+19.623557950" observedRunningTime="2026-04-22 13:21:16.009249701 +0000 UTC m=+21.787936472" watchObservedRunningTime="2026-04-22 13:21:16.010053388 +0000 UTC m=+21.788740168" Apr 22 13:21:16.021831 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:16.021768 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-73.ec2.internal" podStartSLOduration=21.021751893 podStartE2EDuration="21.021751893s" podCreationTimestamp="2026-04-22 13:20:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 13:21:16.021605814 +0000 UTC m=+21.800292594" watchObservedRunningTime="2026-04-22 13:21:16.021751893 +0000 UTC m=+21.800438672" Apr 22 13:21:16.850129 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:16.850049 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:16.850315 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:16.850189 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:16.958685 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:16.958653 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-sclfb" Apr 22 13:21:16.959360 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:16.959336 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-sclfb" Apr 22 13:21:17.002101 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:17.002054 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" event={"ID":"15e591d3-68e8-48e6-854d-1b459e3bf1c1","Type":"ContainerStarted","Data":"1882587fba4578020c25a973181ce472136dd374b733a9677fc16aefb4a28b3e"} Apr 22 13:21:17.005054 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:17.005028 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:21:17.005458 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:17.005430 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" event={"ID":"a28f28cc-e746-49c4-bf70-a476e379f760","Type":"ContainerStarted","Data":"e46f59499b3c16fc2dd78b6d82f0c1453fea3d8422de4e6813399901b2aa59f0"} Apr 22 13:21:17.006068 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:17.005833 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-sclfb" Apr 22 13:21:17.006198 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:17.006179 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-sclfb" Apr 22 13:21:17.020209 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:17.020163 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sk7b" podStartSLOduration=1.6897169509999999 podStartE2EDuration="22.020148094s" podCreationTimestamp="2026-04-22 13:20:55 +0000 UTC" firstStartedPulling="2026-04-22 13:20:56.06446169 +0000 UTC m=+1.843148445" lastFinishedPulling="2026-04-22 13:21:16.394892816 +0000 UTC m=+22.173579588" observedRunningTime="2026-04-22 13:21:17.019850703 +0000 UTC m=+22.798537481" watchObservedRunningTime="2026-04-22 13:21:17.020148094 +0000 UTC m=+22.798834872" Apr 22 13:21:17.850057 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:17.850025 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:17.850222 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:17.850145 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:18.850112 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:18.850040 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:18.850537 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:18.850159 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:19.850397 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:19.850173 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:19.850904 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:19.850428 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:20.012016 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:20.011982 2575 generic.go:358] "Generic (PLEG): container finished" podID="d5dedb94-cce8-40ef-8b20-152362aec6dc" containerID="3421bf9d611e4b3be6ea39e78035deb31f0256b343ce850f2ee2302692a1c581" exitCode=0 Apr 22 13:21:20.012152 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:20.012054 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82s2m" event={"ID":"d5dedb94-cce8-40ef-8b20-152362aec6dc","Type":"ContainerDied","Data":"3421bf9d611e4b3be6ea39e78035deb31f0256b343ce850f2ee2302692a1c581"} Apr 22 13:21:20.015140 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:20.015119 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:21:20.015433 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:20.015415 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" event={"ID":"a28f28cc-e746-49c4-bf70-a476e379f760","Type":"ContainerStarted","Data":"0e0187a0676517c283400130f6e725046d9d163183f8f3fa661ab0b8bcf73b9a"} Apr 22 13:21:20.015671 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:20.015657 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:21:20.015883 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:20.015866 2575 scope.go:117] "RemoveContainer" containerID="9620ff07b1b7a97b2eec0a3683b3dd276688eaae46ccb55525a3cdca8f8137fe" Apr 22 13:21:20.030672 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:20.030645 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:21:20.850454 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:20.850337 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:20.851166 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:20.850489 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:21.019209 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:21.019123 2575 generic.go:358] "Generic (PLEG): container finished" podID="d5dedb94-cce8-40ef-8b20-152362aec6dc" containerID="ebdc5412eb4d91efe79b8eed546d0d3b45d2d77c634dc3030017199be9a4e008" exitCode=0 Apr 22 13:21:21.019337 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:21.019209 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82s2m" event={"ID":"d5dedb94-cce8-40ef-8b20-152362aec6dc","Type":"ContainerDied","Data":"ebdc5412eb4d91efe79b8eed546d0d3b45d2d77c634dc3030017199be9a4e008"} Apr 22 13:21:21.022429 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:21.022413 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:21:21.022719 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:21.022693 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" event={"ID":"a28f28cc-e746-49c4-bf70-a476e379f760","Type":"ContainerStarted","Data":"29754c059727ff540d9c1d2369b0c2db1977681c523eda90995069b8e6928da1"} Apr 22 13:21:21.022854 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:21.022842 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 13:21:21.022935 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:21.022923 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:21:21.038438 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:21.038411 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:21:21.042389 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:21.042369 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jgxt9"] Apr 22 13:21:21.042612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:21.042592 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:21.042710 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:21.042689 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:21.044989 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:21.044968 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ggdrt"] Apr 22 13:21:21.045062 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:21.045049 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:21.045144 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:21.045127 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:21.053076 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:21.053023 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" podStartSLOduration=8.210680313 podStartE2EDuration="26.053010288s" podCreationTimestamp="2026-04-22 13:20:55 +0000 UTC" firstStartedPulling="2026-04-22 13:20:56.045343866 +0000 UTC m=+1.824030622" lastFinishedPulling="2026-04-22 13:21:13.887673841 +0000 UTC m=+19.666360597" observedRunningTime="2026-04-22 13:21:21.052895828 +0000 UTC m=+26.831582639" watchObservedRunningTime="2026-04-22 13:21:21.053010288 +0000 UTC m=+26.831697071" Apr 22 13:21:22.026552 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.026308 2575 generic.go:358] "Generic (PLEG): container finished" podID="d5dedb94-cce8-40ef-8b20-152362aec6dc" containerID="0cd364eb63501012bbe2c9ee8e647e440042ccf34a5f7f850112ac3319f04748" exitCode=0 Apr 22 13:21:22.026552 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.026389 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82s2m" event={"ID":"d5dedb94-cce8-40ef-8b20-152362aec6dc","Type":"ContainerDied","Data":"0cd364eb63501012bbe2c9ee8e647e440042ccf34a5f7f850112ac3319f04748"} Apr 22 13:21:22.027039 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.026654 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 13:21:22.774761 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.774721 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-cmksh"] Apr 22 13:21:22.777547 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.777524 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:22.777656 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:22.777605 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cmksh" podUID="7b8dea64-f9f6-45b5-b139-340bac72fa46" Apr 22 13:21:22.781007 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.780982 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:21:22.784465 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.784443 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cmksh"] Apr 22 13:21:22.836140 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.836112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7b8dea64-f9f6-45b5-b139-340bac72fa46-dbus\") pod \"global-pull-secret-syncer-cmksh\" (UID: \"7b8dea64-f9f6-45b5-b139-340bac72fa46\") " pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:22.836285 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.836174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7b8dea64-f9f6-45b5-b139-340bac72fa46-kubelet-config\") pod \"global-pull-secret-syncer-cmksh\" (UID: \"7b8dea64-f9f6-45b5-b139-340bac72fa46\") " pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:22.836285 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.836200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret\") pod \"global-pull-secret-syncer-cmksh\" (UID: \"7b8dea64-f9f6-45b5-b139-340bac72fa46\") " pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:22.850240 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.850213 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:22.850379 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.850240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:22.850379 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:22.850332 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:22.850570 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:22.850541 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:22.937478 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.937370 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7b8dea64-f9f6-45b5-b139-340bac72fa46-dbus\") pod \"global-pull-secret-syncer-cmksh\" (UID: \"7b8dea64-f9f6-45b5-b139-340bac72fa46\") " pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:22.937478 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.937454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7b8dea64-f9f6-45b5-b139-340bac72fa46-kubelet-config\") pod \"global-pull-secret-syncer-cmksh\" (UID: \"7b8dea64-f9f6-45b5-b139-340bac72fa46\") " pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:22.937658 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.937482 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret\") pod \"global-pull-secret-syncer-cmksh\" (UID: \"7b8dea64-f9f6-45b5-b139-340bac72fa46\") " pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:22.937658 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.937578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7b8dea64-f9f6-45b5-b139-340bac72fa46-kubelet-config\") pod \"global-pull-secret-syncer-cmksh\" (UID: \"7b8dea64-f9f6-45b5-b139-340bac72fa46\") " pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:22.937658 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:22.937600 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7b8dea64-f9f6-45b5-b139-340bac72fa46-dbus\") pod \"global-pull-secret-syncer-cmksh\" (UID: \"7b8dea64-f9f6-45b5-b139-340bac72fa46\") " pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:22.937761 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:22.937666 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 13:21:22.937761 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:22.937735 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret podName:7b8dea64-f9f6-45b5-b139-340bac72fa46 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:23.437716664 +0000 UTC m=+29.216403422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret") pod "global-pull-secret-syncer-cmksh" (UID: "7b8dea64-f9f6-45b5-b139-340bac72fa46") : object "kube-system"/"original-pull-secret" not registered Apr 22 13:21:23.028320 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:23.028281 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:23.028827 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:23.028405 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cmksh" podUID="7b8dea64-f9f6-45b5-b139-340bac72fa46" Apr 22 13:21:23.442315 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:23.442280 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret\") pod \"global-pull-secret-syncer-cmksh\" (UID: \"7b8dea64-f9f6-45b5-b139-340bac72fa46\") " pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:23.442476 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:23.442449 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 13:21:23.442535 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:23.442523 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret podName:7b8dea64-f9f6-45b5-b139-340bac72fa46 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:24.442504496 +0000 UTC m=+30.221191256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret") pod "global-pull-secret-syncer-cmksh" (UID: "7b8dea64-f9f6-45b5-b139-340bac72fa46") : object "kube-system"/"original-pull-secret" not registered Apr 22 13:21:24.043860 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:24.043783 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" podUID="a28f28cc-e746-49c4-bf70-a476e379f760" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 13:21:24.450532 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:24.450437 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret\") pod \"global-pull-secret-syncer-cmksh\" (UID: \"7b8dea64-f9f6-45b5-b139-340bac72fa46\") " pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:24.450682 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:24.450611 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 13:21:24.450682 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:24.450669 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret podName:7b8dea64-f9f6-45b5-b139-340bac72fa46 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:26.450650988 +0000 UTC m=+32.229337756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret") pod "global-pull-secret-syncer-cmksh" (UID: "7b8dea64-f9f6-45b5-b139-340bac72fa46") : object "kube-system"/"original-pull-secret" not registered Apr 22 13:21:24.851141 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:24.851059 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:24.851311 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:24.851145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:24.851311 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:24.851178 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:24.851311 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:24.851201 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cmksh" podUID="7b8dea64-f9f6-45b5-b139-340bac72fa46" Apr 22 13:21:24.851311 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:24.851248 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:24.851484 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:24.851344 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:26.464100 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:26.464061 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret\") pod \"global-pull-secret-syncer-cmksh\" (UID: \"7b8dea64-f9f6-45b5-b139-340bac72fa46\") " pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:26.464498 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:26.464200 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 13:21:26.464498 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:26.464260 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret podName:7b8dea64-f9f6-45b5-b139-340bac72fa46 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:30.464242796 +0000 UTC m=+36.242929556 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret") pod "global-pull-secret-syncer-cmksh" (UID: "7b8dea64-f9f6-45b5-b139-340bac72fa46") : object "kube-system"/"original-pull-secret" not registered Apr 22 13:21:26.850920 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:26.850804 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:26.851086 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:26.850804 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:26.851086 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:26.850961 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ggdrt" podUID="767888b5-4be3-4a3e-ac92-a5c0cd2708fe" Apr 22 13:21:26.851086 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:26.850979 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:26.851086 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:26.851069 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cmksh" podUID="7b8dea64-f9f6-45b5-b139-340bac72fa46" Apr 22 13:21:26.851279 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:26.851189 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jgxt9" podUID="8b92597f-aa17-456d-bc65-ee5880d70a69" Apr 22 13:21:26.986355 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:26.986320 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-73.ec2.internal" event="NodeReady" Apr 22 13:21:26.986525 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:26.986480 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 13:21:27.038390 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.038332 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-c7lmh"] Apr 22 13:21:27.067497 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.067455 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6784d5bdf4-cp9jk"] Apr 22 13:21:27.067653 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.067593 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.070493 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.070466 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 13:21:27.070493 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.070484 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-gff5c\"" Apr 22 13:21:27.071807 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.070957 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 13:21:27.071807 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.071156 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 13:21:27.071807 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.071436 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 13:21:27.082149 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.080761 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k"] Apr 22 13:21:27.083728 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.083702 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.083860 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.083736 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 13:21:27.087302 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.087280 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 13:21:27.087302 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.087316 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 13:21:27.087474 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.087350 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 13:21:27.087551 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.087523 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-lx8tg\"" Apr 22 13:21:27.087704 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.087688 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 13:21:27.088159 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.088145 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 13:21:27.091187 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.090883 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 13:21:27.105783 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.105716 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs"] Apr 22 13:21:27.105924 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.105882 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:27.108445 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.108415 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 13:21:27.108602 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.108579 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bmt5j\"" Apr 22 13:21:27.108693 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.108541 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 13:21:27.108779 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.108529 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 13:21:27.117879 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.117859 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6b8d7bc487-ggkqp"] Apr 22 13:21:27.118015 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.117999 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:27.121484 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.121465 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 13:21:27.121713 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.121699 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 13:21:27.121984 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.121928 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-hfczk\"" Apr 22 13:21:27.121984 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.121978 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 13:21:27.122462 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.122447 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 13:21:27.131855 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.131835 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-tkzkh"] Apr 22 13:21:27.131935 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.131919 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.136895 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.136681 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 13:21:27.136895 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.136788 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 13:21:27.137162 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.137113 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7w6d8\"" Apr 22 13:21:27.137665 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.137626 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 13:21:27.143101 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.143077 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 13:21:27.145532 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.145512 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-plfpc"] Apr 22 13:21:27.145666 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.145645 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.148469 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.148451 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 13:21:27.148653 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.148503 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 13:21:27.148723 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.148560 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-xlktb\"" Apr 22 13:21:27.149035 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.149017 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 13:21:27.149152 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.149132 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 13:21:27.157924 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.156186 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 13:21:27.158574 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.158555 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-c7lmh"] Apr 22 13:21:27.158661 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.158582 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb"] Apr 22 13:21:27.158731 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.158717 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-plfpc" Apr 22 13:21:27.161653 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.161603 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-qhz7c\"" Apr 22 13:21:27.162031 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.161968 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 13:21:27.162031 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.161984 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 13:21:27.168312 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.168292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-tmp\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.168406 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.168332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.168406 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.168359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w5fv\" (UniqueName: \"kubernetes.io/projected/31773dcc-6b07-4788-8a81-d7978b0c63fc-kube-api-access-8w5fv\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.168503 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.168418 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.168503 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.168451 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-stats-auth\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.168503 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.168477 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-snapshots\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.168636 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.168505 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcl5j\" (UniqueName: \"kubernetes.io/projected/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-kube-api-access-lcl5j\") pod \"cluster-samples-operator-6dc5bdb6b4-5b69k\" (UID: \"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:27.168636 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.168564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-service-ca-bundle\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.168636 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.168610 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-serving-cert\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.168768 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.168640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.168768 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.168664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-default-certificate\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.168768 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.168682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5b69k\" (UID: \"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:27.168768 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.168725 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4htj\" (UniqueName: \"kubernetes.io/projected/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-kube-api-access-n4htj\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.176983 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.176957 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844"] Apr 22 13:21:27.177097 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.177079 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" Apr 22 13:21:27.182727 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.182704 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 13:21:27.182950 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.182911 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-hckdg\"" Apr 22 13:21:27.183114 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.183050 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 13:21:27.183114 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.183073 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 13:21:27.183924 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.183904 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 13:21:27.193478 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.193441 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw"] Apr 22 13:21:27.193660 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.193639 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" Apr 22 13:21:27.196108 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.196084 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 13:21:27.196203 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.196087 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 13:21:27.196272 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.196109 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-zxsb8\"" Apr 22 13:21:27.196635 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.196618 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 13:21:27.196754 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.196735 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 13:21:27.208522 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.208502 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9jhgc"] Apr 22 13:21:27.208675 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.208654 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:27.211153 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.211063 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 13:21:27.211153 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.211073 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 13:21:27.211291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.211155 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-65l8g\"" Apr 22 13:21:27.229486 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.229465 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd"] Apr 22 13:21:27.229610 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.229591 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9jhgc" Apr 22 13:21:27.232342 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.232325 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-d47l2\"" Apr 22 13:21:27.232342 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.232335 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 13:21:27.232509 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.232430 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 13:21:27.242530 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.242513 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4"] Apr 22 13:21:27.242687 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.242671 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.245203 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.245187 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 13:21:27.245379 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.245302 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 13:21:27.245379 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.245243 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 13:21:27.245516 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.245384 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 13:21:27.245516 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.245394 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 13:21:27.245711 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.245692 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 13:21:27.245993 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.245975 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 13:21:27.255113 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.255096 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6"] Apr 22 13:21:27.255284 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.255269 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" Apr 22 13:21:27.257467 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.257448 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 13:21:27.267546 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.267530 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6784d5bdf4-cp9jk"] Apr 22 13:21:27.267627 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.267553 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-plfpc"] Apr 22 13:21:27.267627 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.267567 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs"] Apr 22 13:21:27.267627 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.267591 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k"] Apr 22 13:21:27.267627 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.267604 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b8d7bc487-ggkqp"] Apr 22 13:21:27.267627 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.267616 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb"] Apr 22 13:21:27.267790 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.267630 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd"] Apr 22 13:21:27.267790 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.267640 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw"] Apr 22 13:21:27.267790 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.267648 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4"] Apr 22 13:21:27.267790 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.267659 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-28xhg"] Apr 22 13:21:27.267790 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.267677 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6" Apr 22 13:21:27.269743 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.269725 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7wmrw\" (UID: \"e52e91fc-36b4-42db-9861-4ef4a07e7ec7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:27.269840 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.269752 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/325a1de2-a59a-4875-9ae5-6279a61a3d7c-config\") pod \"service-ca-operator-d6fc45fc5-qkdgb\" (UID: \"325a1de2-a59a-4875-9ae5-6279a61a3d7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" Apr 22 13:21:27.269840 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.269778 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 13:21:27.269840 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.269779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-stats-auth\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.270019 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.269880 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/314bb891-5872-4d07-b293-eb6ba8a1c926-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9s844\" (UID: \"314bb891-5872-4d07-b293-eb6ba8a1c926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" Apr 22 13:21:27.270019 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.269911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/40c9e05d-a355-4746-b30f-56fb43b54267-image-registry-private-configuration\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.270019 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.269950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-snapshots\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.270019 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.269979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcl5j\" (UniqueName: \"kubernetes.io/projected/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-kube-api-access-lcl5j\") pod \"cluster-samples-operator-6dc5bdb6b4-5b69k\" (UID: \"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:27.270019 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.269997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-serving-cert\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.270226 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270048 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6690926-2579-440b-9233-f4d551be735b-trusted-ca\") pod \"console-operator-9d4b6777b-tkzkh\" (UID: \"e6690926-2579-440b-9233-f4d551be735b\") " pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.270226 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270084 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40c9e05d-a355-4746-b30f-56fb43b54267-ca-trust-extracted\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.270226 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270130 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:27.270226 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270158 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6690926-2579-440b-9233-f4d551be735b-serving-cert\") pod \"console-operator-9d4b6777b-tkzkh\" (UID: \"e6690926-2579-440b-9233-f4d551be735b\") " pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.270405 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mjxx\" (UniqueName: \"kubernetes.io/projected/e6690926-2579-440b-9233-f4d551be735b-kube-api-access-9mjxx\") pod \"console-operator-9d4b6777b-tkzkh\" (UID: \"e6690926-2579-440b-9233-f4d551be735b\") " pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.270405 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270271 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-default-certificate\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.270405 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270302 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5b69k\" (UID: \"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:27.270405 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270325 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40c9e05d-a355-4746-b30f-56fb43b54267-registry-certificates\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.270405 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4htj\" (UniqueName: \"kubernetes.io/projected/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-kube-api-access-n4htj\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.270642 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.270411 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 13:21:27.270642 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-bound-sa-token\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.270642 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.270511 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls podName:26fb6a86-2731-45ec-bf1d-5a84dbd6e4de nodeName:}" failed. No retries permitted until 2026-04-22 13:21:27.770484769 +0000 UTC m=+33.549171743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5b69k" (UID: "26fb6a86-2731-45ec-bf1d-5a84dbd6e4de") : secret "samples-operator-tls" not found Apr 22 13:21:27.270642 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-snapshots\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.270642 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6690926-2579-440b-9233-f4d551be735b-config\") pod \"console-operator-9d4b6777b-tkzkh\" (UID: \"e6690926-2579-440b-9233-f4d551be735b\") " pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.270642 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270637 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnzvm\" (UniqueName: \"kubernetes.io/projected/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-kube-api-access-mnzvm\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:27.270968 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/325a1de2-a59a-4875-9ae5-6279a61a3d7c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qkdgb\" (UID: \"325a1de2-a59a-4875-9ae5-6279a61a3d7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" Apr 22 13:21:27.270968 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8w5fv\" (UniqueName: \"kubernetes.io/projected/31773dcc-6b07-4788-8a81-d7978b0c63fc-kube-api-access-8w5fv\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.270968 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270771 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfgc2\" (UniqueName: \"kubernetes.io/projected/314bb891-5872-4d07-b293-eb6ba8a1c926-kube-api-access-tfgc2\") pod \"kube-storage-version-migrator-operator-6769c5d45-9s844\" (UID: \"314bb891-5872-4d07-b293-eb6ba8a1c926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" Apr 22 13:21:27.270968 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.270968 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270880 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.270968 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7wmrw\" (UID: \"e52e91fc-36b4-42db-9861-4ef4a07e7ec7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:27.270968 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-service-ca-bundle\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.271257 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.270984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40c9e05d-a355-4746-b30f-56fb43b54267-installation-pull-secrets\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.271257 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.271009 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmzt7\" (UniqueName: \"kubernetes.io/projected/325a1de2-a59a-4875-9ae5-6279a61a3d7c-kube-api-access-qmzt7\") pod \"service-ca-operator-d6fc45fc5-qkdgb\" (UID: \"325a1de2-a59a-4875-9ae5-6279a61a3d7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" Apr 22 13:21:27.271257 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.271041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.271257 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.271085 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40c9e05d-a355-4746-b30f-56fb43b54267-trusted-ca\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.271257 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.271114 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 13:21:27.271257 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.271141 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-tmp\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.271257 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.271156 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs podName:31773dcc-6b07-4788-8a81-d7978b0c63fc nodeName:}" failed. No retries permitted until 2026-04-22 13:21:27.771142076 +0000 UTC m=+33.549828836 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs") pod "router-default-6784d5bdf4-cp9jk" (UID: "31773dcc-6b07-4788-8a81-d7978b0c63fc") : secret "router-metrics-certs-default" not found Apr 22 13:21:27.271257 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.271180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.271257 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.271204 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314bb891-5872-4d07-b293-eb6ba8a1c926-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9s844\" (UID: \"314bb891-5872-4d07-b293-eb6ba8a1c926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" Apr 22 13:21:27.271257 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.271229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4vss\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-kube-api-access-z4vss\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.271257 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.271251 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:27.271732 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.271275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpxws\" (UniqueName: \"kubernetes.io/projected/8c86436f-31ed-4303-a368-025b9fb5a7ed-kube-api-access-tpxws\") pod \"volume-data-source-validator-7c6cbb6c87-plfpc\" (UID: \"8c86436f-31ed-4303-a368-025b9fb5a7ed\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-plfpc" Apr 22 13:21:27.271732 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.271416 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-tmp\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.271732 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.271519 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle podName:31773dcc-6b07-4788-8a81-d7978b0c63fc nodeName:}" failed. No retries permitted until 2026-04-22 13:21:27.771505636 +0000 UTC m=+33.550192414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle") pod "router-default-6784d5bdf4-cp9jk" (UID: "31773dcc-6b07-4788-8a81-d7978b0c63fc") : configmap references non-existent config key: service-ca.crt Apr 22 13:21:27.271911 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.271804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-service-ca-bundle\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.274915 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.274887 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-stats-auth\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.275001 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.274890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-default-certificate\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.277549 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.277526 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-serving-cert\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.277950 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.277929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.282312 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.282288 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w5fv\" (UniqueName: \"kubernetes.io/projected/31773dcc-6b07-4788-8a81-d7978b0c63fc-kube-api-access-8w5fv\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.282435 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.282378 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcl5j\" (UniqueName: \"kubernetes.io/projected/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-kube-api-access-lcl5j\") pod \"cluster-samples-operator-6dc5bdb6b4-5b69k\" (UID: \"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:27.283876 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.283747 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9jhgc"] Apr 22 13:21:27.283876 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.283777 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6"] Apr 22 13:21:27.283876 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.283792 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-tkzkh"] Apr 22 13:21:27.283876 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.283804 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844"] Apr 22 13:21:27.283876 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.283877 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-28xhg"] Apr 22 13:21:27.284147 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.283911 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qlspg"] Apr 22 13:21:27.284415 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.284394 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:27.285492 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.285470 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4htj\" (UniqueName: \"kubernetes.io/projected/6a9a5a71-f594-4aa2-a20c-e3b81689cb97-kube-api-access-n4htj\") pod \"insights-operator-585dfdc468-c7lmh\" (UID: \"6a9a5a71-f594-4aa2-a20c-e3b81689cb97\") " pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.286444 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.286427 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 13:21:27.286613 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.286585 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6bcgp\"" Apr 22 13:21:27.286745 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.286719 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 13:21:27.286884 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.286858 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 13:21:27.298530 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.298489 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qlspg"] Apr 22 13:21:27.298657 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.298644 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:27.301243 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.300995 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 13:21:27.301243 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.301061 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 13:21:27.301243 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.301086 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 13:21:27.301243 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.301189 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rrhbx\"" Apr 22 13:21:27.301490 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.301301 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 13:21:27.372128 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnzvm\" (UniqueName: \"kubernetes.io/projected/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-kube-api-access-mnzvm\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:27.372128 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372096 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvb7s\" (UniqueName: \"kubernetes.io/projected/1e878e40-107f-4e8f-a28a-a31fd05dab63-kube-api-access-dvb7s\") pod \"managed-serviceaccount-addon-agent-7bb549c677-jfxn6\" (UID: \"1e878e40-107f-4e8f-a28a-a31fd05dab63\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6" Apr 22 13:21:27.372336 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/325a1de2-a59a-4875-9ae5-6279a61a3d7c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qkdgb\" (UID: \"325a1de2-a59a-4875-9ae5-6279a61a3d7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" Apr 22 13:21:27.372336 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krllr\" (UniqueName: \"kubernetes.io/projected/bd517d51-a6c8-4afc-a72e-d5715a54d32b-kube-api-access-krllr\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.372336 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfgc2\" (UniqueName: \"kubernetes.io/projected/314bb891-5872-4d07-b293-eb6ba8a1c926-kube-api-access-tfgc2\") pod \"kube-storage-version-migrator-operator-6769c5d45-9s844\" (UID: \"314bb891-5872-4d07-b293-eb6ba8a1c926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" Apr 22 13:21:27.372336 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1e878e40-107f-4e8f-a28a-a31fd05dab63-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bb549c677-jfxn6\" (UID: \"1e878e40-107f-4e8f-a28a-a31fd05dab63\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6" Apr 22 13:21:27.372336 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.372336 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85tb2\" (UniqueName: \"kubernetes.io/projected/71d8f771-bd2a-4877-b621-ee39745c59d3-kube-api-access-85tb2\") pod \"ingress-canary-28xhg\" (UID: \"71d8f771-bd2a-4877-b621-ee39745c59d3\") " pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:27.372336 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372330 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7wmrw\" (UID: \"e52e91fc-36b4-42db-9861-4ef4a07e7ec7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:27.372659 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd517d51-a6c8-4afc-a72e-d5715a54d32b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.372659 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372410 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40c9e05d-a355-4746-b30f-56fb43b54267-installation-pull-secrets\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.372659 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.372499 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 13:21:27.372659 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.372517 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b8d7bc487-ggkqp: secret "image-registry-tls" not found Apr 22 13:21:27.372659 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.372578 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls podName:40c9e05d-a355-4746-b30f-56fb43b54267 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:27.872561407 +0000 UTC m=+33.651248164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls") pod "image-registry-6b8d7bc487-ggkqp" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267") : secret "image-registry-tls" not found Apr 22 13:21:27.372659 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert\") pod \"ingress-canary-28xhg\" (UID: \"71d8f771-bd2a-4877-b621-ee39745c59d3\") " pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:27.372993 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmzt7\" (UniqueName: \"kubernetes.io/projected/325a1de2-a59a-4875-9ae5-6279a61a3d7c-kube-api-access-qmzt7\") pod \"service-ca-operator-d6fc45fc5-qkdgb\" (UID: \"325a1de2-a59a-4875-9ae5-6279a61a3d7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" Apr 22 13:21:27.372993 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372712 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxxn\" (UniqueName: \"kubernetes.io/projected/af2a53dd-540c-49c6-b29d-228576b6c6ef-kube-api-access-6cxxn\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:27.372993 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372758 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f0d009dd-f884-49c7-9c71-35b224f451dd-klusterlet-config\") pod \"klusterlet-addon-workmgr-7db4c585c4-mpcl4\" (UID: \"f0d009dd-f884-49c7-9c71-35b224f451dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" Apr 22 13:21:27.372993 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372786 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:27.372993 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/af2a53dd-540c-49c6-b29d-228576b6c6ef-tmp-dir\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:27.372993 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40c9e05d-a355-4746-b30f-56fb43b54267-trusted-ca\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.372993 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af2a53dd-540c-49c6-b29d-228576b6c6ef-config-volume\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:27.372993 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372952 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314bb891-5872-4d07-b293-eb6ba8a1c926-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9s844\" (UID: \"314bb891-5872-4d07-b293-eb6ba8a1c926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" Apr 22 13:21:27.372993 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.372984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k5kj\" (UniqueName: \"kubernetes.io/projected/f0d009dd-f884-49c7-9c71-35b224f451dd-kube-api-access-7k5kj\") pod \"klusterlet-addon-workmgr-7db4c585c4-mpcl4\" (UID: \"f0d009dd-f884-49c7-9c71-35b224f451dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" Apr 22 13:21:27.373451 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4vss\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-kube-api-access-z4vss\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.373451 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373044 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:27.373451 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373073 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpxws\" (UniqueName: \"kubernetes.io/projected/8c86436f-31ed-4303-a368-025b9fb5a7ed-kube-api-access-tpxws\") pod \"volume-data-source-validator-7c6cbb6c87-plfpc\" (UID: \"8c86436f-31ed-4303-a368-025b9fb5a7ed\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-plfpc" Apr 22 13:21:27.373451 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373151 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/bd517d51-a6c8-4afc-a72e-d5715a54d32b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.373451 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373390 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7wmrw\" (UID: \"e52e91fc-36b4-42db-9861-4ef4a07e7ec7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:27.373699 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373482 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7wmrw\" (UID: \"e52e91fc-36b4-42db-9861-4ef4a07e7ec7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:27.373699 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/325a1de2-a59a-4875-9ae5-6279a61a3d7c-config\") pod \"service-ca-operator-d6fc45fc5-qkdgb\" (UID: \"325a1de2-a59a-4875-9ae5-6279a61a3d7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" Apr 22 13:21:27.373699 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.373607 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 13:21:27.373699 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373655 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314bb891-5872-4d07-b293-eb6ba8a1c926-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9s844\" (UID: \"314bb891-5872-4d07-b293-eb6ba8a1c926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" Apr 22 13:21:27.373699 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.373667 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert podName:e52e91fc-36b4-42db-9861-4ef4a07e7ec7 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:27.873650621 +0000 UTC m=+33.652337394 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7wmrw" (UID: "e52e91fc-36b4-42db-9861-4ef4a07e7ec7") : secret "networking-console-plugin-cert" not found Apr 22 13:21:27.373959 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q74xx\" (UniqueName: \"kubernetes.io/projected/519ff98a-672a-4535-ba51-0ecaffb33bc5-kube-api-access-q74xx\") pod \"network-check-source-8894fc9bd-9jhgc\" (UID: \"519ff98a-672a-4535-ba51-0ecaffb33bc5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9jhgc" Apr 22 13:21:27.373959 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/314bb891-5872-4d07-b293-eb6ba8a1c926-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9s844\" (UID: \"314bb891-5872-4d07-b293-eb6ba8a1c926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" Apr 22 13:21:27.373959 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/40c9e05d-a355-4746-b30f-56fb43b54267-image-registry-private-configuration\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.373959 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373804 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/bd517d51-a6c8-4afc-a72e-d5715a54d32b-hub\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.373959 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6690926-2579-440b-9233-f4d551be735b-trusted-ca\") pod \"console-operator-9d4b6777b-tkzkh\" (UID: \"e6690926-2579-440b-9233-f4d551be735b\") " pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.373959 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373886 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f0d009dd-f884-49c7-9c71-35b224f451dd-tmp\") pod \"klusterlet-addon-workmgr-7db4c585c4-mpcl4\" (UID: \"f0d009dd-f884-49c7-9c71-35b224f451dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" Apr 22 13:21:27.373959 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40c9e05d-a355-4746-b30f-56fb43b54267-ca-trust-extracted\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.373959 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:27.374330 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6690926-2579-440b-9233-f4d551be735b-serving-cert\") pod \"console-operator-9d4b6777b-tkzkh\" (UID: \"e6690926-2579-440b-9233-f4d551be735b\") " pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.374330 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.373994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mjxx\" (UniqueName: \"kubernetes.io/projected/e6690926-2579-440b-9233-f4d551be735b-kube-api-access-9mjxx\") pod \"console-operator-9d4b6777b-tkzkh\" (UID: \"e6690926-2579-440b-9233-f4d551be735b\") " pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.374330 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.374039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40c9e05d-a355-4746-b30f-56fb43b54267-registry-certificates\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.374330 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.374066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/bd517d51-a6c8-4afc-a72e-d5715a54d32b-ca\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.374330 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.374104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-bound-sa-token\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.374330 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.374127 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/325a1de2-a59a-4875-9ae5-6279a61a3d7c-config\") pod \"service-ca-operator-d6fc45fc5-qkdgb\" (UID: \"325a1de2-a59a-4875-9ae5-6279a61a3d7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" Apr 22 13:21:27.374330 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.374132 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/bd517d51-a6c8-4afc-a72e-d5715a54d32b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.374330 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.374173 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 13:21:27.374330 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.374181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6690926-2579-440b-9233-f4d551be735b-config\") pod \"console-operator-9d4b6777b-tkzkh\" (UID: \"e6690926-2579-440b-9233-f4d551be735b\") " pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.374330 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.374234 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls podName:39b46330-ba0e-4d59-adf9-d5dae3eff9a5 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:27.87421829 +0000 UTC m=+33.652905052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-m67zs" (UID: "39b46330-ba0e-4d59-adf9-d5dae3eff9a5") : secret "cluster-monitoring-operator-tls" not found Apr 22 13:21:27.374330 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.374286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:27.374877 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.374767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40c9e05d-a355-4746-b30f-56fb43b54267-registry-certificates\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.375056 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.375030 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/325a1de2-a59a-4875-9ae5-6279a61a3d7c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qkdgb\" (UID: \"325a1de2-a59a-4875-9ae5-6279a61a3d7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" Apr 22 13:21:27.375230 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.375210 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40c9e05d-a355-4746-b30f-56fb43b54267-trusted-ca\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.375332 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.375214 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40c9e05d-a355-4746-b30f-56fb43b54267-installation-pull-secrets\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.375332 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.375243 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6690926-2579-440b-9233-f4d551be735b-trusted-ca\") pod \"console-operator-9d4b6777b-tkzkh\" (UID: \"e6690926-2579-440b-9233-f4d551be735b\") " pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.375544 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.375517 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6690926-2579-440b-9233-f4d551be735b-config\") pod \"console-operator-9d4b6777b-tkzkh\" (UID: \"e6690926-2579-440b-9233-f4d551be735b\") " pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.375632 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.375593 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40c9e05d-a355-4746-b30f-56fb43b54267-ca-trust-extracted\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.377004 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.376966 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/40c9e05d-a355-4746-b30f-56fb43b54267-image-registry-private-configuration\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.377089 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.377047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6690926-2579-440b-9233-f4d551be735b-serving-cert\") pod \"console-operator-9d4b6777b-tkzkh\" (UID: \"e6690926-2579-440b-9233-f4d551be735b\") " pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.377401 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.377383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/314bb891-5872-4d07-b293-eb6ba8a1c926-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9s844\" (UID: \"314bb891-5872-4d07-b293-eb6ba8a1c926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" Apr 22 13:21:27.382252 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.382231 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnzvm\" (UniqueName: \"kubernetes.io/projected/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-kube-api-access-mnzvm\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:27.382796 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.382743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmzt7\" (UniqueName: \"kubernetes.io/projected/325a1de2-a59a-4875-9ae5-6279a61a3d7c-kube-api-access-qmzt7\") pod \"service-ca-operator-d6fc45fc5-qkdgb\" (UID: \"325a1de2-a59a-4875-9ae5-6279a61a3d7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" Apr 22 13:21:27.383061 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.383036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4vss\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-kube-api-access-z4vss\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.383623 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.383600 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-bound-sa-token\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.384085 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.384064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfgc2\" (UniqueName: \"kubernetes.io/projected/314bb891-5872-4d07-b293-eb6ba8a1c926-kube-api-access-tfgc2\") pod \"kube-storage-version-migrator-operator-6769c5d45-9s844\" (UID: \"314bb891-5872-4d07-b293-eb6ba8a1c926\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" Apr 22 13:21:27.385241 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.385223 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpxws\" (UniqueName: \"kubernetes.io/projected/8c86436f-31ed-4303-a368-025b9fb5a7ed-kube-api-access-tpxws\") pod \"volume-data-source-validator-7c6cbb6c87-plfpc\" (UID: \"8c86436f-31ed-4303-a368-025b9fb5a7ed\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-plfpc" Apr 22 13:21:27.385421 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.385399 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mjxx\" (UniqueName: \"kubernetes.io/projected/e6690926-2579-440b-9233-f4d551be735b-kube-api-access-9mjxx\") pod \"console-operator-9d4b6777b-tkzkh\" (UID: \"e6690926-2579-440b-9233-f4d551be735b\") " pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.386385 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.386369 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-c7lmh" Apr 22 13:21:27.460860 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.460784 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:27.469548 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.469519 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-plfpc" Apr 22 13:21:27.475364 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.475341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krllr\" (UniqueName: \"kubernetes.io/projected/bd517d51-a6c8-4afc-a72e-d5715a54d32b-kube-api-access-krllr\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.475455 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.475375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1e878e40-107f-4e8f-a28a-a31fd05dab63-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bb549c677-jfxn6\" (UID: \"1e878e40-107f-4e8f-a28a-a31fd05dab63\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6" Apr 22 13:21:27.475455 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.475402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85tb2\" (UniqueName: \"kubernetes.io/projected/71d8f771-bd2a-4877-b621-ee39745c59d3-kube-api-access-85tb2\") pod \"ingress-canary-28xhg\" (UID: \"71d8f771-bd2a-4877-b621-ee39745c59d3\") " pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:27.475578 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.475560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd517d51-a6c8-4afc-a72e-d5715a54d32b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.475638 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.475603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs\") pod \"network-metrics-daemon-jgxt9\" (UID: \"8b92597f-aa17-456d-bc65-ee5880d70a69\") " pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:27.475690 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.475635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert\") pod \"ingress-canary-28xhg\" (UID: \"71d8f771-bd2a-4877-b621-ee39745c59d3\") " pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:27.475690 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.475666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxxn\" (UniqueName: \"kubernetes.io/projected/af2a53dd-540c-49c6-b29d-228576b6c6ef-kube-api-access-6cxxn\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:27.475789 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.475704 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f0d009dd-f884-49c7-9c71-35b224f451dd-klusterlet-config\") pod \"klusterlet-addon-workmgr-7db4c585c4-mpcl4\" (UID: \"f0d009dd-f884-49c7-9c71-35b224f451dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" Apr 22 13:21:27.475789 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.475720 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:27.475789 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.475755 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 13:21:27.475926 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.475799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:27.475926 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.475842 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert podName:71d8f771-bd2a-4877-b621-ee39745c59d3 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:27.97580183 +0000 UTC m=+33.754488599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert") pod "ingress-canary-28xhg" (UID: "71d8f771-bd2a-4877-b621-ee39745c59d3") : secret "canary-serving-cert" not found Apr 22 13:21:27.475926 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.475878 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs podName:8b92597f-aa17-456d-bc65-ee5880d70a69 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:59.475868693 +0000 UTC m=+65.254555464 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs") pod "network-metrics-daemon-jgxt9" (UID: "8b92597f-aa17-456d-bc65-ee5880d70a69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 13:21:27.475926 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.475890 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 13:21:27.475926 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.475897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/af2a53dd-540c-49c6-b29d-228576b6c6ef-tmp-dir\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:27.476184 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.475935 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls podName:af2a53dd-540c-49c6-b29d-228576b6c6ef nodeName:}" failed. No retries permitted until 2026-04-22 13:21:27.975920162 +0000 UTC m=+33.754606929 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls") pod "dns-default-qlspg" (UID: "af2a53dd-540c-49c6-b29d-228576b6c6ef") : secret "dns-default-metrics-tls" not found Apr 22 13:21:27.476184 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.475955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af2a53dd-540c-49c6-b29d-228576b6c6ef-config-volume\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:27.476184 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.476005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k5kj\" (UniqueName: \"kubernetes.io/projected/f0d009dd-f884-49c7-9c71-35b224f451dd-kube-api-access-7k5kj\") pod \"klusterlet-addon-workmgr-7db4c585c4-mpcl4\" (UID: \"f0d009dd-f884-49c7-9c71-35b224f451dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" Apr 22 13:21:27.476184 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.476040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/bd517d51-a6c8-4afc-a72e-d5715a54d32b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.476184 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.476089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q74xx\" (UniqueName: \"kubernetes.io/projected/519ff98a-672a-4535-ba51-0ecaffb33bc5-kube-api-access-q74xx\") pod \"network-check-source-8894fc9bd-9jhgc\" (UID: \"519ff98a-672a-4535-ba51-0ecaffb33bc5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9jhgc" Apr 22 13:21:27.476184 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.476159 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/bd517d51-a6c8-4afc-a72e-d5715a54d32b-hub\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.476184 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.476165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/af2a53dd-540c-49c6-b29d-228576b6c6ef-tmp-dir\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:27.476569 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.476193 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f0d009dd-f884-49c7-9c71-35b224f451dd-tmp\") pod \"klusterlet-addon-workmgr-7db4c585c4-mpcl4\" (UID: \"f0d009dd-f884-49c7-9c71-35b224f451dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" Apr 22 13:21:27.476569 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.476255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/bd517d51-a6c8-4afc-a72e-d5715a54d32b-ca\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.476569 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.476289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/bd517d51-a6c8-4afc-a72e-d5715a54d32b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.476569 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.476324 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvb7s\" (UniqueName: \"kubernetes.io/projected/1e878e40-107f-4e8f-a28a-a31fd05dab63-kube-api-access-dvb7s\") pod \"managed-serviceaccount-addon-agent-7bb549c677-jfxn6\" (UID: \"1e878e40-107f-4e8f-a28a-a31fd05dab63\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6" Apr 22 13:21:27.476569 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.476473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af2a53dd-540c-49c6-b29d-228576b6c6ef-config-volume\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:27.477080 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.477057 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/bd517d51-a6c8-4afc-a72e-d5715a54d32b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.478234 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.478207 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1e878e40-107f-4e8f-a28a-a31fd05dab63-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bb549c677-jfxn6\" (UID: \"1e878e40-107f-4e8f-a28a-a31fd05dab63\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6" Apr 22 13:21:27.478469 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.478451 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bd517d51-a6c8-4afc-a72e-d5715a54d32b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.478528 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.478501 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/bd517d51-a6c8-4afc-a72e-d5715a54d32b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.481388 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.481369 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f0d009dd-f884-49c7-9c71-35b224f451dd-tmp\") pod \"klusterlet-addon-workmgr-7db4c585c4-mpcl4\" (UID: \"f0d009dd-f884-49c7-9c71-35b224f451dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" Apr 22 13:21:27.483229 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.483205 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f0d009dd-f884-49c7-9c71-35b224f451dd-klusterlet-config\") pod \"klusterlet-addon-workmgr-7db4c585c4-mpcl4\" (UID: \"f0d009dd-f884-49c7-9c71-35b224f451dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" Apr 22 13:21:27.485056 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.484701 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/bd517d51-a6c8-4afc-a72e-d5715a54d32b-ca\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.485317 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.485297 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/bd517d51-a6c8-4afc-a72e-d5715a54d32b-hub\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.485391 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.485350 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k5kj\" (UniqueName: \"kubernetes.io/projected/f0d009dd-f884-49c7-9c71-35b224f451dd-kube-api-access-7k5kj\") pod \"klusterlet-addon-workmgr-7db4c585c4-mpcl4\" (UID: \"f0d009dd-f884-49c7-9c71-35b224f451dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" Apr 22 13:21:27.485391 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.485378 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krllr\" (UniqueName: \"kubernetes.io/projected/bd517d51-a6c8-4afc-a72e-d5715a54d32b-kube-api-access-krllr\") pod \"cluster-proxy-proxy-agent-5d69c7685b-fdvpd\" (UID: \"bd517d51-a6c8-4afc-a72e-d5715a54d32b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.485541 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.485524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q74xx\" (UniqueName: \"kubernetes.io/projected/519ff98a-672a-4535-ba51-0ecaffb33bc5-kube-api-access-q74xx\") pod \"network-check-source-8894fc9bd-9jhgc\" (UID: \"519ff98a-672a-4535-ba51-0ecaffb33bc5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9jhgc" Apr 22 13:21:27.485699 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.485682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxxn\" (UniqueName: \"kubernetes.io/projected/af2a53dd-540c-49c6-b29d-228576b6c6ef-kube-api-access-6cxxn\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:27.485699 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.485692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvb7s\" (UniqueName: \"kubernetes.io/projected/1e878e40-107f-4e8f-a28a-a31fd05dab63-kube-api-access-dvb7s\") pod \"managed-serviceaccount-addon-agent-7bb549c677-jfxn6\" (UID: \"1e878e40-107f-4e8f-a28a-a31fd05dab63\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6" Apr 22 13:21:27.485845 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.485808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85tb2\" (UniqueName: \"kubernetes.io/projected/71d8f771-bd2a-4877-b621-ee39745c59d3-kube-api-access-85tb2\") pod \"ingress-canary-28xhg\" (UID: \"71d8f771-bd2a-4877-b621-ee39745c59d3\") " pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:27.487634 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.487617 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" Apr 22 13:21:27.504892 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.504869 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" Apr 22 13:21:27.521097 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.521072 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-g8tbz"] Apr 22 13:21:27.536705 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.536477 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g8tbz" Apr 22 13:21:27.538910 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.538605 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9jhgc" Apr 22 13:21:27.539326 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.539112 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bwhcj\"" Apr 22 13:21:27.574177 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.574067 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:21:27.577566 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.577388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgxm\" (UniqueName: \"kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm\") pod \"network-check-target-ggdrt\" (UID: \"767888b5-4be3-4a3e-ac92-a5c0cd2708fe\") " pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:27.583006 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.582952 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlgxm\" (UniqueName: \"kubernetes.io/projected/767888b5-4be3-4a3e-ac92-a5c0cd2708fe-kube-api-access-xlgxm\") pod \"network-check-target-ggdrt\" (UID: \"767888b5-4be3-4a3e-ac92-a5c0cd2708fe\") " pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:27.605450 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.604775 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" Apr 22 13:21:27.624142 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.621883 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6" Apr 22 13:21:27.679296 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.679101 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc02fe1b-6157-4e28-a646-7be5ed635282-hosts-file\") pod \"node-resolver-g8tbz\" (UID: \"bc02fe1b-6157-4e28-a646-7be5ed635282\") " pod="openshift-dns/node-resolver-g8tbz" Apr 22 13:21:27.679296 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.679211 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bc02fe1b-6157-4e28-a646-7be5ed635282-tmp-dir\") pod \"node-resolver-g8tbz\" (UID: \"bc02fe1b-6157-4e28-a646-7be5ed635282\") " pod="openshift-dns/node-resolver-g8tbz" Apr 22 13:21:27.679511 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.679350 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76sw8\" (UniqueName: \"kubernetes.io/projected/bc02fe1b-6157-4e28-a646-7be5ed635282-kube-api-access-76sw8\") pod \"node-resolver-g8tbz\" (UID: \"bc02fe1b-6157-4e28-a646-7be5ed635282\") " pod="openshift-dns/node-resolver-g8tbz" Apr 22 13:21:27.782509 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.780687 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bc02fe1b-6157-4e28-a646-7be5ed635282-tmp-dir\") pod \"node-resolver-g8tbz\" (UID: \"bc02fe1b-6157-4e28-a646-7be5ed635282\") " pod="openshift-dns/node-resolver-g8tbz" Apr 22 13:21:27.782509 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.780803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5b69k\" (UID: \"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:27.782509 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.780928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.782509 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.780966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76sw8\" (UniqueName: \"kubernetes.io/projected/bc02fe1b-6157-4e28-a646-7be5ed635282-kube-api-access-76sw8\") pod \"node-resolver-g8tbz\" (UID: \"bc02fe1b-6157-4e28-a646-7be5ed635282\") " pod="openshift-dns/node-resolver-g8tbz" Apr 22 13:21:27.782509 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.781029 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:27.782509 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.781071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc02fe1b-6157-4e28-a646-7be5ed635282-hosts-file\") pod \"node-resolver-g8tbz\" (UID: \"bc02fe1b-6157-4e28-a646-7be5ed635282\") " pod="openshift-dns/node-resolver-g8tbz" Apr 22 13:21:27.782509 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.781242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc02fe1b-6157-4e28-a646-7be5ed635282-hosts-file\") pod \"node-resolver-g8tbz\" (UID: \"bc02fe1b-6157-4e28-a646-7be5ed635282\") " pod="openshift-dns/node-resolver-g8tbz" Apr 22 13:21:27.782509 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.781633 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle podName:31773dcc-6b07-4788-8a81-d7978b0c63fc nodeName:}" failed. No retries permitted until 2026-04-22 13:21:28.781611239 +0000 UTC m=+34.560298017 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle") pod "router-default-6784d5bdf4-cp9jk" (UID: "31773dcc-6b07-4788-8a81-d7978b0c63fc") : configmap references non-existent config key: service-ca.crt Apr 22 13:21:27.782509 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.781730 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 13:21:27.782509 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.781770 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls podName:26fb6a86-2731-45ec-bf1d-5a84dbd6e4de nodeName:}" failed. No retries permitted until 2026-04-22 13:21:28.78175702 +0000 UTC m=+34.560443791 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5b69k" (UID: "26fb6a86-2731-45ec-bf1d-5a84dbd6e4de") : secret "samples-operator-tls" not found Apr 22 13:21:27.782509 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.782266 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 13:21:27.782509 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.782315 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs podName:31773dcc-6b07-4788-8a81-d7978b0c63fc nodeName:}" failed. No retries permitted until 2026-04-22 13:21:28.782301428 +0000 UTC m=+34.560988198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs") pod "router-default-6784d5bdf4-cp9jk" (UID: "31773dcc-6b07-4788-8a81-d7978b0c63fc") : secret "router-metrics-certs-default" not found Apr 22 13:21:27.790014 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.789939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bc02fe1b-6157-4e28-a646-7be5ed635282-tmp-dir\") pod \"node-resolver-g8tbz\" (UID: \"bc02fe1b-6157-4e28-a646-7be5ed635282\") " pod="openshift-dns/node-resolver-g8tbz" Apr 22 13:21:27.790460 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.790401 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-plfpc"] Apr 22 13:21:27.804263 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.796128 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-tkzkh"] Apr 22 13:21:27.804263 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.801222 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76sw8\" (UniqueName: \"kubernetes.io/projected/bc02fe1b-6157-4e28-a646-7be5ed635282-kube-api-access-76sw8\") pod \"node-resolver-g8tbz\" (UID: \"bc02fe1b-6157-4e28-a646-7be5ed635282\") " pod="openshift-dns/node-resolver-g8tbz" Apr 22 13:21:27.808428 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.807195 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb"] Apr 22 13:21:27.815923 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.815902 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-c7lmh"] Apr 22 13:21:27.817780 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.817738 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844"] Apr 22 13:21:27.832582 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.828353 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9jhgc"] Apr 22 13:21:27.832582 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.832366 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd"] Apr 22 13:21:27.842912 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:27.842858 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c86436f_31ed_4303_a368_025b9fb5a7ed.slice/crio-3aa685aa3a0cc2f911f910b3f3f5941d6e0e37d89ad10b64defc8a2135626ace WatchSource:0}: Error finding container 3aa685aa3a0cc2f911f910b3f3f5941d6e0e37d89ad10b64defc8a2135626ace: Status 404 returned error can't find the container with id 3aa685aa3a0cc2f911f910b3f3f5941d6e0e37d89ad10b64defc8a2135626ace Apr 22 13:21:27.854092 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:27.853801 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod314bb891_5872_4d07_b293_eb6ba8a1c926.slice/crio-88293d643a1568c99dce9d542eb862e939039fd0abc27dbc6d37a588fc8a38f8 WatchSource:0}: Error finding container 88293d643a1568c99dce9d542eb862e939039fd0abc27dbc6d37a588fc8a38f8: Status 404 returned error can't find the container with id 88293d643a1568c99dce9d542eb862e939039fd0abc27dbc6d37a588fc8a38f8 Apr 22 13:21:27.854291 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:27.854264 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod519ff98a_672a_4535_ba51_0ecaffb33bc5.slice/crio-f35c2da63e08d48037fb1a5eb4ad980146ad5b32adfd5525bc39e7df8a1dc17a WatchSource:0}: Error finding container f35c2da63e08d48037fb1a5eb4ad980146ad5b32adfd5525bc39e7df8a1dc17a: Status 404 returned error can't find the container with id f35c2da63e08d48037fb1a5eb4ad980146ad5b32adfd5525bc39e7df8a1dc17a Apr 22 13:21:27.854948 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:27.854925 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd517d51_a6c8_4afc_a72e_d5715a54d32b.slice/crio-0af65737d67daf2e041cddd6e8bc24912430afc434c5c234eb3e8999e7e0d810 WatchSource:0}: Error finding container 0af65737d67daf2e041cddd6e8bc24912430afc434c5c234eb3e8999e7e0d810: Status 404 returned error can't find the container with id 0af65737d67daf2e041cddd6e8bc24912430afc434c5c234eb3e8999e7e0d810 Apr 22 13:21:27.857930 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.857880 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g8tbz" Apr 22 13:21:27.876491 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.876433 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6"] Apr 22 13:21:27.878984 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:27.878946 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e878e40_107f_4e8f_a28a_a31fd05dab63.slice/crio-bae51f5c3735c0b2ae8cbe89eaadea8d50784a33bd6303321e38031eed844bed WatchSource:0}: Error finding container bae51f5c3735c0b2ae8cbe89eaadea8d50784a33bd6303321e38031eed844bed: Status 404 returned error can't find the container with id bae51f5c3735c0b2ae8cbe89eaadea8d50784a33bd6303321e38031eed844bed Apr 22 13:21:27.881800 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.881779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7wmrw\" (UID: \"e52e91fc-36b4-42db-9861-4ef4a07e7ec7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:27.881943 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.881925 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 13:21:27.881994 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.881984 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert podName:e52e91fc-36b4-42db-9861-4ef4a07e7ec7 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:28.881968681 +0000 UTC m=+34.660655437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7wmrw" (UID: "e52e91fc-36b4-42db-9861-4ef4a07e7ec7") : secret "networking-console-plugin-cert" not found Apr 22 13:21:27.882033 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.881928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:27.882033 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.882003 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 13:21:27.882093 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.882054 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls podName:39b46330-ba0e-4d59-adf9-d5dae3eff9a5 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:28.882039045 +0000 UTC m=+34.660725802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-m67zs" (UID: "39b46330-ba0e-4d59-adf9-d5dae3eff9a5") : secret "cluster-monitoring-operator-tls" not found Apr 22 13:21:27.882134 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.882093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:27.882235 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.882221 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 13:21:27.882273 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.882239 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b8d7bc487-ggkqp: secret "image-registry-tls" not found Apr 22 13:21:27.882273 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.882269 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls podName:40c9e05d-a355-4746-b30f-56fb43b54267 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:28.882261214 +0000 UTC m=+34.660947970 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls") pod "image-registry-6b8d7bc487-ggkqp" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267") : secret "image-registry-tls" not found Apr 22 13:21:27.885552 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.885526 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4"] Apr 22 13:21:27.889160 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:27.889052 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0d009dd_f884_49c7_9c71_35b224f451dd.slice/crio-25f86c4525d8b4d14494ef1bcfe7efb00c156155cb459cdb556fbb852a16f881 WatchSource:0}: Error finding container 25f86c4525d8b4d14494ef1bcfe7efb00c156155cb459cdb556fbb852a16f881: Status 404 returned error can't find the container with id 25f86c4525d8b4d14494ef1bcfe7efb00c156155cb459cdb556fbb852a16f881 Apr 22 13:21:27.893353 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:27.893335 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc02fe1b_6157_4e28_a646_7be5ed635282.slice/crio-e3f766245ebf33bff5a08e6ee2087b776d41a9ad6e9cff2711a1f9f38085b55b WatchSource:0}: Error finding container e3f766245ebf33bff5a08e6ee2087b776d41a9ad6e9cff2711a1f9f38085b55b: Status 404 returned error can't find the container with id e3f766245ebf33bff5a08e6ee2087b776d41a9ad6e9cff2711a1f9f38085b55b Apr 22 13:21:27.982984 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.982949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert\") pod \"ingress-canary-28xhg\" (UID: \"71d8f771-bd2a-4877-b621-ee39745c59d3\") " pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:27.983137 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:27.983016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:27.983137 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.983109 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 13:21:27.983243 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.983182 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert podName:71d8f771-bd2a-4877-b621-ee39745c59d3 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:28.983162146 +0000 UTC m=+34.761848926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert") pod "ingress-canary-28xhg" (UID: "71d8f771-bd2a-4877-b621-ee39745c59d3") : secret "canary-serving-cert" not found Apr 22 13:21:27.983306 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.983246 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 13:21:27.983306 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:27.983299 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls podName:af2a53dd-540c-49c6-b29d-228576b6c6ef nodeName:}" failed. No retries permitted until 2026-04-22 13:21:28.983284004 +0000 UTC m=+34.761970759 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls") pod "dns-default-qlspg" (UID: "af2a53dd-540c-49c6-b29d-228576b6c6ef") : secret "dns-default-metrics-tls" not found Apr 22 13:21:28.040412 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.040249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" event={"ID":"314bb891-5872-4d07-b293-eb6ba8a1c926","Type":"ContainerStarted","Data":"88293d643a1568c99dce9d542eb862e939039fd0abc27dbc6d37a588fc8a38f8"} Apr 22 13:21:28.041362 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.041330 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-plfpc" event={"ID":"8c86436f-31ed-4303-a368-025b9fb5a7ed","Type":"ContainerStarted","Data":"3aa685aa3a0cc2f911f910b3f3f5941d6e0e37d89ad10b64defc8a2135626ace"} Apr 22 13:21:28.042279 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.042256 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-c7lmh" event={"ID":"6a9a5a71-f594-4aa2-a20c-e3b81689cb97","Type":"ContainerStarted","Data":"2399ce60906179065983710ccc9654909464776fe5f6ec2cff3ebda02b106430"} Apr 22 13:21:28.043379 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.043359 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g8tbz" event={"ID":"bc02fe1b-6157-4e28-a646-7be5ed635282","Type":"ContainerStarted","Data":"e3f766245ebf33bff5a08e6ee2087b776d41a9ad6e9cff2711a1f9f38085b55b"} Apr 22 13:21:28.044140 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.044117 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" event={"ID":"325a1de2-a59a-4875-9ae5-6279a61a3d7c","Type":"ContainerStarted","Data":"c9253b86f11ca75bc7df6cd72852b021a2701b7eced8e46096f3e41c45a0857e"} Apr 22 13:21:28.044933 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.044914 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6" event={"ID":"1e878e40-107f-4e8f-a28a-a31fd05dab63","Type":"ContainerStarted","Data":"bae51f5c3735c0b2ae8cbe89eaadea8d50784a33bd6303321e38031eed844bed"} Apr 22 13:21:28.045729 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.045712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" event={"ID":"e6690926-2579-440b-9233-f4d551be735b","Type":"ContainerStarted","Data":"5f9173c4df9c7df2f8a073ebf0483d2e9dd0da615771e508592427906e0ff94b"} Apr 22 13:21:28.046601 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.046584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" event={"ID":"f0d009dd-f884-49c7-9c71-35b224f451dd","Type":"ContainerStarted","Data":"25f86c4525d8b4d14494ef1bcfe7efb00c156155cb459cdb556fbb852a16f881"} Apr 22 13:21:28.047433 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.047414 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" event={"ID":"bd517d51-a6c8-4afc-a72e-d5715a54d32b","Type":"ContainerStarted","Data":"0af65737d67daf2e041cddd6e8bc24912430afc434c5c234eb3e8999e7e0d810"} Apr 22 13:21:28.048203 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.048188 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9jhgc" event={"ID":"519ff98a-672a-4535-ba51-0ecaffb33bc5","Type":"ContainerStarted","Data":"f35c2da63e08d48037fb1a5eb4ad980146ad5b32adfd5525bc39e7df8a1dc17a"} Apr 22 13:21:28.791880 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.791797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5b69k\" (UID: \"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:28.792791 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.791923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:28.792791 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.791989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:28.792791 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.792213 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 13:21:28.792791 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.792276 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs podName:31773dcc-6b07-4788-8a81-d7978b0c63fc nodeName:}" failed. No retries permitted until 2026-04-22 13:21:30.792257147 +0000 UTC m=+36.570943909 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs") pod "router-default-6784d5bdf4-cp9jk" (UID: "31773dcc-6b07-4788-8a81-d7978b0c63fc") : secret "router-metrics-certs-default" not found Apr 22 13:21:28.792791 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.792698 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 13:21:28.792791 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.792755 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls podName:26fb6a86-2731-45ec-bf1d-5a84dbd6e4de nodeName:}" failed. No retries permitted until 2026-04-22 13:21:30.792739094 +0000 UTC m=+36.571425855 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5b69k" (UID: "26fb6a86-2731-45ec-bf1d-5a84dbd6e4de") : secret "samples-operator-tls" not found Apr 22 13:21:28.793131 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.792841 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle podName:31773dcc-6b07-4788-8a81-d7978b0c63fc nodeName:}" failed. No retries permitted until 2026-04-22 13:21:30.792811454 +0000 UTC m=+36.571498215 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle") pod "router-default-6784d5bdf4-cp9jk" (UID: "31773dcc-6b07-4788-8a81-d7978b0c63fc") : configmap references non-existent config key: service-ca.crt Apr 22 13:21:28.852143 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.852108 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:28.854691 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.852947 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:28.854691 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.853401 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:28.855454 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.854977 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bxmrr\"" Apr 22 13:21:28.855454 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.855189 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 13:21:28.856893 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.856299 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 13:21:28.856893 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.856490 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8cjvt\"" Apr 22 13:21:28.880300 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.880272 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:28.893509 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.893479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7wmrw\" (UID: \"e52e91fc-36b4-42db-9861-4ef4a07e7ec7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:28.893620 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.893549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:28.893620 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.893613 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:28.893777 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.893618 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 13:21:28.893777 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.893676 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert podName:e52e91fc-36b4-42db-9861-4ef4a07e7ec7 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:30.893658197 +0000 UTC m=+36.672344964 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7wmrw" (UID: "e52e91fc-36b4-42db-9861-4ef4a07e7ec7") : secret "networking-console-plugin-cert" not found Apr 22 13:21:28.893777 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.893721 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 13:21:28.893777 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.893734 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b8d7bc487-ggkqp: secret "image-registry-tls" not found Apr 22 13:21:28.893777 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.893766 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls podName:40c9e05d-a355-4746-b30f-56fb43b54267 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:30.893755356 +0000 UTC m=+36.672442115 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls") pod "image-registry-6b8d7bc487-ggkqp" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267") : secret "image-registry-tls" not found Apr 22 13:21:28.894059 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.893836 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 13:21:28.894059 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.893869 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls podName:39b46330-ba0e-4d59-adf9-d5dae3eff9a5 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:30.89385871 +0000 UTC m=+36.672545466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-m67zs" (UID: "39b46330-ba0e-4d59-adf9-d5dae3eff9a5") : secret "cluster-monitoring-operator-tls" not found Apr 22 13:21:28.994344 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.994304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert\") pod \"ingress-canary-28xhg\" (UID: \"71d8f771-bd2a-4877-b621-ee39745c59d3\") " pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:28.994528 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:28.994371 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:28.995404 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.995376 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 13:21:28.995535 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.995446 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert podName:71d8f771-bd2a-4877-b621-ee39745c59d3 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:30.995425902 +0000 UTC m=+36.774112665 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert") pod "ingress-canary-28xhg" (UID: "71d8f771-bd2a-4877-b621-ee39745c59d3") : secret "canary-serving-cert" not found Apr 22 13:21:28.995613 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.995603 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 13:21:28.995670 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:28.995645 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls podName:af2a53dd-540c-49c6-b29d-228576b6c6ef nodeName:}" failed. No retries permitted until 2026-04-22 13:21:30.995632489 +0000 UTC m=+36.774319251 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls") pod "dns-default-qlspg" (UID: "af2a53dd-540c-49c6-b29d-228576b6c6ef") : secret "dns-default-metrics-tls" not found Apr 22 13:21:29.071427 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:29.070237 2575 generic.go:358] "Generic (PLEG): container finished" podID="d5dedb94-cce8-40ef-8b20-152362aec6dc" containerID="4c1bfb88fc1c371c959ebf1c8dbe5853d9fd122226c792d5b166feda4490de13" exitCode=0 Apr 22 13:21:29.071427 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:29.070334 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82s2m" event={"ID":"d5dedb94-cce8-40ef-8b20-152362aec6dc","Type":"ContainerDied","Data":"4c1bfb88fc1c371c959ebf1c8dbe5853d9fd122226c792d5b166feda4490de13"} Apr 22 13:21:29.076745 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:29.076026 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g8tbz" event={"ID":"bc02fe1b-6157-4e28-a646-7be5ed635282","Type":"ContainerStarted","Data":"c48b37c8c914d046e646d34f4f90ea9eaae6cf28298e4655fc201ea98da10e41"} Apr 22 13:21:29.104013 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:29.103962 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-g8tbz" podStartSLOduration=2.103943558 podStartE2EDuration="2.103943558s" podCreationTimestamp="2026-04-22 13:21:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 13:21:29.102077437 +0000 UTC m=+34.880764216" watchObservedRunningTime="2026-04-22 13:21:29.103943558 +0000 UTC m=+34.882630338" Apr 22 13:21:29.104513 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:29.104218 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ggdrt"] Apr 22 13:21:29.113328 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:29.113293 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod767888b5_4be3_4a3e_ac92_a5c0cd2708fe.slice/crio-2969129f46d2dffe371ebda747d3b37eea2cc4dcf179f021bbac8d97780c20f9 WatchSource:0}: Error finding container 2969129f46d2dffe371ebda747d3b37eea2cc4dcf179f021bbac8d97780c20f9: Status 404 returned error can't find the container with id 2969129f46d2dffe371ebda747d3b37eea2cc4dcf179f021bbac8d97780c20f9 Apr 22 13:21:30.123266 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:30.123226 2575 generic.go:358] "Generic (PLEG): container finished" podID="d5dedb94-cce8-40ef-8b20-152362aec6dc" containerID="6f389d1b9c03c943df3e4d0aa2878620dec4cd29b9725014e278f1ddc9ddd551" exitCode=0 Apr 22 13:21:30.123835 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:30.123319 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82s2m" event={"ID":"d5dedb94-cce8-40ef-8b20-152362aec6dc","Type":"ContainerDied","Data":"6f389d1b9c03c943df3e4d0aa2878620dec4cd29b9725014e278f1ddc9ddd551"} Apr 22 13:21:30.144952 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:30.144894 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ggdrt" event={"ID":"767888b5-4be3-4a3e-ac92-a5c0cd2708fe","Type":"ContainerStarted","Data":"2969129f46d2dffe371ebda747d3b37eea2cc4dcf179f021bbac8d97780c20f9"} Apr 22 13:21:30.516745 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:30.516519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret\") pod \"global-pull-secret-syncer-cmksh\" (UID: \"7b8dea64-f9f6-45b5-b139-340bac72fa46\") " pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:30.569972 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:30.569902 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b8dea64-f9f6-45b5-b139-340bac72fa46-original-pull-secret\") pod \"global-pull-secret-syncer-cmksh\" (UID: \"7b8dea64-f9f6-45b5-b139-340bac72fa46\") " pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:30.689135 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:30.689102 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cmksh" Apr 22 13:21:30.819976 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:30.819895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:30.820131 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:30.820039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5b69k\" (UID: \"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:30.820131 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:30.820114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:30.820328 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:30.820281 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle podName:31773dcc-6b07-4788-8a81-d7978b0c63fc nodeName:}" failed. No retries permitted until 2026-04-22 13:21:34.82026101 +0000 UTC m=+40.598947767 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle") pod "router-default-6784d5bdf4-cp9jk" (UID: "31773dcc-6b07-4788-8a81-d7978b0c63fc") : configmap references non-existent config key: service-ca.crt Apr 22 13:21:30.820732 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:30.820711 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 13:21:30.820833 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:30.820772 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs podName:31773dcc-6b07-4788-8a81-d7978b0c63fc nodeName:}" failed. No retries permitted until 2026-04-22 13:21:34.820756102 +0000 UTC m=+40.599442862 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs") pod "router-default-6784d5bdf4-cp9jk" (UID: "31773dcc-6b07-4788-8a81-d7978b0c63fc") : secret "router-metrics-certs-default" not found Apr 22 13:21:30.820890 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:30.820846 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 13:21:30.820890 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:30.820879 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls podName:26fb6a86-2731-45ec-bf1d-5a84dbd6e4de nodeName:}" failed. No retries permitted until 2026-04-22 13:21:34.820868314 +0000 UTC m=+40.599555075 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5b69k" (UID: "26fb6a86-2731-45ec-bf1d-5a84dbd6e4de") : secret "samples-operator-tls" not found Apr 22 13:21:30.921145 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:30.921098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7wmrw\" (UID: \"e52e91fc-36b4-42db-9861-4ef4a07e7ec7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:30.921345 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:30.921175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:30.921345 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:30.921292 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:30.921573 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:30.921553 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 13:21:30.921624 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:30.921579 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b8d7bc487-ggkqp: secret "image-registry-tls" not found Apr 22 13:21:30.921673 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:30.921659 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls podName:40c9e05d-a355-4746-b30f-56fb43b54267 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:34.921621872 +0000 UTC m=+40.700308633 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls") pod "image-registry-6b8d7bc487-ggkqp" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267") : secret "image-registry-tls" not found Apr 22 13:21:30.922126 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:30.922105 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 13:21:30.922233 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:30.922162 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert podName:e52e91fc-36b4-42db-9861-4ef4a07e7ec7 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:34.922147641 +0000 UTC m=+40.700834423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7wmrw" (UID: "e52e91fc-36b4-42db-9861-4ef4a07e7ec7") : secret "networking-console-plugin-cert" not found Apr 22 13:21:30.922233 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:30.922219 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 13:21:30.922331 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:30.922252 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls podName:39b46330-ba0e-4d59-adf9-d5dae3eff9a5 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:34.92224192 +0000 UTC m=+40.700928682 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-m67zs" (UID: "39b46330-ba0e-4d59-adf9-d5dae3eff9a5") : secret "cluster-monitoring-operator-tls" not found Apr 22 13:21:31.022981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:31.022795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert\") pod \"ingress-canary-28xhg\" (UID: \"71d8f771-bd2a-4877-b621-ee39745c59d3\") " pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:31.022981 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:31.022879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:31.023245 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:31.023017 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 13:21:31.023245 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:31.023049 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 13:21:31.023245 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:31.023089 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert podName:71d8f771-bd2a-4877-b621-ee39745c59d3 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:35.023069522 +0000 UTC m=+40.801756284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert") pod "ingress-canary-28xhg" (UID: "71d8f771-bd2a-4877-b621-ee39745c59d3") : secret "canary-serving-cert" not found Apr 22 13:21:31.023245 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:31.023111 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls podName:af2a53dd-540c-49c6-b29d-228576b6c6ef nodeName:}" failed. No retries permitted until 2026-04-22 13:21:35.023098438 +0000 UTC m=+40.801785195 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls") pod "dns-default-qlspg" (UID: "af2a53dd-540c-49c6-b29d-228576b6c6ef") : secret "dns-default-metrics-tls" not found Apr 22 13:21:31.205599 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:31.205565 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82s2m" event={"ID":"d5dedb94-cce8-40ef-8b20-152362aec6dc","Type":"ContainerStarted","Data":"b320495f5182fb5a06d6bb6c1f10168c4e51d0a7463fd9c6f23af7334d8c0531"} Apr 22 13:21:31.228478 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:31.228422 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-82s2m" podStartSLOduration=4.437652742 podStartE2EDuration="36.22840319s" podCreationTimestamp="2026-04-22 13:20:55 +0000 UTC" firstStartedPulling="2026-04-22 13:20:56.113859106 +0000 UTC m=+1.892545862" lastFinishedPulling="2026-04-22 13:21:27.904609554 +0000 UTC m=+33.683296310" observedRunningTime="2026-04-22 13:21:31.226095275 +0000 UTC m=+37.004782065" watchObservedRunningTime="2026-04-22 13:21:31.22840319 +0000 UTC m=+37.007089968" Apr 22 13:21:34.864333 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:34.864303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5b69k\" (UID: \"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:34.864938 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:34.864378 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:34.864938 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:34.864425 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 13:21:34.864938 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:34.864491 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls podName:26fb6a86-2731-45ec-bf1d-5a84dbd6e4de nodeName:}" failed. No retries permitted until 2026-04-22 13:21:42.864473402 +0000 UTC m=+48.643160166 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5b69k" (UID: "26fb6a86-2731-45ec-bf1d-5a84dbd6e4de") : secret "samples-operator-tls" not found Apr 22 13:21:34.864938 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:34.864537 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle podName:31773dcc-6b07-4788-8a81-d7978b0c63fc nodeName:}" failed. No retries permitted until 2026-04-22 13:21:42.864517881 +0000 UTC m=+48.643204686 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle") pod "router-default-6784d5bdf4-cp9jk" (UID: "31773dcc-6b07-4788-8a81-d7978b0c63fc") : configmap references non-existent config key: service-ca.crt Apr 22 13:21:34.864938 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:34.864428 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:34.864938 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:34.864564 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 13:21:34.864938 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:34.864625 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs podName:31773dcc-6b07-4788-8a81-d7978b0c63fc nodeName:}" failed. No retries permitted until 2026-04-22 13:21:42.864610431 +0000 UTC m=+48.643297190 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs") pod "router-default-6784d5bdf4-cp9jk" (UID: "31773dcc-6b07-4788-8a81-d7978b0c63fc") : secret "router-metrics-certs-default" not found Apr 22 13:21:34.965517 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:34.965481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7wmrw\" (UID: \"e52e91fc-36b4-42db-9861-4ef4a07e7ec7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:34.965697 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:34.965554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:34.965697 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:34.965615 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:34.965697 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:34.965655 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 13:21:34.965868 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:34.965697 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 13:21:34.965868 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:34.965729 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert podName:e52e91fc-36b4-42db-9861-4ef4a07e7ec7 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:42.96571008 +0000 UTC m=+48.744396840 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7wmrw" (UID: "e52e91fc-36b4-42db-9861-4ef4a07e7ec7") : secret "networking-console-plugin-cert" not found Apr 22 13:21:34.965868 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:34.965733 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 13:21:34.965868 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:34.965750 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b8d7bc487-ggkqp: secret "image-registry-tls" not found Apr 22 13:21:34.965868 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:34.965752 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls podName:39b46330-ba0e-4d59-adf9-d5dae3eff9a5 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:42.965735852 +0000 UTC m=+48.744422623 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-m67zs" (UID: "39b46330-ba0e-4d59-adf9-d5dae3eff9a5") : secret "cluster-monitoring-operator-tls" not found Apr 22 13:21:34.965868 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:34.965793 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls podName:40c9e05d-a355-4746-b30f-56fb43b54267 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:42.965779965 +0000 UTC m=+48.744466726 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls") pod "image-registry-6b8d7bc487-ggkqp" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267") : secret "image-registry-tls" not found Apr 22 13:21:35.067089 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:35.067052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert\") pod \"ingress-canary-28xhg\" (UID: \"71d8f771-bd2a-4877-b621-ee39745c59d3\") " pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:35.067291 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:35.067109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:35.067291 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:35.067221 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 13:21:35.067291 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:35.067235 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 13:21:35.067291 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:35.067285 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls podName:af2a53dd-540c-49c6-b29d-228576b6c6ef nodeName:}" failed. No retries permitted until 2026-04-22 13:21:43.067267827 +0000 UTC m=+48.845954589 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls") pod "dns-default-qlspg" (UID: "af2a53dd-540c-49c6-b29d-228576b6c6ef") : secret "dns-default-metrics-tls" not found Apr 22 13:21:35.067502 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:35.067303 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert podName:71d8f771-bd2a-4877-b621-ee39745c59d3 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:43.067294894 +0000 UTC m=+48.845981653 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert") pod "ingress-canary-28xhg" (UID: "71d8f771-bd2a-4877-b621-ee39745c59d3") : secret "canary-serving-cert" not found Apr 22 13:21:40.433593 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:40.433554 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cmksh"] Apr 22 13:21:40.436373 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:40.436342 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b8dea64_f9f6_45b5_b139_340bac72fa46.slice/crio-d4ee86f38d4452cf3c2e39643f015e1961e31a8fd355742c05b7c7e30592870f WatchSource:0}: Error finding container d4ee86f38d4452cf3c2e39643f015e1961e31a8fd355742c05b7c7e30592870f: Status 404 returned error can't find the container with id d4ee86f38d4452cf3c2e39643f015e1961e31a8fd355742c05b7c7e30592870f Apr 22 13:21:41.228351 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.228313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" event={"ID":"325a1de2-a59a-4875-9ae5-6279a61a3d7c","Type":"ContainerStarted","Data":"bd04efea2d08f0ee47737957ebce5fec4a082e07a445836ae7b4970ddaa466bc"} Apr 22 13:21:41.230930 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.230835 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ggdrt" event={"ID":"767888b5-4be3-4a3e-ac92-a5c0cd2708fe","Type":"ContainerStarted","Data":"be57f3bfe2f821a9c6c91189327583a9fceb6f2549e909b0e696c933a6768859"} Apr 22 13:21:41.231139 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.231106 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:21:41.233249 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.233218 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6" event={"ID":"1e878e40-107f-4e8f-a28a-a31fd05dab63","Type":"ContainerStarted","Data":"f32a67ed7fc0fb24b1350b9c4203f0327443121744ac46610d63ee8ddd4eaad5"} Apr 22 13:21:41.235193 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.235067 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/0.log" Apr 22 13:21:41.235193 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.235099 2575 generic.go:358] "Generic (PLEG): container finished" podID="e6690926-2579-440b-9233-f4d551be735b" containerID="cd0e378961fc71cdebd74222061988307f8808c449305cb0ecc0d397aeb2669f" exitCode=255 Apr 22 13:21:41.235193 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.235161 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" event={"ID":"e6690926-2579-440b-9233-f4d551be735b","Type":"ContainerDied","Data":"cd0e378961fc71cdebd74222061988307f8808c449305cb0ecc0d397aeb2669f"} Apr 22 13:21:41.235435 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.235418 2575 scope.go:117] "RemoveContainer" containerID="cd0e378961fc71cdebd74222061988307f8808c449305cb0ecc0d397aeb2669f" Apr 22 13:21:41.237282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.237259 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cmksh" event={"ID":"7b8dea64-f9f6-45b5-b139-340bac72fa46","Type":"ContainerStarted","Data":"d4ee86f38d4452cf3c2e39643f015e1961e31a8fd355742c05b7c7e30592870f"} Apr 22 13:21:41.239399 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.239361 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" event={"ID":"f0d009dd-f884-49c7-9c71-35b224f451dd","Type":"ContainerStarted","Data":"03168995721eb143c56fa1f515c4c9f8755537de8bedf9e0e30c68de6e1dc93c"} Apr 22 13:21:41.239771 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.239733 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" Apr 22 13:21:41.241539 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.241488 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" Apr 22 13:21:41.241972 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.241951 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" event={"ID":"bd517d51-a6c8-4afc-a72e-d5715a54d32b","Type":"ContainerStarted","Data":"d62218225568475be649827b2f65540ef7d3758cdcdc25aec25996c1eb74bb9d"} Apr 22 13:21:41.243616 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.243582 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9jhgc" event={"ID":"519ff98a-672a-4535-ba51-0ecaffb33bc5","Type":"ContainerStarted","Data":"98057eeb3067e2611896e0bdf8666ee39622ca5a9bc07cce849a9d9d74fd9754"} Apr 22 13:21:41.246201 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.245755 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" event={"ID":"314bb891-5872-4d07-b293-eb6ba8a1c926","Type":"ContainerStarted","Data":"3c27af33debf06d499f28a79969a41ceaf586e92d6c79f7f151dc55f5f234b51"} Apr 22 13:21:41.249903 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.247867 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-plfpc" event={"ID":"8c86436f-31ed-4303-a368-025b9fb5a7ed","Type":"ContainerStarted","Data":"6749160dd2207162b94791196f90411cdae09fb55986e6698f2ddd23773ebb2b"} Apr 22 13:21:41.249903 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.249676 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" podStartSLOduration=33.842530554 podStartE2EDuration="44.249661439s" podCreationTimestamp="2026-04-22 13:20:57 +0000 UTC" firstStartedPulling="2026-04-22 13:21:27.848086231 +0000 UTC m=+33.626773001" lastFinishedPulling="2026-04-22 13:21:38.255217129 +0000 UTC m=+44.033903886" observedRunningTime="2026-04-22 13:21:41.249344135 +0000 UTC m=+47.028030911" watchObservedRunningTime="2026-04-22 13:21:41.249661439 +0000 UTC m=+47.028348219" Apr 22 13:21:41.253013 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.252655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-c7lmh" event={"ID":"6a9a5a71-f594-4aa2-a20c-e3b81689cb97","Type":"ContainerStarted","Data":"3532e31ab625913970044f0ee7157929171b3ce770aad14fa67731bb8363e9ca"} Apr 22 13:21:41.295849 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.295727 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" podStartSLOduration=31.861722627 podStartE2EDuration="44.295709624s" podCreationTimestamp="2026-04-22 13:20:57 +0000 UTC" firstStartedPulling="2026-04-22 13:21:27.878554089 +0000 UTC m=+33.657240857" lastFinishedPulling="2026-04-22 13:21:40.312541094 +0000 UTC m=+46.091227854" observedRunningTime="2026-04-22 13:21:41.295060721 +0000 UTC m=+47.073747512" watchObservedRunningTime="2026-04-22 13:21:41.295709624 +0000 UTC m=+47.074396403" Apr 22 13:21:41.346619 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.345160 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ggdrt" podStartSLOduration=35.118204748 podStartE2EDuration="46.345141313s" podCreationTimestamp="2026-04-22 13:20:55 +0000 UTC" firstStartedPulling="2026-04-22 13:21:29.125945519 +0000 UTC m=+34.904632291" lastFinishedPulling="2026-04-22 13:21:40.352882091 +0000 UTC m=+46.131568856" observedRunningTime="2026-04-22 13:21:41.344001395 +0000 UTC m=+47.122688179" watchObservedRunningTime="2026-04-22 13:21:41.345141313 +0000 UTC m=+47.123828089" Apr 22 13:21:41.346619 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.345864 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-plfpc" podStartSLOduration=33.937560834 podStartE2EDuration="44.345856909s" podCreationTimestamp="2026-04-22 13:20:57 +0000 UTC" firstStartedPulling="2026-04-22 13:21:27.846919943 +0000 UTC m=+33.625606705" lastFinishedPulling="2026-04-22 13:21:38.255216025 +0000 UTC m=+44.033902780" observedRunningTime="2026-04-22 13:21:41.31868142 +0000 UTC m=+47.097368198" watchObservedRunningTime="2026-04-22 13:21:41.345856909 +0000 UTC m=+47.124543687" Apr 22 13:21:41.375619 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.371448 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bb549c677-jfxn6" podStartSLOduration=16.905726059 podStartE2EDuration="29.371427362s" podCreationTimestamp="2026-04-22 13:21:12 +0000 UTC" firstStartedPulling="2026-04-22 13:21:27.887789228 +0000 UTC m=+33.666475985" lastFinishedPulling="2026-04-22 13:21:40.35349053 +0000 UTC m=+46.132177288" observedRunningTime="2026-04-22 13:21:41.367670713 +0000 UTC m=+47.146357492" watchObservedRunningTime="2026-04-22 13:21:41.371427362 +0000 UTC m=+47.150114141" Apr 22 13:21:41.393604 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.392666 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7db4c585c4-mpcl4" podStartSLOduration=16.912231463 podStartE2EDuration="29.39264583s" podCreationTimestamp="2026-04-22 13:21:12 +0000 UTC" firstStartedPulling="2026-04-22 13:21:27.890967577 +0000 UTC m=+33.669654336" lastFinishedPulling="2026-04-22 13:21:40.371381946 +0000 UTC m=+46.150068703" observedRunningTime="2026-04-22 13:21:41.391265211 +0000 UTC m=+47.169951994" watchObservedRunningTime="2026-04-22 13:21:41.39264583 +0000 UTC m=+47.171332609" Apr 22 13:21:41.411979 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.411223 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9jhgc" podStartSLOduration=29.978016349 podStartE2EDuration="42.411205417s" podCreationTimestamp="2026-04-22 13:20:59 +0000 UTC" firstStartedPulling="2026-04-22 13:21:27.878541018 +0000 UTC m=+33.657227785" lastFinishedPulling="2026-04-22 13:21:40.311730085 +0000 UTC m=+46.090416853" observedRunningTime="2026-04-22 13:21:41.409522129 +0000 UTC m=+47.188208908" watchObservedRunningTime="2026-04-22 13:21:41.411205417 +0000 UTC m=+47.189892197" Apr 22 13:21:41.438884 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:41.437982 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-c7lmh" podStartSLOduration=31.974230836 podStartE2EDuration="44.437962278s" podCreationTimestamp="2026-04-22 13:20:57 +0000 UTC" firstStartedPulling="2026-04-22 13:21:27.848810635 +0000 UTC m=+33.627497391" lastFinishedPulling="2026-04-22 13:21:40.31254207 +0000 UTC m=+46.091228833" observedRunningTime="2026-04-22 13:21:41.436694945 +0000 UTC m=+47.215381724" watchObservedRunningTime="2026-04-22 13:21:41.437962278 +0000 UTC m=+47.216649059" Apr 22 13:21:42.259004 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.258974 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:21:42.261290 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.259437 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/0.log" Apr 22 13:21:42.261290 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.259474 2575 generic.go:358] "Generic (PLEG): container finished" podID="e6690926-2579-440b-9233-f4d551be735b" containerID="f72156ffb54123dadfc105123771e3d55e21fa9aa90b12f246221f09a62c5e89" exitCode=255 Apr 22 13:21:42.261290 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.260356 2575 scope.go:117] "RemoveContainer" containerID="f72156ffb54123dadfc105123771e3d55e21fa9aa90b12f246221f09a62c5e89" Apr 22 13:21:42.261290 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:42.260528 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-tkzkh_openshift-console-operator(e6690926-2579-440b-9233-f4d551be735b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" podUID="e6690926-2579-440b-9233-f4d551be735b" Apr 22 13:21:42.261290 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.260837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" event={"ID":"e6690926-2579-440b-9233-f4d551be735b","Type":"ContainerDied","Data":"f72156ffb54123dadfc105123771e3d55e21fa9aa90b12f246221f09a62c5e89"} Apr 22 13:21:42.261290 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.260880 2575 scope.go:117] "RemoveContainer" containerID="cd0e378961fc71cdebd74222061988307f8808c449305cb0ecc0d397aeb2669f" Apr 22 13:21:42.870642 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.870429 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vb8xm"] Apr 22 13:21:42.883026 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.882975 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vb8xm"] Apr 22 13:21:42.883026 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.883002 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vb8xm" Apr 22 13:21:42.885678 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.885493 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 13:21:42.886465 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.886112 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-lvgxv\"" Apr 22 13:21:42.886465 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.886352 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 13:21:42.942802 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.942751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5b69k\" (UID: \"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:42.943001 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.942878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:42.943001 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:42.942945 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 13:21:42.943116 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:42.943002 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 13:21:42.943116 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.942948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:42.943116 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:42.943025 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls podName:26fb6a86-2731-45ec-bf1d-5a84dbd6e4de nodeName:}" failed. No retries permitted until 2026-04-22 13:21:58.943006136 +0000 UTC m=+64.721692918 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5b69k" (UID: "26fb6a86-2731-45ec-bf1d-5a84dbd6e4de") : secret "samples-operator-tls" not found Apr 22 13:21:42.943116 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:42.943053 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs podName:31773dcc-6b07-4788-8a81-d7978b0c63fc nodeName:}" failed. No retries permitted until 2026-04-22 13:21:58.943043004 +0000 UTC m=+64.721729766 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs") pod "router-default-6784d5bdf4-cp9jk" (UID: "31773dcc-6b07-4788-8a81-d7978b0c63fc") : secret "router-metrics-certs-default" not found Apr 22 13:21:42.943116 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:42.943100 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle podName:31773dcc-6b07-4788-8a81-d7978b0c63fc nodeName:}" failed. No retries permitted until 2026-04-22 13:21:58.943077838 +0000 UTC m=+64.721764604 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle") pod "router-default-6784d5bdf4-cp9jk" (UID: "31773dcc-6b07-4788-8a81-d7978b0c63fc") : configmap references non-existent config key: service-ca.crt Apr 22 13:21:42.943334 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:42.943155 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2znv\" (UniqueName: \"kubernetes.io/projected/a582ca34-4c37-42fd-9ba2-cb3ff6d5ca30-kube-api-access-v2znv\") pod \"migrator-74bb7799d9-vb8xm\" (UID: \"a582ca34-4c37-42fd-9ba2-cb3ff6d5ca30\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vb8xm" Apr 22 13:21:43.044155 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.044113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7wmrw\" (UID: \"e52e91fc-36b4-42db-9861-4ef4a07e7ec7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:43.044343 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.044203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:43.044343 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:43.044252 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 13:21:43.044343 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:43.044327 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 13:21:43.044343 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.044259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:43.044343 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:43.044339 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b8d7bc487-ggkqp: secret "image-registry-tls" not found Apr 22 13:21:43.044343 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:43.044330 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert podName:e52e91fc-36b4-42db-9861-4ef4a07e7ec7 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:59.044309459 +0000 UTC m=+64.822996234 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7wmrw" (UID: "e52e91fc-36b4-42db-9861-4ef4a07e7ec7") : secret "networking-console-plugin-cert" not found Apr 22 13:21:43.044668 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:43.044327 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 13:21:43.044668 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:43.044391 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls podName:40c9e05d-a355-4746-b30f-56fb43b54267 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:59.044380079 +0000 UTC m=+64.823066856 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls") pod "image-registry-6b8d7bc487-ggkqp" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267") : secret "image-registry-tls" not found Apr 22 13:21:43.044668 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:43.044417 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls podName:39b46330-ba0e-4d59-adf9-d5dae3eff9a5 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:59.044398147 +0000 UTC m=+64.823084912 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-m67zs" (UID: "39b46330-ba0e-4d59-adf9-d5dae3eff9a5") : secret "cluster-monitoring-operator-tls" not found Apr 22 13:21:43.044668 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.044516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2znv\" (UniqueName: \"kubernetes.io/projected/a582ca34-4c37-42fd-9ba2-cb3ff6d5ca30-kube-api-access-v2znv\") pod \"migrator-74bb7799d9-vb8xm\" (UID: \"a582ca34-4c37-42fd-9ba2-cb3ff6d5ca30\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vb8xm" Apr 22 13:21:43.053556 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.053506 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2znv\" (UniqueName: \"kubernetes.io/projected/a582ca34-4c37-42fd-9ba2-cb3ff6d5ca30-kube-api-access-v2znv\") pod \"migrator-74bb7799d9-vb8xm\" (UID: \"a582ca34-4c37-42fd-9ba2-cb3ff6d5ca30\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vb8xm" Apr 22 13:21:43.145094 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.145010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert\") pod \"ingress-canary-28xhg\" (UID: \"71d8f771-bd2a-4877-b621-ee39745c59d3\") " pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:43.145094 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.145075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:43.145641 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:43.145328 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 13:21:43.145641 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:43.145425 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls podName:af2a53dd-540c-49c6-b29d-228576b6c6ef nodeName:}" failed. No retries permitted until 2026-04-22 13:21:59.145408215 +0000 UTC m=+64.924094977 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls") pod "dns-default-qlspg" (UID: "af2a53dd-540c-49c6-b29d-228576b6c6ef") : secret "dns-default-metrics-tls" not found Apr 22 13:21:43.145641 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:43.145438 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 13:21:43.145641 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:43.145507 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert podName:71d8f771-bd2a-4877-b621-ee39745c59d3 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:59.145489443 +0000 UTC m=+64.924176211 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert") pod "ingress-canary-28xhg" (UID: "71d8f771-bd2a-4877-b621-ee39745c59d3") : secret "canary-serving-cert" not found Apr 22 13:21:43.199046 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.198929 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vb8xm" Apr 22 13:21:43.264461 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.264429 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:21:43.265001 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.264976 2575 scope.go:117] "RemoveContainer" containerID="f72156ffb54123dadfc105123771e3d55e21fa9aa90b12f246221f09a62c5e89" Apr 22 13:21:43.265153 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:43.265132 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-tkzkh_openshift-console-operator(e6690926-2579-440b-9233-f4d551be735b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" podUID="e6690926-2579-440b-9233-f4d551be735b" Apr 22 13:21:43.467929 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.467891 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vb8xm"] Apr 22 13:21:43.471806 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:43.471763 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda582ca34_4c37_42fd_9ba2_cb3ff6d5ca30.slice/crio-53b4c9553679daf23dc9f6292753f7c69b88e481584515cd686dc0fd6ba44a07 WatchSource:0}: Error finding container 53b4c9553679daf23dc9f6292753f7c69b88e481584515cd686dc0fd6ba44a07: Status 404 returned error can't find the container with id 53b4c9553679daf23dc9f6292753f7c69b88e481584515cd686dc0fd6ba44a07 Apr 22 13:21:43.887612 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.887573 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5wkct"] Apr 22 13:21:43.900593 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.900557 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5wkct"] Apr 22 13:21:43.900721 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.900707 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:43.903235 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.903164 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 13:21:43.903235 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.903171 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jvn9v\"" Apr 22 13:21:43.903420 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.903179 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 13:21:43.953228 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.953158 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/db237155-08f3-4006-8e49-5c56556feb45-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:43.953374 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.953273 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/db237155-08f3-4006-8e49-5c56556feb45-crio-socket\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:43.953374 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.953314 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/db237155-08f3-4006-8e49-5c56556feb45-data-volume\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:43.953478 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.953437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w9vs\" (UniqueName: \"kubernetes.io/projected/db237155-08f3-4006-8e49-5c56556feb45-kube-api-access-2w9vs\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:43.953554 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:43.953539 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:44.054873 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.054834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/db237155-08f3-4006-8e49-5c56556feb45-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:44.055098 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.054934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/db237155-08f3-4006-8e49-5c56556feb45-crio-socket\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:44.055098 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.054960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/db237155-08f3-4006-8e49-5c56556feb45-data-volume\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:44.055098 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.055082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w9vs\" (UniqueName: \"kubernetes.io/projected/db237155-08f3-4006-8e49-5c56556feb45-kube-api-access-2w9vs\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:44.055252 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.055173 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/db237155-08f3-4006-8e49-5c56556feb45-crio-socket\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:44.055288 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.055277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:44.055414 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.055394 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/db237155-08f3-4006-8e49-5c56556feb45-data-volume\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:44.055473 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:44.055412 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 13:21:44.055529 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:44.055474 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls podName:db237155-08f3-4006-8e49-5c56556feb45 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:44.555456418 +0000 UTC m=+50.334143190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5wkct" (UID: "db237155-08f3-4006-8e49-5c56556feb45") : secret "insights-runtime-extractor-tls" not found Apr 22 13:21:44.055529 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.055501 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/db237155-08f3-4006-8e49-5c56556feb45-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:44.066306 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.066278 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w9vs\" (UniqueName: \"kubernetes.io/projected/db237155-08f3-4006-8e49-5c56556feb45-kube-api-access-2w9vs\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:44.107625 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.107593 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-rzt66"] Apr 22 13:21:44.118993 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.118967 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-g8tbz_bc02fe1b-6157-4e28-a646-7be5ed635282/dns-node-resolver/0.log" Apr 22 13:21:44.134794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.134766 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-rzt66"] Apr 22 13:21:44.134965 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.134929 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-rzt66" Apr 22 13:21:44.137481 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.137344 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 13:21:44.137481 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.137349 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 13:21:44.137481 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.137429 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 13:21:44.137481 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.137463 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 13:21:44.137752 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.137530 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-q2ntv\"" Apr 22 13:21:44.257547 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.257458 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2c113ea9-ff52-47d4-aa14-73f60e288cb4-signing-cabundle\") pod \"service-ca-865cb79987-rzt66\" (UID: \"2c113ea9-ff52-47d4-aa14-73f60e288cb4\") " pod="openshift-service-ca/service-ca-865cb79987-rzt66" Apr 22 13:21:44.257547 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.257507 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2c113ea9-ff52-47d4-aa14-73f60e288cb4-signing-key\") pod \"service-ca-865cb79987-rzt66\" (UID: \"2c113ea9-ff52-47d4-aa14-73f60e288cb4\") " pod="openshift-service-ca/service-ca-865cb79987-rzt66" Apr 22 13:21:44.257760 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.257612 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr5qg\" (UniqueName: \"kubernetes.io/projected/2c113ea9-ff52-47d4-aa14-73f60e288cb4-kube-api-access-nr5qg\") pod \"service-ca-865cb79987-rzt66\" (UID: \"2c113ea9-ff52-47d4-aa14-73f60e288cb4\") " pod="openshift-service-ca/service-ca-865cb79987-rzt66" Apr 22 13:21:44.269447 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.269407 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" event={"ID":"bd517d51-a6c8-4afc-a72e-d5715a54d32b","Type":"ContainerStarted","Data":"1f413cddcb81dbd978c0d42f0323173f728810e7368ed2c6977271277761b367"} Apr 22 13:21:44.274711 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.274672 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vb8xm" event={"ID":"a582ca34-4c37-42fd-9ba2-cb3ff6d5ca30","Type":"ContainerStarted","Data":"53b4c9553679daf23dc9f6292753f7c69b88e481584515cd686dc0fd6ba44a07"} Apr 22 13:21:44.358633 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.358594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2c113ea9-ff52-47d4-aa14-73f60e288cb4-signing-cabundle\") pod \"service-ca-865cb79987-rzt66\" (UID: \"2c113ea9-ff52-47d4-aa14-73f60e288cb4\") " pod="openshift-service-ca/service-ca-865cb79987-rzt66" Apr 22 13:21:44.358833 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.358658 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2c113ea9-ff52-47d4-aa14-73f60e288cb4-signing-key\") pod \"service-ca-865cb79987-rzt66\" (UID: \"2c113ea9-ff52-47d4-aa14-73f60e288cb4\") " pod="openshift-service-ca/service-ca-865cb79987-rzt66" Apr 22 13:21:44.358833 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.358727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr5qg\" (UniqueName: \"kubernetes.io/projected/2c113ea9-ff52-47d4-aa14-73f60e288cb4-kube-api-access-nr5qg\") pod \"service-ca-865cb79987-rzt66\" (UID: \"2c113ea9-ff52-47d4-aa14-73f60e288cb4\") " pod="openshift-service-ca/service-ca-865cb79987-rzt66" Apr 22 13:21:44.359351 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.359314 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2c113ea9-ff52-47d4-aa14-73f60e288cb4-signing-cabundle\") pod \"service-ca-865cb79987-rzt66\" (UID: \"2c113ea9-ff52-47d4-aa14-73f60e288cb4\") " pod="openshift-service-ca/service-ca-865cb79987-rzt66" Apr 22 13:21:44.361596 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.361563 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2c113ea9-ff52-47d4-aa14-73f60e288cb4-signing-key\") pod \"service-ca-865cb79987-rzt66\" (UID: \"2c113ea9-ff52-47d4-aa14-73f60e288cb4\") " pod="openshift-service-ca/service-ca-865cb79987-rzt66" Apr 22 13:21:44.366703 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.366654 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr5qg\" (UniqueName: \"kubernetes.io/projected/2c113ea9-ff52-47d4-aa14-73f60e288cb4-kube-api-access-nr5qg\") pod \"service-ca-865cb79987-rzt66\" (UID: \"2c113ea9-ff52-47d4-aa14-73f60e288cb4\") " pod="openshift-service-ca/service-ca-865cb79987-rzt66" Apr 22 13:21:44.445706 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.445666 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-rzt66" Apr 22 13:21:44.560805 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.560729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:44.560976 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:44.560900 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 13:21:44.560976 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:44.560965 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls podName:db237155-08f3-4006-8e49-5c56556feb45 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:45.560946084 +0000 UTC m=+51.339632845 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5wkct" (UID: "db237155-08f3-4006-8e49-5c56556feb45") : secret "insights-runtime-extractor-tls" not found Apr 22 13:21:44.916633 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:44.916558 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2x5k7_4f52168a-3467-4e13-b154-1feaf9796063/node-ca/0.log" Apr 22 13:21:45.308785 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:45.308764 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-rzt66"] Apr 22 13:21:45.311547 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:45.311517 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c113ea9_ff52_47d4_aa14_73f60e288cb4.slice/crio-511865c55523d8d47e25c561da6613b480b98c3c5d8252fc5e3e16e01c223af5 WatchSource:0}: Error finding container 511865c55523d8d47e25c561da6613b480b98c3c5d8252fc5e3e16e01c223af5: Status 404 returned error can't find the container with id 511865c55523d8d47e25c561da6613b480b98c3c5d8252fc5e3e16e01c223af5 Apr 22 13:21:45.571938 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:45.571895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:45.572203 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:45.572040 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 13:21:45.572203 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:45.572117 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls podName:db237155-08f3-4006-8e49-5c56556feb45 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:47.572098107 +0000 UTC m=+53.350784879 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5wkct" (UID: "db237155-08f3-4006-8e49-5c56556feb45") : secret "insights-runtime-extractor-tls" not found Apr 22 13:21:46.282034 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:46.282003 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cmksh" event={"ID":"7b8dea64-f9f6-45b5-b139-340bac72fa46","Type":"ContainerStarted","Data":"79ba3897b2443561daf1490e97c6619d0f4aeb378867914f0d1e13e22416ef68"} Apr 22 13:21:46.284063 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:46.284020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" event={"ID":"bd517d51-a6c8-4afc-a72e-d5715a54d32b","Type":"ContainerStarted","Data":"b00b14033a62af47045fcb1bc094b77ce983792ae8a65176f0e789d52e6b8ba9"} Apr 22 13:21:46.285370 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:46.285341 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vb8xm" event={"ID":"a582ca34-4c37-42fd-9ba2-cb3ff6d5ca30","Type":"ContainerStarted","Data":"1abde2ee04a13c57a1ac143cb256dbdc4d64608720fa0b6d46cbc7c678b95040"} Apr 22 13:21:46.286732 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:46.286700 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-rzt66" event={"ID":"2c113ea9-ff52-47d4-aa14-73f60e288cb4","Type":"ContainerStarted","Data":"4b6a24046061bf2fb0d01cf71a91195d41ea49f82eb1570c7bfc2ff73cf1199a"} Apr 22 13:21:46.286808 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:46.286734 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-rzt66" event={"ID":"2c113ea9-ff52-47d4-aa14-73f60e288cb4","Type":"ContainerStarted","Data":"511865c55523d8d47e25c561da6613b480b98c3c5d8252fc5e3e16e01c223af5"} Apr 22 13:21:46.296642 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:46.296605 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-cmksh" podStartSLOduration=19.132606984 podStartE2EDuration="24.296592269s" podCreationTimestamp="2026-04-22 13:21:22 +0000 UTC" firstStartedPulling="2026-04-22 13:21:40.438238086 +0000 UTC m=+46.216924842" lastFinishedPulling="2026-04-22 13:21:45.602223371 +0000 UTC m=+51.380910127" observedRunningTime="2026-04-22 13:21:46.295877828 +0000 UTC m=+52.074564606" watchObservedRunningTime="2026-04-22 13:21:46.296592269 +0000 UTC m=+52.075279046" Apr 22 13:21:46.314273 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:46.314233 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" podStartSLOduration=18.498989856 podStartE2EDuration="34.314219904s" podCreationTimestamp="2026-04-22 13:21:12 +0000 UTC" firstStartedPulling="2026-04-22 13:21:27.878624546 +0000 UTC m=+33.657311311" lastFinishedPulling="2026-04-22 13:21:43.693854592 +0000 UTC m=+49.472541359" observedRunningTime="2026-04-22 13:21:46.313191946 +0000 UTC m=+52.091878734" watchObservedRunningTime="2026-04-22 13:21:46.314219904 +0000 UTC m=+52.092906681" Apr 22 13:21:46.328135 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:46.328097 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-rzt66" podStartSLOduration=2.328086705 podStartE2EDuration="2.328086705s" podCreationTimestamp="2026-04-22 13:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 13:21:46.326969428 +0000 UTC m=+52.105656386" watchObservedRunningTime="2026-04-22 13:21:46.328086705 +0000 UTC m=+52.106773482" Apr 22 13:21:47.293121 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:47.293078 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vb8xm" event={"ID":"a582ca34-4c37-42fd-9ba2-cb3ff6d5ca30","Type":"ContainerStarted","Data":"2ab4baa6536ca908f6fc3e6d043c2d96a641dacbbd6afda8afeb27d89029d50c"} Apr 22 13:21:47.319635 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:47.319582 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vb8xm" podStartSLOduration=2.617382748 podStartE2EDuration="5.319549345s" podCreationTimestamp="2026-04-22 13:21:42 +0000 UTC" firstStartedPulling="2026-04-22 13:21:43.473282806 +0000 UTC m=+49.251969565" lastFinishedPulling="2026-04-22 13:21:46.175449402 +0000 UTC m=+51.954136162" observedRunningTime="2026-04-22 13:21:47.318910317 +0000 UTC m=+53.097597096" watchObservedRunningTime="2026-04-22 13:21:47.319549345 +0000 UTC m=+53.098236127" Apr 22 13:21:47.461495 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:47.461456 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:47.461495 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:47.461505 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:21:47.461982 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:47.461963 2575 scope.go:117] "RemoveContainer" containerID="f72156ffb54123dadfc105123771e3d55e21fa9aa90b12f246221f09a62c5e89" Apr 22 13:21:47.462200 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:47.462179 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-tkzkh_openshift-console-operator(e6690926-2579-440b-9233-f4d551be735b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" podUID="e6690926-2579-440b-9233-f4d551be735b" Apr 22 13:21:47.577242 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:47.577144 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" podUID="bd517d51-a6c8-4afc-a72e-d5715a54d32b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 13:21:47.594037 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:47.594004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:47.594635 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:47.594613 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 13:21:47.594840 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:47.594798 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls podName:db237155-08f3-4006-8e49-5c56556feb45 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:51.594773644 +0000 UTC m=+57.373460414 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5wkct" (UID: "db237155-08f3-4006-8e49-5c56556feb45") : secret "insights-runtime-extractor-tls" not found Apr 22 13:21:51.636461 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:51.636418 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:51.637089 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:51.636583 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 13:21:51.637089 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:21:51.636651 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls podName:db237155-08f3-4006-8e49-5c56556feb45 nodeName:}" failed. No retries permitted until 2026-04-22 13:21:59.636635037 +0000 UTC m=+65.415321793 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls") pod "insights-runtime-extractor-5wkct" (UID: "db237155-08f3-4006-8e49-5c56556feb45") : secret "insights-runtime-extractor-tls" not found Apr 22 13:21:54.040993 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:54.040962 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kf5cf" Apr 22 13:21:57.576104 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:57.576050 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" podUID="bd517d51-a6c8-4afc-a72e-d5715a54d32b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 13:21:59.002046 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.002010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5b69k\" (UID: \"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:59.002405 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.002099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:59.002405 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.002164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:59.002678 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.002649 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31773dcc-6b07-4788-8a81-d7978b0c63fc-service-ca-bundle\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:59.008457 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.004706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31773dcc-6b07-4788-8a81-d7978b0c63fc-metrics-certs\") pod \"router-default-6784d5bdf4-cp9jk\" (UID: \"31773dcc-6b07-4788-8a81-d7978b0c63fc\") " pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:59.008457 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.004885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26fb6a86-2731-45ec-bf1d-5a84dbd6e4de-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5b69k\" (UID: \"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:59.102990 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.102940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7wmrw\" (UID: \"e52e91fc-36b4-42db-9861-4ef4a07e7ec7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:59.103159 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.103013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:59.103159 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.103055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:59.105434 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.105392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e52e91fc-36b4-42db-9861-4ef4a07e7ec7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7wmrw\" (UID: \"e52e91fc-36b4-42db-9861-4ef4a07e7ec7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:59.105557 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.105481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls\") pod \"image-registry-6b8d7bc487-ggkqp\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:59.105557 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.105509 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b46330-ba0e-4d59-adf9-d5dae3eff9a5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-m67zs\" (UID: \"39b46330-ba0e-4d59-adf9-d5dae3eff9a5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:59.197055 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.197021 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-lx8tg\"" Apr 22 13:21:59.204161 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.204136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert\") pod \"ingress-canary-28xhg\" (UID: \"71d8f771-bd2a-4877-b621-ee39745c59d3\") " pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:59.204251 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.204171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:59.204877 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.204855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:21:59.206989 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.206952 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af2a53dd-540c-49c6-b29d-228576b6c6ef-metrics-tls\") pod \"dns-default-qlspg\" (UID: \"af2a53dd-540c-49c6-b29d-228576b6c6ef\") " pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:59.207076 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.207036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71d8f771-bd2a-4877-b621-ee39745c59d3-cert\") pod \"ingress-canary-28xhg\" (UID: \"71d8f771-bd2a-4877-b621-ee39745c59d3\") " pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:59.218149 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.218125 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bmt5j\"" Apr 22 13:21:59.226159 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.226138 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" Apr 22 13:21:59.229836 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.229800 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-hfczk\"" Apr 22 13:21:59.238700 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.238677 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" Apr 22 13:21:59.254872 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.254842 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7w6d8\"" Apr 22 13:21:59.263088 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.263061 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:21:59.321800 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.321632 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-65l8g\"" Apr 22 13:21:59.332299 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.331275 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" Apr 22 13:21:59.377791 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.377741 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6784d5bdf4-cp9jk"] Apr 22 13:21:59.406399 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.406181 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k"] Apr 22 13:21:59.428498 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.428468 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs"] Apr 22 13:21:59.430178 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.430154 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6bcgp\"" Apr 22 13:21:59.435722 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.435622 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rrhbx\"" Apr 22 13:21:59.438682 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.438660 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-28xhg" Apr 22 13:21:59.444586 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.444546 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qlspg" Apr 22 13:21:59.459867 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.459655 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b8d7bc487-ggkqp"] Apr 22 13:21:59.463720 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:59.463417 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40c9e05d_a355_4746_b30f_56fb43b54267.slice/crio-c82c971a08d733c1eb2268d3a3767d20fda2254548eb70195491ee5dfc8c2f53 WatchSource:0}: Error finding container c82c971a08d733c1eb2268d3a3767d20fda2254548eb70195491ee5dfc8c2f53: Status 404 returned error can't find the container with id c82c971a08d733c1eb2268d3a3767d20fda2254548eb70195491ee5dfc8c2f53 Apr 22 13:21:59.507275 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.506787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs\") pod \"network-metrics-daemon-jgxt9\" (UID: \"8b92597f-aa17-456d-bc65-ee5880d70a69\") " pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:59.509388 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.509081 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 13:21:59.518999 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.518947 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw"] Apr 22 13:21:59.521661 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.521637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b92597f-aa17-456d-bc65-ee5880d70a69-metrics-certs\") pod \"network-metrics-daemon-jgxt9\" (UID: \"8b92597f-aa17-456d-bc65-ee5880d70a69\") " pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:59.523783 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:59.523716 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode52e91fc_36b4_42db_9861_4ef4a07e7ec7.slice/crio-73d8beae60ba31cfc93954c43c747f66282b7e39165e8d830921e6beace37ebb WatchSource:0}: Error finding container 73d8beae60ba31cfc93954c43c747f66282b7e39165e8d830921e6beace37ebb: Status 404 returned error can't find the container with id 73d8beae60ba31cfc93954c43c747f66282b7e39165e8d830921e6beace37ebb Apr 22 13:21:59.586942 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.586911 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-28xhg"] Apr 22 13:21:59.590466 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:59.590434 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71d8f771_bd2a_4877_b621_ee39745c59d3.slice/crio-29da4bc08059e9bbe777a546ca505a6222147e21073bf84a5306701bb5de0532 WatchSource:0}: Error finding container 29da4bc08059e9bbe777a546ca505a6222147e21073bf84a5306701bb5de0532: Status 404 returned error can't find the container with id 29da4bc08059e9bbe777a546ca505a6222147e21073bf84a5306701bb5de0532 Apr 22 13:21:59.607611 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.607584 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qlspg"] Apr 22 13:21:59.610932 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:59.610910 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf2a53dd_540c_49c6_b29d_228576b6c6ef.slice/crio-41b8e0f1445825afb863927622ec502669826e59525e539a0140a3ddab940fed WatchSource:0}: Error finding container 41b8e0f1445825afb863927622ec502669826e59525e539a0140a3ddab940fed: Status 404 returned error can't find the container with id 41b8e0f1445825afb863927622ec502669826e59525e539a0140a3ddab940fed Apr 22 13:21:59.708185 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.708152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:59.710322 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.710300 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/db237155-08f3-4006-8e49-5c56556feb45-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5wkct\" (UID: \"db237155-08f3-4006-8e49-5c56556feb45\") " pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:59.775950 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.775877 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bxmrr\"" Apr 22 13:21:59.783679 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.783655 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jgxt9" Apr 22 13:21:59.814616 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.814580 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jvn9v\"" Apr 22 13:21:59.823479 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.823454 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5wkct" Apr 22 13:21:59.917152 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.917010 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jgxt9"] Apr 22 13:21:59.921320 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:59.921275 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b92597f_aa17_456d_bc65_ee5880d70a69.slice/crio-224a10532a2dce73bf45ae4704ea2bd5c448b0fd7b3db744be95f5ad17bb7591 WatchSource:0}: Error finding container 224a10532a2dce73bf45ae4704ea2bd5c448b0fd7b3db744be95f5ad17bb7591: Status 404 returned error can't find the container with id 224a10532a2dce73bf45ae4704ea2bd5c448b0fd7b3db744be95f5ad17bb7591 Apr 22 13:21:59.964004 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:21:59.963972 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5wkct"] Apr 22 13:21:59.966779 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:21:59.966741 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb237155_08f3_4006_8e49_5c56556feb45.slice/crio-9999d382fe46f7e466aac97485d6d4f9e60d6e2da1d004c1768676b8133833c1 WatchSource:0}: Error finding container 9999d382fe46f7e466aac97485d6d4f9e60d6e2da1d004c1768676b8133833c1: Status 404 returned error can't find the container with id 9999d382fe46f7e466aac97485d6d4f9e60d6e2da1d004c1768676b8133833c1 Apr 22 13:22:00.332930 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.332672 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jgxt9" event={"ID":"8b92597f-aa17-456d-bc65-ee5880d70a69","Type":"ContainerStarted","Data":"224a10532a2dce73bf45ae4704ea2bd5c448b0fd7b3db744be95f5ad17bb7591"} Apr 22 13:22:00.334640 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.334575 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" event={"ID":"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de","Type":"ContainerStarted","Data":"e78f2e39c6d246f1353712ec1d01d64b69f5e616c77162a65db24a774a84c9a2"} Apr 22 13:22:00.338620 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.338590 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5wkct" event={"ID":"db237155-08f3-4006-8e49-5c56556feb45","Type":"ContainerStarted","Data":"509dfddfd041f5a1f32e4272ebac26e31845d06f8ffa18c68da900cd8136613e"} Apr 22 13:22:00.338800 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.338626 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5wkct" event={"ID":"db237155-08f3-4006-8e49-5c56556feb45","Type":"ContainerStarted","Data":"9999d382fe46f7e466aac97485d6d4f9e60d6e2da1d004c1768676b8133833c1"} Apr 22 13:22:00.343292 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.343265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-28xhg" event={"ID":"71d8f771-bd2a-4877-b621-ee39745c59d3","Type":"ContainerStarted","Data":"29da4bc08059e9bbe777a546ca505a6222147e21073bf84a5306701bb5de0532"} Apr 22 13:22:00.344686 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.344644 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" event={"ID":"39b46330-ba0e-4d59-adf9-d5dae3eff9a5","Type":"ContainerStarted","Data":"b801d24edaa8f000eee3766d609c5bd19938c73e4193fa15aa3dc883a6642f0a"} Apr 22 13:22:00.346989 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.346958 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" event={"ID":"31773dcc-6b07-4788-8a81-d7978b0c63fc","Type":"ContainerStarted","Data":"d3355e89049e30e27cf58c6665deec75ba3338b12c1db17b146faeedfaf4d1a3"} Apr 22 13:22:00.347093 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.346998 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" event={"ID":"31773dcc-6b07-4788-8a81-d7978b0c63fc","Type":"ContainerStarted","Data":"f970b6342ae2ff9180e54ba3a0ff2a7e9e88fd5db6360c8c30f62653e591d481"} Apr 22 13:22:00.349867 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.349726 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" event={"ID":"40c9e05d-a355-4746-b30f-56fb43b54267","Type":"ContainerStarted","Data":"20019a9506f7830427f0d0b3920d16b9d4e9b6bc05ecc6a486a5b03a6953ac22"} Apr 22 13:22:00.349867 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.349756 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" event={"ID":"40c9e05d-a355-4746-b30f-56fb43b54267","Type":"ContainerStarted","Data":"c82c971a08d733c1eb2268d3a3767d20fda2254548eb70195491ee5dfc8c2f53"} Apr 22 13:22:00.351550 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.350487 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:22:00.352122 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.352092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qlspg" event={"ID":"af2a53dd-540c-49c6-b29d-228576b6c6ef","Type":"ContainerStarted","Data":"41b8e0f1445825afb863927622ec502669826e59525e539a0140a3ddab940fed"} Apr 22 13:22:00.354752 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.354713 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" event={"ID":"e52e91fc-36b4-42db-9861-4ef4a07e7ec7","Type":"ContainerStarted","Data":"73d8beae60ba31cfc93954c43c747f66282b7e39165e8d830921e6beace37ebb"} Apr 22 13:22:00.389702 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.389648 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" podStartSLOduration=65.389629859 podStartE2EDuration="1m5.389629859s" podCreationTimestamp="2026-04-22 13:20:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 13:22:00.389558672 +0000 UTC m=+66.168245452" watchObservedRunningTime="2026-04-22 13:22:00.389629859 +0000 UTC m=+66.168316637" Apr 22 13:22:00.390208 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.390167 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" podStartSLOduration=63.390155308 podStartE2EDuration="1m3.390155308s" podCreationTimestamp="2026-04-22 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 13:22:00.370499975 +0000 UTC m=+66.149186754" watchObservedRunningTime="2026-04-22 13:22:00.390155308 +0000 UTC m=+66.168842087" Apr 22 13:22:00.851597 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:00.851240 2575 scope.go:117] "RemoveContainer" containerID="f72156ffb54123dadfc105123771e3d55e21fa9aa90b12f246221f09a62c5e89" Apr 22 13:22:01.205794 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:01.205758 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:22:01.208620 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:01.208598 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:22:01.358419 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:01.358379 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:22:01.359890 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:01.359867 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6784d5bdf4-cp9jk" Apr 22 13:22:05.393347 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.391371 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jgxt9" event={"ID":"8b92597f-aa17-456d-bc65-ee5880d70a69","Type":"ContainerStarted","Data":"3a5c3269262da891b6d2060373efb3417af70f5e31c15daf8693ad2d76656f2d"} Apr 22 13:22:05.393802 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.393400 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" event={"ID":"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de","Type":"ContainerStarted","Data":"b5ece5b845ff52468338d9bc266c770e95b707847b03e7d3757f49a5857f5d56"} Apr 22 13:22:05.393802 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.393441 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" event={"ID":"26fb6a86-2731-45ec-bf1d-5a84dbd6e4de","Type":"ContainerStarted","Data":"dc580e97fbcf65a39c1da389127803ebc901baac36d42e4140257fb5ae233d35"} Apr 22 13:22:05.396169 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.396140 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5wkct" event={"ID":"db237155-08f3-4006-8e49-5c56556feb45","Type":"ContainerStarted","Data":"e5de2619511ce2f752f9f8e1b206b084b73eb90ef8c9bb2bc51e21ac9171649c"} Apr 22 13:22:05.399480 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.398803 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-28xhg" event={"ID":"71d8f771-bd2a-4877-b621-ee39745c59d3","Type":"ContainerStarted","Data":"a9e9ea2b69b88944728e46169fadd7f97e878e308990256fea2aed43936368b0"} Apr 22 13:22:05.404435 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.404405 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" event={"ID":"39b46330-ba0e-4d59-adf9-d5dae3eff9a5","Type":"ContainerStarted","Data":"b69e1e1ef841837c887854942c60109fd70686405f323c2484365e06edfd9409"} Apr 22 13:22:05.409680 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.409645 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:22:05.409785 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.409725 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" event={"ID":"e6690926-2579-440b-9233-f4d551be735b","Type":"ContainerStarted","Data":"cf9b566e17de89a172e430621db42b07b71692143ba18f00b6fc2f0c317a9601"} Apr 22 13:22:05.410513 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.410488 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:22:05.412337 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.412262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qlspg" event={"ID":"af2a53dd-540c-49c6-b29d-228576b6c6ef","Type":"ContainerStarted","Data":"39329e277e95a4cfc05a970dd01942fa6a0982b5e7c961b2035a72f648904496"} Apr 22 13:22:05.415493 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.414021 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5b69k" podStartSLOduration=63.301530352 podStartE2EDuration="1m8.414007104s" podCreationTimestamp="2026-04-22 13:20:57 +0000 UTC" firstStartedPulling="2026-04-22 13:21:59.469734066 +0000 UTC m=+65.248420822" lastFinishedPulling="2026-04-22 13:22:04.582210818 +0000 UTC m=+70.360897574" observedRunningTime="2026-04-22 13:22:05.413779886 +0000 UTC m=+71.192466665" watchObservedRunningTime="2026-04-22 13:22:05.414007104 +0000 UTC m=+71.192693880" Apr 22 13:22:05.415493 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.414949 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" event={"ID":"e52e91fc-36b4-42db-9861-4ef4a07e7ec7","Type":"ContainerStarted","Data":"e53b8a70c04e7186b99e322b5e1e8722d172c45c3e53f42e53307ca2f7b16abf"} Apr 22 13:22:05.438269 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.438197 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" podStartSLOduration=55.972549819 podStartE2EDuration="1m8.438181361s" podCreationTimestamp="2026-04-22 13:20:57 +0000 UTC" firstStartedPulling="2026-04-22 13:21:27.846911947 +0000 UTC m=+33.625598703" lastFinishedPulling="2026-04-22 13:21:40.312543485 +0000 UTC m=+46.091230245" observedRunningTime="2026-04-22 13:22:05.436324438 +0000 UTC m=+71.215011214" watchObservedRunningTime="2026-04-22 13:22:05.438181361 +0000 UTC m=+71.216868139" Apr 22 13:22:05.461752 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.461569 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-28xhg" podStartSLOduration=33.473908477 podStartE2EDuration="38.461548082s" podCreationTimestamp="2026-04-22 13:21:27 +0000 UTC" firstStartedPulling="2026-04-22 13:21:59.594272603 +0000 UTC m=+65.372959366" lastFinishedPulling="2026-04-22 13:22:04.581912205 +0000 UTC m=+70.360598971" observedRunningTime="2026-04-22 13:22:05.459916539 +0000 UTC m=+71.238603318" watchObservedRunningTime="2026-04-22 13:22:05.461548082 +0000 UTC m=+71.240234861" Apr 22 13:22:05.506697 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.506649 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-m67zs" podStartSLOduration=62.892283151 podStartE2EDuration="1m8.506633175s" podCreationTimestamp="2026-04-22 13:20:57 +0000 UTC" firstStartedPulling="2026-04-22 13:21:59.437075757 +0000 UTC m=+65.215762522" lastFinishedPulling="2026-04-22 13:22:05.051425782 +0000 UTC m=+70.830112546" observedRunningTime="2026-04-22 13:22:05.50457096 +0000 UTC m=+71.283257738" watchObservedRunningTime="2026-04-22 13:22:05.506633175 +0000 UTC m=+71.285319953" Apr 22 13:22:05.555986 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.555889 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7wmrw" podStartSLOduration=56.506140002 podStartE2EDuration="1m1.555869164s" podCreationTimestamp="2026-04-22 13:21:04 +0000 UTC" firstStartedPulling="2026-04-22 13:21:59.527486618 +0000 UTC m=+65.306173377" lastFinishedPulling="2026-04-22 13:22:04.577215781 +0000 UTC m=+70.355902539" observedRunningTime="2026-04-22 13:22:05.547746788 +0000 UTC m=+71.326433590" watchObservedRunningTime="2026-04-22 13:22:05.555869164 +0000 UTC m=+71.334555943" Apr 22 13:22:05.750491 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.750450 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kgdnd"] Apr 22 13:22:05.770274 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.769714 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kgdnd"] Apr 22 13:22:05.770274 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.769337 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kgdnd" Apr 22 13:22:05.772110 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.772079 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 13:22:05.773126 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.773103 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-96f55\"" Apr 22 13:22:05.868219 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.868064 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/931be1b4-3093-4331-93ef-e0c2f780d193-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kgdnd\" (UID: \"931be1b4-3093-4331-93ef-e0c2f780d193\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kgdnd" Apr 22 13:22:05.969019 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.968976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/931be1b4-3093-4331-93ef-e0c2f780d193-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kgdnd\" (UID: \"931be1b4-3093-4331-93ef-e0c2f780d193\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kgdnd" Apr 22 13:22:05.972054 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:05.972017 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/931be1b4-3093-4331-93ef-e0c2f780d193-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kgdnd\" (UID: \"931be1b4-3093-4331-93ef-e0c2f780d193\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kgdnd" Apr 22 13:22:06.081019 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.080961 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kgdnd" Apr 22 13:22:06.205276 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.205237 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kgdnd"] Apr 22 13:22:06.208408 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:22:06.208367 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod931be1b4_3093_4331_93ef_e0c2f780d193.slice/crio-4125d8f0d561546c7f39afc095fa263313d56f7caf0e7ccfa11033a2f31fef99 WatchSource:0}: Error finding container 4125d8f0d561546c7f39afc095fa263313d56f7caf0e7ccfa11033a2f31fef99: Status 404 returned error can't find the container with id 4125d8f0d561546c7f39afc095fa263313d56f7caf0e7ccfa11033a2f31fef99 Apr 22 13:22:06.256299 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.256272 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-tkzkh" Apr 22 13:22:06.420723 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.420631 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kgdnd" event={"ID":"931be1b4-3093-4331-93ef-e0c2f780d193","Type":"ContainerStarted","Data":"4125d8f0d561546c7f39afc095fa263313d56f7caf0e7ccfa11033a2f31fef99"} Apr 22 13:22:06.422511 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.422480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jgxt9" event={"ID":"8b92597f-aa17-456d-bc65-ee5880d70a69","Type":"ContainerStarted","Data":"6dbdf4db975b91a06aa6283af141da132ca34e2d0842770959d5947a9029f59e"} Apr 22 13:22:06.424783 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.424425 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qlspg" event={"ID":"af2a53dd-540c-49c6-b29d-228576b6c6ef","Type":"ContainerStarted","Data":"23ef2598b45ea2940d2a6a0bf4c1d321c9df7e4b542c980adc5bde204dca47d1"} Apr 22 13:22:06.425001 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.424897 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qlspg" Apr 22 13:22:06.434563 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.434545 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-kq2hp"] Apr 22 13:22:06.443338 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.443301 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jgxt9" podStartSLOduration=66.785139975 podStartE2EDuration="1m11.443289261s" podCreationTimestamp="2026-04-22 13:20:55 +0000 UTC" firstStartedPulling="2026-04-22 13:21:59.923842806 +0000 UTC m=+65.702529577" lastFinishedPulling="2026-04-22 13:22:04.581992103 +0000 UTC m=+70.360678863" observedRunningTime="2026-04-22 13:22:06.44197056 +0000 UTC m=+72.220657338" watchObservedRunningTime="2026-04-22 13:22:06.443289261 +0000 UTC m=+72.221976038" Apr 22 13:22:06.461170 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.461120 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qlspg" podStartSLOduration=34.491758276 podStartE2EDuration="39.461102668s" podCreationTimestamp="2026-04-22 13:21:27 +0000 UTC" firstStartedPulling="2026-04-22 13:21:59.612511162 +0000 UTC m=+65.391197923" lastFinishedPulling="2026-04-22 13:22:04.581855552 +0000 UTC m=+70.360542315" observedRunningTime="2026-04-22 13:22:06.459737003 +0000 UTC m=+72.238423781" watchObservedRunningTime="2026-04-22 13:22:06.461102668 +0000 UTC m=+72.239789447" Apr 22 13:22:06.473408 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.473382 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-kq2hp"] Apr 22 13:22:06.473522 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.473435 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-kq2hp" Apr 22 13:22:06.475694 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.475674 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 13:22:06.475812 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.475717 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-vb4cb\"" Apr 22 13:22:06.475812 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.475732 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 13:22:06.575255 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.575208 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbjtb\" (UniqueName: \"kubernetes.io/projected/fdf99c44-f5fc-46e4-bf6b-f8aeeff9a948-kube-api-access-xbjtb\") pod \"downloads-6bcc868b7-kq2hp\" (UID: \"fdf99c44-f5fc-46e4-bf6b-f8aeeff9a948\") " pod="openshift-console/downloads-6bcc868b7-kq2hp" Apr 22 13:22:06.677300 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.677209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbjtb\" (UniqueName: \"kubernetes.io/projected/fdf99c44-f5fc-46e4-bf6b-f8aeeff9a948-kube-api-access-xbjtb\") pod \"downloads-6bcc868b7-kq2hp\" (UID: \"fdf99c44-f5fc-46e4-bf6b-f8aeeff9a948\") " pod="openshift-console/downloads-6bcc868b7-kq2hp" Apr 22 13:22:06.685877 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.685845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbjtb\" (UniqueName: \"kubernetes.io/projected/fdf99c44-f5fc-46e4-bf6b-f8aeeff9a948-kube-api-access-xbjtb\") pod \"downloads-6bcc868b7-kq2hp\" (UID: \"fdf99c44-f5fc-46e4-bf6b-f8aeeff9a948\") " pod="openshift-console/downloads-6bcc868b7-kq2hp" Apr 22 13:22:06.783972 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:06.783931 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-kq2hp" Apr 22 13:22:07.224601 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:07.224565 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-kq2hp"] Apr 22 13:22:07.227334 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:22:07.227302 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf99c44_f5fc_46e4_bf6b_f8aeeff9a948.slice/crio-bee02a83b9359a1aecaf2d92a3b9abab8fbdc0b7a003e21c85c1703f0b41919f WatchSource:0}: Error finding container bee02a83b9359a1aecaf2d92a3b9abab8fbdc0b7a003e21c85c1703f0b41919f: Status 404 returned error can't find the container with id bee02a83b9359a1aecaf2d92a3b9abab8fbdc0b7a003e21c85c1703f0b41919f Apr 22 13:22:07.431173 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:07.431093 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-kq2hp" event={"ID":"fdf99c44-f5fc-46e4-bf6b-f8aeeff9a948","Type":"ContainerStarted","Data":"bee02a83b9359a1aecaf2d92a3b9abab8fbdc0b7a003e21c85c1703f0b41919f"} Apr 22 13:22:07.575371 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:07.575338 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" podUID="bd517d51-a6c8-4afc-a72e-d5715a54d32b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 13:22:07.575480 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:07.575408 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" Apr 22 13:22:07.575953 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:07.575922 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"b00b14033a62af47045fcb1bc094b77ce983792ae8a65176f0e789d52e6b8ba9"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 13:22:07.576005 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:07.575989 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" podUID="bd517d51-a6c8-4afc-a72e-d5715a54d32b" containerName="service-proxy" containerID="cri-o://b00b14033a62af47045fcb1bc094b77ce983792ae8a65176f0e789d52e6b8ba9" gracePeriod=30 Apr 22 13:22:08.438396 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:08.438359 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd517d51-a6c8-4afc-a72e-d5715a54d32b" containerID="b00b14033a62af47045fcb1bc094b77ce983792ae8a65176f0e789d52e6b8ba9" exitCode=2 Apr 22 13:22:08.438800 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:08.438431 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" event={"ID":"bd517d51-a6c8-4afc-a72e-d5715a54d32b","Type":"ContainerDied","Data":"b00b14033a62af47045fcb1bc094b77ce983792ae8a65176f0e789d52e6b8ba9"} Apr 22 13:22:08.438800 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:08.438473 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d69c7685b-fdvpd" event={"ID":"bd517d51-a6c8-4afc-a72e-d5715a54d32b","Type":"ContainerStarted","Data":"1946c5e35b2b182ad8286a97c8ee650364e2b476a683160ad05a3c6897b6c398"} Apr 22 13:22:08.440484 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:08.440453 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5wkct" event={"ID":"db237155-08f3-4006-8e49-5c56556feb45","Type":"ContainerStarted","Data":"2f6abd5d3fca5240c2ca8f8265b653891cc3fcb3bc60adb96d13ecc84558c152"} Apr 22 13:22:08.479170 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:08.479114 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5wkct" podStartSLOduration=18.113621372 podStartE2EDuration="25.479098601s" podCreationTimestamp="2026-04-22 13:21:43 +0000 UTC" firstStartedPulling="2026-04-22 13:22:00.109640752 +0000 UTC m=+65.888327511" lastFinishedPulling="2026-04-22 13:22:07.475117981 +0000 UTC m=+73.253804740" observedRunningTime="2026-04-22 13:22:08.478269422 +0000 UTC m=+74.256956201" watchObservedRunningTime="2026-04-22 13:22:08.479098601 +0000 UTC m=+74.257785378" Apr 22 13:22:09.445388 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:09.445351 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kgdnd" event={"ID":"931be1b4-3093-4331-93ef-e0c2f780d193","Type":"ContainerStarted","Data":"66a945c5d020c6f68ac596eb134e793331a53262025e36e5a1d912395d78180a"} Apr 22 13:22:09.445756 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:09.445693 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kgdnd" Apr 22 13:22:09.450211 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:09.450190 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kgdnd" Apr 22 13:22:09.462296 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:09.462252 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kgdnd" podStartSLOduration=1.988137534 podStartE2EDuration="4.462239604s" podCreationTimestamp="2026-04-22 13:22:05 +0000 UTC" firstStartedPulling="2026-04-22 13:22:06.210737183 +0000 UTC m=+71.989423942" lastFinishedPulling="2026-04-22 13:22:08.684839253 +0000 UTC m=+74.463526012" observedRunningTime="2026-04-22 13:22:09.461167678 +0000 UTC m=+75.239854457" watchObservedRunningTime="2026-04-22 13:22:09.462239604 +0000 UTC m=+75.240926431" Apr 22 13:22:12.264303 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:12.264270 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ggdrt" Apr 22 13:22:16.218282 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.217759 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qwm58"] Apr 22 13:22:16.223719 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.223678 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.227069 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.227045 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 13:22:16.228055 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.227279 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 13:22:16.228055 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.227450 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x2x6b\"" Apr 22 13:22:16.228055 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.227642 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 13:22:16.228055 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.227995 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 13:22:16.259716 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.258876 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-textfile\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.259716 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.258927 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-accelerators-collector-config\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.259716 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.258964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44be4ae3-1c0b-4336-b5e5-5573a6075967-sys\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.259716 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.259040 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/44be4ae3-1c0b-4336-b5e5-5573a6075967-root\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.259716 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.259076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-wtmp\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.259716 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.259125 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-tls\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.259716 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.259172 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.259716 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.259225 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44be4ae3-1c0b-4336-b5e5-5573a6075967-metrics-client-ca\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.259716 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.259255 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q95xk\" (UniqueName: \"kubernetes.io/projected/44be4ae3-1c0b-4336-b5e5-5573a6075967-kube-api-access-q95xk\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.360845 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.360239 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-textfile\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.360845 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.360288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-accelerators-collector-config\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.360845 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.360322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44be4ae3-1c0b-4336-b5e5-5573a6075967-sys\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.360845 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.360352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/44be4ae3-1c0b-4336-b5e5-5573a6075967-root\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.360845 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.360375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-wtmp\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.360845 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.360412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-tls\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.360845 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.360451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.360845 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.360494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44be4ae3-1c0b-4336-b5e5-5573a6075967-metrics-client-ca\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.360845 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.360524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q95xk\" (UniqueName: \"kubernetes.io/projected/44be4ae3-1c0b-4336-b5e5-5573a6075967-kube-api-access-q95xk\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.360845 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.360632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-textfile\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.360845 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.360786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-wtmp\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.361534 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.361209 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44be4ae3-1c0b-4336-b5e5-5573a6075967-sys\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.361534 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.361265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/44be4ae3-1c0b-4336-b5e5-5573a6075967-root\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.361534 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.361308 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-accelerators-collector-config\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.361761 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:22:16.361719 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 13:22:16.361761 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.361745 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44be4ae3-1c0b-4336-b5e5-5573a6075967-metrics-client-ca\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.361892 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:22:16.361793 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-tls podName:44be4ae3-1c0b-4336-b5e5-5573a6075967 nodeName:}" failed. No retries permitted until 2026-04-22 13:22:16.861773822 +0000 UTC m=+82.640460589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-tls") pod "node-exporter-qwm58" (UID: "44be4ae3-1c0b-4336-b5e5-5573a6075967") : secret "node-exporter-tls" not found Apr 22 13:22:16.365463 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.365337 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.372498 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.372447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q95xk\" (UniqueName: \"kubernetes.io/projected/44be4ae3-1c0b-4336-b5e5-5573a6075967-kube-api-access-q95xk\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.865563 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.865526 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-tls\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:16.868289 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:16.868262 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/44be4ae3-1c0b-4336-b5e5-5573a6075967-node-exporter-tls\") pod \"node-exporter-qwm58\" (UID: \"44be4ae3-1c0b-4336-b5e5-5573a6075967\") " pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:17.137384 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:17.137307 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qwm58" Apr 22 13:22:17.150979 ip-10-0-136-73 kubenswrapper[2575]: W0422 13:22:17.150920 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44be4ae3_1c0b_4336_b5e5_5573a6075967.slice/crio-bec37c9f67ec28fc3230bf9918e9eaa32be97ea9b6a26d59d161a25afd2a07cf WatchSource:0}: Error finding container bec37c9f67ec28fc3230bf9918e9eaa32be97ea9b6a26d59d161a25afd2a07cf: Status 404 returned error can't find the container with id bec37c9f67ec28fc3230bf9918e9eaa32be97ea9b6a26d59d161a25afd2a07cf Apr 22 13:22:17.443769 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:17.443736 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qlspg" Apr 22 13:22:17.475533 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:17.475498 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qwm58" event={"ID":"44be4ae3-1c0b-4336-b5e5-5573a6075967","Type":"ContainerStarted","Data":"bec37c9f67ec28fc3230bf9918e9eaa32be97ea9b6a26d59d161a25afd2a07cf"} Apr 22 13:22:18.481146 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:18.481110 2575 generic.go:358] "Generic (PLEG): container finished" podID="44be4ae3-1c0b-4336-b5e5-5573a6075967" containerID="456075aea73691a9df8ded18b74d899f646d01911e60382918b479be38362c6f" exitCode=0 Apr 22 13:22:18.481146 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:18.481157 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qwm58" event={"ID":"44be4ae3-1c0b-4336-b5e5-5573a6075967","Type":"ContainerDied","Data":"456075aea73691a9df8ded18b74d899f646d01911e60382918b479be38362c6f"} Apr 22 13:22:19.267862 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:19.267789 2575 patch_prober.go:28] interesting pod/image-registry-6b8d7bc487-ggkqp container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 13:22:19.268065 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:19.267885 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" podUID="40c9e05d-a355-4746-b30f-56fb43b54267" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 13:22:22.366091 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:22.366058 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:22:25.509539 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:25.509450 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qwm58" event={"ID":"44be4ae3-1c0b-4336-b5e5-5573a6075967","Type":"ContainerStarted","Data":"f7b54d7e6b2a99613dc0355df6efbd0eb3f2d42ea7f069e5ea6a2add864fe24f"} Apr 22 13:22:25.509539 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:25.509502 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qwm58" event={"ID":"44be4ae3-1c0b-4336-b5e5-5573a6075967","Type":"ContainerStarted","Data":"162c3ca429334dd3a3f092971a817c779de1ddd7cf44178d609da89e837e126e"} Apr 22 13:22:25.511117 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:25.511083 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-kq2hp" event={"ID":"fdf99c44-f5fc-46e4-bf6b-f8aeeff9a948","Type":"ContainerStarted","Data":"f4337291ad202a71e4f8deef51b5f678f0b077864cc52f23d1d94976463e02ac"} Apr 22 13:22:25.511318 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:25.511300 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-kq2hp" Apr 22 13:22:25.512844 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:25.512802 2575 patch_prober.go:28] interesting pod/downloads-6bcc868b7-kq2hp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.132.0.26:8080/\": dial tcp 10.132.0.26:8080: connect: connection refused" start-of-body= Apr 22 13:22:25.512947 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:25.512862 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-kq2hp" podUID="fdf99c44-f5fc-46e4-bf6b-f8aeeff9a948" containerName="download-server" probeResult="failure" output="Get \"http://10.132.0.26:8080/\": dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 13:22:25.529399 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:25.529352 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qwm58" podStartSLOduration=8.656746727 podStartE2EDuration="9.529337327s" podCreationTimestamp="2026-04-22 13:22:16 +0000 UTC" firstStartedPulling="2026-04-22 13:22:17.153861335 +0000 UTC m=+82.932548092" lastFinishedPulling="2026-04-22 13:22:18.026451922 +0000 UTC m=+83.805138692" observedRunningTime="2026-04-22 13:22:25.528296933 +0000 UTC m=+91.306983711" watchObservedRunningTime="2026-04-22 13:22:25.529337327 +0000 UTC m=+91.308024105" Apr 22 13:22:25.544269 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:25.544213 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-kq2hp" podStartSLOduration=1.472888579 podStartE2EDuration="19.544195711s" podCreationTimestamp="2026-04-22 13:22:06 +0000 UTC" firstStartedPulling="2026-04-22 13:22:07.229191418 +0000 UTC m=+73.007878174" lastFinishedPulling="2026-04-22 13:22:25.300498545 +0000 UTC m=+91.079185306" observedRunningTime="2026-04-22 13:22:25.543464461 +0000 UTC m=+91.322151240" watchObservedRunningTime="2026-04-22 13:22:25.544195711 +0000 UTC m=+91.322882490" Apr 22 13:22:26.531276 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:26.531239 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-kq2hp" Apr 22 13:22:30.825485 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:30.825447 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b8d7bc487-ggkqp"] Apr 22 13:22:47.584971 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:47.584936 2575 generic.go:358] "Generic (PLEG): container finished" podID="314bb891-5872-4d07-b293-eb6ba8a1c926" containerID="3c27af33debf06d499f28a79969a41ceaf586e92d6c79f7f151dc55f5f234b51" exitCode=0 Apr 22 13:22:47.585378 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:47.585013 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" event={"ID":"314bb891-5872-4d07-b293-eb6ba8a1c926","Type":"ContainerDied","Data":"3c27af33debf06d499f28a79969a41ceaf586e92d6c79f7f151dc55f5f234b51"} Apr 22 13:22:47.585378 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:47.585367 2575 scope.go:117] "RemoveContainer" containerID="3c27af33debf06d499f28a79969a41ceaf586e92d6c79f7f151dc55f5f234b51" Apr 22 13:22:48.594624 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:48.594589 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9s844" event={"ID":"314bb891-5872-4d07-b293-eb6ba8a1c926","Type":"ContainerStarted","Data":"e5b394b8d51c3d34cb9be4fea5d876b2bb8bc52e5cc83c3a8e8ad5180b38bb73"} Apr 22 13:22:55.847838 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:55.847767 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" podUID="40c9e05d-a355-4746-b30f-56fb43b54267" containerName="registry" containerID="cri-o://20019a9506f7830427f0d0b3920d16b9d4e9b6bc05ecc6a486a5b03a6953ac22" gracePeriod=30 Apr 22 13:22:56.104309 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.104256 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:22:56.125576 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.125551 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40c9e05d-a355-4746-b30f-56fb43b54267-trusted-ca\") pod \"40c9e05d-a355-4746-b30f-56fb43b54267\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " Apr 22 13:22:56.125696 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.125591 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4vss\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-kube-api-access-z4vss\") pod \"40c9e05d-a355-4746-b30f-56fb43b54267\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " Apr 22 13:22:56.125696 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.125616 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/40c9e05d-a355-4746-b30f-56fb43b54267-image-registry-private-configuration\") pod \"40c9e05d-a355-4746-b30f-56fb43b54267\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " Apr 22 13:22:56.125696 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.125632 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-bound-sa-token\") pod \"40c9e05d-a355-4746-b30f-56fb43b54267\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " Apr 22 13:22:56.125696 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.125658 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40c9e05d-a355-4746-b30f-56fb43b54267-ca-trust-extracted\") pod \"40c9e05d-a355-4746-b30f-56fb43b54267\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " Apr 22 13:22:56.125696 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.125674 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40c9e05d-a355-4746-b30f-56fb43b54267-registry-certificates\") pod \"40c9e05d-a355-4746-b30f-56fb43b54267\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " Apr 22 13:22:56.125967 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.125715 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40c9e05d-a355-4746-b30f-56fb43b54267-installation-pull-secrets\") pod \"40c9e05d-a355-4746-b30f-56fb43b54267\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " Apr 22 13:22:56.125967 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.125748 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls\") pod \"40c9e05d-a355-4746-b30f-56fb43b54267\" (UID: \"40c9e05d-a355-4746-b30f-56fb43b54267\") " Apr 22 13:22:56.126193 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.126162 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40c9e05d-a355-4746-b30f-56fb43b54267-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "40c9e05d-a355-4746-b30f-56fb43b54267" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:22:56.126252 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.126169 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40c9e05d-a355-4746-b30f-56fb43b54267-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "40c9e05d-a355-4746-b30f-56fb43b54267" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 13:22:56.128621 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.128566 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c9e05d-a355-4746-b30f-56fb43b54267-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "40c9e05d-a355-4746-b30f-56fb43b54267" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 13:22:56.129062 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.129034 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c9e05d-a355-4746-b30f-56fb43b54267-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "40c9e05d-a355-4746-b30f-56fb43b54267" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 13:22:56.129158 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.129043 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-kube-api-access-z4vss" (OuterVolumeSpecName: "kube-api-access-z4vss") pod "40c9e05d-a355-4746-b30f-56fb43b54267" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267"). InnerVolumeSpecName "kube-api-access-z4vss". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 13:22:56.129158 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.129118 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "40c9e05d-a355-4746-b30f-56fb43b54267" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 13:22:56.129253 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.129151 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "40c9e05d-a355-4746-b30f-56fb43b54267" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 13:22:56.137471 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.137439 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c9e05d-a355-4746-b30f-56fb43b54267-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "40c9e05d-a355-4746-b30f-56fb43b54267" (UID: "40c9e05d-a355-4746-b30f-56fb43b54267"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 13:22:56.227078 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.227041 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4vss\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-kube-api-access-z4vss\") on node \"ip-10-0-136-73.ec2.internal\" DevicePath \"\"" Apr 22 13:22:56.227078 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.227073 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/40c9e05d-a355-4746-b30f-56fb43b54267-image-registry-private-configuration\") on node \"ip-10-0-136-73.ec2.internal\" DevicePath \"\"" Apr 22 13:22:56.227078 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.227084 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-bound-sa-token\") on node \"ip-10-0-136-73.ec2.internal\" DevicePath \"\"" Apr 22 13:22:56.227318 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.227094 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40c9e05d-a355-4746-b30f-56fb43b54267-ca-trust-extracted\") on node \"ip-10-0-136-73.ec2.internal\" DevicePath \"\"" Apr 22 13:22:56.227318 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.227103 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40c9e05d-a355-4746-b30f-56fb43b54267-registry-certificates\") on node \"ip-10-0-136-73.ec2.internal\" DevicePath \"\"" Apr 22 13:22:56.227318 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.227112 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40c9e05d-a355-4746-b30f-56fb43b54267-installation-pull-secrets\") on node \"ip-10-0-136-73.ec2.internal\" DevicePath \"\"" Apr 22 13:22:56.227318 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.227121 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c9e05d-a355-4746-b30f-56fb43b54267-registry-tls\") on node \"ip-10-0-136-73.ec2.internal\" DevicePath \"\"" Apr 22 13:22:56.227318 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.227129 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40c9e05d-a355-4746-b30f-56fb43b54267-trusted-ca\") on node \"ip-10-0-136-73.ec2.internal\" DevicePath \"\"" Apr 22 13:22:56.620621 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.620587 2575 generic.go:358] "Generic (PLEG): container finished" podID="40c9e05d-a355-4746-b30f-56fb43b54267" containerID="20019a9506f7830427f0d0b3920d16b9d4e9b6bc05ecc6a486a5b03a6953ac22" exitCode=0 Apr 22 13:22:56.620621 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.620625 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" event={"ID":"40c9e05d-a355-4746-b30f-56fb43b54267","Type":"ContainerDied","Data":"20019a9506f7830427f0d0b3920d16b9d4e9b6bc05ecc6a486a5b03a6953ac22"} Apr 22 13:22:56.620896 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.620647 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" event={"ID":"40c9e05d-a355-4746-b30f-56fb43b54267","Type":"ContainerDied","Data":"c82c971a08d733c1eb2268d3a3767d20fda2254548eb70195491ee5dfc8c2f53"} Apr 22 13:22:56.620896 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.620648 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b8d7bc487-ggkqp" Apr 22 13:22:56.620896 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.620668 2575 scope.go:117] "RemoveContainer" containerID="20019a9506f7830427f0d0b3920d16b9d4e9b6bc05ecc6a486a5b03a6953ac22" Apr 22 13:22:56.635589 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.635574 2575 scope.go:117] "RemoveContainer" containerID="20019a9506f7830427f0d0b3920d16b9d4e9b6bc05ecc6a486a5b03a6953ac22" Apr 22 13:22:56.635898 ip-10-0-136-73 kubenswrapper[2575]: E0422 13:22:56.635872 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20019a9506f7830427f0d0b3920d16b9d4e9b6bc05ecc6a486a5b03a6953ac22\": container with ID starting with 20019a9506f7830427f0d0b3920d16b9d4e9b6bc05ecc6a486a5b03a6953ac22 not found: ID does not exist" containerID="20019a9506f7830427f0d0b3920d16b9d4e9b6bc05ecc6a486a5b03a6953ac22" Apr 22 13:22:56.635952 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.635911 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20019a9506f7830427f0d0b3920d16b9d4e9b6bc05ecc6a486a5b03a6953ac22"} err="failed to get container status \"20019a9506f7830427f0d0b3920d16b9d4e9b6bc05ecc6a486a5b03a6953ac22\": rpc error: code = NotFound desc = could not find container \"20019a9506f7830427f0d0b3920d16b9d4e9b6bc05ecc6a486a5b03a6953ac22\": container with ID starting with 20019a9506f7830427f0d0b3920d16b9d4e9b6bc05ecc6a486a5b03a6953ac22 not found: ID does not exist" Apr 22 13:22:56.643984 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.643957 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b8d7bc487-ggkqp"] Apr 22 13:22:56.647130 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.647107 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6b8d7bc487-ggkqp"] Apr 22 13:22:56.855329 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:22:56.855295 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c9e05d-a355-4746-b30f-56fb43b54267" path="/var/lib/kubelet/pods/40c9e05d-a355-4746-b30f-56fb43b54267/volumes" Apr 22 13:23:11.670563 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:23:11.670528 2575 generic.go:358] "Generic (PLEG): container finished" podID="6a9a5a71-f594-4aa2-a20c-e3b81689cb97" containerID="3532e31ab625913970044f0ee7157929171b3ce770aad14fa67731bb8363e9ca" exitCode=0 Apr 22 13:23:11.671061 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:23:11.670601 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-c7lmh" event={"ID":"6a9a5a71-f594-4aa2-a20c-e3b81689cb97","Type":"ContainerDied","Data":"3532e31ab625913970044f0ee7157929171b3ce770aad14fa67731bb8363e9ca"} Apr 22 13:23:11.671061 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:23:11.670953 2575 scope.go:117] "RemoveContainer" containerID="3532e31ab625913970044f0ee7157929171b3ce770aad14fa67731bb8363e9ca" Apr 22 13:23:11.672007 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:23:11.671988 2575 generic.go:358] "Generic (PLEG): container finished" podID="325a1de2-a59a-4875-9ae5-6279a61a3d7c" containerID="bd04efea2d08f0ee47737957ebce5fec4a082e07a445836ae7b4970ddaa466bc" exitCode=0 Apr 22 13:23:11.672113 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:23:11.672088 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" event={"ID":"325a1de2-a59a-4875-9ae5-6279a61a3d7c","Type":"ContainerDied","Data":"bd04efea2d08f0ee47737957ebce5fec4a082e07a445836ae7b4970ddaa466bc"} Apr 22 13:23:11.672387 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:23:11.672374 2575 scope.go:117] "RemoveContainer" containerID="bd04efea2d08f0ee47737957ebce5fec4a082e07a445836ae7b4970ddaa466bc" Apr 22 13:23:12.677587 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:23:12.677554 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-c7lmh" event={"ID":"6a9a5a71-f594-4aa2-a20c-e3b81689cb97","Type":"ContainerStarted","Data":"2404a7d1e5ce64669f5260e77beef82f72cbb9fc94a47f32e2c29fd9e476e4b3"} Apr 22 13:23:12.679176 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:23:12.679156 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qkdgb" event={"ID":"325a1de2-a59a-4875-9ae5-6279a61a3d7c","Type":"ContainerStarted","Data":"44b8c4d12675fdc9d608f237f58363242981d2f6bea714974e4a5233d4eabaf1"} Apr 22 13:25:54.738808 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:25:54.738769 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:25:54.740074 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:25:54.740049 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:25:54.744943 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:25:54.744921 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:25:54.746253 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:25:54.746230 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:25:54.749481 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:25:54.749464 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 13:30:54.759835 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:30:54.759789 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:30:54.762473 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:30:54.762452 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:30:54.765706 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:30:54.765683 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:30:54.768404 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:30:54.768386 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:35:54.780047 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:35:54.779966 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:35:54.792558 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:35:54.792463 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:35:54.794395 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:35:54.794374 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:35:54.799147 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:35:54.799128 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:40:54.810379 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:40:54.810347 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:40:54.814684 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:40:54.814662 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:40:54.816004 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:40:54.815976 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:40:54.820430 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:40:54.820413 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:45:54.830445 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:45:54.830416 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:45:54.836263 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:45:54.836235 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:45:54.836398 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:45:54.836348 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:45:54.841909 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:45:54.841891 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:50:54.852255 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:50:54.852140 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:50:54.858680 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:50:54.858659 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:50:54.858803 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:50:54.858735 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:50:54.864142 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:50:54.864122 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:55:54.873043 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:55:54.872942 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:55:54.878496 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:55:54.878469 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 13:55:54.879552 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:55:54.879531 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 13:55:54.888839 ip-10-0-136-73 kubenswrapper[2575]: I0422 13:55:54.885748 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 14:00:54.892571 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:00:54.892469 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 14:00:54.900963 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:00:54.900943 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 14:00:54.908035 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:00:54.908016 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 14:00:54.913595 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:00:54.913576 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 14:05:54.915363 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:05:54.915258 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 14:05:54.921319 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:05:54.921293 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 14:05:54.929304 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:05:54.929283 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 14:05:54.935379 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:05:54.935348 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 14:09:23.217423 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:23.217396 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-cmksh_7b8dea64-f9f6-45b5-b139-340bac72fa46/global-pull-secret-syncer/0.log" Apr 22 14:09:23.303303 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:23.303263 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-sclfb_02a0d9d1-db8d-479c-9fe1-ca3a2cfd049f/konnectivity-agent/0.log" Apr 22 14:09:23.385449 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:23.385416 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-73.ec2.internal_5e1839749775b093843d832935f4dc52/haproxy/0.log" Apr 22 14:09:26.487505 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:26.487476 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-m67zs_39b46330-ba0e-4d59-adf9-d5dae3eff9a5/cluster-monitoring-operator/0.log" Apr 22 14:09:26.773055 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:26.772977 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qwm58_44be4ae3-1c0b-4336-b5e5-5573a6075967/node-exporter/0.log" Apr 22 14:09:26.795032 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:26.795002 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qwm58_44be4ae3-1c0b-4336-b5e5-5573a6075967/kube-rbac-proxy/0.log" Apr 22 14:09:26.815917 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:26.815898 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qwm58_44be4ae3-1c0b-4336-b5e5-5573a6075967/init-textfile/0.log" Apr 22 14:09:27.113275 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:27.113187 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-kgdnd_931be1b4-3093-4331-93ef-e0c2f780d193/prometheus-operator-admission-webhook/0.log" Apr 22 14:09:28.509182 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:28.509154 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-7wmrw_e52e91fc-36b4-42db-9861-4ef4a07e7ec7/networking-console-plugin/0.log" Apr 22 14:09:28.927301 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:28.927195 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/1.log" Apr 22 14:09:28.935229 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:28.935198 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tkzkh_e6690926-2579-440b-9233-f4d551be735b/console-operator/2.log" Apr 22 14:09:29.334911 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:29.334878 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-kq2hp_fdf99c44-f5fc-46e4-bf6b-f8aeeff9a948/download-server/0.log" Apr 22 14:09:29.725141 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:29.725104 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-plfpc_8c86436f-31ed-4303-a368-025b9fb5a7ed/volume-data-source-validator/0.log" Apr 22 14:09:30.346192 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.346160 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7"] Apr 22 14:09:30.346480 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.346468 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40c9e05d-a355-4746-b30f-56fb43b54267" containerName="registry" Apr 22 14:09:30.346523 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.346481 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c9e05d-a355-4746-b30f-56fb43b54267" containerName="registry" Apr 22 14:09:30.346555 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.346533 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="40c9e05d-a355-4746-b30f-56fb43b54267" containerName="registry" Apr 22 14:09:30.349406 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.349384 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.351525 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.351503 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-98vmz\"/\"openshift-service-ca.crt\"" Apr 22 14:09:30.351625 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.351508 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-98vmz\"/\"kube-root-ca.crt\"" Apr 22 14:09:30.351625 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.351536 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-98vmz\"/\"default-dockercfg-254cb\"" Apr 22 14:09:30.356053 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.356030 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7"] Apr 22 14:09:30.412806 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.412771 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkxt8\" (UniqueName: \"kubernetes.io/projected/415ed9a2-f979-4dda-b326-73abb3a29072-kube-api-access-gkxt8\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.413048 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.412838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/415ed9a2-f979-4dda-b326-73abb3a29072-podres\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.413048 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.412868 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/415ed9a2-f979-4dda-b326-73abb3a29072-proc\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.413048 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.412937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/415ed9a2-f979-4dda-b326-73abb3a29072-sys\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.413048 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.412956 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/415ed9a2-f979-4dda-b326-73abb3a29072-lib-modules\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.433248 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.433221 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qlspg_af2a53dd-540c-49c6-b29d-228576b6c6ef/dns/0.log" Apr 22 14:09:30.456076 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.456052 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qlspg_af2a53dd-540c-49c6-b29d-228576b6c6ef/kube-rbac-proxy/0.log" Apr 22 14:09:30.496875 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.496848 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-g8tbz_bc02fe1b-6157-4e28-a646-7be5ed635282/dns-node-resolver/0.log" Apr 22 14:09:30.513747 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.513718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/415ed9a2-f979-4dda-b326-73abb3a29072-sys\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.513919 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.513750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/415ed9a2-f979-4dda-b326-73abb3a29072-lib-modules\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.513919 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.513779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkxt8\" (UniqueName: \"kubernetes.io/projected/415ed9a2-f979-4dda-b326-73abb3a29072-kube-api-access-gkxt8\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.513919 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.513829 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/415ed9a2-f979-4dda-b326-73abb3a29072-podres\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.513919 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.513854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/415ed9a2-f979-4dda-b326-73abb3a29072-proc\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.513919 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.513860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/415ed9a2-f979-4dda-b326-73abb3a29072-sys\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.514096 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.513938 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/415ed9a2-f979-4dda-b326-73abb3a29072-lib-modules\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.514096 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.513949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/415ed9a2-f979-4dda-b326-73abb3a29072-proc\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.514096 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.514017 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/415ed9a2-f979-4dda-b326-73abb3a29072-podres\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.521201 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.521175 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkxt8\" (UniqueName: \"kubernetes.io/projected/415ed9a2-f979-4dda-b326-73abb3a29072-kube-api-access-gkxt8\") pod \"perf-node-gather-daemonset-cxgz7\" (UID: \"415ed9a2-f979-4dda-b326-73abb3a29072\") " pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.660530 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.660433 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:30.779215 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.779181 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7"] Apr 22 14:09:30.782033 ip-10-0-136-73 kubenswrapper[2575]: W0422 14:09:30.782003 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod415ed9a2_f979_4dda_b326_73abb3a29072.slice/crio-3db05010b18f97285fc578d921d309cf0236b7523244fb8dec8c473c6a633859 WatchSource:0}: Error finding container 3db05010b18f97285fc578d921d309cf0236b7523244fb8dec8c473c6a633859: Status 404 returned error can't find the container with id 3db05010b18f97285fc578d921d309cf0236b7523244fb8dec8c473c6a633859 Apr 22 14:09:30.783691 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.783673 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:09:30.933159 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:30.933080 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2x5k7_4f52168a-3467-4e13-b154-1feaf9796063/node-ca/0.log" Apr 22 14:09:31.623637 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:31.623606 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6784d5bdf4-cp9jk_31773dcc-6b07-4788-8a81-d7978b0c63fc/router/0.log" Apr 22 14:09:31.630654 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:31.630618 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" event={"ID":"415ed9a2-f979-4dda-b326-73abb3a29072","Type":"ContainerStarted","Data":"ce7c9eb2f6ab7a211bb7f6436f0f5aa48b1a59d9de0f2dc2021e3211e3686989"} Apr 22 14:09:31.630654 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:31.630651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" event={"ID":"415ed9a2-f979-4dda-b326-73abb3a29072","Type":"ContainerStarted","Data":"3db05010b18f97285fc578d921d309cf0236b7523244fb8dec8c473c6a633859"} Apr 22 14:09:31.630923 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:31.630742 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:31.644499 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:31.644447 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" podStartSLOduration=1.644433712 podStartE2EDuration="1.644433712s" podCreationTimestamp="2026-04-22 14:09:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:09:31.643971629 +0000 UTC m=+2917.422658405" watchObservedRunningTime="2026-04-22 14:09:31.644433712 +0000 UTC m=+2917.423120545" Apr 22 14:09:31.917114 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:31.917031 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-28xhg_71d8f771-bd2a-4877-b621-ee39745c59d3/serve-healthcheck-canary/0.log" Apr 22 14:09:32.277244 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:32.277211 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-c7lmh_6a9a5a71-f594-4aa2-a20c-e3b81689cb97/insights-operator/0.log" Apr 22 14:09:32.278637 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:32.278610 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-c7lmh_6a9a5a71-f594-4aa2-a20c-e3b81689cb97/insights-operator/1.log" Apr 22 14:09:32.297162 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:32.297132 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5wkct_db237155-08f3-4006-8e49-5c56556feb45/kube-rbac-proxy/0.log" Apr 22 14:09:32.316298 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:32.316268 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5wkct_db237155-08f3-4006-8e49-5c56556feb45/exporter/0.log" Apr 22 14:09:32.336420 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:32.336394 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5wkct_db237155-08f3-4006-8e49-5c56556feb45/extractor/0.log" Apr 22 14:09:36.872391 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:36.872362 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-vb8xm_a582ca34-4c37-42fd-9ba2-cb3ff6d5ca30/migrator/0.log" Apr 22 14:09:36.909698 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:36.909665 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-vb8xm_a582ca34-4c37-42fd-9ba2-cb3ff6d5ca30/graceful-termination/0.log" Apr 22 14:09:37.187527 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:37.187496 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-9s844_314bb891-5872-4d07-b293-eb6ba8a1c926/kube-storage-version-migrator-operator/1.log" Apr 22 14:09:37.188749 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:37.188729 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-9s844_314bb891-5872-4d07-b293-eb6ba8a1c926/kube-storage-version-migrator-operator/0.log" Apr 22 14:09:37.642534 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:37.642466 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-98vmz/perf-node-gather-daemonset-cxgz7" Apr 22 14:09:37.978953 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:37.978923 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82s2m_d5dedb94-cce8-40ef-8b20-152362aec6dc/kube-multus-additional-cni-plugins/0.log" Apr 22 14:09:37.997786 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:37.997758 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82s2m_d5dedb94-cce8-40ef-8b20-152362aec6dc/egress-router-binary-copy/0.log" Apr 22 14:09:38.017617 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:38.017592 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82s2m_d5dedb94-cce8-40ef-8b20-152362aec6dc/cni-plugins/0.log" Apr 22 14:09:38.038809 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:38.038786 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82s2m_d5dedb94-cce8-40ef-8b20-152362aec6dc/bond-cni-plugin/0.log" Apr 22 14:09:38.057256 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:38.057232 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82s2m_d5dedb94-cce8-40ef-8b20-152362aec6dc/routeoverride-cni/0.log" Apr 22 14:09:38.075601 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:38.075579 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82s2m_d5dedb94-cce8-40ef-8b20-152362aec6dc/whereabouts-cni-bincopy/0.log" Apr 22 14:09:38.094291 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:38.094269 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82s2m_d5dedb94-cce8-40ef-8b20-152362aec6dc/whereabouts-cni/0.log" Apr 22 14:09:38.404706 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:38.404624 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g5ctl_544999a6-323d-481d-b6ad-d24f5da3e82f/kube-multus/0.log" Apr 22 14:09:38.516202 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:38.516174 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jgxt9_8b92597f-aa17-456d-bc65-ee5880d70a69/network-metrics-daemon/0.log" Apr 22 14:09:38.534308 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:38.534281 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jgxt9_8b92597f-aa17-456d-bc65-ee5880d70a69/kube-rbac-proxy/0.log" Apr 22 14:09:39.364317 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:39.364285 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-controller/0.log" Apr 22 14:09:39.383765 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:39.383736 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/0.log" Apr 22 14:09:39.409765 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:39.409735 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovn-acl-logging/1.log" Apr 22 14:09:39.443835 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:39.443796 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/kube-rbac-proxy-node/0.log" Apr 22 14:09:39.467455 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:39.467431 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 14:09:39.485641 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:39.485619 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/northd/0.log" Apr 22 14:09:39.504960 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:39.504900 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/nbdb/0.log" Apr 22 14:09:39.524005 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:39.523975 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/sbdb/0.log" Apr 22 14:09:39.682122 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:39.682093 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kf5cf_a28f28cc-e746-49c4-bf70-a476e379f760/ovnkube-controller/0.log" Apr 22 14:09:41.255472 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:41.255427 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-9jhgc_519ff98a-672a-4535-ba51-0ecaffb33bc5/check-endpoints/0.log" Apr 22 14:09:41.277371 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:41.277345 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ggdrt_767888b5-4be3-4a3e-ac92-a5c0cd2708fe/network-check-target-container/0.log" Apr 22 14:09:42.185442 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:42.185412 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-lld46_9e919041-6f50-4989-b55f-057c690de2ab/iptables-alerter/0.log" Apr 22 14:09:42.716269 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:42.716236 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-l8mkn_55d1184b-6b1c-43fd-9fdf-a5cbe05174b6/tuned/0.log" Apr 22 14:09:44.368505 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:44.368478 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-5b69k_26fb6a86-2731-45ec-bf1d-5a84dbd6e4de/cluster-samples-operator/0.log" Apr 22 14:09:44.395722 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:44.395691 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-5b69k_26fb6a86-2731-45ec-bf1d-5a84dbd6e4de/cluster-samples-operator-watch/0.log" Apr 22 14:09:45.377788 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:45.377751 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-qkdgb_325a1de2-a59a-4875-9ae5-6279a61a3d7c/service-ca-operator/1.log" Apr 22 14:09:45.379245 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:45.379221 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-qkdgb_325a1de2-a59a-4875-9ae5-6279a61a3d7c/service-ca-operator/0.log" Apr 22 14:09:45.708731 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:45.708704 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-rzt66_2c113ea9-ff52-47d4-aa14-73f60e288cb4/service-ca-controller/0.log" Apr 22 14:09:46.155790 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:46.155719 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-7sk7b_15e591d3-68e8-48e6-854d-1b459e3bf1c1/csi-driver/0.log" Apr 22 14:09:46.188973 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:46.188942 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-7sk7b_15e591d3-68e8-48e6-854d-1b459e3bf1c1/csi-node-driver-registrar/0.log" Apr 22 14:09:46.219447 ip-10-0-136-73 kubenswrapper[2575]: I0422 14:09:46.219417 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-7sk7b_15e591d3-68e8-48e6-854d-1b459e3bf1c1/csi-liveness-probe/0.log"