Apr 17 16:28:53.353818 ip-10-0-132-44 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 16:28:53.353829 ip-10-0-132-44 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 16:28:53.353837 ip-10-0-132-44 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 16:28:53.354154 ip-10-0-132-44 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 16:29:03.444872 ip-10-0-132-44 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 16:29:03.444886 ip-10-0-132-44 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 0f290741e72742198c4359f0920c8a5b -- Apr 17 16:31:14.502490 ip-10-0-132-44 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:31:14.947373 ip-10-0-132-44 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:14.947373 ip-10-0-132-44 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:31:14.947373 ip-10-0-132-44 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:14.947373 ip-10-0-132-44 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:31:14.947373 ip-10-0-132-44 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:14.949911 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.949832 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:31:14.953703 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953683 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:14.953703 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953700 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:14.953703 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953704 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:14.953703 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953707 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953711 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953715 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953718 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953720 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953723 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953726 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953729 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953732 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953734 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953737 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953740 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953743 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953746 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953748 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953751 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953753 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953756 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953759 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953761 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:14.953868 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953764 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953767 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953769 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953772 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953774 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953777 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953780 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953782 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953785 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953788 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953791 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953793 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953796 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953799 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953802 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953805 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953808 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953811 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953814 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953816 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:14.954356 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953819 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953821 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953824 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953826 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953829 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953831 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953834 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953839 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953843 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953846 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953849 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953852 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953856 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953860 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953864 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953867 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953870 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953873 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953875 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:14.954917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953879 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953881 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953884 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953887 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953890 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953893 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953896 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953898 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953901 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953904 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953907 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953909 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953912 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953914 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953916 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953919 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953923 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953926 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953929 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:14.955388 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953932 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953934 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953938 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953942 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.953945 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954342 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954347 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954350 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954353 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954356 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954359 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954361 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954365 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954368 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954370 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954373 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954376 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954379 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954382 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954385 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:14.955837 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954388 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954390 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954393 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954396 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954398 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954401 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954404 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954406 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954409 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954412 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954415 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954417 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954420 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954422 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954425 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954427 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954430 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954433 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954435 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954438 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:14.956328 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954440 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954443 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954445 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954448 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954450 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954453 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954455 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954458 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954461 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954464 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954466 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954469 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954472 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954475 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954478 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954480 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954483 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954485 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954487 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954490 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:14.956825 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954492 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954495 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954500 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954504 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954506 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954509 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954512 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954515 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954517 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954520 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954522 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954525 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954527 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954530 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954532 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954535 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954538 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954541 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954543 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954546 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:14.957330 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954548 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954551 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954554 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954557 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954559 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954563 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954567 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954569 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954572 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954575 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.954577 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.955996 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956006 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956014 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956018 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956023 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956027 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956031 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956036 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956039 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956042 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:31:14.957861 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956046 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956049 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956052 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956056 2572 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956059 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956062 2572 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956065 2572 flags.go:64] FLAG: --cloud-config="" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956067 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956070 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956074 2572 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956078 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956081 2572 flags.go:64] FLAG: --config-dir="" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956084 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956088 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956092 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956095 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956098 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956101 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956104 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956108 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956111 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956114 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956117 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956121 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956124 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:31:14.958402 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956127 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956130 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956134 2572 flags.go:64] FLAG: --enable-server="true" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956136 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956142 2572 flags.go:64] FLAG: --event-burst="100" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956145 2572 flags.go:64] FLAG: --event-qps="50" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956148 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956151 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956154 2572 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956157 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956160 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956163 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956166 2572 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956169 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956172 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956175 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956178 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956181 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956185 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956202 2572 flags.go:64] FLAG: --feature-gates="" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956206 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956210 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956213 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956216 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956219 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:31:14.959009 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956222 2572 flags.go:64] FLAG: --help="false" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956227 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-132-44.ec2.internal" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956230 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956233 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956236 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956240 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956244 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956247 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956250 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956253 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956257 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956260 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956263 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956266 2572 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956269 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956272 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956275 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956278 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956281 2572 flags.go:64] FLAG: --lock-file="" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956284 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956287 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956290 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956295 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:31:14.959628 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956298 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956301 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956304 2572 flags.go:64] FLAG: --logging-format="text" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956307 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956311 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956314 2572 flags.go:64] FLAG: --manifest-url="" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956317 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956321 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956325 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956329 2572 flags.go:64] FLAG: --max-pods="110" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956338 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956341 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956344 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956347 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956350 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956353 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956356 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956364 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956367 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956370 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956373 2572 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956376 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956381 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956384 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:31:14.960170 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956387 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956390 2572 flags.go:64] FLAG: --port="10250" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956393 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956396 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0823ed7ddcb040d8b" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956399 2572 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956402 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956405 2572 flags.go:64] FLAG: --register-node="true" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956408 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956411 2572 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956415 2572 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956418 2572 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956422 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956425 2572 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956428 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956431 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956435 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956437 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956440 2572 flags.go:64] FLAG: --runonce="false" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956445 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956448 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956451 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956454 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956457 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956460 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956463 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956466 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:31:14.960805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956469 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956472 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956475 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956478 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956481 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956484 2572 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956488 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956493 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956496 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956499 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956503 2572 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956506 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956509 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956512 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956515 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956518 2572 flags.go:64] FLAG: --v="2" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956522 2572 flags.go:64] FLAG: --version="false" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956526 2572 flags.go:64] FLAG: --vmodule="" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956531 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.956534 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956624 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956627 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956631 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956634 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:14.961462 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956639 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956642 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956644 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956647 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956650 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956653 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956656 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956659 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956661 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956664 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956666 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956669 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956672 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956674 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956677 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956680 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956682 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956684 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956688 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:14.962042 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956692 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956695 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956697 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956700 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956703 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956705 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956708 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956710 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956713 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956717 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956720 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956723 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956726 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956730 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956733 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956736 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956738 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956741 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956743 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956746 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:14.962562 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956748 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956751 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956753 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956756 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956759 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956761 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956764 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956766 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956769 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956771 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956774 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956776 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956779 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956781 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956784 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956787 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956789 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956792 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956794 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956797 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:14.963113 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956799 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956802 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956804 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956807 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956810 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956813 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956816 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956819 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956821 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956824 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956826 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956829 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956832 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956834 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956837 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956840 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956845 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956848 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956850 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956853 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:14.963636 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956856 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956858 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.956861 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.957650 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.963880 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.963897 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963944 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963949 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963952 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963956 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963960 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963964 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963968 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963971 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963974 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:14.964118 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963977 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963980 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963983 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963986 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963989 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963992 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963994 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963997 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.963999 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964002 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964005 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964007 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964010 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964013 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964015 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964018 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964021 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964024 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964026 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964029 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:14.964525 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964032 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964035 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964038 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964041 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964044 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964046 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964049 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964052 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964054 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964057 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964059 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964062 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964064 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964067 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964069 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964072 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964075 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964078 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964080 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964083 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:14.965006 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964085 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964088 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964090 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964093 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964095 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964097 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964100 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964103 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964105 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964108 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964111 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964114 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964116 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964119 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964123 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964126 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964128 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964131 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964134 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964136 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:14.965515 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964139 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964141 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964144 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964147 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964150 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964153 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964155 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964158 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964161 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964163 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964167 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964171 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964174 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964176 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964179 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964182 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:14.965991 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964184 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.964207 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964301 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964305 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964308 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964311 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964315 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964318 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964322 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964326 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964329 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964332 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964335 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964338 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964341 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:14.966482 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964343 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964346 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964348 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964351 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964353 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964356 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964358 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964361 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964363 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964366 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964368 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964371 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964373 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964376 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964378 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964381 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964383 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964386 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964388 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964391 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:14.966890 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964394 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964396 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964399 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964403 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964406 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964409 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964412 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964416 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964418 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964421 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964423 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964426 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964428 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964431 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964433 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964436 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964438 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964441 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964443 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964446 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:14.967382 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964448 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964451 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964453 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964456 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964458 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964461 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964464 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964466 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964468 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964471 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964473 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964476 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964478 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964481 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964483 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964486 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964488 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964494 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964498 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964500 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:14.967877 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964503 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964505 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964508 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964510 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964513 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964515 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964518 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964520 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964523 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964526 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964529 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964531 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:14.964534 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.964539 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.965303 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:31:14.968369 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.968062 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:31:14.969063 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.969051 2572 server.go:1019] "Starting client certificate rotation" Apr 17 16:31:14.969163 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.969147 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:14.969218 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.969181 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:14.994876 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.994860 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:14.998489 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:14.998375 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:15.011837 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.011816 2572 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:31:15.017570 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.017551 2572 log.go:25] "Validated CRI v1 image API" Apr 17 16:31:15.018673 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.018649 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:31:15.024036 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.024016 2572 fs.go:135] Filesystem UUIDs: map[72d9d0ee-8145-4bdb-a846-99cf1631a78d:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a0590fcb-433d-4312-b279-d5fe9713533d:/dev/nvme0n1p4] Apr 17 16:31:15.024114 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.024035 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:31:15.027726 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.027700 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:15.029621 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.029510 2572 manager.go:217] Machine: {Timestamp:2026-04-17 16:31:15.027675043 +0000 UTC m=+0.406347209 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098999 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec240df561489b4e265ac1aecaa17a2d SystemUUID:ec240df5-6148-9b4e-265a-c1aecaa17a2d BootID:0f290741-e727-4219-8c43-59f0920c8a5b Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6e:19:d2:89:7f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6e:19:d2:89:7f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:6f:31:49:4a:3a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:31:15.029621 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.029617 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:31:15.029734 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.029695 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:31:15.031420 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.031395 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:31:15.031556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.031423 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-44.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:31:15.031600 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.031565 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:31:15.031600 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.031573 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:31:15.031600 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.031585 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:15.031676 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.031614 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:15.033008 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.032997 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:15.033117 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.033108 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:31:15.035469 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.035459 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:31:15.035503 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.035477 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:31:15.035503 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.035490 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:31:15.035503 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.035499 2572 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:31:15.035611 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.035508 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:31:15.036568 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.036556 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:15.036619 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.036575 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:15.039665 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.039651 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:31:15.041314 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.041302 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:31:15.042696 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.042655 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:31:15.042741 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.042712 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:31:15.042741 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.042720 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:31:15.042741 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.042726 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:31:15.042741 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.042741 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:31:15.042901 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.042747 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:31:15.042901 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.042753 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:31:15.042901 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.042758 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:31:15.042901 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.042765 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:31:15.042901 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.042771 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:31:15.042901 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.042788 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:31:15.042901 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.042797 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:31:15.043732 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.043722 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:31:15.043732 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.043732 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:31:15.047072 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.047052 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:31:15.047156 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.047086 2572 server.go:1295] "Started kubelet" Apr 17 16:31:15.047686 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.047664 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:31:15.047910 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.047860 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:31:15.048030 ip-10-0-132-44 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:31:15.052023 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.052001 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:31:15.052450 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.052432 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-44.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:31:15.053295 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.053243 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:31:15.054452 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.054431 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:31:15.059097 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.059071 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:15.059682 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.059662 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:31:15.059761 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.059704 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-44.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:31:15.059761 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.059724 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:31:15.062617 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.061556 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:31:15.062617 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.061646 2572 factory.go:153] Registering CRI-O factory Apr 17 16:31:15.062617 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.061688 2572 factory.go:223] Registration of the crio container factory successfully Apr 17 16:31:15.062617 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.061745 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:31:15.062617 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.061754 2572 factory.go:55] Registering systemd factory Apr 17 16:31:15.062617 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.061763 2572 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:31:15.062617 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.061795 2572 factory.go:103] Registering Raw factory Apr 17 16:31:15.062617 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.061810 2572 manager.go:1196] Started watching for new ooms in manager Apr 17 16:31:15.062617 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.061937 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:31:15.062617 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.062266 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:31:15.062617 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.062341 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:31:15.062617 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.062351 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:31:15.062617 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.059748 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-44.ec2.internal.18a731edb5bda46d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-44.ec2.internal,UID:ip-10-0-132-44.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-44.ec2.internal,},FirstTimestamp:2026-04-17 16:31:15.047064685 +0000 UTC m=+0.425736852,LastTimestamp:2026-04-17 16:31:15.047064685 +0000 UTC m=+0.425736852,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-44.ec2.internal,}" Apr 17 16:31:15.063781 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.063628 2572 manager.go:319] Starting recovery of all containers Apr 17 16:31:15.063781 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.063728 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:31:15.064067 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.064048 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:15.064203 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.064159 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 16:31:15.069753 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.069724 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-44.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 16:31:15.074082 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.074069 2572 manager.go:324] Recovery completed Apr 17 16:31:15.077431 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.077415 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vflt8" Apr 17 16:31:15.077773 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.077762 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:15.079879 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.079864 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:15.079956 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.079890 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:15.079956 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.079899 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:15.080398 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.080385 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:31:15.080398 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.080396 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:31:15.080485 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.080410 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:15.081825 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.081761 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-44.ec2.internal.18a731edb7b24f00 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-44.ec2.internal,UID:ip-10-0-132-44.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-132-44.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-132-44.ec2.internal,},FirstTimestamp:2026-04-17 16:31:15.079876352 +0000 UTC m=+0.458548517,LastTimestamp:2026-04-17 16:31:15.079876352 +0000 UTC m=+0.458548517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-44.ec2.internal,}" Apr 17 16:31:15.082840 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.082817 2572 policy_none.go:49] "None policy: Start" Apr 17 16:31:15.082913 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.082863 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:31:15.082913 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.082889 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:31:15.086333 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.086316 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vflt8" Apr 17 16:31:15.120738 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.120717 2572 manager.go:341] "Starting Device Plugin manager" Apr 17 16:31:15.127805 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.120749 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:31:15.127805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.120761 2572 server.go:85] "Starting device plugin registration server" Apr 17 16:31:15.127805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.120982 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:31:15.127805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.120992 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:31:15.127805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.121109 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:31:15.127805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.121172 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:31:15.127805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.121183 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:31:15.127805 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.121655 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:31:15.127805 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.121696 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:15.186037 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.186012 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:31:15.187151 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.187136 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:31:15.187248 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.187160 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:31:15.187248 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.187176 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:31:15.187248 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.187183 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:31:15.187248 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.187226 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:31:15.189592 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.189574 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:15.221823 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.221780 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:15.222982 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.222967 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:15.223051 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.222997 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:15.223051 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.223012 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:15.223051 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.223034 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.230531 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.230517 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.230598 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.230536 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-44.ec2.internal\": node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:15.257994 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.257973 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:15.287776 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.287752 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-132-44.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal"] Apr 17 16:31:15.287859 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.287807 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:15.288535 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.288519 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:15.288616 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.288551 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:15.288616 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.288565 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:15.289657 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.289643 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:15.289779 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.289766 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.289826 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.289790 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:15.290285 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.290269 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:15.290365 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.290300 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:15.290365 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.290313 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:15.290365 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.290269 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:15.290365 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.290363 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:15.290532 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.290375 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:15.291422 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.291408 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.291471 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.291433 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:15.292015 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.292002 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:15.292083 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.292031 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:15.292083 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.292044 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:15.311703 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.311683 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-44.ec2.internal\" not found" node="ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.315347 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.315333 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-44.ec2.internal\" not found" node="ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.358845 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.358825 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:15.364359 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.364343 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/97b6bf974c89d22078b0dac98765eb2d-config\") pod \"kube-apiserver-proxy-ip-10-0-132-44.ec2.internal\" (UID: \"97b6bf974c89d22078b0dac98765eb2d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.364447 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.364372 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/00990f5dcecb088795fe2c671a9f2474-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal\" (UID: \"00990f5dcecb088795fe2c671a9f2474\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.364447 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.364399 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00990f5dcecb088795fe2c671a9f2474-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal\" (UID: \"00990f5dcecb088795fe2c671a9f2474\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.459551 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.459531 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:15.464907 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.464884 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/97b6bf974c89d22078b0dac98765eb2d-config\") pod \"kube-apiserver-proxy-ip-10-0-132-44.ec2.internal\" (UID: \"97b6bf974c89d22078b0dac98765eb2d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.464960 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.464917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/00990f5dcecb088795fe2c671a9f2474-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal\" (UID: \"00990f5dcecb088795fe2c671a9f2474\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.464960 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.464937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00990f5dcecb088795fe2c671a9f2474-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal\" (UID: \"00990f5dcecb088795fe2c671a9f2474\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.465042 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.464981 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/97b6bf974c89d22078b0dac98765eb2d-config\") pod \"kube-apiserver-proxy-ip-10-0-132-44.ec2.internal\" (UID: \"97b6bf974c89d22078b0dac98765eb2d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.465042 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.464990 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/00990f5dcecb088795fe2c671a9f2474-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal\" (UID: \"00990f5dcecb088795fe2c671a9f2474\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.465042 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.465023 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00990f5dcecb088795fe2c671a9f2474-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal\" (UID: \"00990f5dcecb088795fe2c671a9f2474\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.560304 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.560259 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:15.613724 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.613696 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.617214 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.617186 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal" Apr 17 16:31:15.660737 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.660719 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:15.761233 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.761212 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:15.861702 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.861648 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:15.874430 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.874409 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:15.961761 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:15.961740 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:15.968993 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.968975 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:31:15.969128 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.969108 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:15.969203 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:15.969164 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:16.059174 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.059148 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:16.062050 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:16.062032 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:16.072896 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.072871 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:16.088106 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.088074 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:26:15 +0000 UTC" deadline="2027-10-25 08:08:53.203542514 +0000 UTC" Apr 17 16:31:16.088164 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.088107 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13335h37m37.115438501s" Apr 17 16:31:16.091283 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:16.091255 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00990f5dcecb088795fe2c671a9f2474.slice/crio-e4eca3b877148f69707366673e117a0230ace8bc97541f4cafde7b536ebc5669 WatchSource:0}: Error finding container e4eca3b877148f69707366673e117a0230ace8bc97541f4cafde7b536ebc5669: Status 404 returned error can't find the container with id e4eca3b877148f69707366673e117a0230ace8bc97541f4cafde7b536ebc5669 Apr 17 16:31:16.091691 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:16.091665 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97b6bf974c89d22078b0dac98765eb2d.slice/crio-9bff31e5a8968cc225d0cfa024ec9ffc7388c0a3ee86f977f763cd266db1589e WatchSource:0}: Error finding container 9bff31e5a8968cc225d0cfa024ec9ffc7388c0a3ee86f977f763cd266db1589e: Status 404 returned error can't find the container with id 9bff31e5a8968cc225d0cfa024ec9ffc7388c0a3ee86f977f763cd266db1589e Apr 17 16:31:16.097021 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.097007 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:31:16.147123 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.147104 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xbrrk" Apr 17 16:31:16.155862 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.155844 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xbrrk" Apr 17 16:31:16.162522 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:16.162510 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:16.190436 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.190400 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal" event={"ID":"00990f5dcecb088795fe2c671a9f2474","Type":"ContainerStarted","Data":"e4eca3b877148f69707366673e117a0230ace8bc97541f4cafde7b536ebc5669"} Apr 17 16:31:16.191285 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.191258 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-44.ec2.internal" event={"ID":"97b6bf974c89d22078b0dac98765eb2d","Type":"ContainerStarted","Data":"9bff31e5a8968cc225d0cfa024ec9ffc7388c0a3ee86f977f763cd266db1589e"} Apr 17 16:31:16.262622 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:16.262602 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:16.313277 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.313257 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:16.371004 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:16.370982 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-44.ec2.internal\" not found" Apr 17 16:31:16.406786 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.406730 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:16.460032 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.460011 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal" Apr 17 16:31:16.475570 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.475551 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:16.476477 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.476462 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-44.ec2.internal" Apr 17 16:31:16.487548 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:16.487534 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:17.036598 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.036553 2572 apiserver.go:52] "Watching apiserver" Apr 17 16:31:17.045053 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.045025 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:31:17.047743 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.047718 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-svpg4","openshift-image-registry/node-ca-b58dr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal","openshift-multus/multus-additional-cni-plugins-gl5np","openshift-multus/network-metrics-daemon-lt7mn","openshift-network-diagnostics/network-check-target-qch6k","openshift-ovn-kubernetes/ovnkube-node-nnv8v","kube-system/konnectivity-agent-fh942","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd","openshift-multus/multus-2fb27","openshift-network-operator/iptables-alerter-jl2wf","kube-system/kube-apiserver-proxy-ip-10-0-132-44.ec2.internal"] Apr 17 16:31:17.050525 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.050500 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.050780 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.050753 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b58dr" Apr 17 16:31:17.051753 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.051735 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.053055 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.053011 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:17.053350 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.053089 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:17.053456 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.053248 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:17.053456 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.053366 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:31:17.053456 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.053379 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:31:17.053456 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.053377 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2tf2s\"" Apr 17 16:31:17.053630 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.053336 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:17.053630 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.053390 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7vn4z\"" Apr 17 16:31:17.053783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.053772 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:31:17.054478 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.054455 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:31:17.054561 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.054515 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:31:17.055163 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.054898 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:31:17.055163 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.054956 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-krbhr\"" Apr 17 16:31:17.055163 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.054971 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:31:17.055163 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.054902 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:31:17.055837 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.055815 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:17.055947 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.055884 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:17.056008 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.055986 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.057130 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.057113 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fh942" Apr 17 16:31:17.058494 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.058426 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-thqqx\"" Apr 17 16:31:17.058605 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.058587 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.060527 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.060393 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:31:17.060631 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.060533 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:31:17.061300 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.060910 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:31:17.061300 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.061120 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.061456 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.061444 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:31:17.061559 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.061540 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:31:17.061559 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.061556 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-85c7f\"" Apr 17 16:31:17.061974 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.061955 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:31:17.062632 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.062520 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xjsmz\"" Apr 17 16:31:17.063714 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.062838 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:31:17.063714 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.062952 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:31:17.063714 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.063237 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:31:17.063714 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.063268 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:31:17.063714 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.063681 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:31:17.064849 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.064827 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:31:17.065050 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.065028 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8p95g\"" Apr 17 16:31:17.065626 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.065605 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jl2wf" Apr 17 16:31:17.068231 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.068185 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:17.068374 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.068222 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:31:17.068374 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.068258 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:17.068374 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.068271 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pfgg7\"" Apr 17 16:31:17.075278 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075246 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-host\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.075374 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075284 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrdvm\" (UniqueName: \"kubernetes.io/projected/757f13f2-9e57-4475-b1d7-97713e23ab42-kube-api-access-qrdvm\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.075374 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075309 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-log-socket\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.075374 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075335 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01242e37-ff64-4fe3-825f-6cb0f669e5b7-cni-binary-copy\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.075374 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075363 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsxhq\" (UniqueName: \"kubernetes.io/projected/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-kube-api-access-rsxhq\") pod \"network-metrics-daemon-lt7mn\" (UID: \"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0\") " pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:17.075658 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075391 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb9qp\" (UniqueName: \"kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp\") pod \"network-check-target-qch6k\" (UID: \"d9099ff1-798e-434b-8980-189e358b2f96\") " pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:17.075658 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075439 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-sysctl-d\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.075658 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-var-lib-kubelet\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.075658 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075560 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/908fb34a-55a4-4783-af03-7b4b7c408f98-host\") pod \"node-ca-b58dr\" (UID: \"908fb34a-55a4-4783-af03-7b4b7c408f98\") " pod="openshift-image-registry/node-ca-b58dr" Apr 17 16:31:17.075658 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075583 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/908fb34a-55a4-4783-af03-7b4b7c408f98-serviceca\") pod \"node-ca-b58dr\" (UID: \"908fb34a-55a4-4783-af03-7b4b7c408f98\") " pod="openshift-image-registry/node-ca-b58dr" Apr 17 16:31:17.075658 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075606 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-os-release\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.075658 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075630 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-lib-modules\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.075658 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075652 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-kubelet\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.076002 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2a93066e-fe3b-416c-ab13-3098d92ffb5f-ovnkube-script-lib\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.076002 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075698 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-device-dir\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.076002 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-multus-socket-dir-parent\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.076002 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075779 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-run-k8s-cni-cncf-io\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.076002 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.076002 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075871 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-run-netns\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.076002 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075901 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-node-log\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.076002 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075925 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-socket-dir\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.076002 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075950 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzhtq\" (UniqueName: \"kubernetes.io/projected/91b78255-7a27-4463-8656-4713778fa480-kube-api-access-pzhtq\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.076002 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.075976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbcmr\" (UniqueName: \"kubernetes.io/projected/908fb34a-55a4-4783-af03-7b4b7c408f98-kube-api-access-qbcmr\") pod \"node-ca-b58dr\" (UID: \"908fb34a-55a4-4783-af03-7b4b7c408f98\") " pod="openshift-image-registry/node-ca-b58dr" Apr 17 16:31:17.076433 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076122 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-sysconfig\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.076433 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076152 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-tuned\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.076433 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.076433 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076233 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-cni-bin\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.076433 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076300 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-multus-cni-dir\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.076433 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076349 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-etc-kubernetes\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.076433 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076377 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-sysctl-conf\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.076433 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076396 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-systemd-units\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.076433 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076412 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bab37d78-677d-45dc-81ad-fba92f1bf0c6-agent-certs\") pod \"konnectivity-agent-fh942\" (UID: \"bab37d78-677d-45dc-81ad-fba92f1bf0c6\") " pod="kube-system/konnectivity-agent-fh942" Apr 17 16:31:17.076433 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076429 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9c85e109-cdf7-4611-80ae-1231d32d04b9-iptables-alerter-script\") pod \"iptables-alerter-jl2wf\" (UID: \"9c85e109-cdf7-4611-80ae-1231d32d04b9\") " pod="openshift-network-operator/iptables-alerter-jl2wf" Apr 17 16:31:17.076783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076455 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2pcr\" (UniqueName: \"kubernetes.io/projected/9c85e109-cdf7-4611-80ae-1231d32d04b9-kube-api-access-l2pcr\") pod \"iptables-alerter-jl2wf\" (UID: \"9c85e109-cdf7-4611-80ae-1231d32d04b9\") " pod="openshift-network-operator/iptables-alerter-jl2wf" Apr 17 16:31:17.076783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076499 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-run\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.076783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076538 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-var-lib-cni-bin\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.076783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-var-lib-cni-multus\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.076783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076609 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-hostroot\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.076783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076659 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-cni-netd\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.076783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076692 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-cnibin\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.076783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-run-multus-certs\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.076783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-run-ovn\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.076783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076771 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8grp\" (UniqueName: \"kubernetes.io/projected/2a93066e-fe3b-416c-ab13-3098d92ffb5f-kube-api-access-t8grp\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.077120 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076796 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.077120 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076821 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-modprobe-d\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.077120 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-etc-openvswitch\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.077120 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076885 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-run-openvswitch\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.077120 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076909 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a93066e-fe3b-416c-ab13-3098d92ffb5f-env-overrides\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.077120 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076940 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bab37d78-677d-45dc-81ad-fba92f1bf0c6-konnectivity-ca\") pod \"konnectivity-agent-fh942\" (UID: \"bab37d78-677d-45dc-81ad-fba92f1bf0c6\") " pod="kube-system/konnectivity-agent-fh942" Apr 17 16:31:17.077120 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.076962 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-slash\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.077120 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077017 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a93066e-fe3b-416c-ab13-3098d92ffb5f-ovn-node-metrics-cert\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.077120 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077044 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-system-cni-dir\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.077120 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077070 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-systemd\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.077120 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077093 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-var-lib-openvswitch\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-registration-dir\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077208 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-os-release\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077230 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/01242e37-ff64-4fe3-825f-6cb0f669e5b7-multus-daemon-config\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077247 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077263 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077277 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-var-lib-kubelet\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077302 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/757f13f2-9e57-4475-b1d7-97713e23ab42-tmp\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077319 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-etc-selinux\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077340 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-sys-fs\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077375 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-system-cni-dir\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-cnibin\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077434 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6qc6\" (UniqueName: \"kubernetes.io/projected/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-kube-api-access-g6qc6\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077460 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs\") pod \"network-metrics-daemon-lt7mn\" (UID: \"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0\") " pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077493 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-run-systemd\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.077530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077522 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a93066e-fe3b-416c-ab13-3098d92ffb5f-ovnkube-config\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.078217 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077544 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-run-netns\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.078217 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077566 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b76x7\" (UniqueName: \"kubernetes.io/projected/01242e37-ff64-4fe3-825f-6cb0f669e5b7-kube-api-access-b76x7\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.078217 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077589 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c85e109-cdf7-4611-80ae-1231d32d04b9-host-slash\") pod \"iptables-alerter-jl2wf\" (UID: \"9c85e109-cdf7-4611-80ae-1231d32d04b9\") " pod="openshift-network-operator/iptables-alerter-jl2wf" Apr 17 16:31:17.078217 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077611 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-kubernetes\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.078217 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077633 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.078217 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077654 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-multus-conf-dir\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.078217 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.077674 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-sys\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.157162 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.157120 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:16 +0000 UTC" deadline="2027-10-08 03:04:04.975388655 +0000 UTC" Apr 17 16:31:17.157162 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.157161 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12922h32m47.81823129s" Apr 17 16:31:17.162800 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.162779 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:31:17.177914 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.177894 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/01242e37-ff64-4fe3-825f-6cb0f669e5b7-multus-daemon-config\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.178023 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.177925 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.178023 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.177942 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.178023 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.177961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-var-lib-kubelet\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.178176 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/757f13f2-9e57-4475-b1d7-97713e23ab42-tmp\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.178176 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-etc-selinux\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.178176 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-sys-fs\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.178176 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178132 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-var-lib-kubelet\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.178376 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178215 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-etc-selinux\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.178376 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178222 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-system-cni-dir\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.178376 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178143 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-system-cni-dir\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.178376 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178266 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-cnibin\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.178376 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6qc6\" (UniqueName: \"kubernetes.io/projected/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-kube-api-access-g6qc6\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.178376 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178300 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-cnibin\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.178376 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178268 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-sys-fs\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.178376 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178320 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs\") pod \"network-metrics-daemon-lt7mn\" (UID: \"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0\") " pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:17.178724 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178457 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:31:17.178724 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-run-systemd\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.178724 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a93066e-fe3b-416c-ab13-3098d92ffb5f-ovnkube-config\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.178724 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-run-netns\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.178724 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178638 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-run-systemd\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.178724 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.178600 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:17.178724 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b76x7\" (UniqueName: \"kubernetes.io/projected/01242e37-ff64-4fe3-825f-6cb0f669e5b7-kube-api-access-b76x7\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.179033 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c85e109-cdf7-4611-80ae-1231d32d04b9-host-slash\") pod \"iptables-alerter-jl2wf\" (UID: \"9c85e109-cdf7-4611-80ae-1231d32d04b9\") " pod="openshift-network-operator/iptables-alerter-jl2wf" Apr 17 16:31:17.179033 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-kubernetes\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.179033 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178883 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.179033 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178896 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.179033 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178922 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-multus-conf-dir\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.179033 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c85e109-cdf7-4611-80ae-1231d32d04b9-host-slash\") pod \"iptables-alerter-jl2wf\" (UID: \"9c85e109-cdf7-4611-80ae-1231d32d04b9\") " pod="openshift-network-operator/iptables-alerter-jl2wf" Apr 17 16:31:17.179033 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178796 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/01242e37-ff64-4fe3-825f-6cb0f669e5b7-multus-daemon-config\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.179033 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.178880 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.179033 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.179002 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs podName:3e538eeb-9985-4bbd-ae4b-d6ac1469dba0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:17.678976693 +0000 UTC m=+3.057648865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs") pod "network-metrics-daemon-lt7mn" (UID: "3e538eeb-9985-4bbd-ae4b-d6ac1469dba0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:17.179033 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179018 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-run-netns\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.179551 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179025 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-sys\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.179551 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-multus-conf-dir\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.179551 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.179551 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179093 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-host\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.179551 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179158 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-sys\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.179551 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179163 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-kubernetes\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.179551 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179206 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-host\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.179551 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179213 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrdvm\" (UniqueName: \"kubernetes.io/projected/757f13f2-9e57-4475-b1d7-97713e23ab42-kube-api-access-qrdvm\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.179551 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-log-socket\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.179551 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179301 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-log-socket\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.179551 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179344 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01242e37-ff64-4fe3-825f-6cb0f669e5b7-cni-binary-copy\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.180029 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsxhq\" (UniqueName: \"kubernetes.io/projected/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-kube-api-access-rsxhq\") pod \"network-metrics-daemon-lt7mn\" (UID: \"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0\") " pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:17.180029 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179627 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb9qp\" (UniqueName: \"kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp\") pod \"network-check-target-qch6k\" (UID: \"d9099ff1-798e-434b-8980-189e358b2f96\") " pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:17.180029 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-sysctl-d\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.180029 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179827 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-var-lib-kubelet\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.180029 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179857 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/908fb34a-55a4-4783-af03-7b4b7c408f98-host\") pod \"node-ca-b58dr\" (UID: \"908fb34a-55a4-4783-af03-7b4b7c408f98\") " pod="openshift-image-registry/node-ca-b58dr" Apr 17 16:31:17.180029 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179882 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/908fb34a-55a4-4783-af03-7b4b7c408f98-serviceca\") pod \"node-ca-b58dr\" (UID: \"908fb34a-55a4-4783-af03-7b4b7c408f98\") " pod="openshift-image-registry/node-ca-b58dr" Apr 17 16:31:17.180029 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179909 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-os-release\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.180029 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179954 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-var-lib-kubelet\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.180029 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-lib-modules\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.180029 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.179973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a93066e-fe3b-416c-ab13-3098d92ffb5f-ovnkube-config\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.180029 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180027 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/908fb34a-55a4-4783-af03-7b4b7c408f98-host\") pod \"node-ca-b58dr\" (UID: \"908fb34a-55a4-4783-af03-7b4b7c408f98\") " pod="openshift-image-registry/node-ca-b58dr" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180069 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-kubelet\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180091 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-sysctl-d\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180063 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-lib-modules\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180103 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-os-release\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2a93066e-fe3b-416c-ab13-3098d92ffb5f-ovnkube-script-lib\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180153 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01242e37-ff64-4fe3-825f-6cb0f669e5b7-cni-binary-copy\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180160 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-device-dir\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180162 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-kubelet\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180222 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-multus-socket-dir-parent\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180233 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-device-dir\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180263 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-run-k8s-cni-cncf-io\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180296 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-multus-socket-dir-parent\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180322 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-run-netns\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-run-k8s-cni-cncf-io\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180344 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-node-log\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.180556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-socket-dir\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180387 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-node-log\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180433 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzhtq\" (UniqueName: \"kubernetes.io/projected/91b78255-7a27-4463-8656-4713778fa480-kube-api-access-pzhtq\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180465 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-run-netns\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180461 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbcmr\" (UniqueName: \"kubernetes.io/projected/908fb34a-55a4-4783-af03-7b4b7c408f98-kube-api-access-qbcmr\") pod \"node-ca-b58dr\" (UID: \"908fb34a-55a4-4783-af03-7b4b7c408f98\") " pod="openshift-image-registry/node-ca-b58dr" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180559 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-socket-dir\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/908fb34a-55a4-4783-af03-7b4b7c408f98-serviceca\") pod \"node-ca-b58dr\" (UID: \"908fb34a-55a4-4783-af03-7b4b7c408f98\") " pod="openshift-image-registry/node-ca-b58dr" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180636 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-sysconfig\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-tuned\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180709 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180713 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-sysconfig\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180734 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-cni-bin\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180757 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-multus-cni-dir\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180759 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180759 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2a93066e-fe3b-416c-ab13-3098d92ffb5f-ovnkube-script-lib\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-etc-kubernetes\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.181320 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180798 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-cni-bin\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180840 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-multus-cni-dir\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-sysctl-conf\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180903 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-etc-kubernetes\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-systemd-units\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180953 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bab37d78-677d-45dc-81ad-fba92f1bf0c6-agent-certs\") pod \"konnectivity-agent-fh942\" (UID: \"bab37d78-677d-45dc-81ad-fba92f1bf0c6\") " pod="kube-system/konnectivity-agent-fh942" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180977 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9c85e109-cdf7-4611-80ae-1231d32d04b9-iptables-alerter-script\") pod \"iptables-alerter-jl2wf\" (UID: \"9c85e109-cdf7-4611-80ae-1231d32d04b9\") " pod="openshift-network-operator/iptables-alerter-jl2wf" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.180981 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-systemd-units\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181029 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-sysctl-conf\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2pcr\" (UniqueName: \"kubernetes.io/projected/9c85e109-cdf7-4611-80ae-1231d32d04b9-kube-api-access-l2pcr\") pod \"iptables-alerter-jl2wf\" (UID: \"9c85e109-cdf7-4611-80ae-1231d32d04b9\") " pod="openshift-network-operator/iptables-alerter-jl2wf" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181148 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-run\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181221 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-var-lib-cni-bin\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-var-lib-cni-multus\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181272 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-run\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181278 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-hostroot\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181315 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-cni-netd\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-var-lib-cni-bin\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-cnibin\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.182035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181384 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-hostroot\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181408 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-var-lib-cni-multus\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181405 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-run-multus-certs\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181472 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-host-run-multus-certs\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-cnibin\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181459 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-cni-netd\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-run-ovn\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181556 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9c85e109-cdf7-4611-80ae-1231d32d04b9-iptables-alerter-script\") pod \"iptables-alerter-jl2wf\" (UID: \"9c85e109-cdf7-4611-80ae-1231d32d04b9\") " pod="openshift-network-operator/iptables-alerter-jl2wf" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181568 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8grp\" (UniqueName: \"kubernetes.io/projected/2a93066e-fe3b-416c-ab13-3098d92ffb5f-kube-api-access-t8grp\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181623 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-run-ovn\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181778 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-modprobe-d\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181814 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-etc-openvswitch\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181839 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-run-openvswitch\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-run-openvswitch\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181890 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a93066e-fe3b-416c-ab13-3098d92ffb5f-env-overrides\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181922 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-etc-openvswitch\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.182871 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181925 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bab37d78-677d-45dc-81ad-fba92f1bf0c6-konnectivity-ca\") pod \"konnectivity-agent-fh942\" (UID: \"bab37d78-677d-45dc-81ad-fba92f1bf0c6\") " pod="kube-system/konnectivity-agent-fh942" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.181969 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-slash\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182011 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a93066e-fe3b-416c-ab13-3098d92ffb5f-ovn-node-metrics-cert\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-system-cni-dir\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182176 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182415 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bab37d78-677d-45dc-81ad-fba92f1bf0c6-konnectivity-ca\") pod \"konnectivity-agent-fh942\" (UID: \"bab37d78-677d-45dc-81ad-fba92f1bf0c6\") " pod="kube-system/konnectivity-agent-fh942" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182466 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-systemd\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182497 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-modprobe-d\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-var-lib-openvswitch\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182536 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a93066e-fe3b-416c-ab13-3098d92ffb5f-env-overrides\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182566 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-slash\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182571 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a93066e-fe3b-416c-ab13-3098d92ffb5f-var-lib-openvswitch\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-system-cni-dir\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-systemd\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182653 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-registration-dir\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.183552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-os-release\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.184098 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182744 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/91b78255-7a27-4463-8656-4713778fa480-registration-dir\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.184098 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.182779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01242e37-ff64-4fe3-825f-6cb0f669e5b7-os-release\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.184098 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.183152 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/757f13f2-9e57-4475-b1d7-97713e23ab42-tmp\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.184098 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.183452 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/757f13f2-9e57-4475-b1d7-97713e23ab42-etc-tuned\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.184946 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.184929 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bab37d78-677d-45dc-81ad-fba92f1bf0c6-agent-certs\") pod \"konnectivity-agent-fh942\" (UID: \"bab37d78-677d-45dc-81ad-fba92f1bf0c6\") " pod="kube-system/konnectivity-agent-fh942" Apr 17 16:31:17.185021 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.184934 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a93066e-fe3b-416c-ab13-3098d92ffb5f-ovn-node-metrics-cert\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.191674 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.191102 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:17.191674 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.191138 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:17.191674 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.191151 2572 projected.go:194] Error preparing data for projected volume kube-api-access-wb9qp for pod openshift-network-diagnostics/network-check-target-qch6k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:17.191674 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.191228 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp podName:d9099ff1-798e-434b-8980-189e358b2f96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:17.691210749 +0000 UTC m=+3.069882906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wb9qp" (UniqueName: "kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp") pod "network-check-target-qch6k" (UID: "d9099ff1-798e-434b-8980-189e358b2f96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:17.194918 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.194895 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b76x7\" (UniqueName: \"kubernetes.io/projected/01242e37-ff64-4fe3-825f-6cb0f669e5b7-kube-api-access-b76x7\") pod \"multus-2fb27\" (UID: \"01242e37-ff64-4fe3-825f-6cb0f669e5b7\") " pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.195332 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.195309 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbcmr\" (UniqueName: \"kubernetes.io/projected/908fb34a-55a4-4783-af03-7b4b7c408f98-kube-api-access-qbcmr\") pod \"node-ca-b58dr\" (UID: \"908fb34a-55a4-4783-af03-7b4b7c408f98\") " pod="openshift-image-registry/node-ca-b58dr" Apr 17 16:31:17.196461 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.196421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2pcr\" (UniqueName: \"kubernetes.io/projected/9c85e109-cdf7-4611-80ae-1231d32d04b9-kube-api-access-l2pcr\") pod \"iptables-alerter-jl2wf\" (UID: \"9c85e109-cdf7-4611-80ae-1231d32d04b9\") " pod="openshift-network-operator/iptables-alerter-jl2wf" Apr 17 16:31:17.196461 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.196440 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzhtq\" (UniqueName: \"kubernetes.io/projected/91b78255-7a27-4463-8656-4713778fa480-kube-api-access-pzhtq\") pod \"aws-ebs-csi-driver-node-dhvrd\" (UID: \"91b78255-7a27-4463-8656-4713778fa480\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.196626 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.196444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8grp\" (UniqueName: \"kubernetes.io/projected/2a93066e-fe3b-416c-ab13-3098d92ffb5f-kube-api-access-t8grp\") pod \"ovnkube-node-nnv8v\" (UID: \"2a93066e-fe3b-416c-ab13-3098d92ffb5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.196891 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.196868 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6qc6\" (UniqueName: \"kubernetes.io/projected/665698fb-87f6-4ef0-a908-b1d2f13eb9d6-kube-api-access-g6qc6\") pod \"multus-additional-cni-plugins-gl5np\" (UID: \"665698fb-87f6-4ef0-a908-b1d2f13eb9d6\") " pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.197367 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.197340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsxhq\" (UniqueName: \"kubernetes.io/projected/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-kube-api-access-rsxhq\") pod \"network-metrics-daemon-lt7mn\" (UID: \"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0\") " pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:17.198233 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.198216 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrdvm\" (UniqueName: \"kubernetes.io/projected/757f13f2-9e57-4475-b1d7-97713e23ab42-kube-api-access-qrdvm\") pod \"tuned-svpg4\" (UID: \"757f13f2-9e57-4475-b1d7-97713e23ab42\") " pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.335749 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.335650 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:17.364156 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.364132 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-svpg4" Apr 17 16:31:17.370757 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.370733 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b58dr" Apr 17 16:31:17.379200 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.379169 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gl5np" Apr 17 16:31:17.384832 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.384815 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:17.390891 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.390875 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fh942" Apr 17 16:31:17.397360 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.397342 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" Apr 17 16:31:17.402886 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.402871 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2fb27" Apr 17 16:31:17.409388 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.409372 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jl2wf" Apr 17 16:31:17.684863 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.684833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs\") pod \"network-metrics-daemon-lt7mn\" (UID: \"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0\") " pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:17.685032 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.685012 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:17.685103 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.685091 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs podName:3e538eeb-9985-4bbd-ae4b-d6ac1469dba0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:18.685070441 +0000 UTC m=+4.063742606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs") pod "network-metrics-daemon-lt7mn" (UID: "3e538eeb-9985-4bbd-ae4b-d6ac1469dba0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:17.698026 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:17.698005 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod908fb34a_55a4_4783_af03_7b4b7c408f98.slice/crio-72fcbc581161a3bfcb8ed9f22aeb8369b3310d2c2484e89364fd9f4cc2f43df7 WatchSource:0}: Error finding container 72fcbc581161a3bfcb8ed9f22aeb8369b3310d2c2484e89364fd9f4cc2f43df7: Status 404 returned error can't find the container with id 72fcbc581161a3bfcb8ed9f22aeb8369b3310d2c2484e89364fd9f4cc2f43df7 Apr 17 16:31:17.699639 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:17.699613 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod665698fb_87f6_4ef0_a908_b1d2f13eb9d6.slice/crio-2ab9b9cc9c9b13b5bc5662725ca1d5da2b54de29c8b373cd0fe1be0577e5d465 WatchSource:0}: Error finding container 2ab9b9cc9c9b13b5bc5662725ca1d5da2b54de29c8b373cd0fe1be0577e5d465: Status 404 returned error can't find the container with id 2ab9b9cc9c9b13b5bc5662725ca1d5da2b54de29c8b373cd0fe1be0577e5d465 Apr 17 16:31:17.700631 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:17.700608 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c85e109_cdf7_4611_80ae_1231d32d04b9.slice/crio-5756ecf3928b9cd2807d124c00aa1e49aa8dd7d53af1958131108bacb63efa45 WatchSource:0}: Error finding container 5756ecf3928b9cd2807d124c00aa1e49aa8dd7d53af1958131108bacb63efa45: Status 404 returned error can't find the container with id 5756ecf3928b9cd2807d124c00aa1e49aa8dd7d53af1958131108bacb63efa45 Apr 17 16:31:17.701704 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:17.701679 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbab37d78_677d_45dc_81ad_fba92f1bf0c6.slice/crio-df723ad1c948766c43238f1f45846d560c3d5f2934b96475bed439eaccfc6132 WatchSource:0}: Error finding container df723ad1c948766c43238f1f45846d560c3d5f2934b96475bed439eaccfc6132: Status 404 returned error can't find the container with id df723ad1c948766c43238f1f45846d560c3d5f2934b96475bed439eaccfc6132 Apr 17 16:31:17.702448 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:17.702401 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01242e37_ff64_4fe3_825f_6cb0f669e5b7.slice/crio-72130a536d30e3048490fcf20861807afb0f63ed6832ff089aa7a13006a0809d WatchSource:0}: Error finding container 72130a536d30e3048490fcf20861807afb0f63ed6832ff089aa7a13006a0809d: Status 404 returned error can't find the container with id 72130a536d30e3048490fcf20861807afb0f63ed6832ff089aa7a13006a0809d Apr 17 16:31:17.705030 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:17.705009 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a93066e_fe3b_416c_ab13_3098d92ffb5f.slice/crio-8db5c3dd705c910f23526f6fc10a481dbb7a643b307282b51d01c007ac393220 WatchSource:0}: Error finding container 8db5c3dd705c910f23526f6fc10a481dbb7a643b307282b51d01c007ac393220: Status 404 returned error can't find the container with id 8db5c3dd705c910f23526f6fc10a481dbb7a643b307282b51d01c007ac393220 Apr 17 16:31:17.705917 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:17.705895 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b78255_7a27_4463_8656_4713778fa480.slice/crio-902def260869390c2cf6c9e43daac2b5ee6418dc8c246b9bf089b7712be17728 WatchSource:0}: Error finding container 902def260869390c2cf6c9e43daac2b5ee6418dc8c246b9bf089b7712be17728: Status 404 returned error can't find the container with id 902def260869390c2cf6c9e43daac2b5ee6418dc8c246b9bf089b7712be17728 Apr 17 16:31:17.706679 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:17.706583 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod757f13f2_9e57_4475_b1d7_97713e23ab42.slice/crio-7bca62d785c9b6c45d6100da5c2f083368fa59e935fefe083ee668328190705f WatchSource:0}: Error finding container 7bca62d785c9b6c45d6100da5c2f083368fa59e935fefe083ee668328190705f: Status 404 returned error can't find the container with id 7bca62d785c9b6c45d6100da5c2f083368fa59e935fefe083ee668328190705f Apr 17 16:31:17.785394 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:17.785369 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb9qp\" (UniqueName: \"kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp\") pod \"network-check-target-qch6k\" (UID: \"d9099ff1-798e-434b-8980-189e358b2f96\") " pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:17.785520 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.785506 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:17.785559 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.785523 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:17.785559 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.785532 2572 projected.go:194] Error preparing data for projected volume kube-api-access-wb9qp for pod openshift-network-diagnostics/network-check-target-qch6k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:17.785618 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:17.785580 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp podName:d9099ff1-798e-434b-8980-189e358b2f96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:18.785559289 +0000 UTC m=+4.164231460 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wb9qp" (UniqueName: "kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp") pod "network-check-target-qch6k" (UID: "d9099ff1-798e-434b-8980-189e358b2f96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:18.157538 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.157446 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:16 +0000 UTC" deadline="2027-11-16 11:41:54.173469061 +0000 UTC" Apr 17 16:31:18.157538 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.157482 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13867h10m36.015989822s" Apr 17 16:31:18.189679 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.188932 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:18.189679 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:18.189057 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:18.197866 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.197814 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gl5np" event={"ID":"665698fb-87f6-4ef0-a908-b1d2f13eb9d6","Type":"ContainerStarted","Data":"2ab9b9cc9c9b13b5bc5662725ca1d5da2b54de29c8b373cd0fe1be0577e5d465"} Apr 17 16:31:18.205075 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.205017 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fh942" event={"ID":"bab37d78-677d-45dc-81ad-fba92f1bf0c6","Type":"ContainerStarted","Data":"df723ad1c948766c43238f1f45846d560c3d5f2934b96475bed439eaccfc6132"} Apr 17 16:31:18.208017 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.207979 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b58dr" event={"ID":"908fb34a-55a4-4783-af03-7b4b7c408f98","Type":"ContainerStarted","Data":"72fcbc581161a3bfcb8ed9f22aeb8369b3310d2c2484e89364fd9f4cc2f43df7"} Apr 17 16:31:18.211121 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.211095 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-44.ec2.internal" event={"ID":"97b6bf974c89d22078b0dac98765eb2d","Type":"ContainerStarted","Data":"b054437e1cf89a01aa7c8e01ff181b8811bf1708233c67567b723a3b9168481e"} Apr 17 16:31:18.216058 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.216006 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-svpg4" event={"ID":"757f13f2-9e57-4475-b1d7-97713e23ab42","Type":"ContainerStarted","Data":"7bca62d785c9b6c45d6100da5c2f083368fa59e935fefe083ee668328190705f"} Apr 17 16:31:18.223867 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.223841 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" event={"ID":"91b78255-7a27-4463-8656-4713778fa480","Type":"ContainerStarted","Data":"902def260869390c2cf6c9e43daac2b5ee6418dc8c246b9bf089b7712be17728"} Apr 17 16:31:18.225413 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.225376 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" event={"ID":"2a93066e-fe3b-416c-ab13-3098d92ffb5f","Type":"ContainerStarted","Data":"8db5c3dd705c910f23526f6fc10a481dbb7a643b307282b51d01c007ac393220"} Apr 17 16:31:18.237437 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.237396 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jl2wf" event={"ID":"9c85e109-cdf7-4611-80ae-1231d32d04b9","Type":"ContainerStarted","Data":"5756ecf3928b9cd2807d124c00aa1e49aa8dd7d53af1958131108bacb63efa45"} Apr 17 16:31:18.248315 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.248291 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2fb27" event={"ID":"01242e37-ff64-4fe3-825f-6cb0f669e5b7","Type":"ContainerStarted","Data":"72130a536d30e3048490fcf20861807afb0f63ed6832ff089aa7a13006a0809d"} Apr 17 16:31:18.694664 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.694630 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs\") pod \"network-metrics-daemon-lt7mn\" (UID: \"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0\") " pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:18.694820 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:18.694796 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:18.694889 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:18.694861 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs podName:3e538eeb-9985-4bbd-ae4b-d6ac1469dba0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:20.694838304 +0000 UTC m=+6.073510457 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs") pod "network-metrics-daemon-lt7mn" (UID: "3e538eeb-9985-4bbd-ae4b-d6ac1469dba0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:18.795692 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.795613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb9qp\" (UniqueName: \"kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp\") pod \"network-check-target-qch6k\" (UID: \"d9099ff1-798e-434b-8980-189e358b2f96\") " pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:18.795836 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:18.795791 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:18.795836 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:18.795812 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:18.795836 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:18.795824 2572 projected.go:194] Error preparing data for projected volume kube-api-access-wb9qp for pod openshift-network-diagnostics/network-check-target-qch6k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:18.795986 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:18.795877 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp podName:d9099ff1-798e-434b-8980-189e358b2f96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:20.795859447 +0000 UTC m=+6.174531620 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wb9qp" (UniqueName: "kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp") pod "network-check-target-qch6k" (UID: "d9099ff1-798e-434b-8980-189e358b2f96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:18.921043 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:18.920765 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:19.188853 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:19.188659 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:19.188853 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:19.188805 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:19.263642 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:19.263603 2572 generic.go:358] "Generic (PLEG): container finished" podID="00990f5dcecb088795fe2c671a9f2474" containerID="92feb76423d8d24f8ce39d8b7201d804e3d53c1864a2c08124442b7eb43f832b" exitCode=0 Apr 17 16:31:19.264035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:19.264001 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal" event={"ID":"00990f5dcecb088795fe2c671a9f2474","Type":"ContainerDied","Data":"92feb76423d8d24f8ce39d8b7201d804e3d53c1864a2c08124442b7eb43f832b"} Apr 17 16:31:19.280707 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:19.280249 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-44.ec2.internal" podStartSLOduration=3.280232905 podStartE2EDuration="3.280232905s" podCreationTimestamp="2026-04-17 16:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:18.23746886 +0000 UTC m=+3.616141037" watchObservedRunningTime="2026-04-17 16:31:19.280232905 +0000 UTC m=+4.658905080" Apr 17 16:31:20.187748 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:20.187716 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:20.187933 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:20.187847 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:20.272776 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:20.272739 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal" event={"ID":"00990f5dcecb088795fe2c671a9f2474","Type":"ContainerStarted","Data":"f560f0d1dfb109fb91701ac0fc99e78e58a5be2134c064c26ff8c683856728ff"} Apr 17 16:31:20.709472 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:20.709432 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs\") pod \"network-metrics-daemon-lt7mn\" (UID: \"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0\") " pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:20.709667 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:20.709615 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:20.709729 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:20.709678 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs podName:3e538eeb-9985-4bbd-ae4b-d6ac1469dba0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:24.709657994 +0000 UTC m=+10.088330152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs") pod "network-metrics-daemon-lt7mn" (UID: "3e538eeb-9985-4bbd-ae4b-d6ac1469dba0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:20.810540 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:20.810503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb9qp\" (UniqueName: \"kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp\") pod \"network-check-target-qch6k\" (UID: \"d9099ff1-798e-434b-8980-189e358b2f96\") " pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:20.810717 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:20.810666 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:20.810717 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:20.810685 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:20.810717 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:20.810698 2572 projected.go:194] Error preparing data for projected volume kube-api-access-wb9qp for pod openshift-network-diagnostics/network-check-target-qch6k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:20.810878 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:20.810766 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp podName:d9099ff1-798e-434b-8980-189e358b2f96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:24.810746547 +0000 UTC m=+10.189418714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wb9qp" (UniqueName: "kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp") pod "network-check-target-qch6k" (UID: "d9099ff1-798e-434b-8980-189e358b2f96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:21.187677 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:21.187630 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:21.187853 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:21.187779 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:22.188397 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:22.188363 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:22.188846 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:22.188483 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:23.188372 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:23.188333 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:23.188531 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:23.188469 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:24.188027 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:24.187990 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:24.188234 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:24.188125 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:24.740163 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:24.740123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs\") pod \"network-metrics-daemon-lt7mn\" (UID: \"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0\") " pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:24.740603 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:24.740301 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:24.740603 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:24.740377 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs podName:3e538eeb-9985-4bbd-ae4b-d6ac1469dba0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:32.74035754 +0000 UTC m=+18.119029722 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs") pod "network-metrics-daemon-lt7mn" (UID: "3e538eeb-9985-4bbd-ae4b-d6ac1469dba0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:24.840580 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:24.840547 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb9qp\" (UniqueName: \"kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp\") pod \"network-check-target-qch6k\" (UID: \"d9099ff1-798e-434b-8980-189e358b2f96\") " pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:24.840748 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:24.840726 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:24.840790 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:24.840758 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:24.840790 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:24.840771 2572 projected.go:194] Error preparing data for projected volume kube-api-access-wb9qp for pod openshift-network-diagnostics/network-check-target-qch6k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:24.840849 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:24.840831 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp podName:d9099ff1-798e-434b-8980-189e358b2f96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:32.840810005 +0000 UTC m=+18.219482172 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wb9qp" (UniqueName: "kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp") pod "network-check-target-qch6k" (UID: "d9099ff1-798e-434b-8980-189e358b2f96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:25.189707 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:25.189237 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:25.189707 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:25.189351 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:26.188346 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:26.188309 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:26.188771 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:26.188437 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:27.188172 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:27.187949 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:27.188362 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:27.188294 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:28.188323 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:28.188293 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:28.188477 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:28.188392 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:29.188420 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:29.188384 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:29.188621 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:29.188523 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:30.188162 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:30.188128 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:30.188339 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:30.188262 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:31.187914 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:31.187883 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:31.188345 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:31.188000 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:32.187939 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:32.187911 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:32.188393 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:32.188026 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:32.804988 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:32.804956 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs\") pod \"network-metrics-daemon-lt7mn\" (UID: \"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0\") " pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:32.805122 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:32.805073 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:32.805169 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:32.805132 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs podName:3e538eeb-9985-4bbd-ae4b-d6ac1469dba0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:48.805116473 +0000 UTC m=+34.183788629 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs") pod "network-metrics-daemon-lt7mn" (UID: "3e538eeb-9985-4bbd-ae4b-d6ac1469dba0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:32.905321 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:32.905284 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb9qp\" (UniqueName: \"kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp\") pod \"network-check-target-qch6k\" (UID: \"d9099ff1-798e-434b-8980-189e358b2f96\") " pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:32.905456 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:32.905412 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:32.905456 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:32.905425 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:32.905456 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:32.905434 2572 projected.go:194] Error preparing data for projected volume kube-api-access-wb9qp for pod openshift-network-diagnostics/network-check-target-qch6k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:32.905600 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:32.905482 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp podName:d9099ff1-798e-434b-8980-189e358b2f96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:48.905465755 +0000 UTC m=+34.284137922 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wb9qp" (UniqueName: "kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp") pod "network-check-target-qch6k" (UID: "d9099ff1-798e-434b-8980-189e358b2f96") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:33.189959 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:33.189932 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:33.190390 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:33.190056 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:34.187602 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:34.187569 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:34.187774 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:34.187675 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:35.188918 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.188714 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:35.189713 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:35.189018 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:35.297559 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.297531 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2fb27" event={"ID":"01242e37-ff64-4fe3-825f-6cb0f669e5b7","Type":"ContainerStarted","Data":"1126ae675a997d6506d1c989f2ff3703fa4aeda3dd854867e335e17ceb11b4fd"} Apr 17 16:31:35.298989 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.298961 2572 generic.go:358] "Generic (PLEG): container finished" podID="665698fb-87f6-4ef0-a908-b1d2f13eb9d6" containerID="718d28b687ad6dafbe592d6a1161fb7cd989c8109d1287783e7248f53b9058e9" exitCode=0 Apr 17 16:31:35.299159 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.299044 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gl5np" event={"ID":"665698fb-87f6-4ef0-a908-b1d2f13eb9d6","Type":"ContainerDied","Data":"718d28b687ad6dafbe592d6a1161fb7cd989c8109d1287783e7248f53b9058e9"} Apr 17 16:31:35.300582 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.300356 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fh942" event={"ID":"bab37d78-677d-45dc-81ad-fba92f1bf0c6","Type":"ContainerStarted","Data":"2e3e1709a8d3e530ddb937b2ec2f8e4ba379394b4524db8301d05d2b53751a63"} Apr 17 16:31:35.301786 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.301670 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b58dr" event={"ID":"908fb34a-55a4-4783-af03-7b4b7c408f98","Type":"ContainerStarted","Data":"2a6a25636aa26d57a4ba3cc1cb684f9614f202f8f9983ba07bab62b49f74d1c1"} Apr 17 16:31:35.303004 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.302984 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-svpg4" event={"ID":"757f13f2-9e57-4475-b1d7-97713e23ab42","Type":"ContainerStarted","Data":"dc0758813164194909b41ba6c74ca2070c28bd9e7f1e146da1e1472e6ca98599"} Apr 17 16:31:35.304418 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.304395 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" event={"ID":"91b78255-7a27-4463-8656-4713778fa480","Type":"ContainerStarted","Data":"8a22a7b6cf0dc85b7da68649d33e85ac0f4cad515f925fff5d9c495bc5df112e"} Apr 17 16:31:35.307331 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.307315 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/0.log" Apr 17 16:31:35.307596 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.307579 2572 generic.go:358] "Generic (PLEG): container finished" podID="2a93066e-fe3b-416c-ab13-3098d92ffb5f" containerID="f9785836b9e819e28cc21732eba7d5746eebac6fea6b183786b683b69919c23d" exitCode=1 Apr 17 16:31:35.307669 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.307617 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" event={"ID":"2a93066e-fe3b-416c-ab13-3098d92ffb5f","Type":"ContainerStarted","Data":"0fbb59c62ffe63777d43d875562e594c4eab7f971e7bc77ca214190557ceb428"} Apr 17 16:31:35.307669 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.307636 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" event={"ID":"2a93066e-fe3b-416c-ab13-3098d92ffb5f","Type":"ContainerStarted","Data":"30cef1935bfa42da28a41644ca70a1366631acbc6e0f531746035e8cc508a695"} Apr 17 16:31:35.307669 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.307650 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" event={"ID":"2a93066e-fe3b-416c-ab13-3098d92ffb5f","Type":"ContainerStarted","Data":"a929ab1cf79aee7575029ea7fb5969dca34c744c00d97508c5984cea8525bc1a"} Apr 17 16:31:35.307669 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.307664 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" event={"ID":"2a93066e-fe3b-416c-ab13-3098d92ffb5f","Type":"ContainerStarted","Data":"1460574051a6df77f2572bfdb35938d15ddf9445ce1d1a9046b982c94269bedd"} Apr 17 16:31:35.307858 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.307676 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" event={"ID":"2a93066e-fe3b-416c-ab13-3098d92ffb5f","Type":"ContainerDied","Data":"f9785836b9e819e28cc21732eba7d5746eebac6fea6b183786b683b69919c23d"} Apr 17 16:31:35.307858 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.307688 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" event={"ID":"2a93066e-fe3b-416c-ab13-3098d92ffb5f","Type":"ContainerStarted","Data":"3f16d0d4e21bed8a74a49273ac5e81433dc0316c5f75ad43c5208a93662f7ef3"} Apr 17 16:31:35.312027 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.311990 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2fb27" podStartSLOduration=3.45050753 podStartE2EDuration="20.311978582s" podCreationTimestamp="2026-04-17 16:31:15 +0000 UTC" firstStartedPulling="2026-04-17 16:31:17.704743952 +0000 UTC m=+3.083416121" lastFinishedPulling="2026-04-17 16:31:34.566215018 +0000 UTC m=+19.944887173" observedRunningTime="2026-04-17 16:31:35.311833696 +0000 UTC m=+20.690505881" watchObservedRunningTime="2026-04-17 16:31:35.311978582 +0000 UTC m=+20.690650747" Apr 17 16:31:35.312292 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.312268 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-44.ec2.internal" podStartSLOduration=19.31226208 podStartE2EDuration="19.31226208s" podCreationTimestamp="2026-04-17 16:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:20.298088826 +0000 UTC m=+5.676760999" watchObservedRunningTime="2026-04-17 16:31:35.31226208 +0000 UTC m=+20.690934254" Apr 17 16:31:35.337824 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.337794 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fh942" podStartSLOduration=8.237076587 podStartE2EDuration="20.337784423s" podCreationTimestamp="2026-04-17 16:31:15 +0000 UTC" firstStartedPulling="2026-04-17 16:31:17.703710937 +0000 UTC m=+3.082383091" lastFinishedPulling="2026-04-17 16:31:29.804418771 +0000 UTC m=+15.183090927" observedRunningTime="2026-04-17 16:31:35.32425745 +0000 UTC m=+20.702929627" watchObservedRunningTime="2026-04-17 16:31:35.337784423 +0000 UTC m=+20.716456597" Apr 17 16:31:35.354890 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.354848 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-svpg4" podStartSLOduration=3.590270016 podStartE2EDuration="20.354835005s" podCreationTimestamp="2026-04-17 16:31:15 +0000 UTC" firstStartedPulling="2026-04-17 16:31:17.708428267 +0000 UTC m=+3.087100421" lastFinishedPulling="2026-04-17 16:31:34.472993252 +0000 UTC m=+19.851665410" observedRunningTime="2026-04-17 16:31:35.337514956 +0000 UTC m=+20.716187131" watchObservedRunningTime="2026-04-17 16:31:35.354835005 +0000 UTC m=+20.733507181" Apr 17 16:31:35.354994 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.354946 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-b58dr" podStartSLOduration=3.766190488 podStartE2EDuration="20.354940384s" podCreationTimestamp="2026-04-17 16:31:15 +0000 UTC" firstStartedPulling="2026-04-17 16:31:17.699515474 +0000 UTC m=+3.078187627" lastFinishedPulling="2026-04-17 16:31:34.288265369 +0000 UTC m=+19.666937523" observedRunningTime="2026-04-17 16:31:35.35455639 +0000 UTC m=+20.733228578" watchObservedRunningTime="2026-04-17 16:31:35.354940384 +0000 UTC m=+20.733612558" Apr 17 16:31:35.581050 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:35.580917 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:31:36.132809 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:36.132713 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:31:35.581047326Z","UUID":"ac333e96-c5f5-4a8d-8523-443fb0d3af80","Handler":null,"Name":"","Endpoint":""} Apr 17 16:31:36.135935 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:36.135903 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:31:36.135935 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:36.135938 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:31:36.187393 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:36.187369 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:36.187536 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:36.187472 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:36.313179 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:36.313145 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" event={"ID":"91b78255-7a27-4463-8656-4713778fa480","Type":"ContainerStarted","Data":"74f13e7b9727259b62523e0d1a622181ca2d59c92289fca1b7c87cbac0c00caf"} Apr 17 16:31:36.314626 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:36.314519 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jl2wf" event={"ID":"9c85e109-cdf7-4611-80ae-1231d32d04b9","Type":"ContainerStarted","Data":"b957b6862439b3396f04612c163f46e491acf766838c723feca2f4e1468e43b0"} Apr 17 16:31:37.187712 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.187680 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:37.187886 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:37.187794 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:37.317747 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.317671 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" event={"ID":"91b78255-7a27-4463-8656-4713778fa480","Type":"ContainerStarted","Data":"855b4bbe76410185449b58091f700797501b487bfd53665ff642d2c55ebc27ea"} Apr 17 16:31:37.320263 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.320245 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/0.log" Apr 17 16:31:37.320552 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.320535 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" event={"ID":"2a93066e-fe3b-416c-ab13-3098d92ffb5f","Type":"ContainerStarted","Data":"04c14df5d78a6edfa3fdecae70965b422d692880926d70d9dca38d8a00490dd0"} Apr 17 16:31:37.345367 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.345324 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhvrd" podStartSLOduration=3.63408581 podStartE2EDuration="22.345310201s" podCreationTimestamp="2026-04-17 16:31:15 +0000 UTC" firstStartedPulling="2026-04-17 16:31:17.708291153 +0000 UTC m=+3.086963317" lastFinishedPulling="2026-04-17 16:31:36.419515552 +0000 UTC m=+21.798187708" observedRunningTime="2026-04-17 16:31:37.34499984 +0000 UTC m=+22.723672009" watchObservedRunningTime="2026-04-17 16:31:37.345310201 +0000 UTC m=+22.723982377" Apr 17 16:31:37.345589 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.345554 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jl2wf" podStartSLOduration=5.576908137 podStartE2EDuration="22.345548464s" podCreationTimestamp="2026-04-17 16:31:15 +0000 UTC" firstStartedPulling="2026-04-17 16:31:17.702598205 +0000 UTC m=+3.081270373" lastFinishedPulling="2026-04-17 16:31:34.471238541 +0000 UTC m=+19.849910700" observedRunningTime="2026-04-17 16:31:36.32748086 +0000 UTC m=+21.706153038" watchObservedRunningTime="2026-04-17 16:31:37.345548464 +0000 UTC m=+22.724220638" Apr 17 16:31:37.538838 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.538810 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5v2lr"] Apr 17 16:31:37.544310 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.544290 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:37.544424 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:37.544358 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5v2lr" podUID="345c95f5-2f92-4ba5-8afd-6484fb524fad" Apr 17 16:31:37.641500 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.641474 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/345c95f5-2f92-4ba5-8afd-6484fb524fad-dbus\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:37.641649 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.641518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/345c95f5-2f92-4ba5-8afd-6484fb524fad-kubelet-config\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:37.641649 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.641603 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:37.742814 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.742784 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/345c95f5-2f92-4ba5-8afd-6484fb524fad-dbus\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:37.742952 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.742830 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/345c95f5-2f92-4ba5-8afd-6484fb524fad-kubelet-config\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:37.742952 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.742878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:37.743058 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.742966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/345c95f5-2f92-4ba5-8afd-6484fb524fad-kubelet-config\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:37.743058 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:37.742975 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:37.743058 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:37.743032 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret podName:345c95f5-2f92-4ba5-8afd-6484fb524fad nodeName:}" failed. No retries permitted until 2026-04-17 16:31:38.243015773 +0000 UTC m=+23.621687939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret") pod "global-pull-secret-syncer-5v2lr" (UID: "345c95f5-2f92-4ba5-8afd-6484fb524fad") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:37.743228 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:37.743158 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/345c95f5-2f92-4ba5-8afd-6484fb524fad-dbus\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:38.138672 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:38.138638 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fh942" Apr 17 16:31:38.139317 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:38.139294 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fh942" Apr 17 16:31:38.188390 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:38.188367 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:38.188531 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:38.188478 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:38.246790 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:38.246758 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:38.246920 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:38.246873 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:38.247036 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:38.246932 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret podName:345c95f5-2f92-4ba5-8afd-6484fb524fad nodeName:}" failed. No retries permitted until 2026-04-17 16:31:39.246914742 +0000 UTC m=+24.625586910 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret") pod "global-pull-secret-syncer-5v2lr" (UID: "345c95f5-2f92-4ba5-8afd-6484fb524fad") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:39.187670 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:39.187641 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:39.188317 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:39.187646 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:39.188317 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:39.187773 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5v2lr" podUID="345c95f5-2f92-4ba5-8afd-6484fb524fad" Apr 17 16:31:39.188317 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:39.187839 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:39.196746 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:39.196716 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fh942" Apr 17 16:31:39.197486 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:39.197471 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fh942" Apr 17 16:31:39.256370 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:39.256333 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:39.256506 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:39.256495 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:39.256572 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:39.256560 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret podName:345c95f5-2f92-4ba5-8afd-6484fb524fad nodeName:}" failed. No retries permitted until 2026-04-17 16:31:41.256541899 +0000 UTC m=+26.635214056 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret") pod "global-pull-secret-syncer-5v2lr" (UID: "345c95f5-2f92-4ba5-8afd-6484fb524fad") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:40.187473 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:40.187286 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:40.187657 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:40.187561 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:40.327671 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:40.327639 2572 generic.go:358] "Generic (PLEG): container finished" podID="665698fb-87f6-4ef0-a908-b1d2f13eb9d6" containerID="492ed08dbbd15f82e83a542a0b89cc5df7cd4c843fb4dca952ad14f78bfa9195" exitCode=0 Apr 17 16:31:40.328082 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:40.327679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gl5np" event={"ID":"665698fb-87f6-4ef0-a908-b1d2f13eb9d6","Type":"ContainerDied","Data":"492ed08dbbd15f82e83a542a0b89cc5df7cd4c843fb4dca952ad14f78bfa9195"} Apr 17 16:31:40.330754 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:40.330731 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/0.log" Apr 17 16:31:40.331083 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:40.331062 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" event={"ID":"2a93066e-fe3b-416c-ab13-3098d92ffb5f","Type":"ContainerStarted","Data":"f9633d3508968da73f431245930b30e244a4b8136f20c33a6975515629cb2f70"} Apr 17 16:31:40.331563 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:40.331517 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:40.331563 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:40.331541 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:40.331657 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:40.331644 2572 scope.go:117] "RemoveContainer" containerID="f9785836b9e819e28cc21732eba7d5746eebac6fea6b183786b683b69919c23d" Apr 17 16:31:40.346404 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:40.346386 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:40.346758 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:40.346745 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:41.188563 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.188338 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:41.188704 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:41.188682 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:41.188962 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.188937 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:41.189113 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:41.189045 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5v2lr" podUID="345c95f5-2f92-4ba5-8afd-6484fb524fad" Apr 17 16:31:41.272044 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.272021 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:41.272154 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:41.272141 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:41.272219 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:41.272204 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret podName:345c95f5-2f92-4ba5-8afd-6484fb524fad nodeName:}" failed. No retries permitted until 2026-04-17 16:31:45.272175862 +0000 UTC m=+30.650848015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret") pod "global-pull-secret-syncer-5v2lr" (UID: "345c95f5-2f92-4ba5-8afd-6484fb524fad") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:41.334433 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.334405 2572 generic.go:358] "Generic (PLEG): container finished" podID="665698fb-87f6-4ef0-a908-b1d2f13eb9d6" containerID="777ffee9e7a92acdd9b6ad88f188fea26e2261be1ccfcd300edd3fffa93e2ef6" exitCode=0 Apr 17 16:31:41.334881 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.334501 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gl5np" event={"ID":"665698fb-87f6-4ef0-a908-b1d2f13eb9d6","Type":"ContainerDied","Data":"777ffee9e7a92acdd9b6ad88f188fea26e2261be1ccfcd300edd3fffa93e2ef6"} Apr 17 16:31:41.337555 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.337531 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/0.log" Apr 17 16:31:41.337847 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.337829 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" event={"ID":"2a93066e-fe3b-416c-ab13-3098d92ffb5f","Type":"ContainerStarted","Data":"49d6b36c90b10da39266791bb342bbc4a3fb56e9d8ecd60d86d62908d35c37f3"} Apr 17 16:31:41.337945 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.337933 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 16:31:41.424344 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.424271 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" podStartSLOduration=9.62030549 podStartE2EDuration="26.424259045s" podCreationTimestamp="2026-04-17 16:31:15 +0000 UTC" firstStartedPulling="2026-04-17 16:31:17.707174334 +0000 UTC m=+3.085846487" lastFinishedPulling="2026-04-17 16:31:34.511127875 +0000 UTC m=+19.889800042" observedRunningTime="2026-04-17 16:31:41.423805991 +0000 UTC m=+26.802478176" watchObservedRunningTime="2026-04-17 16:31:41.424259045 +0000 UTC m=+26.802931219" Apr 17 16:31:41.469675 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.469648 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lt7mn"] Apr 17 16:31:41.469755 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.469738 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:41.469861 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:41.469840 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:41.475380 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.475362 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5v2lr"] Apr 17 16:31:41.475450 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.475420 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:41.475508 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:41.475490 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5v2lr" podUID="345c95f5-2f92-4ba5-8afd-6484fb524fad" Apr 17 16:31:41.476741 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.476719 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qch6k"] Apr 17 16:31:41.476821 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:41.476809 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:41.476898 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:41.476883 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:42.341306 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:42.341231 2572 generic.go:358] "Generic (PLEG): container finished" podID="665698fb-87f6-4ef0-a908-b1d2f13eb9d6" containerID="4699867e9da4f1ee83480c81a04c5fe849f5c6191d8e03a943d591398c5eaf6c" exitCode=0 Apr 17 16:31:42.341615 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:42.341320 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gl5np" event={"ID":"665698fb-87f6-4ef0-a908-b1d2f13eb9d6","Type":"ContainerDied","Data":"4699867e9da4f1ee83480c81a04c5fe849f5c6191d8e03a943d591398c5eaf6c"} Apr 17 16:31:42.341615 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:42.341562 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 16:31:43.188398 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:43.188359 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:43.188564 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:43.188470 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5v2lr" podUID="345c95f5-2f92-4ba5-8afd-6484fb524fad" Apr 17 16:31:43.188705 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:43.188678 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:43.188839 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:43.188732 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:43.188839 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:43.188821 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:43.188933 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:43.188910 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:45.188962 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.188924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:45.189687 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:45.189013 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qch6k" podUID="d9099ff1-798e-434b-8980-189e358b2f96" Apr 17 16:31:45.189687 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.189029 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:45.189687 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.189154 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:45.189687 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:45.189207 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lt7mn" podUID="3e538eeb-9985-4bbd-ae4b-d6ac1469dba0" Apr 17 16:31:45.189687 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:45.189245 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5v2lr" podUID="345c95f5-2f92-4ba5-8afd-6484fb524fad" Apr 17 16:31:45.305223 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.305125 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:45.305359 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:45.305271 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:45.305359 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:45.305336 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret podName:345c95f5-2f92-4ba5-8afd-6484fb524fad nodeName:}" failed. No retries permitted until 2026-04-17 16:31:53.30531632 +0000 UTC m=+38.683988475 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret") pod "global-pull-secret-syncer-5v2lr" (UID: "345c95f5-2f92-4ba5-8afd-6484fb524fad") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:45.554654 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.554622 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-rtbw5"] Apr 17 16:31:45.595464 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.595405 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rtbw5" Apr 17 16:31:45.597985 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.597965 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:31:45.598107 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.597989 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-25cmm\"" Apr 17 16:31:45.599023 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.599007 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:31:45.708108 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.708077 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33d7fffe-ef0b-495e-adb5-fbc82f11a1f0-tmp-dir\") pod \"node-resolver-rtbw5\" (UID: \"33d7fffe-ef0b-495e-adb5-fbc82f11a1f0\") " pod="openshift-dns/node-resolver-rtbw5" Apr 17 16:31:45.708236 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.708131 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-675h5\" (UniqueName: \"kubernetes.io/projected/33d7fffe-ef0b-495e-adb5-fbc82f11a1f0-kube-api-access-675h5\") pod \"node-resolver-rtbw5\" (UID: \"33d7fffe-ef0b-495e-adb5-fbc82f11a1f0\") " pod="openshift-dns/node-resolver-rtbw5" Apr 17 16:31:45.708287 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.708249 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33d7fffe-ef0b-495e-adb5-fbc82f11a1f0-hosts-file\") pod \"node-resolver-rtbw5\" (UID: \"33d7fffe-ef0b-495e-adb5-fbc82f11a1f0\") " pod="openshift-dns/node-resolver-rtbw5" Apr 17 16:31:45.809314 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.809286 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33d7fffe-ef0b-495e-adb5-fbc82f11a1f0-tmp-dir\") pod \"node-resolver-rtbw5\" (UID: \"33d7fffe-ef0b-495e-adb5-fbc82f11a1f0\") " pod="openshift-dns/node-resolver-rtbw5" Apr 17 16:31:45.809434 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.809350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-675h5\" (UniqueName: \"kubernetes.io/projected/33d7fffe-ef0b-495e-adb5-fbc82f11a1f0-kube-api-access-675h5\") pod \"node-resolver-rtbw5\" (UID: \"33d7fffe-ef0b-495e-adb5-fbc82f11a1f0\") " pod="openshift-dns/node-resolver-rtbw5" Apr 17 16:31:45.809434 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.809408 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33d7fffe-ef0b-495e-adb5-fbc82f11a1f0-hosts-file\") pod \"node-resolver-rtbw5\" (UID: \"33d7fffe-ef0b-495e-adb5-fbc82f11a1f0\") " pod="openshift-dns/node-resolver-rtbw5" Apr 17 16:31:45.809577 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.809561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33d7fffe-ef0b-495e-adb5-fbc82f11a1f0-hosts-file\") pod \"node-resolver-rtbw5\" (UID: \"33d7fffe-ef0b-495e-adb5-fbc82f11a1f0\") " pod="openshift-dns/node-resolver-rtbw5" Apr 17 16:31:45.809635 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.809594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/33d7fffe-ef0b-495e-adb5-fbc82f11a1f0-tmp-dir\") pod \"node-resolver-rtbw5\" (UID: \"33d7fffe-ef0b-495e-adb5-fbc82f11a1f0\") " pod="openshift-dns/node-resolver-rtbw5" Apr 17 16:31:45.819307 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.819285 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-675h5\" (UniqueName: \"kubernetes.io/projected/33d7fffe-ef0b-495e-adb5-fbc82f11a1f0-kube-api-access-675h5\") pod \"node-resolver-rtbw5\" (UID: \"33d7fffe-ef0b-495e-adb5-fbc82f11a1f0\") " pod="openshift-dns/node-resolver-rtbw5" Apr 17 16:31:45.904722 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:45.904694 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rtbw5" Apr 17 16:31:45.913667 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:45.913630 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33d7fffe_ef0b_495e_adb5_fbc82f11a1f0.slice/crio-1b35f6acc9f796b5e9f243554e2cf6004ad613d8c2b0074aec075e4147557528 WatchSource:0}: Error finding container 1b35f6acc9f796b5e9f243554e2cf6004ad613d8c2b0074aec075e4147557528: Status 404 returned error can't find the container with id 1b35f6acc9f796b5e9f243554e2cf6004ad613d8c2b0074aec075e4147557528 Apr 17 16:31:46.349019 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.348989 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rtbw5" event={"ID":"33d7fffe-ef0b-495e-adb5-fbc82f11a1f0","Type":"ContainerStarted","Data":"9d132d14bbc133f70338249be02b14cc3f438870c8f5b2ec4c3e180eb2d94aa0"} Apr 17 16:31:46.349019 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.349023 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rtbw5" event={"ID":"33d7fffe-ef0b-495e-adb5-fbc82f11a1f0","Type":"ContainerStarted","Data":"1b35f6acc9f796b5e9f243554e2cf6004ad613d8c2b0074aec075e4147557528"} Apr 17 16:31:46.366431 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.366391 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rtbw5" podStartSLOduration=1.3663782119999999 podStartE2EDuration="1.366378212s" podCreationTimestamp="2026-04-17 16:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:46.365432571 +0000 UTC m=+31.744104747" watchObservedRunningTime="2026-04-17 16:31:46.366378212 +0000 UTC m=+31.745050380" Apr 17 16:31:46.464533 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.464469 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-44.ec2.internal" event="NodeReady" Apr 17 16:31:46.464634 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.464580 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:31:46.499960 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.499938 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww"] Apr 17 16:31:46.510636 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.510611 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2"] Apr 17 16:31:46.510775 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.510763 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:31:46.513997 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.513971 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 16:31:46.514994 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.514974 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:31:46.515083 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.515055 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-9pgds\"" Apr 17 16:31:46.516176 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.516158 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:31:46.516294 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.516161 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 16:31:46.521905 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.521886 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-ftwtr"] Apr 17 16:31:46.522065 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.522047 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" Apr 17 16:31:46.526932 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.525861 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 16:31:46.526932 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.525877 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ng9td\"" Apr 17 16:31:46.526932 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.526622 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:46.527620 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.526653 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:46.527620 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.526766 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 16:31:46.537580 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.537427 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-fb6d49967-7zjnb"] Apr 17 16:31:46.537822 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.537800 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.540445 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.540427 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:31:46.540606 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.540591 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-ckhcv\"" Apr 17 16:31:46.540819 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.540799 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 16:31:46.540891 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.540806 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 16:31:46.541121 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.541105 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:31:46.547695 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.547675 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79f8c9df5d-9kzhz"] Apr 17 16:31:46.547885 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.547863 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.549398 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.549381 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 16:31:46.551817 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.551119 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:31:46.551817 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.551130 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-psh7p\"" Apr 17 16:31:46.551817 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.551405 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:31:46.551817 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.551654 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:31:46.560422 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.560405 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:31:46.562183 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.562166 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv"] Apr 17 16:31:46.562285 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.562270 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.565474 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.565455 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 16:31:46.565613 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.565541 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 16:31:46.565613 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.565577 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-xnps4\"" Apr 17 16:31:46.565811 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.565458 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 16:31:46.565920 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.565902 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 16:31:46.566080 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.566028 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 16:31:46.566080 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.566000 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 16:31:46.579539 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.579518 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mkhdq"] Apr 17 16:31:46.579673 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.579657 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" Apr 17 16:31:46.581969 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.581950 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 16:31:46.582234 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.582217 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:46.582279 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.582239 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:46.582792 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.582777 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 16:31:46.582880 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.582864 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-zlk9g\"" Apr 17 16:31:46.599771 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.599751 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc"] Apr 17 16:31:46.599891 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.599878 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mkhdq" Apr 17 16:31:46.602605 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.602590 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:46.602683 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.602618 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-zg6bm\"" Apr 17 16:31:46.602740 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.602699 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:46.616035 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.616015 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjchd\" (UniqueName: \"kubernetes.io/projected/81ae48b9-3879-4953-b4f9-833feac79819-kube-api-access-qjchd\") pod \"service-ca-operator-d6fc45fc5-78xv2\" (UID: \"81ae48b9-3879-4953-b4f9-833feac79819\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" Apr 17 16:31:46.616116 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.616064 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/418111de-59f5-4b93-bf45-150196b0de95-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:31:46.616157 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.616123 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rjh2\" (UniqueName: \"kubernetes.io/projected/418111de-59f5-4b93-bf45-150196b0de95-kube-api-access-7rjh2\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:31:46.616214 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.616178 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81ae48b9-3879-4953-b4f9-833feac79819-serving-cert\") pod \"service-ca-operator-d6fc45fc5-78xv2\" (UID: \"81ae48b9-3879-4953-b4f9-833feac79819\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" Apr 17 16:31:46.616302 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.616285 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:31:46.616364 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.616319 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ae48b9-3879-4953-b4f9-833feac79819-config\") pod \"service-ca-operator-d6fc45fc5-78xv2\" (UID: \"81ae48b9-3879-4953-b4f9-833feac79819\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" Apr 17 16:31:46.618450 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.618434 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-t8bs6"] Apr 17 16:31:46.618546 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.618532 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:31:46.621114 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.621095 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:46.621260 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.621123 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-vzv4m\"" Apr 17 16:31:46.621390 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.621376 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 16:31:46.621489 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.621460 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:46.636651 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.636635 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:46.636733 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.636716 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww"] Apr 17 16:31:46.636783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.636746 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-ftwtr"] Apr 17 16:31:46.636783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.636759 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-fb6d49967-7zjnb"] Apr 17 16:31:46.636783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.636770 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mkhdq"] Apr 17 16:31:46.636783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.636772 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.636783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.636781 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc"] Apr 17 16:31:46.636951 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.636791 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv"] Apr 17 16:31:46.636951 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.636803 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-t8bs6"] Apr 17 16:31:46.636951 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.636817 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wtckq"] Apr 17 16:31:46.636951 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.636842 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 16:31:46.639400 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.639381 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:46.639484 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.639450 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:46.639484 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.639474 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 16:31:46.639586 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.639558 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 16:31:46.639800 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.639785 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-qjmc6\"" Apr 17 16:31:46.645572 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.645554 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 16:31:46.649380 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.649364 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nnv8v" Apr 17 16:31:46.649443 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.649388 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-psgfk"] Apr 17 16:31:46.649443 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.649425 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:31:46.651798 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.651783 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:31:46.651874 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.651813 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:31:46.651941 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.651931 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:31:46.652007 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.651953 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cnl7t\"" Apr 17 16:31:46.661331 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.661314 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-w25s5"] Apr 17 16:31:46.661452 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.661438 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:46.663787 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.663769 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:31:46.663899 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.663880 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:31:46.663967 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.663916 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xz9tf\"" Apr 17 16:31:46.675397 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.675372 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nlj65"] Apr 17 16:31:46.681915 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.681897 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv"] Apr 17 16:31:46.682020 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.682006 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:31:46.682236 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.682213 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nlj65" Apr 17 16:31:46.684436 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.684414 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 16:31:46.684517 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.684494 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 16:31:46.684588 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.684525 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-cgkh6\"" Apr 17 16:31:46.684588 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.684538 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-pqtpb\"" Apr 17 16:31:46.684837 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.684821 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:31:46.684926 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.684827 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:31:46.689344 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.689305 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5"] Apr 17 16:31:46.690071 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.689792 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" Apr 17 16:31:46.692316 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.692297 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 16:31:46.692468 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.692449 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 16:31:46.692569 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.692550 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 16:31:46.692753 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.692660 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 16:31:46.695827 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.695809 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4"] Apr 17 16:31:46.696027 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.696009 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.698307 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.698280 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 16:31:46.698391 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.698321 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 16:31:46.698442 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.698398 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 16:31:46.698669 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.698656 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 16:31:46.704435 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.704417 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2"] Apr 17 16:31:46.704550 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.704538 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-79f8c9df5d-9kzhz"] Apr 17 16:31:46.704632 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.704623 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wtckq"] Apr 17 16:31:46.704726 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.704715 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nlj65"] Apr 17 16:31:46.704827 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.704815 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-psgfk"] Apr 17 16:31:46.704915 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.704904 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-w25s5"] Apr 17 16:31:46.705010 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.704995 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv"] Apr 17 16:31:46.705010 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.705012 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5"] Apr 17 16:31:46.705139 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.705023 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4"] Apr 17 16:31:46.705139 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.704651 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4" Apr 17 16:31:46.707391 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.707374 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 16:31:46.707467 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.707380 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-r9x27\"" Apr 17 16:31:46.716635 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.716585 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4138b3cf-4356-4853-b790-fdfd4d1b8d21-service-ca-bundle\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.716635 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.716624 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.716790 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.716649 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8lmz\" (UniqueName: \"kubernetes.io/projected/1cd90798-a06c-4c4c-9c1a-45465cc231dd-kube-api-access-l8lmz\") pod \"cluster-samples-operator-6dc5bdb6b4-chmjc\" (UID: \"1cd90798-a06c-4c4c-9c1a-45465cc231dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:31:46.716790 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.716672 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4138b3cf-4356-4853-b790-fdfd4d1b8d21-snapshots\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.716790 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.716727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4138b3cf-4356-4853-b790-fdfd4d1b8d21-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.716790 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.716751 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4138b3cf-4356-4853-b790-fdfd4d1b8d21-serving-cert\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.716790 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.716780 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.717021 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.716836 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4050d306-2166-4792-b643-4e16417bd406-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dnxcv\" (UID: \"4050d306-2166-4792-b643-4e16417bd406\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" Apr 17 16:31:46.717021 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.716926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:31:46.717021 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.716996 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ae48b9-3879-4953-b4f9-833feac79819-config\") pod \"service-ca-operator-d6fc45fc5-78xv2\" (UID: \"81ae48b9-3879-4953-b4f9-833feac79819\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" Apr 17 16:31:46.717162 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.717021 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:31:46.717162 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717033 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.717162 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717062 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-ca-trust-extracted\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.717162 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.717086 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls podName:418111de-59f5-4b93-bf45-150196b0de95 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:47.217068458 +0000 UTC m=+32.595740613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zdpww" (UID: "418111de-59f5-4b93-bf45-150196b0de95") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:31:46.717162 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/418111de-59f5-4b93-bf45-150196b0de95-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:31:46.717162 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717152 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-certificates\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.717459 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717183 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rjh2\" (UniqueName: \"kubernetes.io/projected/418111de-59f5-4b93-bf45-150196b0de95-kube-api-access-7rjh2\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:31:46.717459 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717422 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4138b3cf-4356-4853-b790-fdfd4d1b8d21-tmp\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.717518 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717456 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-chmjc\" (UID: \"1cd90798-a06c-4c4c-9c1a-45465cc231dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:31:46.717518 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81ae48b9-3879-4953-b4f9-833feac79819-serving-cert\") pod \"service-ca-operator-d6fc45fc5-78xv2\" (UID: \"81ae48b9-3879-4953-b4f9-833feac79819\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" Apr 17 16:31:46.717580 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717519 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdhw9\" (UniqueName: \"kubernetes.io/projected/19d7e13d-e66e-48ec-b132-f6e07a38ea96-kube-api-access-rdhw9\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.717580 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717547 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m64xh\" (UniqueName: \"kubernetes.io/projected/4050d306-2166-4792-b643-4e16417bd406-kube-api-access-m64xh\") pod \"kube-storage-version-migrator-operator-6769c5d45-dnxcv\" (UID: \"4050d306-2166-4792-b643-4e16417bd406\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" Apr 17 16:31:46.717671 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717577 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-stats-auth\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.717671 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ae48b9-3879-4953-b4f9-833feac79819-config\") pod \"service-ca-operator-d6fc45fc5-78xv2\" (UID: \"81ae48b9-3879-4953-b4f9-833feac79819\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" Apr 17 16:31:46.717671 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717600 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-bound-sa-token\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.717671 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-default-certificate\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.717843 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717683 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-image-registry-private-configuration\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.717843 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717750 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4050d306-2166-4792-b643-4e16417bd406-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dnxcv\" (UID: \"4050d306-2166-4792-b643-4e16417bd406\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" Apr 17 16:31:46.717843 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717771 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-trusted-ca\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.717843 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjchd\" (UniqueName: \"kubernetes.io/projected/81ae48b9-3879-4953-b4f9-833feac79819-kube-api-access-qjchd\") pod \"service-ca-operator-d6fc45fc5-78xv2\" (UID: \"81ae48b9-3879-4953-b4f9-833feac79819\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" Apr 17 16:31:46.717843 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717835 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lbs4\" (UniqueName: \"kubernetes.io/projected/4138b3cf-4356-4853-b790-fdfd4d1b8d21-kube-api-access-5lbs4\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.718034 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717852 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmm8j\" (UniqueName: \"kubernetes.io/projected/c7871d70-1e63-496f-835f-a447b5d1800c-kube-api-access-nmm8j\") pod \"volume-data-source-validator-7c6cbb6c87-mkhdq\" (UID: \"c7871d70-1e63-496f-835f-a447b5d1800c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mkhdq" Apr 17 16:31:46.718034 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717870 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j24th\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-kube-api-access-j24th\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.718034 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717887 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-installation-pull-secrets\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.718034 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.717890 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/418111de-59f5-4b93-bf45-150196b0de95-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:31:46.720949 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.720930 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81ae48b9-3879-4953-b4f9-833feac79819-serving-cert\") pod \"service-ca-operator-d6fc45fc5-78xv2\" (UID: \"81ae48b9-3879-4953-b4f9-833feac79819\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" Apr 17 16:31:46.727520 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.727501 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rjh2\" (UniqueName: \"kubernetes.io/projected/418111de-59f5-4b93-bf45-150196b0de95-kube-api-access-7rjh2\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:31:46.728326 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.728303 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjchd\" (UniqueName: \"kubernetes.io/projected/81ae48b9-3879-4953-b4f9-833feac79819-kube-api-access-qjchd\") pod \"service-ca-operator-d6fc45fc5-78xv2\" (UID: \"81ae48b9-3879-4953-b4f9-833feac79819\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" Apr 17 16:31:46.818397 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.818366 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j24th\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-kube-api-access-j24th\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.818582 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.818415 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.818582 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.818528 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-installation-pull-secrets\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.818582 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.818567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8e1510f-1cb7-4361-bb57-1944dc90fae3-trusted-ca\") pod \"console-operator-9d4b6777b-t8bs6\" (UID: \"c8e1510f-1cb7-4361-bb57-1944dc90fae3\") " pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.818760 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.818630 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c83fe77d-ef92-4eec-bda5-e4a74cd955d3-tmp\") pod \"klusterlet-addon-workmgr-74bdc8ccc4-vwkwv\" (UID: \"c83fe77d-ef92-4eec-bda5-e4a74cd955d3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" Apr 17 16:31:46.818760 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.818670 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl4jp\" (UniqueName: \"kubernetes.io/projected/5f749b09-28ed-4c6f-a64f-2df1f97857d6-kube-api-access-hl4jp\") pod \"network-check-source-8894fc9bd-nlj65\" (UID: \"5f749b09-28ed-4c6f-a64f-2df1f97857d6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nlj65" Apr 17 16:31:46.818760 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.818731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kx4v\" (UniqueName: \"kubernetes.io/projected/684a0c83-c816-4304-bc54-42eabb96cf54-kube-api-access-9kx4v\") pod \"managed-serviceaccount-addon-agent-86d88845b5-l6pz4\" (UID: \"684a0c83-c816-4304-bc54-42eabb96cf54\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4" Apr 17 16:31:46.818886 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.818791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4138b3cf-4356-4853-b790-fdfd4d1b8d21-service-ca-bundle\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.818886 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.818820 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-ca\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.818886 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.818848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.818886 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.818872 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8lmz\" (UniqueName: \"kubernetes.io/projected/1cd90798-a06c-4c4c-9c1a-45465cc231dd-kube-api-access-l8lmz\") pod \"cluster-samples-operator-6dc5bdb6b4-chmjc\" (UID: \"1cd90798-a06c-4c4c-9c1a-45465cc231dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:31:46.819059 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.818897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4138b3cf-4356-4853-b790-fdfd4d1b8d21-snapshots\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.819059 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.818920 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4138b3cf-4356-4853-b790-fdfd4d1b8d21-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.819059 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.819001 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:31:46.819059 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.819051 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs podName:19d7e13d-e66e-48ec-b132-f6e07a38ea96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:47.319033575 +0000 UTC m=+32.697705742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs") pod "router-default-79f8c9df5d-9kzhz" (UID: "19d7e13d-e66e-48ec-b132-f6e07a38ea96") : secret "router-metrics-certs-default" not found Apr 17 16:31:46.819648 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.819605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4138b3cf-4356-4853-b790-fdfd4d1b8d21-serving-cert\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.819739 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.819668 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.819739 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.819701 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/684a0c83-c816-4304-bc54-42eabb96cf54-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86d88845b5-l6pz4\" (UID: \"684a0c83-c816-4304-bc54-42eabb96cf54\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4" Apr 17 16:31:46.819838 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.819747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4050d306-2166-4792-b643-4e16417bd406-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dnxcv\" (UID: \"4050d306-2166-4792-b643-4e16417bd406\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" Apr 17 16:31:46.819838 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.819798 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4138b3cf-4356-4853-b790-fdfd4d1b8d21-snapshots\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.819935 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.819844 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:46.819935 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.819857 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fb6d49967-7zjnb: secret "image-registry-tls" not found Apr 17 16:31:46.819935 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.819909 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls podName:b6f18147-3b7c-4098-b765-2b71bf2dc0f9 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:47.31989321 +0000 UTC m=+32.698565370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls") pod "image-registry-fb6d49967-7zjnb" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9") : secret "image-registry-tls" not found Apr 17 16:31:46.820086 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.819948 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njm86\" (UniqueName: \"kubernetes.io/projected/c8e1510f-1cb7-4361-bb57-1944dc90fae3-kube-api-access-njm86\") pod \"console-operator-9d4b6777b-t8bs6\" (UID: \"c8e1510f-1cb7-4361-bb57-1944dc90fae3\") " pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.820086 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.819982 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.820086 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820008 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-ca-trust-extracted\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.820086 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4138b3cf-4356-4853-b790-fdfd4d1b8d21-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.820086 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820033 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-w25s5\" (UID: \"289e73e3-2dca-4a41-b5bf-d6148102c16a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:31:46.820350 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-certificates\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.820350 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.820110 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle podName:19d7e13d-e66e-48ec-b132-f6e07a38ea96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:47.320095556 +0000 UTC m=+32.698767736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle") pod "router-default-79f8c9df5d-9kzhz" (UID: "19d7e13d-e66e-48ec-b132-f6e07a38ea96") : configmap references non-existent config key: service-ca.crt Apr 17 16:31:46.820350 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820143 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng7lh\" (UniqueName: \"kubernetes.io/projected/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-kube-api-access-ng7lh\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.820350 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4138b3cf-4356-4853-b790-fdfd4d1b8d21-tmp\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.820350 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-chmjc\" (UID: \"1cd90798-a06c-4c4c-9c1a-45465cc231dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:31:46.820350 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820298 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.820350 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwxrq\" (UniqueName: \"kubernetes.io/projected/c83fe77d-ef92-4eec-bda5-e4a74cd955d3-kube-api-access-lwxrq\") pod \"klusterlet-addon-workmgr-74bdc8ccc4-vwkwv\" (UID: \"c83fe77d-ef92-4eec-bda5-e4a74cd955d3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" Apr 17 16:31:46.820350 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820350 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f91565e7-f3a4-4471-a2b5-0fcd0292176b-tmp-dir\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:46.820755 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820387 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdhw9\" (UniqueName: \"kubernetes.io/projected/19d7e13d-e66e-48ec-b132-f6e07a38ea96-kube-api-access-rdhw9\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.820755 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m64xh\" (UniqueName: \"kubernetes.io/projected/4050d306-2166-4792-b643-4e16417bd406-kube-api-access-m64xh\") pod \"kube-storage-version-migrator-operator-6769c5d45-dnxcv\" (UID: \"4050d306-2166-4792-b643-4e16417bd406\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" Apr 17 16:31:46.820755 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-stats-auth\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.820755 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820450 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4050d306-2166-4792-b643-4e16417bd406-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dnxcv\" (UID: \"4050d306-2166-4792-b643-4e16417bd406\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" Apr 17 16:31:46.820755 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-bound-sa-token\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.820755 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-ca-trust-extracted\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.820755 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.820549 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:31:46.820755 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820553 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4138b3cf-4356-4853-b790-fdfd4d1b8d21-tmp\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.820755 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.820696 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls podName:1cd90798-a06c-4c4c-9c1a-45465cc231dd nodeName:}" failed. No retries permitted until 2026-04-17 16:31:47.320680336 +0000 UTC m=+32.699352504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-chmjc" (UID: "1cd90798-a06c-4c4c-9c1a-45465cc231dd") : secret "samples-operator-tls" not found Apr 17 16:31:46.820755 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820746 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8e1510f-1cb7-4361-bb57-1944dc90fae3-serving-cert\") pod \"console-operator-9d4b6777b-t8bs6\" (UID: \"c8e1510f-1cb7-4361-bb57-1944dc90fae3\") " pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-default-certificate\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820811 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-hub\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hckgl\" (UniqueName: \"kubernetes.io/projected/f91565e7-f3a4-4471-a2b5-0fcd0292176b-kube-api-access-hckgl\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820894 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-image-registry-private-configuration\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert\") pod \"ingress-canary-wtckq\" (UID: \"ff048557-6618-44a8-9b98-aa0325746b04\") " pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820948 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.820975 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c83fe77d-ef92-4eec-bda5-e4a74cd955d3-klusterlet-config\") pod \"klusterlet-addon-workmgr-74bdc8ccc4-vwkwv\" (UID: \"c83fe77d-ef92-4eec-bda5-e4a74cd955d3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.821017 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8e1510f-1cb7-4361-bb57-1944dc90fae3-config\") pod \"console-operator-9d4b6777b-t8bs6\" (UID: \"c8e1510f-1cb7-4361-bb57-1944dc90fae3\") " pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.821042 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fct8\" (UniqueName: \"kubernetes.io/projected/ff048557-6618-44a8-9b98-aa0325746b04-kube-api-access-7fct8\") pod \"ingress-canary-wtckq\" (UID: \"ff048557-6618-44a8-9b98-aa0325746b04\") " pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.821067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.821129 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4050d306-2166-4792-b643-4e16417bd406-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dnxcv\" (UID: \"4050d306-2166-4792-b643-4e16417bd406\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.821155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-trusted-ca\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.821182 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/289e73e3-2dca-4a41-b5bf-d6148102c16a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-w25s5\" (UID: \"289e73e3-2dca-4a41-b5bf-d6148102c16a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:31:46.821225 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.821223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f91565e7-f3a4-4471-a2b5-0fcd0292176b-config-volume\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:46.821852 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.821275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lbs4\" (UniqueName: \"kubernetes.io/projected/4138b3cf-4356-4853-b790-fdfd4d1b8d21-kube-api-access-5lbs4\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.821852 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.821301 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmm8j\" (UniqueName: \"kubernetes.io/projected/c7871d70-1e63-496f-835f-a447b5d1800c-kube-api-access-nmm8j\") pod \"volume-data-source-validator-7c6cbb6c87-mkhdq\" (UID: \"c7871d70-1e63-496f-835f-a447b5d1800c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mkhdq" Apr 17 16:31:46.821852 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.821580 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-certificates\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.822542 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.822479 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-trusted-ca\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.822542 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.822484 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4138b3cf-4356-4853-b790-fdfd4d1b8d21-serving-cert\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.823341 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.823321 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-installation-pull-secrets\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.823660 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.823639 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-stats-auth\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.823868 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.823836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-default-certificate\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.824063 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.824045 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4050d306-2166-4792-b643-4e16417bd406-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dnxcv\" (UID: \"4050d306-2166-4792-b643-4e16417bd406\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" Apr 17 16:31:46.824973 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.824934 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-image-registry-private-configuration\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.826410 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.826385 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4138b3cf-4356-4853-b790-fdfd4d1b8d21-service-ca-bundle\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.828623 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.828580 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-bound-sa-token\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.828957 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.828722 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j24th\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-kube-api-access-j24th\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:46.829268 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.829231 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m64xh\" (UniqueName: \"kubernetes.io/projected/4050d306-2166-4792-b643-4e16417bd406-kube-api-access-m64xh\") pod \"kube-storage-version-migrator-operator-6769c5d45-dnxcv\" (UID: \"4050d306-2166-4792-b643-4e16417bd406\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" Apr 17 16:31:46.829495 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.829478 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdhw9\" (UniqueName: \"kubernetes.io/projected/19d7e13d-e66e-48ec-b132-f6e07a38ea96-kube-api-access-rdhw9\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:46.829584 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.829523 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8lmz\" (UniqueName: \"kubernetes.io/projected/1cd90798-a06c-4c4c-9c1a-45465cc231dd-kube-api-access-l8lmz\") pod \"cluster-samples-operator-6dc5bdb6b4-chmjc\" (UID: \"1cd90798-a06c-4c4c-9c1a-45465cc231dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:31:46.829639 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.829608 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmm8j\" (UniqueName: \"kubernetes.io/projected/c7871d70-1e63-496f-835f-a447b5d1800c-kube-api-access-nmm8j\") pod \"volume-data-source-validator-7c6cbb6c87-mkhdq\" (UID: \"c7871d70-1e63-496f-835f-a447b5d1800c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mkhdq" Apr 17 16:31:46.831085 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.831065 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lbs4\" (UniqueName: \"kubernetes.io/projected/4138b3cf-4356-4853-b790-fdfd4d1b8d21-kube-api-access-5lbs4\") pod \"insights-operator-585dfdc468-ftwtr\" (UID: \"4138b3cf-4356-4853-b790-fdfd4d1b8d21\") " pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.834282 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.834262 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" Apr 17 16:31:46.848080 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.848055 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-ftwtr" Apr 17 16:31:46.888503 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.888083 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" Apr 17 16:31:46.908425 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.907990 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mkhdq" Apr 17 16:31:46.923237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.922771 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8e1510f-1cb7-4361-bb57-1944dc90fae3-serving-cert\") pod \"console-operator-9d4b6777b-t8bs6\" (UID: \"c8e1510f-1cb7-4361-bb57-1944dc90fae3\") " pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.923237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.922816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-hub\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.923237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.922846 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hckgl\" (UniqueName: \"kubernetes.io/projected/f91565e7-f3a4-4471-a2b5-0fcd0292176b-kube-api-access-hckgl\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:46.923237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.922878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert\") pod \"ingress-canary-wtckq\" (UID: \"ff048557-6618-44a8-9b98-aa0325746b04\") " pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:31:46.923237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.922903 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.923237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.922932 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c83fe77d-ef92-4eec-bda5-e4a74cd955d3-klusterlet-config\") pod \"klusterlet-addon-workmgr-74bdc8ccc4-vwkwv\" (UID: \"c83fe77d-ef92-4eec-bda5-e4a74cd955d3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" Apr 17 16:31:46.923237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.922976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8e1510f-1cb7-4361-bb57-1944dc90fae3-config\") pod \"console-operator-9d4b6777b-t8bs6\" (UID: \"c8e1510f-1cb7-4361-bb57-1944dc90fae3\") " pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.923237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.923002 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fct8\" (UniqueName: \"kubernetes.io/projected/ff048557-6618-44a8-9b98-aa0325746b04-kube-api-access-7fct8\") pod \"ingress-canary-wtckq\" (UID: \"ff048557-6618-44a8-9b98-aa0325746b04\") " pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:31:46.923237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.923027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:46.923237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.923051 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/289e73e3-2dca-4a41-b5bf-d6148102c16a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-w25s5\" (UID: \"289e73e3-2dca-4a41-b5bf-d6148102c16a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:31:46.923237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.923074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f91565e7-f3a4-4471-a2b5-0fcd0292176b-config-volume\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:46.923237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.923126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.923237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.923153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8e1510f-1cb7-4361-bb57-1944dc90fae3-trusted-ca\") pod \"console-operator-9d4b6777b-t8bs6\" (UID: \"c8e1510f-1cb7-4361-bb57-1944dc90fae3\") " pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.928407 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.923178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c83fe77d-ef92-4eec-bda5-e4a74cd955d3-tmp\") pod \"klusterlet-addon-workmgr-74bdc8ccc4-vwkwv\" (UID: \"c83fe77d-ef92-4eec-bda5-e4a74cd955d3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" Apr 17 16:31:46.929177 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.928787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hl4jp\" (UniqueName: \"kubernetes.io/projected/5f749b09-28ed-4c6f-a64f-2df1f97857d6-kube-api-access-hl4jp\") pod \"network-check-source-8894fc9bd-nlj65\" (UID: \"5f749b09-28ed-4c6f-a64f-2df1f97857d6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nlj65" Apr 17 16:31:46.929177 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.928831 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kx4v\" (UniqueName: \"kubernetes.io/projected/684a0c83-c816-4304-bc54-42eabb96cf54-kube-api-access-9kx4v\") pod \"managed-serviceaccount-addon-agent-86d88845b5-l6pz4\" (UID: \"684a0c83-c816-4304-bc54-42eabb96cf54\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4" Apr 17 16:31:46.929177 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.928861 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-ca\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.929177 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.928860 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c83fe77d-ef92-4eec-bda5-e4a74cd955d3-tmp\") pod \"klusterlet-addon-workmgr-74bdc8ccc4-vwkwv\" (UID: \"c83fe77d-ef92-4eec-bda5-e4a74cd955d3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" Apr 17 16:31:46.929177 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.927461 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8e1510f-1cb7-4361-bb57-1944dc90fae3-trusted-ca\") pod \"console-operator-9d4b6777b-t8bs6\" (UID: \"c8e1510f-1cb7-4361-bb57-1944dc90fae3\") " pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.929177 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.928959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/684a0c83-c816-4304-bc54-42eabb96cf54-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86d88845b5-l6pz4\" (UID: \"684a0c83-c816-4304-bc54-42eabb96cf54\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4" Apr 17 16:31:46.929177 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.929016 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njm86\" (UniqueName: \"kubernetes.io/projected/c8e1510f-1cb7-4361-bb57-1944dc90fae3-kube-api-access-njm86\") pod \"console-operator-9d4b6777b-t8bs6\" (UID: \"c8e1510f-1cb7-4361-bb57-1944dc90fae3\") " pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.929177 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.929065 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-w25s5\" (UID: \"289e73e3-2dca-4a41-b5bf-d6148102c16a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:31:46.929177 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.929102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng7lh\" (UniqueName: \"kubernetes.io/projected/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-kube-api-access-ng7lh\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.929177 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.929156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.929712 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.929229 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwxrq\" (UniqueName: \"kubernetes.io/projected/c83fe77d-ef92-4eec-bda5-e4a74cd955d3-kube-api-access-lwxrq\") pod \"klusterlet-addon-workmgr-74bdc8ccc4-vwkwv\" (UID: \"c83fe77d-ef92-4eec-bda5-e4a74cd955d3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" Apr 17 16:31:46.929712 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.929260 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f91565e7-f3a4-4471-a2b5-0fcd0292176b-tmp-dir\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:46.930833 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.925875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/289e73e3-2dca-4a41-b5bf-d6148102c16a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-w25s5\" (UID: \"289e73e3-2dca-4a41-b5bf-d6148102c16a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:31:46.931971 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.928738 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:46.931971 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.931048 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls podName:f91565e7-f3a4-4471-a2b5-0fcd0292176b nodeName:}" failed. No retries permitted until 2026-04-17 16:31:47.431023544 +0000 UTC m=+32.809695709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls") pod "dns-default-psgfk" (UID: "f91565e7-f3a4-4471-a2b5-0fcd0292176b") : secret "dns-default-metrics-tls" not found Apr 17 16:31:46.931971 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.931089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8e1510f-1cb7-4361-bb57-1944dc90fae3-config\") pod \"console-operator-9d4b6777b-t8bs6\" (UID: \"c8e1510f-1cb7-4361-bb57-1944dc90fae3\") " pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.931971 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.929901 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:46.931971 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.931168 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:31:46.931971 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.931185 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert podName:ff048557-6618-44a8-9b98-aa0325746b04 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:47.431171087 +0000 UTC m=+32.809843243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert") pod "ingress-canary-wtckq" (UID: "ff048557-6618-44a8-9b98-aa0325746b04") : secret "canary-serving-cert" not found Apr 17 16:31:46.931971 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:46.931237 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert podName:289e73e3-2dca-4a41-b5bf-d6148102c16a nodeName:}" failed. No retries permitted until 2026-04-17 16:31:47.431220069 +0000 UTC m=+32.809892234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-w25s5" (UID: "289e73e3-2dca-4a41-b5bf-d6148102c16a") : secret "networking-console-plugin-cert" not found Apr 17 16:31:46.931971 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.931915 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.933868 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.933839 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f91565e7-f3a4-4471-a2b5-0fcd0292176b-tmp-dir\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:46.934723 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.934688 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f91565e7-f3a4-4471-a2b5-0fcd0292176b-config-volume\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:46.940955 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.939727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fct8\" (UniqueName: \"kubernetes.io/projected/ff048557-6618-44a8-9b98-aa0325746b04-kube-api-access-7fct8\") pod \"ingress-canary-wtckq\" (UID: \"ff048557-6618-44a8-9b98-aa0325746b04\") " pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:31:46.940955 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.940320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl4jp\" (UniqueName: \"kubernetes.io/projected/5f749b09-28ed-4c6f-a64f-2df1f97857d6-kube-api-access-hl4jp\") pod \"network-check-source-8894fc9bd-nlj65\" (UID: \"5f749b09-28ed-4c6f-a64f-2df1f97857d6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nlj65" Apr 17 16:31:46.940955 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.940626 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8e1510f-1cb7-4361-bb57-1944dc90fae3-serving-cert\") pod \"console-operator-9d4b6777b-t8bs6\" (UID: \"c8e1510f-1cb7-4361-bb57-1944dc90fae3\") " pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.941841 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.941791 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng7lh\" (UniqueName: \"kubernetes.io/projected/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-kube-api-access-ng7lh\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.942591 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.942541 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kx4v\" (UniqueName: \"kubernetes.io/projected/684a0c83-c816-4304-bc54-42eabb96cf54-kube-api-access-9kx4v\") pod \"managed-serviceaccount-addon-agent-86d88845b5-l6pz4\" (UID: \"684a0c83-c816-4304-bc54-42eabb96cf54\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4" Apr 17 16:31:46.943316 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.943263 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwxrq\" (UniqueName: \"kubernetes.io/projected/c83fe77d-ef92-4eec-bda5-e4a74cd955d3-kube-api-access-lwxrq\") pod \"klusterlet-addon-workmgr-74bdc8ccc4-vwkwv\" (UID: \"c83fe77d-ef92-4eec-bda5-e4a74cd955d3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" Apr 17 16:31:46.944038 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.943973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njm86\" (UniqueName: \"kubernetes.io/projected/c8e1510f-1cb7-4361-bb57-1944dc90fae3-kube-api-access-njm86\") pod \"console-operator-9d4b6777b-t8bs6\" (UID: \"c8e1510f-1cb7-4361-bb57-1944dc90fae3\") " pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.950605 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.948490 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:31:46.950605 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.949477 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/684a0c83-c816-4304-bc54-42eabb96cf54-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86d88845b5-l6pz4\" (UID: \"684a0c83-c816-4304-bc54-42eabb96cf54\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4" Apr 17 16:31:46.954713 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.954353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-ca\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.955632 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.954868 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.955632 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.955425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.955632 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.955434 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c83fe77d-ef92-4eec-bda5-e4a74cd955d3-klusterlet-config\") pod \"klusterlet-addon-workmgr-74bdc8ccc4-vwkwv\" (UID: \"c83fe77d-ef92-4eec-bda5-e4a74cd955d3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" Apr 17 16:31:46.955951 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.955869 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/cbfcaa7c-7f0a-4a57-96cc-78428bb81916-hub\") pod \"cluster-proxy-proxy-agent-54fd8bd9fb-srkv5\" (UID: \"cbfcaa7c-7f0a-4a57-96cc-78428bb81916\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:46.956730 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:46.956542 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hckgl\" (UniqueName: \"kubernetes.io/projected/f91565e7-f3a4-4471-a2b5-0fcd0292176b-kube-api-access-hckgl\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:47.000818 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.000357 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nlj65" Apr 17 16:31:47.009727 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.009326 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" Apr 17 16:31:47.027270 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.026932 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:31:47.039092 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.038742 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4" Apr 17 16:31:47.060095 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.057821 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-ftwtr"] Apr 17 16:31:47.060095 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.058081 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2"] Apr 17 16:31:47.069380 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:47.069337 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81ae48b9_3879_4953_b4f9_833feac79819.slice/crio-767f4ce1fca8599916216b11f2e85585d40e0803c9872d9f3f7ec5e33ed10172 WatchSource:0}: Error finding container 767f4ce1fca8599916216b11f2e85585d40e0803c9872d9f3f7ec5e33ed10172: Status 404 returned error can't find the container with id 767f4ce1fca8599916216b11f2e85585d40e0803c9872d9f3f7ec5e33ed10172 Apr 17 16:31:47.099795 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.099749 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv"] Apr 17 16:31:47.151611 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.151553 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mkhdq"] Apr 17 16:31:47.190307 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.188713 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:47.190307 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.189209 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:47.190307 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.189659 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:47.195380 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.193896 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fwf7s\"" Apr 17 16:31:47.195380 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.194043 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:31:47.195380 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.194070 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6m66z\"" Apr 17 16:31:47.195380 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.194368 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:31:47.199505 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.199458 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-t8bs6"] Apr 17 16:31:47.210824 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:47.209390 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8e1510f_1cb7_4361_bb57_1944dc90fae3.slice/crio-26173b07e83ca4fec79ca854fd660a46fb3cbdbef1d50714c33842d3fd89cf2f WatchSource:0}: Error finding container 26173b07e83ca4fec79ca854fd660a46fb3cbdbef1d50714c33842d3fd89cf2f: Status 404 returned error can't find the container with id 26173b07e83ca4fec79ca854fd660a46fb3cbdbef1d50714c33842d3fd89cf2f Apr 17 16:31:47.219026 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.218953 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-nlj65"] Apr 17 16:31:47.224456 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.224396 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv"] Apr 17 16:31:47.241621 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.241571 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:31:47.241880 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.241861 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:31:47.241969 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.241929 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls podName:418111de-59f5-4b93-bf45-150196b0de95 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:48.241909554 +0000 UTC m=+33.620581708 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zdpww" (UID: "418111de-59f5-4b93-bf45-150196b0de95") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:31:47.253282 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.253261 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5"] Apr 17 16:31:47.255492 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:47.255457 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbfcaa7c_7f0a_4a57_96cc_78428bb81916.slice/crio-1bacb5adc7e04406a2eed767337d3ae95348242e276be9664bbd4168da1bf5b7 WatchSource:0}: Error finding container 1bacb5adc7e04406a2eed767337d3ae95348242e276be9664bbd4168da1bf5b7: Status 404 returned error can't find the container with id 1bacb5adc7e04406a2eed767337d3ae95348242e276be9664bbd4168da1bf5b7 Apr 17 16:31:47.263508 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.263486 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4"] Apr 17 16:31:47.266362 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:31:47.266336 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod684a0c83_c816_4304_bc54_42eabb96cf54.slice/crio-4ebe570761215167197437d9c529dc44162dec4929b5d50f5fca4052507b3523 WatchSource:0}: Error finding container 4ebe570761215167197437d9c529dc44162dec4929b5d50f5fca4052507b3523: Status 404 returned error can't find the container with id 4ebe570761215167197437d9c529dc44162dec4929b5d50f5fca4052507b3523 Apr 17 16:31:47.342355 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.342329 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:47.342453 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.342409 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:47.342508 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.342465 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:47.342508 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.342487 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fb6d49967-7zjnb: secret "image-registry-tls" not found Apr 17 16:31:47.342613 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.342538 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls podName:b6f18147-3b7c-4098-b765-2b71bf2dc0f9 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:48.342520357 +0000 UTC m=+33.721192524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls") pod "image-registry-fb6d49967-7zjnb" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9") : secret "image-registry-tls" not found Apr 17 16:31:47.342613 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.342547 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:31:47.342613 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.342465 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-chmjc\" (UID: \"1cd90798-a06c-4c4c-9c1a-45465cc231dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:31:47.342765 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.342675 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle podName:19d7e13d-e66e-48ec-b132-f6e07a38ea96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:48.342658537 +0000 UTC m=+33.721330696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle") pod "router-default-79f8c9df5d-9kzhz" (UID: "19d7e13d-e66e-48ec-b132-f6e07a38ea96") : configmap references non-existent config key: service-ca.crt Apr 17 16:31:47.342765 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.342740 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls podName:1cd90798-a06c-4c4c-9c1a-45465cc231dd nodeName:}" failed. No retries permitted until 2026-04-17 16:31:48.342728544 +0000 UTC m=+33.721400723 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-chmjc" (UID: "1cd90798-a06c-4c4c-9c1a-45465cc231dd") : secret "samples-operator-tls" not found Apr 17 16:31:47.342765 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.342739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:47.342919 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.342798 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:31:47.342919 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.342833 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs podName:19d7e13d-e66e-48ec-b132-f6e07a38ea96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:48.342824722 +0000 UTC m=+33.721496875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs") pod "router-default-79f8c9df5d-9kzhz" (UID: "19d7e13d-e66e-48ec-b132-f6e07a38ea96") : secret "router-metrics-certs-default" not found Apr 17 16:31:47.351732 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.351706 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ftwtr" event={"ID":"4138b3cf-4356-4853-b790-fdfd4d1b8d21","Type":"ContainerStarted","Data":"26d2016e25aecb51dd2b7a8201b7b1b5ac682d8235e1760bafe80fc5f81a3505"} Apr 17 16:31:47.352880 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.352859 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" event={"ID":"cbfcaa7c-7f0a-4a57-96cc-78428bb81916","Type":"ContainerStarted","Data":"1bacb5adc7e04406a2eed767337d3ae95348242e276be9664bbd4168da1bf5b7"} Apr 17 16:31:47.354028 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.354004 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" event={"ID":"c8e1510f-1cb7-4361-bb57-1944dc90fae3","Type":"ContainerStarted","Data":"26173b07e83ca4fec79ca854fd660a46fb3cbdbef1d50714c33842d3fd89cf2f"} Apr 17 16:31:47.355232 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.355205 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" event={"ID":"4050d306-2166-4792-b643-4e16417bd406","Type":"ContainerStarted","Data":"4897d364b7959c80adc5bb67e8ee359b59eae9228996ae2d47c4954961c95c5e"} Apr 17 16:31:47.356234 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.356212 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" event={"ID":"81ae48b9-3879-4953-b4f9-833feac79819","Type":"ContainerStarted","Data":"767f4ce1fca8599916216b11f2e85585d40e0803c9872d9f3f7ec5e33ed10172"} Apr 17 16:31:47.357306 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.357273 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4" event={"ID":"684a0c83-c816-4304-bc54-42eabb96cf54","Type":"ContainerStarted","Data":"4ebe570761215167197437d9c529dc44162dec4929b5d50f5fca4052507b3523"} Apr 17 16:31:47.358295 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.358268 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" event={"ID":"c83fe77d-ef92-4eec-bda5-e4a74cd955d3","Type":"ContainerStarted","Data":"b9b0632638003b13661f847e6c17c8705528440fc4271779c81918bf208d9844"} Apr 17 16:31:47.359373 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.359347 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mkhdq" event={"ID":"c7871d70-1e63-496f-835f-a447b5d1800c","Type":"ContainerStarted","Data":"474957d6e82bc01aa6f86e652985387d04db65f4f47316ecdd87e26232208140"} Apr 17 16:31:47.360426 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.360401 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nlj65" event={"ID":"5f749b09-28ed-4c6f-a64f-2df1f97857d6","Type":"ContainerStarted","Data":"e1121f6e3d6d72717982057e995b55d62d29750caed79daea695b36813348f66"} Apr 17 16:31:47.444166 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.444131 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert\") pod \"ingress-canary-wtckq\" (UID: \"ff048557-6618-44a8-9b98-aa0325746b04\") " pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:31:47.444346 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.444209 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:47.444346 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.444308 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:47.444459 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.444372 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert podName:ff048557-6618-44a8-9b98-aa0325746b04 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:48.444351683 +0000 UTC m=+33.823023836 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert") pod "ingress-canary-wtckq" (UID: "ff048557-6618-44a8-9b98-aa0325746b04") : secret "canary-serving-cert" not found Apr 17 16:31:47.444459 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:47.444417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-w25s5\" (UID: \"289e73e3-2dca-4a41-b5bf-d6148102c16a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:31:47.444603 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.444588 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:47.444669 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.444639 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls podName:f91565e7-f3a4-4471-a2b5-0fcd0292176b nodeName:}" failed. No retries permitted until 2026-04-17 16:31:48.444623476 +0000 UTC m=+33.823295633 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls") pod "dns-default-psgfk" (UID: "f91565e7-f3a4-4471-a2b5-0fcd0292176b") : secret "dns-default-metrics-tls" not found Apr 17 16:31:47.444761 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.444736 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:31:47.444817 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:47.444788 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert podName:289e73e3-2dca-4a41-b5bf-d6148102c16a nodeName:}" failed. No retries permitted until 2026-04-17 16:31:48.444774157 +0000 UTC m=+33.823446311 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-w25s5" (UID: "289e73e3-2dca-4a41-b5bf-d6148102c16a") : secret "networking-console-plugin-cert" not found Apr 17 16:31:48.253777 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:48.253220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:31:48.253777 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.253495 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:31:48.253777 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.253559 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls podName:418111de-59f5-4b93-bf45-150196b0de95 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:50.253539893 +0000 UTC m=+35.632212053 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zdpww" (UID: "418111de-59f5-4b93-bf45-150196b0de95") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:31:48.355292 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:48.354236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:48.355292 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:48.354291 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:48.355292 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:48.354355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:48.355292 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:48.354409 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-chmjc\" (UID: \"1cd90798-a06c-4c4c-9c1a-45465cc231dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:31:48.355292 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.354620 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:31:48.355292 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.354683 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls podName:1cd90798-a06c-4c4c-9c1a-45465cc231dd nodeName:}" failed. No retries permitted until 2026-04-17 16:31:50.354662751 +0000 UTC m=+35.733334912 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-chmjc" (UID: "1cd90798-a06c-4c4c-9c1a-45465cc231dd") : secret "samples-operator-tls" not found Apr 17 16:31:48.355292 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.355033 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:48.355292 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.355048 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fb6d49967-7zjnb: secret "image-registry-tls" not found Apr 17 16:31:48.355292 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.355090 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls podName:b6f18147-3b7c-4098-b765-2b71bf2dc0f9 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:50.35507644 +0000 UTC m=+35.733748598 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls") pod "image-registry-fb6d49967-7zjnb" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9") : secret "image-registry-tls" not found Apr 17 16:31:48.355292 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.355144 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:31:48.355292 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.355180 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs podName:19d7e13d-e66e-48ec-b132-f6e07a38ea96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:50.35516769 +0000 UTC m=+35.733839863 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs") pod "router-default-79f8c9df5d-9kzhz" (UID: "19d7e13d-e66e-48ec-b132-f6e07a38ea96") : secret "router-metrics-certs-default" not found Apr 17 16:31:48.355292 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.355262 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle podName:19d7e13d-e66e-48ec-b132-f6e07a38ea96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:50.35525173 +0000 UTC m=+35.733923885 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle") pod "router-default-79f8c9df5d-9kzhz" (UID: "19d7e13d-e66e-48ec-b132-f6e07a38ea96") : configmap references non-existent config key: service-ca.crt Apr 17 16:31:48.455770 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:48.454903 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert\") pod \"ingress-canary-wtckq\" (UID: \"ff048557-6618-44a8-9b98-aa0325746b04\") " pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:31:48.455770 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:48.454970 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:48.455770 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:48.455104 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-w25s5\" (UID: \"289e73e3-2dca-4a41-b5bf-d6148102c16a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:31:48.455770 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.455266 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:31:48.455770 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.455338 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert podName:289e73e3-2dca-4a41-b5bf-d6148102c16a nodeName:}" failed. No retries permitted until 2026-04-17 16:31:50.455317589 +0000 UTC m=+35.833989745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-w25s5" (UID: "289e73e3-2dca-4a41-b5bf-d6148102c16a") : secret "networking-console-plugin-cert" not found Apr 17 16:31:48.455770 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.455732 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:48.455770 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.455778 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert podName:ff048557-6618-44a8-9b98-aa0325746b04 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:50.455763725 +0000 UTC m=+35.834435891 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert") pod "ingress-canary-wtckq" (UID: "ff048557-6618-44a8-9b98-aa0325746b04") : secret "canary-serving-cert" not found Apr 17 16:31:48.456349 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.455834 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:48.456349 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.455867 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls podName:f91565e7-f3a4-4471-a2b5-0fcd0292176b nodeName:}" failed. No retries permitted until 2026-04-17 16:31:50.455856131 +0000 UTC m=+35.834528297 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls") pod "dns-default-psgfk" (UID: "f91565e7-f3a4-4471-a2b5-0fcd0292176b") : secret "dns-default-metrics-tls" not found Apr 17 16:31:48.861697 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:48.861658 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs\") pod \"network-metrics-daemon-lt7mn\" (UID: \"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0\") " pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:31:48.862268 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.861955 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:31:48.862268 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:48.862022 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs podName:3e538eeb-9985-4bbd-ae4b-d6ac1469dba0 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:20.862002591 +0000 UTC m=+66.240674761 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs") pod "network-metrics-daemon-lt7mn" (UID: "3e538eeb-9985-4bbd-ae4b-d6ac1469dba0") : secret "metrics-daemon-secret" not found Apr 17 16:31:48.962541 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:48.962506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb9qp\" (UniqueName: \"kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp\") pod \"network-check-target-qch6k\" (UID: \"d9099ff1-798e-434b-8980-189e358b2f96\") " pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:48.991941 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:48.991914 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb9qp\" (UniqueName: \"kubernetes.io/projected/d9099ff1-798e-434b-8980-189e358b2f96-kube-api-access-wb9qp\") pod \"network-check-target-qch6k\" (UID: \"d9099ff1-798e-434b-8980-189e358b2f96\") " pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:49.030403 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:49.029927 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:31:50.275664 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:50.275630 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:31:50.276036 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.275786 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:31:50.276036 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.275862 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls podName:418111de-59f5-4b93-bf45-150196b0de95 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.275846501 +0000 UTC m=+39.654518659 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zdpww" (UID: "418111de-59f5-4b93-bf45-150196b0de95") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:31:50.376509 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:50.376474 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:50.376690 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:50.376562 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:50.376690 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:50.376628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-chmjc\" (UID: \"1cd90798-a06c-4c4c-9c1a-45465cc231dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:31:50.376690 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.376631 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:50.376690 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.376652 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fb6d49967-7zjnb: secret "image-registry-tls" not found Apr 17 16:31:50.376845 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.376708 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls podName:b6f18147-3b7c-4098-b765-2b71bf2dc0f9 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.376693209 +0000 UTC m=+39.755365361 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls") pod "image-registry-fb6d49967-7zjnb" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9") : secret "image-registry-tls" not found Apr 17 16:31:50.376845 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.376733 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle podName:19d7e13d-e66e-48ec-b132-f6e07a38ea96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.376713936 +0000 UTC m=+39.755386106 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle") pod "router-default-79f8c9df5d-9kzhz" (UID: "19d7e13d-e66e-48ec-b132-f6e07a38ea96") : configmap references non-existent config key: service-ca.crt Apr 17 16:31:50.376845 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.376761 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:31:50.376845 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.376809 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls podName:1cd90798-a06c-4c4c-9c1a-45465cc231dd nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.376799571 +0000 UTC m=+39.755471724 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-chmjc" (UID: "1cd90798-a06c-4c4c-9c1a-45465cc231dd") : secret "samples-operator-tls" not found Apr 17 16:31:50.376845 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:50.376837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:50.377093 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.376945 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:31:50.377093 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.376985 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs podName:19d7e13d-e66e-48ec-b132-f6e07a38ea96 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.37697456 +0000 UTC m=+39.755646713 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs") pod "router-default-79f8c9df5d-9kzhz" (UID: "19d7e13d-e66e-48ec-b132-f6e07a38ea96") : secret "router-metrics-certs-default" not found Apr 17 16:31:50.477731 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:50.477702 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert\") pod \"ingress-canary-wtckq\" (UID: \"ff048557-6618-44a8-9b98-aa0325746b04\") " pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:31:50.477901 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:50.477746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:50.477901 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:50.477831 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-w25s5\" (UID: \"289e73e3-2dca-4a41-b5bf-d6148102c16a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:31:50.477901 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.477844 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:50.477901 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.477880 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:50.478126 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.477914 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert podName:ff048557-6618-44a8-9b98-aa0325746b04 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.477894556 +0000 UTC m=+39.856566719 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert") pod "ingress-canary-wtckq" (UID: "ff048557-6618-44a8-9b98-aa0325746b04") : secret "canary-serving-cert" not found Apr 17 16:31:50.478126 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.477933 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls podName:f91565e7-f3a4-4471-a2b5-0fcd0292176b nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.477923353 +0000 UTC m=+39.856595505 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls") pod "dns-default-psgfk" (UID: "f91565e7-f3a4-4471-a2b5-0fcd0292176b") : secret "dns-default-metrics-tls" not found Apr 17 16:31:50.478126 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.477983 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:31:50.478126 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:50.478030 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert podName:289e73e3-2dca-4a41-b5bf-d6148102c16a nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.478017495 +0000 UTC m=+39.856689653 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-w25s5" (UID: "289e73e3-2dca-4a41-b5bf-d6148102c16a") : secret "networking-console-plugin-cert" not found Apr 17 16:31:53.305461 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:53.305432 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:53.307908 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:53.307885 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/345c95f5-2f92-4ba5-8afd-6484fb524fad-original-pull-secret\") pod \"global-pull-secret-syncer-5v2lr\" (UID: \"345c95f5-2f92-4ba5-8afd-6484fb524fad\") " pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:53.519744 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:53.519707 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5v2lr" Apr 17 16:31:54.315352 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:54.315207 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:31:54.315906 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.315788 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:31:54.315906 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.315860 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls podName:418111de-59f5-4b93-bf45-150196b0de95 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:02.315839276 +0000 UTC m=+47.694511432 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zdpww" (UID: "418111de-59f5-4b93-bf45-150196b0de95") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:31:54.416996 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:54.416954 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:54.417165 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:54.417009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:31:54.417165 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:54.417066 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:31:54.417165 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:54.417128 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-chmjc\" (UID: \"1cd90798-a06c-4c4c-9c1a-45465cc231dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:31:54.417165 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.417137 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:31:54.417446 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.417286 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs podName:19d7e13d-e66e-48ec-b132-f6e07a38ea96 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:02.417264952 +0000 UTC m=+47.795937106 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs") pod "router-default-79f8c9df5d-9kzhz" (UID: "19d7e13d-e66e-48ec-b132-f6e07a38ea96") : secret "router-metrics-certs-default" not found Apr 17 16:31:54.418753 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.417846 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle podName:19d7e13d-e66e-48ec-b132-f6e07a38ea96 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:02.417814454 +0000 UTC m=+47.796486620 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle") pod "router-default-79f8c9df5d-9kzhz" (UID: "19d7e13d-e66e-48ec-b132-f6e07a38ea96") : configmap references non-existent config key: service-ca.crt Apr 17 16:31:54.418753 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.418136 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:54.418753 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.418223 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fb6d49967-7zjnb: secret "image-registry-tls" not found Apr 17 16:31:54.418753 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.418486 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:31:54.418753 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.418544 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls podName:b6f18147-3b7c-4098-b765-2b71bf2dc0f9 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:02.418509077 +0000 UTC m=+47.797181244 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls") pod "image-registry-fb6d49967-7zjnb" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9") : secret "image-registry-tls" not found Apr 17 16:31:54.419078 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.418787 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls podName:1cd90798-a06c-4c4c-9c1a-45465cc231dd nodeName:}" failed. No retries permitted until 2026-04-17 16:32:02.418747529 +0000 UTC m=+47.797419697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-chmjc" (UID: "1cd90798-a06c-4c4c-9c1a-45465cc231dd") : secret "samples-operator-tls" not found Apr 17 16:31:54.518309 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:54.518038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-w25s5\" (UID: \"289e73e3-2dca-4a41-b5bf-d6148102c16a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:31:54.518309 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:54.518159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert\") pod \"ingress-canary-wtckq\" (UID: \"ff048557-6618-44a8-9b98-aa0325746b04\") " pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:31:54.518309 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:31:54.518217 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:31:54.518309 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.518269 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:31:54.518638 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.518350 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:54.518638 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.518363 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert podName:289e73e3-2dca-4a41-b5bf-d6148102c16a nodeName:}" failed. No retries permitted until 2026-04-17 16:32:02.518344149 +0000 UTC m=+47.897016305 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-w25s5" (UID: "289e73e3-2dca-4a41-b5bf-d6148102c16a") : secret "networking-console-plugin-cert" not found Apr 17 16:31:54.518638 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.518408 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls podName:f91565e7-f3a4-4471-a2b5-0fcd0292176b nodeName:}" failed. No retries permitted until 2026-04-17 16:32:02.518392285 +0000 UTC m=+47.897064439 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls") pod "dns-default-psgfk" (UID: "f91565e7-f3a4-4471-a2b5-0fcd0292176b") : secret "dns-default-metrics-tls" not found Apr 17 16:31:54.518638 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.518269 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:54.518638 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:31:54.518448 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert podName:ff048557-6618-44a8-9b98-aa0325746b04 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:02.518437716 +0000 UTC m=+47.897109874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert") pod "ingress-canary-wtckq" (UID: "ff048557-6618-44a8-9b98-aa0325746b04") : secret "canary-serving-cert" not found Apr 17 16:32:00.857490 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:00.857462 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5v2lr"] Apr 17 16:32:00.862677 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:32:00.862641 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod345c95f5_2f92_4ba5_8afd_6484fb524fad.slice/crio-b8f6344787d59415d93467af36545e2819c3c8e7f6ff7649f7d446c74500b811 WatchSource:0}: Error finding container b8f6344787d59415d93467af36545e2819c3c8e7f6ff7649f7d446c74500b811: Status 404 returned error can't find the container with id b8f6344787d59415d93467af36545e2819c3c8e7f6ff7649f7d446c74500b811 Apr 17 16:32:00.953688 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:00.953601 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qch6k"] Apr 17 16:32:00.955546 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:32:00.955487 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9099ff1_798e_434b_8980_189e358b2f96.slice/crio-b23591bc5f6354f2a517c5c7e1bb7f748e00d664373d8ae1af66f988a74c0a79 WatchSource:0}: Error finding container b23591bc5f6354f2a517c5c7e1bb7f748e00d664373d8ae1af66f988a74c0a79: Status 404 returned error can't find the container with id b23591bc5f6354f2a517c5c7e1bb7f748e00d664373d8ae1af66f988a74c0a79 Apr 17 16:32:01.398237 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.397528 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mkhdq" event={"ID":"c7871d70-1e63-496f-835f-a447b5d1800c","Type":"ContainerStarted","Data":"b17197515fd9f02e83b0c9eac3afb5883cd02e801fe59044ad3ea4f4037ba754"} Apr 17 16:32:01.400062 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.399603 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nlj65" event={"ID":"5f749b09-28ed-4c6f-a64f-2df1f97857d6","Type":"ContainerStarted","Data":"8b7ad2caa86aa12f19ec515a0312f842acecc4bedbe7ecaac9ff354bcc3aae9a"} Apr 17 16:32:01.401251 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.401168 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ftwtr" event={"ID":"4138b3cf-4356-4853-b790-fdfd4d1b8d21","Type":"ContainerStarted","Data":"b3bcc011f61f6aa809b22d2fc845912868395d4d93caad1c31604ac1b9e6dbf8"} Apr 17 16:32:01.404372 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.404098 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5v2lr" event={"ID":"345c95f5-2f92-4ba5-8afd-6484fb524fad","Type":"ContainerStarted","Data":"b8f6344787d59415d93467af36545e2819c3c8e7f6ff7649f7d446c74500b811"} Apr 17 16:32:01.405625 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.405594 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" event={"ID":"cbfcaa7c-7f0a-4a57-96cc-78428bb81916","Type":"ContainerStarted","Data":"5f4d7a9fea46d1d7b285f1374bcd759751c49a42d9b6aec1df4500af60c59c96"} Apr 17 16:32:01.407306 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.407284 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/0.log" Apr 17 16:32:01.407404 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.407327 2572 generic.go:358] "Generic (PLEG): container finished" podID="c8e1510f-1cb7-4361-bb57-1944dc90fae3" containerID="463d5d74b28b6b2899541cedee9f6b7334b95506d3f8f312aee44ba21f979518" exitCode=255 Apr 17 16:32:01.407404 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.407392 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" event={"ID":"c8e1510f-1cb7-4361-bb57-1944dc90fae3","Type":"ContainerDied","Data":"463d5d74b28b6b2899541cedee9f6b7334b95506d3f8f312aee44ba21f979518"} Apr 17 16:32:01.407878 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.407589 2572 scope.go:117] "RemoveContainer" containerID="463d5d74b28b6b2899541cedee9f6b7334b95506d3f8f312aee44ba21f979518" Apr 17 16:32:01.413411 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.412481 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" event={"ID":"4050d306-2166-4792-b643-4e16417bd406","Type":"ContainerStarted","Data":"37dae34c79a3db2defe79eb69ad2f86f17c1ef70e55c17798fbaaf1dd11d643f"} Apr 17 16:32:01.415601 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.415530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" event={"ID":"81ae48b9-3879-4953-b4f9-833feac79819","Type":"ContainerStarted","Data":"38d39f89afff7f38e6400bb085d3ee19cb2245f3faf511749ee766885c2edf68"} Apr 17 16:32:01.419577 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.419543 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4" event={"ID":"684a0c83-c816-4304-bc54-42eabb96cf54","Type":"ContainerStarted","Data":"9b79e9872e4d8a36833642a7b5ed0eab9fb111f38b90c6904b7c1b87260a7988"} Apr 17 16:32:01.421755 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.421648 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qch6k" event={"ID":"d9099ff1-798e-434b-8980-189e358b2f96","Type":"ContainerStarted","Data":"7e9cff3314f37a71abb36ad43604d1f2b05a9cbd3158eda17f67742a548fdad2"} Apr 17 16:32:01.421755 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.421679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qch6k" event={"ID":"d9099ff1-798e-434b-8980-189e358b2f96","Type":"ContainerStarted","Data":"b23591bc5f6354f2a517c5c7e1bb7f748e00d664373d8ae1af66f988a74c0a79"} Apr 17 16:32:01.422938 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.421951 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-mkhdq" podStartSLOduration=27.779337912 podStartE2EDuration="41.421933939s" podCreationTimestamp="2026-04-17 16:31:20 +0000 UTC" firstStartedPulling="2026-04-17 16:31:47.177172612 +0000 UTC m=+32.555844765" lastFinishedPulling="2026-04-17 16:32:00.819768623 +0000 UTC m=+46.198440792" observedRunningTime="2026-04-17 16:32:01.418427367 +0000 UTC m=+46.797099543" watchObservedRunningTime="2026-04-17 16:32:01.421933939 +0000 UTC m=+46.800606115" Apr 17 16:32:01.422938 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.422488 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:32:01.429380 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.429355 2572 generic.go:358] "Generic (PLEG): container finished" podID="665698fb-87f6-4ef0-a908-b1d2f13eb9d6" containerID="1e7ad6c42f642d636f1b846be85bf5bb1e416ec137beef8be3dac4276de04242" exitCode=0 Apr 17 16:32:01.429461 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.429421 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gl5np" event={"ID":"665698fb-87f6-4ef0-a908-b1d2f13eb9d6","Type":"ContainerDied","Data":"1e7ad6c42f642d636f1b846be85bf5bb1e416ec137beef8be3dac4276de04242"} Apr 17 16:32:01.432268 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.432248 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" event={"ID":"c83fe77d-ef92-4eec-bda5-e4a74cd955d3","Type":"ContainerStarted","Data":"eb5dba81a816a0f1c7887ba867e78f95fffd21db00dcca44bcfce494571ef98d"} Apr 17 16:32:01.432881 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.432864 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" Apr 17 16:32:01.434245 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.434227 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" Apr 17 16:32:01.443503 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.443436 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" podStartSLOduration=27.824859055 podStartE2EDuration="41.443422913s" podCreationTimestamp="2026-04-17 16:31:20 +0000 UTC" firstStartedPulling="2026-04-17 16:31:47.12856044 +0000 UTC m=+32.507232593" lastFinishedPulling="2026-04-17 16:32:00.747124292 +0000 UTC m=+46.125796451" observedRunningTime="2026-04-17 16:32:01.443361723 +0000 UTC m=+46.822033899" watchObservedRunningTime="2026-04-17 16:32:01.443422913 +0000 UTC m=+46.822095088" Apr 17 16:32:01.471963 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.471492 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" podStartSLOduration=27.795815403 podStartE2EDuration="41.471478389s" podCreationTimestamp="2026-04-17 16:31:20 +0000 UTC" firstStartedPulling="2026-04-17 16:31:47.071518691 +0000 UTC m=+32.450190846" lastFinishedPulling="2026-04-17 16:32:00.747181676 +0000 UTC m=+46.125853832" observedRunningTime="2026-04-17 16:32:01.470112651 +0000 UTC m=+46.848784827" watchObservedRunningTime="2026-04-17 16:32:01.471478389 +0000 UTC m=+46.850150564" Apr 17 16:32:01.510247 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.510170 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-ftwtr" podStartSLOduration=29.801846101 podStartE2EDuration="42.510154834s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="2026-04-17 16:31:47.074775525 +0000 UTC m=+32.453447680" lastFinishedPulling="2026-04-17 16:31:59.783084249 +0000 UTC m=+45.161756413" observedRunningTime="2026-04-17 16:32:01.509815791 +0000 UTC m=+46.888487961" watchObservedRunningTime="2026-04-17 16:32:01.510154834 +0000 UTC m=+46.888827010" Apr 17 16:32:01.510615 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.510582 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-nlj65" podStartSLOduration=23.998614882 podStartE2EDuration="37.510575147s" podCreationTimestamp="2026-04-17 16:31:24 +0000 UTC" firstStartedPulling="2026-04-17 16:31:47.234758419 +0000 UTC m=+32.613430585" lastFinishedPulling="2026-04-17 16:32:00.746718683 +0000 UTC m=+46.125390850" observedRunningTime="2026-04-17 16:32:01.487746123 +0000 UTC m=+46.866418299" watchObservedRunningTime="2026-04-17 16:32:01.510575147 +0000 UTC m=+46.889247327" Apr 17 16:32:01.574657 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.573500 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-74bdc8ccc4-vwkwv" podStartSLOduration=21.990909581 podStartE2EDuration="35.573482351s" podCreationTimestamp="2026-04-17 16:31:26 +0000 UTC" firstStartedPulling="2026-04-17 16:31:47.234773233 +0000 UTC m=+32.613445386" lastFinishedPulling="2026-04-17 16:32:00.817345991 +0000 UTC m=+46.196018156" observedRunningTime="2026-04-17 16:32:01.572385433 +0000 UTC m=+46.951057609" watchObservedRunningTime="2026-04-17 16:32:01.573482351 +0000 UTC m=+46.952154527" Apr 17 16:32:01.590212 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:01.588133 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qch6k" podStartSLOduration=46.588114952 podStartE2EDuration="46.588114952s" podCreationTimestamp="2026-04-17 16:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:32:01.586741988 +0000 UTC m=+46.965414160" watchObservedRunningTime="2026-04-17 16:32:01.588114952 +0000 UTC m=+46.966787144" Apr 17 16:32:02.391737 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.391699 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:32:02.392160 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.391871 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:32:02.392160 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.391952 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls podName:418111de-59f5-4b93-bf45-150196b0de95 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:18.391929584 +0000 UTC m=+63.770601750 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zdpww" (UID: "418111de-59f5-4b93-bf45-150196b0de95") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:32:02.445348 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.445229 2572 generic.go:358] "Generic (PLEG): container finished" podID="665698fb-87f6-4ef0-a908-b1d2f13eb9d6" containerID="c2bfbd420cf737088392eb39b1bd2ac6218f9f3864998013e3c2c203c196cc73" exitCode=0 Apr 17 16:32:02.445348 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.445309 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gl5np" event={"ID":"665698fb-87f6-4ef0-a908-b1d2f13eb9d6","Type":"ContainerDied","Data":"c2bfbd420cf737088392eb39b1bd2ac6218f9f3864998013e3c2c203c196cc73"} Apr 17 16:32:02.447842 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.447823 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/1.log" Apr 17 16:32:02.448428 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.448411 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/0.log" Apr 17 16:32:02.448503 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.448448 2572 generic.go:358] "Generic (PLEG): container finished" podID="c8e1510f-1cb7-4361-bb57-1944dc90fae3" containerID="968677a0d3d65b5b54cb0eecb0e5c428ee5c8b6b44540954d08e8c823ceaa942" exitCode=255 Apr 17 16:32:02.449007 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.448793 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" event={"ID":"c8e1510f-1cb7-4361-bb57-1944dc90fae3","Type":"ContainerDied","Data":"968677a0d3d65b5b54cb0eecb0e5c428ee5c8b6b44540954d08e8c823ceaa942"} Apr 17 16:32:02.449007 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.448831 2572 scope.go:117] "RemoveContainer" containerID="463d5d74b28b6b2899541cedee9f6b7334b95506d3f8f312aee44ba21f979518" Apr 17 16:32:02.449175 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.449114 2572 scope.go:117] "RemoveContainer" containerID="968677a0d3d65b5b54cb0eecb0e5c428ee5c8b6b44540954d08e8c823ceaa942" Apr 17 16:32:02.449383 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.449333 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-t8bs6_openshift-console-operator(c8e1510f-1cb7-4361-bb57-1944dc90fae3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" podUID="c8e1510f-1cb7-4361-bb57-1944dc90fae3" Apr 17 16:32:02.469776 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.469724 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86d88845b5-l6pz4" podStartSLOduration=22.911571177 podStartE2EDuration="36.469710127s" podCreationTimestamp="2026-04-17 16:31:26 +0000 UTC" firstStartedPulling="2026-04-17 16:31:47.268284687 +0000 UTC m=+32.646956852" lastFinishedPulling="2026-04-17 16:32:00.826423646 +0000 UTC m=+46.205095802" observedRunningTime="2026-04-17 16:32:01.605639082 +0000 UTC m=+46.984311252" watchObservedRunningTime="2026-04-17 16:32:02.469710127 +0000 UTC m=+47.848382303" Apr 17 16:32:02.494462 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.494437 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:32:02.494558 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.494505 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:32:02.494621 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.494582 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:32:02.494621 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.494591 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:32:02.494740 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.494642 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs podName:19d7e13d-e66e-48ec-b132-f6e07a38ea96 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:18.49462569 +0000 UTC m=+63.873297857 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs") pod "router-default-79f8c9df5d-9kzhz" (UID: "19d7e13d-e66e-48ec-b132-f6e07a38ea96") : secret "router-metrics-certs-default" not found Apr 17 16:32:02.494740 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.494665 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-chmjc\" (UID: \"1cd90798-a06c-4c4c-9c1a-45465cc231dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:32:02.494740 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.494715 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle podName:19d7e13d-e66e-48ec-b132-f6e07a38ea96 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:18.494699226 +0000 UTC m=+63.873371381 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle") pod "router-default-79f8c9df5d-9kzhz" (UID: "19d7e13d-e66e-48ec-b132-f6e07a38ea96") : configmap references non-existent config key: service-ca.crt Apr 17 16:32:02.494903 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.494745 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:32:02.495122 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.494975 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:32:02.495122 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.494995 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fb6d49967-7zjnb: secret "image-registry-tls" not found Apr 17 16:32:02.495402 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.495053 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls podName:b6f18147-3b7c-4098-b765-2b71bf2dc0f9 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:18.495038171 +0000 UTC m=+63.873710329 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls") pod "image-registry-fb6d49967-7zjnb" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9") : secret "image-registry-tls" not found Apr 17 16:32:02.495402 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.495387 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls podName:1cd90798-a06c-4c4c-9c1a-45465cc231dd nodeName:}" failed. No retries permitted until 2026-04-17 16:32:18.495368268 +0000 UTC m=+63.874040437 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-chmjc" (UID: "1cd90798-a06c-4c4c-9c1a-45465cc231dd") : secret "samples-operator-tls" not found Apr 17 16:32:02.597822 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.595942 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-w25s5\" (UID: \"289e73e3-2dca-4a41-b5bf-d6148102c16a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:32:02.597822 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.596092 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert\") pod \"ingress-canary-wtckq\" (UID: \"ff048557-6618-44a8-9b98-aa0325746b04\") " pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:32:02.597822 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:02.596150 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:32:02.597822 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.596389 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:02.597822 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.596441 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert podName:ff048557-6618-44a8-9b98-aa0325746b04 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:18.596424163 +0000 UTC m=+63.975096320 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert") pod "ingress-canary-wtckq" (UID: "ff048557-6618-44a8-9b98-aa0325746b04") : secret "canary-serving-cert" not found Apr 17 16:32:02.597822 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.596989 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:32:02.597822 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.597038 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert podName:289e73e3-2dca-4a41-b5bf-d6148102c16a nodeName:}" failed. No retries permitted until 2026-04-17 16:32:18.597022034 +0000 UTC m=+63.975694201 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-w25s5" (UID: "289e73e3-2dca-4a41-b5bf-d6148102c16a") : secret "networking-console-plugin-cert" not found Apr 17 16:32:02.597822 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.597355 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:02.597822 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:02.597423 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls podName:f91565e7-f3a4-4471-a2b5-0fcd0292176b nodeName:}" failed. No retries permitted until 2026-04-17 16:32:18.59740464 +0000 UTC m=+63.976076794 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls") pod "dns-default-psgfk" (UID: "f91565e7-f3a4-4471-a2b5-0fcd0292176b") : secret "dns-default-metrics-tls" not found Apr 17 16:32:03.124818 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.124789 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wbp5w"] Apr 17 16:32:03.140817 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.140779 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wbp5w"] Apr 17 16:32:03.141023 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.141002 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wbp5w" Apr 17 16:32:03.146299 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.146074 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 16:32:03.146299 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.146139 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-sh9kx\"" Apr 17 16:32:03.147324 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.147301 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 16:32:03.202310 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.202281 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcdtk\" (UniqueName: \"kubernetes.io/projected/4758db3d-9d86-4b79-bb5f-feb582fbe734-kube-api-access-xcdtk\") pod \"migrator-74bb7799d9-wbp5w\" (UID: \"4758db3d-9d86-4b79-bb5f-feb582fbe734\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wbp5w" Apr 17 16:32:03.303685 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.303608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcdtk\" (UniqueName: \"kubernetes.io/projected/4758db3d-9d86-4b79-bb5f-feb582fbe734-kube-api-access-xcdtk\") pod \"migrator-74bb7799d9-wbp5w\" (UID: \"4758db3d-9d86-4b79-bb5f-feb582fbe734\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wbp5w" Apr 17 16:32:03.312765 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.312706 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcdtk\" (UniqueName: \"kubernetes.io/projected/4758db3d-9d86-4b79-bb5f-feb582fbe734-kube-api-access-xcdtk\") pod \"migrator-74bb7799d9-wbp5w\" (UID: \"4758db3d-9d86-4b79-bb5f-feb582fbe734\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wbp5w" Apr 17 16:32:03.453011 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.452943 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/1.log" Apr 17 16:32:03.453417 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.453401 2572 scope.go:117] "RemoveContainer" containerID="968677a0d3d65b5b54cb0eecb0e5c428ee5c8b6b44540954d08e8c823ceaa942" Apr 17 16:32:03.453639 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:03.453607 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-t8bs6_openshift-console-operator(c8e1510f-1cb7-4361-bb57-1944dc90fae3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" podUID="c8e1510f-1cb7-4361-bb57-1944dc90fae3" Apr 17 16:32:03.455880 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.455863 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wbp5w" Apr 17 16:32:03.457485 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.457446 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gl5np" event={"ID":"665698fb-87f6-4ef0-a908-b1d2f13eb9d6","Type":"ContainerStarted","Data":"200d62bdca698748320ab294777456aa1aec5a23c0496ac562aaba808d02cf89"} Apr 17 16:32:03.501249 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.501180 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gl5np" podStartSLOduration=7.093449813 podStartE2EDuration="48.501161611s" podCreationTimestamp="2026-04-17 16:31:15 +0000 UTC" firstStartedPulling="2026-04-17 16:31:17.701548103 +0000 UTC m=+3.080220261" lastFinishedPulling="2026-04-17 16:31:59.109259892 +0000 UTC m=+44.487932059" observedRunningTime="2026-04-17 16:32:03.498260565 +0000 UTC m=+48.876932741" watchObservedRunningTime="2026-04-17 16:32:03.501161611 +0000 UTC m=+48.879833784" Apr 17 16:32:03.591580 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.591550 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wbp5w"] Apr 17 16:32:03.596020 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:32:03.595990 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4758db3d_9d86_4b79_bb5f_feb582fbe734.slice/crio-6983388b336e9d523cf43445a39972279c33257abad89bd94a5b88b8439598fa WatchSource:0}: Error finding container 6983388b336e9d523cf43445a39972279c33257abad89bd94a5b88b8439598fa: Status 404 returned error can't find the container with id 6983388b336e9d523cf43445a39972279c33257abad89bd94a5b88b8439598fa Apr 17 16:32:03.621724 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:03.621699 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rtbw5_33d7fffe-ef0b-495e-adb5-fbc82f11a1f0/dns-node-resolver/0.log" Apr 17 16:32:04.225378 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.225351 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-b58dr_908fb34a-55a4-4783-af03-7b4b7c408f98/node-ca/0.log" Apr 17 16:32:04.329782 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.329747 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wbb7t"] Apr 17 16:32:04.333411 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.333389 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.336630 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.336101 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:32:04.336630 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.336402 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:32:04.337934 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.336885 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vwmhs\"" Apr 17 16:32:04.342130 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.342106 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wbb7t"] Apr 17 16:32:04.415037 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.414964 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjnj6\" (UniqueName: \"kubernetes.io/projected/76fc56e5-185d-49c6-a5b5-79e4afc9575c-kube-api-access-rjnj6\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.415179 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.415066 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/76fc56e5-185d-49c6-a5b5-79e4afc9575c-data-volume\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.415179 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.415091 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/76fc56e5-185d-49c6-a5b5-79e4afc9575c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.415326 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.415246 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.415381 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.415350 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/76fc56e5-185d-49c6-a5b5-79e4afc9575c-crio-socket\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.462014 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.461975 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wbp5w" event={"ID":"4758db3d-9d86-4b79-bb5f-feb582fbe734","Type":"ContainerStarted","Data":"6983388b336e9d523cf43445a39972279c33257abad89bd94a5b88b8439598fa"} Apr 17 16:32:04.516153 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.516115 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/76fc56e5-185d-49c6-a5b5-79e4afc9575c-crio-socket\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.516317 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.516170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjnj6\" (UniqueName: \"kubernetes.io/projected/76fc56e5-185d-49c6-a5b5-79e4afc9575c-kube-api-access-rjnj6\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.516317 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.516268 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/76fc56e5-185d-49c6-a5b5-79e4afc9575c-crio-socket\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.516317 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.516307 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/76fc56e5-185d-49c6-a5b5-79e4afc9575c-data-volume\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.516467 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.516334 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/76fc56e5-185d-49c6-a5b5-79e4afc9575c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.516467 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.516446 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.516610 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:04.516594 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 16:32:04.516677 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:04.516665 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls podName:76fc56e5-185d-49c6-a5b5-79e4afc9575c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:05.016645949 +0000 UTC m=+50.395318102 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wbb7t" (UID: "76fc56e5-185d-49c6-a5b5-79e4afc9575c") : secret "insights-runtime-extractor-tls" not found Apr 17 16:32:04.516805 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.516785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/76fc56e5-185d-49c6-a5b5-79e4afc9575c-data-volume\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.517007 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.516987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/76fc56e5-185d-49c6-a5b5-79e4afc9575c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:04.524800 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:04.524779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjnj6\" (UniqueName: \"kubernetes.io/projected/76fc56e5-185d-49c6-a5b5-79e4afc9575c-kube-api-access-rjnj6\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:05.022834 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:05.022796 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:05.023021 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:05.022958 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 16:32:05.023021 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:05.023020 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls podName:76fc56e5-185d-49c6-a5b5-79e4afc9575c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:06.023000859 +0000 UTC m=+51.401673019 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wbb7t" (UID: "76fc56e5-185d-49c6-a5b5-79e4afc9575c") : secret "insights-runtime-extractor-tls" not found Apr 17 16:32:05.466827 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:05.466784 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5v2lr" event={"ID":"345c95f5-2f92-4ba5-8afd-6484fb524fad","Type":"ContainerStarted","Data":"77d09b1b43108f3b20a1a03f088dec02852587945362a778ad0e1ba081c30c53"} Apr 17 16:32:05.481548 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:05.481500 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5v2lr" podStartSLOduration=24.174704704 podStartE2EDuration="28.48148691s" podCreationTimestamp="2026-04-17 16:31:37 +0000 UTC" firstStartedPulling="2026-04-17 16:32:00.864758068 +0000 UTC m=+46.243430240" lastFinishedPulling="2026-04-17 16:32:05.17154028 +0000 UTC m=+50.550212446" observedRunningTime="2026-04-17 16:32:05.480801253 +0000 UTC m=+50.859473430" watchObservedRunningTime="2026-04-17 16:32:05.48148691 +0000 UTC m=+50.860159085" Apr 17 16:32:06.033433 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.033403 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:06.033570 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:06.033552 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 16:32:06.033642 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:06.033618 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls podName:76fc56e5-185d-49c6-a5b5-79e4afc9575c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:08.033602752 +0000 UTC m=+53.412274909 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wbb7t" (UID: "76fc56e5-185d-49c6-a5b5-79e4afc9575c") : secret "insights-runtime-extractor-tls" not found Apr 17 16:32:06.387157 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.387125 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-s7cnh"] Apr 17 16:32:06.390966 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.390945 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-s7cnh" Apr 17 16:32:06.394109 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.394045 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 16:32:06.394254 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.394120 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 16:32:06.394254 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.394049 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-fvlgx\"" Apr 17 16:32:06.394373 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.394280 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 16:32:06.394428 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.394393 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 16:32:06.394547 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.394522 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-s7cnh"] Apr 17 16:32:06.471396 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.471360 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wbp5w" event={"ID":"4758db3d-9d86-4b79-bb5f-feb582fbe734","Type":"ContainerStarted","Data":"bed6c67e847bc3927ec01ae13f423bb80361ed9e2d56b997ef1bbe70851de870"} Apr 17 16:32:06.471396 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.471399 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wbp5w" event={"ID":"4758db3d-9d86-4b79-bb5f-feb582fbe734","Type":"ContainerStarted","Data":"e8287a5144f6b26532ff826b18e47a50864f072b0d5ac827118f743df84b82cd"} Apr 17 16:32:06.538351 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.538325 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-896h4\" (UniqueName: \"kubernetes.io/projected/1538a06f-28d8-41cf-9ce2-e3496b79d8e7-kube-api-access-896h4\") pod \"service-ca-865cb79987-s7cnh\" (UID: \"1538a06f-28d8-41cf-9ce2-e3496b79d8e7\") " pod="openshift-service-ca/service-ca-865cb79987-s7cnh" Apr 17 16:32:06.538515 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.538501 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1538a06f-28d8-41cf-9ce2-e3496b79d8e7-signing-cabundle\") pod \"service-ca-865cb79987-s7cnh\" (UID: \"1538a06f-28d8-41cf-9ce2-e3496b79d8e7\") " pod="openshift-service-ca/service-ca-865cb79987-s7cnh" Apr 17 16:32:06.538562 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.538526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1538a06f-28d8-41cf-9ce2-e3496b79d8e7-signing-key\") pod \"service-ca-865cb79987-s7cnh\" (UID: \"1538a06f-28d8-41cf-9ce2-e3496b79d8e7\") " pod="openshift-service-ca/service-ca-865cb79987-s7cnh" Apr 17 16:32:06.639414 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.639344 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1538a06f-28d8-41cf-9ce2-e3496b79d8e7-signing-cabundle\") pod \"service-ca-865cb79987-s7cnh\" (UID: \"1538a06f-28d8-41cf-9ce2-e3496b79d8e7\") " pod="openshift-service-ca/service-ca-865cb79987-s7cnh" Apr 17 16:32:06.639414 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.639375 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1538a06f-28d8-41cf-9ce2-e3496b79d8e7-signing-key\") pod \"service-ca-865cb79987-s7cnh\" (UID: \"1538a06f-28d8-41cf-9ce2-e3496b79d8e7\") " pod="openshift-service-ca/service-ca-865cb79987-s7cnh" Apr 17 16:32:06.639574 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.639422 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-896h4\" (UniqueName: \"kubernetes.io/projected/1538a06f-28d8-41cf-9ce2-e3496b79d8e7-kube-api-access-896h4\") pod \"service-ca-865cb79987-s7cnh\" (UID: \"1538a06f-28d8-41cf-9ce2-e3496b79d8e7\") " pod="openshift-service-ca/service-ca-865cb79987-s7cnh" Apr 17 16:32:06.639926 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.639901 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1538a06f-28d8-41cf-9ce2-e3496b79d8e7-signing-cabundle\") pod \"service-ca-865cb79987-s7cnh\" (UID: \"1538a06f-28d8-41cf-9ce2-e3496b79d8e7\") " pod="openshift-service-ca/service-ca-865cb79987-s7cnh" Apr 17 16:32:06.641837 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.641808 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1538a06f-28d8-41cf-9ce2-e3496b79d8e7-signing-key\") pod \"service-ca-865cb79987-s7cnh\" (UID: \"1538a06f-28d8-41cf-9ce2-e3496b79d8e7\") " pod="openshift-service-ca/service-ca-865cb79987-s7cnh" Apr 17 16:32:06.650642 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.650621 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-896h4\" (UniqueName: \"kubernetes.io/projected/1538a06f-28d8-41cf-9ce2-e3496b79d8e7-kube-api-access-896h4\") pod \"service-ca-865cb79987-s7cnh\" (UID: \"1538a06f-28d8-41cf-9ce2-e3496b79d8e7\") " pod="openshift-service-ca/service-ca-865cb79987-s7cnh" Apr 17 16:32:06.700062 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.700042 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-s7cnh" Apr 17 16:32:06.831138 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.831093 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wbp5w" podStartSLOduration=1.4102481789999999 podStartE2EDuration="3.831078912s" podCreationTimestamp="2026-04-17 16:32:03 +0000 UTC" firstStartedPulling="2026-04-17 16:32:03.598223835 +0000 UTC m=+48.976895993" lastFinishedPulling="2026-04-17 16:32:06.019054569 +0000 UTC m=+51.397726726" observedRunningTime="2026-04-17 16:32:06.489624925 +0000 UTC m=+51.868297101" watchObservedRunningTime="2026-04-17 16:32:06.831078912 +0000 UTC m=+52.209751086" Apr 17 16:32:06.832500 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.832478 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-s7cnh"] Apr 17 16:32:06.834260 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:32:06.834232 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1538a06f_28d8_41cf_9ce2_e3496b79d8e7.slice/crio-e0e18dca6ac14f61b6f7b9a75af772f32f6dea928cc9a32b0f5cc7f35ca13e52 WatchSource:0}: Error finding container e0e18dca6ac14f61b6f7b9a75af772f32f6dea928cc9a32b0f5cc7f35ca13e52: Status 404 returned error can't find the container with id e0e18dca6ac14f61b6f7b9a75af772f32f6dea928cc9a32b0f5cc7f35ca13e52 Apr 17 16:32:06.949104 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.949069 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:32:06.949104 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.949105 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:32:06.949488 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:06.949475 2572 scope.go:117] "RemoveContainer" containerID="968677a0d3d65b5b54cb0eecb0e5c428ee5c8b6b44540954d08e8c823ceaa942" Apr 17 16:32:06.949661 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:06.949645 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-t8bs6_openshift-console-operator(c8e1510f-1cb7-4361-bb57-1944dc90fae3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" podUID="c8e1510f-1cb7-4361-bb57-1944dc90fae3" Apr 17 16:32:07.475496 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:07.475451 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-s7cnh" event={"ID":"1538a06f-28d8-41cf-9ce2-e3496b79d8e7","Type":"ContainerStarted","Data":"863ae856829318726495cfadfe5e261633b61a814fdebbdc3063b4f1043d5038"} Apr 17 16:32:07.475496 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:07.475496 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-s7cnh" event={"ID":"1538a06f-28d8-41cf-9ce2-e3496b79d8e7","Type":"ContainerStarted","Data":"e0e18dca6ac14f61b6f7b9a75af772f32f6dea928cc9a32b0f5cc7f35ca13e52"} Apr 17 16:32:07.491949 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:07.491902 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-s7cnh" podStartSLOduration=1.49188794 podStartE2EDuration="1.49188794s" podCreationTimestamp="2026-04-17 16:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:32:07.491153625 +0000 UTC m=+52.869825801" watchObservedRunningTime="2026-04-17 16:32:07.49188794 +0000 UTC m=+52.870560115" Apr 17 16:32:08.052668 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:08.052635 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:08.052822 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:08.052747 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 16:32:08.052822 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:08.052799 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls podName:76fc56e5-185d-49c6-a5b5-79e4afc9575c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:12.05278589 +0000 UTC m=+57.431458042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wbb7t" (UID: "76fc56e5-185d-49c6-a5b5-79e4afc9575c") : secret "insights-runtime-extractor-tls" not found Apr 17 16:32:11.790181 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:11.790140 2572 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:3c42a1537aa253b95ed15d8d7409baca451b6e5b835cef115761cc41d7505f1e: fetching blob: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image" image="registry.redhat.io/multicluster-engine/cluster-proxy-rhel9@sha256:6bc1f1785b98b2aec519cbfe5ee40d67a537eae488061c116d0db5472b9bf2c5" Apr 17 16:32:11.790586 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:11.790326 2572 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:addon-agent,Image:registry.redhat.io/multicluster-engine/cluster-proxy-rhel9@sha256:6bc1f1785b98b2aec519cbfe5ee40d67a537eae488061c116d0db5472b9bf2c5,Command:[/agent],Args:[--v=2 --hub-kubeconfig=/etc/kubeconfig/kubeconfig --cluster-name=f7e74665-42ea-4532-ad4c-e8dc6d05ede4 --proxy-server-namespace=multicluster-engine],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hub-kubeconfig,ReadOnly:true,MountPath:/etc/kubeconfig/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hub,ReadOnly:true,MountPath:/etc/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ng7lh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000590000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cluster-proxy-proxy-agent-54fd8bd9fb-srkv5_open-cluster-management-agent-addon(cbfcaa7c-7f0a-4a57-96cc-78428bb81916): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:3c42a1537aa253b95ed15d8d7409baca451b6e5b835cef115761cc41d7505f1e: fetching blob: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 16:32:11.931334 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:11.931301 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"addon-agent\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:3c42a1537aa253b95ed15d8d7409baca451b6e5b835cef115761cc41d7505f1e: fetching blob: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podUID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" Apr 17 16:32:12.090987 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:12.090956 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:12.091134 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:12.091117 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 16:32:12.091234 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:12.091215 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls podName:76fc56e5-185d-49c6-a5b5-79e4afc9575c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:20.091171926 +0000 UTC m=+65.469844095 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wbb7t" (UID: "76fc56e5-185d-49c6-a5b5-79e4afc9575c") : secret "insights-runtime-extractor-tls" not found Apr 17 16:32:12.490580 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:12.490544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" event={"ID":"cbfcaa7c-7f0a-4a57-96cc-78428bb81916","Type":"ContainerStarted","Data":"bc268551ccc0207b5ecb067d593d3d14e2e9a547d96d225829c393b01d55943d"} Apr 17 16:32:12.491609 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:12.491586 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"addon-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/multicluster-engine/cluster-proxy-rhel9@sha256:6bc1f1785b98b2aec519cbfe5ee40d67a537eae488061c116d0db5472b9bf2c5\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:3c42a1537aa253b95ed15d8d7409baca451b6e5b835cef115761cc41d7505f1e: fetching blob: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podUID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" Apr 17 16:32:13.493892 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:13.493854 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"addon-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/multicluster-engine/cluster-proxy-rhel9@sha256:6bc1f1785b98b2aec519cbfe5ee40d67a537eae488061c116d0db5472b9bf2c5\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:3c42a1537aa253b95ed15d8d7409baca451b6e5b835cef115761cc41d7505f1e: fetching blob: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podUID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" Apr 17 16:32:17.029252 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:17.029172 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podUID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" containerName="proxy-agent" probeResult="failure" output="Get \"http://10.132.0.20:8888/healthz\": dial tcp 10.132.0.20:8888: connect: connection refused" Apr 17 16:32:17.029688 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:17.029288 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:32:17.029740 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:17.029715 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="proxy-agent" containerStatusID={"Type":"cri-o","ID":"5f4d7a9fea46d1d7b285f1374bcd759751c49a42d9b6aec1df4500af60c59c96"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" containerMessage="Container proxy-agent failed liveness probe, will be restarted" Apr 17 16:32:17.029814 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:17.029798 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podUID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" containerName="proxy-agent" containerID="cri-o://5f4d7a9fea46d1d7b285f1374bcd759751c49a42d9b6aec1df4500af60c59c96" gracePeriod=30 Apr 17 16:32:17.029989 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:17.029957 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podUID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:32:17.204324 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:17.204296 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"addon-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/multicluster-engine/cluster-proxy-rhel9@sha256:6bc1f1785b98b2aec519cbfe5ee40d67a537eae488061c116d0db5472b9bf2c5\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:3c42a1537aa253b95ed15d8d7409baca451b6e5b835cef115761cc41d7505f1e: fetching blob: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podUID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" Apr 17 16:32:17.505496 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:17.505464 2572 generic.go:358] "Generic (PLEG): container finished" podID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" containerID="5f4d7a9fea46d1d7b285f1374bcd759751c49a42d9b6aec1df4500af60c59c96" exitCode=2 Apr 17 16:32:17.505644 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:17.505542 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" event={"ID":"cbfcaa7c-7f0a-4a57-96cc-78428bb81916","Type":"ContainerDied","Data":"5f4d7a9fea46d1d7b285f1374bcd759751c49a42d9b6aec1df4500af60c59c96"} Apr 17 16:32:17.505644 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:17.505589 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" event={"ID":"cbfcaa7c-7f0a-4a57-96cc-78428bb81916","Type":"ContainerStarted","Data":"4f6b270a8bfc7cb61ff6fcb3589bf001f4e1ea601350918e7acb2e2a642f4321"} Apr 17 16:32:17.506835 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:17.506805 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"addon-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/multicluster-engine/cluster-proxy-rhel9@sha256:6bc1f1785b98b2aec519cbfe5ee40d67a537eae488061c116d0db5472b9bf2c5\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:3c42a1537aa253b95ed15d8d7409baca451b6e5b835cef115761cc41d7505f1e: fetching blob: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podUID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" Apr 17 16:32:18.444844 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.444807 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:32:18.447423 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.447399 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/418111de-59f5-4b93-bf45-150196b0de95-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zdpww\" (UID: \"418111de-59f5-4b93-bf45-150196b0de95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:32:18.545515 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.545482 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:32:18.545515 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.545521 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:32:18.545688 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.545550 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:32:18.545688 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.545591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-chmjc\" (UID: \"1cd90798-a06c-4c4c-9c1a-45465cc231dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:32:18.546290 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.546256 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19d7e13d-e66e-48ec-b132-f6e07a38ea96-service-ca-bundle\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:32:18.548018 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.547995 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls\") pod \"image-registry-fb6d49967-7zjnb\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:32:18.548018 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.548011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19d7e13d-e66e-48ec-b132-f6e07a38ea96-metrics-certs\") pod \"router-default-79f8c9df5d-9kzhz\" (UID: \"19d7e13d-e66e-48ec-b132-f6e07a38ea96\") " pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:32:18.548146 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.548050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cd90798-a06c-4c4c-9c1a-45465cc231dd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-chmjc\" (UID: \"1cd90798-a06c-4c4c-9c1a-45465cc231dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:32:18.623408 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.623383 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-9pgds\"" Apr 17 16:32:18.631811 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.631791 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" Apr 17 16:32:18.646756 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.646735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert\") pod \"ingress-canary-wtckq\" (UID: \"ff048557-6618-44a8-9b98-aa0325746b04\") " pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:32:18.646851 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.646773 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:32:18.646918 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.646864 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-w25s5\" (UID: \"289e73e3-2dca-4a41-b5bf-d6148102c16a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:32:18.648922 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.648895 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f91565e7-f3a4-4471-a2b5-0fcd0292176b-metrics-tls\") pod \"dns-default-psgfk\" (UID: \"f91565e7-f3a4-4471-a2b5-0fcd0292176b\") " pod="openshift-dns/dns-default-psgfk" Apr 17 16:32:18.649123 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.649102 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/289e73e3-2dca-4a41-b5bf-d6148102c16a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-w25s5\" (UID: \"289e73e3-2dca-4a41-b5bf-d6148102c16a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:32:18.649123 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.649118 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff048557-6618-44a8-9b98-aa0325746b04-cert\") pod \"ingress-canary-wtckq\" (UID: \"ff048557-6618-44a8-9b98-aa0325746b04\") " pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:32:18.662335 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.662317 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-psh7p\"" Apr 17 16:32:18.670894 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.670877 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:32:18.674727 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.674697 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-xnps4\"" Apr 17 16:32:18.682611 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.682589 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:32:18.729797 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.729537 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-vzv4m\"" Apr 17 16:32:18.738370 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.737006 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" Apr 17 16:32:18.759317 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.759176 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww"] Apr 17 16:32:18.760544 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.760526 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cnl7t\"" Apr 17 16:32:18.761648 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:32:18.761620 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod418111de_59f5_4b93_bf45_150196b0de95.slice/crio-d47c23892787211360aed4b6712990408df9c4c84f5bb4a7dc19bdfbc0589ea3 WatchSource:0}: Error finding container d47c23892787211360aed4b6712990408df9c4c84f5bb4a7dc19bdfbc0589ea3: Status 404 returned error can't find the container with id d47c23892787211360aed4b6712990408df9c4c84f5bb4a7dc19bdfbc0589ea3 Apr 17 16:32:18.768585 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.768564 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wtckq" Apr 17 16:32:18.772078 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.772026 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xz9tf\"" Apr 17 16:32:18.781436 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.780417 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-psgfk" Apr 17 16:32:18.796718 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.795175 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-cgkh6\"" Apr 17 16:32:18.807003 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.803444 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" Apr 17 16:32:18.819827 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.818854 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-fb6d49967-7zjnb"] Apr 17 16:32:18.824431 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:32:18.824383 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6f18147_3b7c_4098_b765_2b71bf2dc0f9.slice/crio-f9e133c56ff52c1b89b6463cbd9d0127d2f8b7a0538dc7f75c0a9d9909ee3de8 WatchSource:0}: Error finding container f9e133c56ff52c1b89b6463cbd9d0127d2f8b7a0538dc7f75c0a9d9909ee3de8: Status 404 returned error can't find the container with id f9e133c56ff52c1b89b6463cbd9d0127d2f8b7a0538dc7f75c0a9d9909ee3de8 Apr 17 16:32:18.844592 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.844544 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-79f8c9df5d-9kzhz"] Apr 17 16:32:18.920399 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.920355 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc"] Apr 17 16:32:18.945956 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.945595 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wtckq"] Apr 17 16:32:18.961820 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:32:18.961797 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff048557_6618_44a8_9b98_aa0325746b04.slice/crio-7f0683891aac8dfba17c69f6d2c24350136963bb3aa66252c05b8cf63f712bf3 WatchSource:0}: Error finding container 7f0683891aac8dfba17c69f6d2c24350136963bb3aa66252c05b8cf63f712bf3: Status 404 returned error can't find the container with id 7f0683891aac8dfba17c69f6d2c24350136963bb3aa66252c05b8cf63f712bf3 Apr 17 16:32:18.971371 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.971294 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-psgfk"] Apr 17 16:32:18.974016 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:32:18.973986 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf91565e7_f3a4_4471_a2b5_0fcd0292176b.slice/crio-92d97acb3ff48a437478f8ccab55e77ddd019409a97b68ad789d6f446e01dc39 WatchSource:0}: Error finding container 92d97acb3ff48a437478f8ccab55e77ddd019409a97b68ad789d6f446e01dc39: Status 404 returned error can't find the container with id 92d97acb3ff48a437478f8ccab55e77ddd019409a97b68ad789d6f446e01dc39 Apr 17 16:32:18.985903 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:18.985885 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-w25s5"] Apr 17 16:32:18.989350 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:32:18.989318 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod289e73e3_2dca_4a41_b5bf_d6148102c16a.slice/crio-48346025e0f836c9d3c54eb449f9dc0887b7208ef44fe7d6c5b4d15b126be556 WatchSource:0}: Error finding container 48346025e0f836c9d3c54eb449f9dc0887b7208ef44fe7d6c5b4d15b126be556: Status 404 returned error can't find the container with id 48346025e0f836c9d3c54eb449f9dc0887b7208ef44fe7d6c5b4d15b126be556 Apr 17 16:32:19.188669 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.188643 2572 scope.go:117] "RemoveContainer" containerID="968677a0d3d65b5b54cb0eecb0e5c428ee5c8b6b44540954d08e8c823ceaa942" Apr 17 16:32:19.513589 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.513497 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wtckq" event={"ID":"ff048557-6618-44a8-9b98-aa0325746b04","Type":"ContainerStarted","Data":"7f0683891aac8dfba17c69f6d2c24350136963bb3aa66252c05b8cf63f712bf3"} Apr 17 16:32:19.516230 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.516045 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/1.log" Apr 17 16:32:19.516230 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.516161 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" event={"ID":"c8e1510f-1cb7-4361-bb57-1944dc90fae3","Type":"ContainerStarted","Data":"8423fbad7373e865750bcd5a8c89393b7890e0f74320142d5617e3743881d945"} Apr 17 16:32:19.516877 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.516820 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:32:19.519650 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.519566 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" event={"ID":"1cd90798-a06c-4c4c-9c1a-45465cc231dd","Type":"ContainerStarted","Data":"dcf9e47007ad3712888482892717ee172aeb5207ead12b24ad3edcaed37b7ed1"} Apr 17 16:32:19.521397 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.521329 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" event={"ID":"19d7e13d-e66e-48ec-b132-f6e07a38ea96","Type":"ContainerStarted","Data":"bb1c21c0c7b72cbe53cce00e630366a8b3c976e64ac79197774059b73e893e23"} Apr 17 16:32:19.521397 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.521363 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" event={"ID":"19d7e13d-e66e-48ec-b132-f6e07a38ea96","Type":"ContainerStarted","Data":"7abae930fdd713e42d5f4897a61900bbfa31f736ad3b6145ac1645fa8241bcf3"} Apr 17 16:32:19.522887 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.522865 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" event={"ID":"418111de-59f5-4b93-bf45-150196b0de95","Type":"ContainerStarted","Data":"d47c23892787211360aed4b6712990408df9c4c84f5bb4a7dc19bdfbc0589ea3"} Apr 17 16:32:19.524340 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.524297 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" event={"ID":"289e73e3-2dca-4a41-b5bf-d6148102c16a","Type":"ContainerStarted","Data":"48346025e0f836c9d3c54eb449f9dc0887b7208ef44fe7d6c5b4d15b126be556"} Apr 17 16:32:19.525755 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.525730 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-psgfk" event={"ID":"f91565e7-f3a4-4471-a2b5-0fcd0292176b","Type":"ContainerStarted","Data":"92d97acb3ff48a437478f8ccab55e77ddd019409a97b68ad789d6f446e01dc39"} Apr 17 16:32:19.528497 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.528473 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" event={"ID":"b6f18147-3b7c-4098-b765-2b71bf2dc0f9","Type":"ContainerStarted","Data":"1b18581a3d7c9d7a3e9f30684fbe7380b38fab30a58f229a097cf36a94ecf481"} Apr 17 16:32:19.528597 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.528506 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" event={"ID":"b6f18147-3b7c-4098-b765-2b71bf2dc0f9","Type":"ContainerStarted","Data":"f9e133c56ff52c1b89b6463cbd9d0127d2f8b7a0538dc7f75c0a9d9909ee3de8"} Apr 17 16:32:19.528976 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.528936 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:32:19.555989 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.555932 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" podStartSLOduration=64.555918681 podStartE2EDuration="1m4.555918681s" podCreationTimestamp="2026-04-17 16:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:32:19.554071213 +0000 UTC m=+64.932743386" watchObservedRunningTime="2026-04-17 16:32:19.555918681 +0000 UTC m=+64.934590859" Apr 17 16:32:19.556132 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.556074 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" podStartSLOduration=46.022509988 podStartE2EDuration="59.556066677s" podCreationTimestamp="2026-04-17 16:31:20 +0000 UTC" firstStartedPulling="2026-04-17 16:31:47.212911473 +0000 UTC m=+32.591583627" lastFinishedPulling="2026-04-17 16:32:00.746468147 +0000 UTC m=+46.125140316" observedRunningTime="2026-04-17 16:32:19.534364766 +0000 UTC m=+64.913036942" watchObservedRunningTime="2026-04-17 16:32:19.556066677 +0000 UTC m=+64.934738987" Apr 17 16:32:19.574774 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.573549 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" podStartSLOduration=59.573533101 podStartE2EDuration="59.573533101s" podCreationTimestamp="2026-04-17 16:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:32:19.572516795 +0000 UTC m=+64.951188972" watchObservedRunningTime="2026-04-17 16:32:19.573533101 +0000 UTC m=+64.952205276" Apr 17 16:32:19.683831 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.683804 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:32:19.687092 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.686880 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:32:19.900537 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:19.900329 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-t8bs6" Apr 17 16:32:20.166460 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:20.165948 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:20.175993 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:20.175931 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/76fc56e5-185d-49c6-a5b5-79e4afc9575c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbb7t\" (UID: \"76fc56e5-185d-49c6-a5b5-79e4afc9575c\") " pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:20.251587 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:20.251488 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vwmhs\"" Apr 17 16:32:20.258971 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:20.258927 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wbb7t" Apr 17 16:32:20.533209 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:20.533093 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:32:20.534733 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:20.534671 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-79f8c9df5d-9kzhz" Apr 17 16:32:20.871340 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:20.871310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs\") pod \"network-metrics-daemon-lt7mn\" (UID: \"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0\") " pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:32:20.873371 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:20.873343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e538eeb-9985-4bbd-ae4b-d6ac1469dba0-metrics-certs\") pod \"network-metrics-daemon-lt7mn\" (UID: \"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0\") " pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:32:21.139333 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:21.139260 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6m66z\"" Apr 17 16:32:21.147528 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:21.147504 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lt7mn" Apr 17 16:32:22.520601 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:22.520552 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wbb7t"] Apr 17 16:32:22.531601 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:32:22.531418 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76fc56e5_185d_49c6_a5b5_79e4afc9575c.slice/crio-f14680eb3b7b4aa7795785fa9867e19746e9a4b832cf2cf5fab8dea21167ecb1 WatchSource:0}: Error finding container f14680eb3b7b4aa7795785fa9867e19746e9a4b832cf2cf5fab8dea21167ecb1: Status 404 returned error can't find the container with id f14680eb3b7b4aa7795785fa9867e19746e9a4b832cf2cf5fab8dea21167ecb1 Apr 17 16:32:22.538955 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:22.538797 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lt7mn"] Apr 17 16:32:22.542472 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:22.542415 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" event={"ID":"1cd90798-a06c-4c4c-9c1a-45465cc231dd","Type":"ContainerStarted","Data":"e95bfcec20395c21522b3f994a1b8d58cb931d813d112fe128926dd363645aa9"} Apr 17 16:32:22.544079 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:22.544045 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" event={"ID":"418111de-59f5-4b93-bf45-150196b0de95","Type":"ContainerStarted","Data":"d3e7f0a5536481ba39193af3fa58e930866855e04aefc1853904f5dd2591d7ce"} Apr 17 16:32:22.545857 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:22.545806 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" event={"ID":"289e73e3-2dca-4a41-b5bf-d6148102c16a","Type":"ContainerStarted","Data":"6e65bb49b13bf598ce9670301c25336699b152efbfbe32b131b154624c1e87ad"} Apr 17 16:32:22.548171 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:22.548141 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wbb7t" event={"ID":"76fc56e5-185d-49c6-a5b5-79e4afc9575c","Type":"ContainerStarted","Data":"f14680eb3b7b4aa7795785fa9867e19746e9a4b832cf2cf5fab8dea21167ecb1"} Apr 17 16:32:22.549712 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:32:22.549644 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e538eeb_9985_4bbd_ae4b_d6ac1469dba0.slice/crio-6ab7d4b10b4cb8d186a8143015e86f054c59c5c73cbcfff1b4c4034b81504fde WatchSource:0}: Error finding container 6ab7d4b10b4cb8d186a8143015e86f054c59c5c73cbcfff1b4c4034b81504fde: Status 404 returned error can't find the container with id 6ab7d4b10b4cb8d186a8143015e86f054c59c5c73cbcfff1b4c4034b81504fde Apr 17 16:32:22.549953 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:22.549932 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wtckq" event={"ID":"ff048557-6618-44a8-9b98-aa0325746b04","Type":"ContainerStarted","Data":"8907ba0cdc3e4253f19348164762ec9a4364355083b1001b14d4466bf2ed8a61"} Apr 17 16:32:22.567489 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:22.565968 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zdpww" podStartSLOduration=58.976271523 podStartE2EDuration="1m2.565951112s" podCreationTimestamp="2026-04-17 16:31:20 +0000 UTC" firstStartedPulling="2026-04-17 16:32:18.765800737 +0000 UTC m=+64.144472889" lastFinishedPulling="2026-04-17 16:32:22.355480325 +0000 UTC m=+67.734152478" observedRunningTime="2026-04-17 16:32:22.564572491 +0000 UTC m=+67.943244681" watchObservedRunningTime="2026-04-17 16:32:22.565951112 +0000 UTC m=+67.944623292" Apr 17 16:32:22.581953 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:22.581889 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wtckq" podStartSLOduration=33.190787003 podStartE2EDuration="36.581868518s" podCreationTimestamp="2026-04-17 16:31:46 +0000 UTC" firstStartedPulling="2026-04-17 16:32:18.964171337 +0000 UTC m=+64.342843490" lastFinishedPulling="2026-04-17 16:32:22.355252835 +0000 UTC m=+67.733925005" observedRunningTime="2026-04-17 16:32:22.580267167 +0000 UTC m=+67.958939342" watchObservedRunningTime="2026-04-17 16:32:22.581868518 +0000 UTC m=+67.960540725" Apr 17 16:32:22.597902 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:22.597836 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-w25s5" podStartSLOduration=50.236895351 podStartE2EDuration="53.597817253s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="2026-04-17 16:32:18.991140526 +0000 UTC m=+64.369812683" lastFinishedPulling="2026-04-17 16:32:22.352062414 +0000 UTC m=+67.730734585" observedRunningTime="2026-04-17 16:32:22.596394402 +0000 UTC m=+67.975066578" watchObservedRunningTime="2026-04-17 16:32:22.597817253 +0000 UTC m=+67.976489431" Apr 17 16:32:23.555258 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:23.555215 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" event={"ID":"1cd90798-a06c-4c4c-9c1a-45465cc231dd","Type":"ContainerStarted","Data":"fe2b2ab6d0315095b517c1305802450c8c1f519f3aa458e44518576d02ea1c22"} Apr 17 16:32:23.556484 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:23.556459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lt7mn" event={"ID":"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0","Type":"ContainerStarted","Data":"6ab7d4b10b4cb8d186a8143015e86f054c59c5c73cbcfff1b4c4034b81504fde"} Apr 17 16:32:23.558468 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:23.558264 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-psgfk" event={"ID":"f91565e7-f3a4-4471-a2b5-0fcd0292176b","Type":"ContainerStarted","Data":"121329128c9727605bb2b3e47b13819bcdb29ae95f9367c81679f729fa1f9abd"} Apr 17 16:32:23.558468 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:23.558296 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-psgfk" event={"ID":"f91565e7-f3a4-4471-a2b5-0fcd0292176b","Type":"ContainerStarted","Data":"02765b42549f25e4d37e794836c4876a16b3b4edd781aade55fa7e3f169f72d3"} Apr 17 16:32:23.558468 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:23.558382 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-psgfk" Apr 17 16:32:23.559928 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:23.559895 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wbb7t" event={"ID":"76fc56e5-185d-49c6-a5b5-79e4afc9575c","Type":"ContainerStarted","Data":"dce15e554f981dcac1474b66a0022d5cf29acd9d2e85617f5d0c7c46c55ee90b"} Apr 17 16:32:23.572039 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:23.571999 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-chmjc" podStartSLOduration=60.188276537 podStartE2EDuration="1m3.571984551s" podCreationTimestamp="2026-04-17 16:31:20 +0000 UTC" firstStartedPulling="2026-04-17 16:32:18.979687436 +0000 UTC m=+64.358359590" lastFinishedPulling="2026-04-17 16:32:22.363395437 +0000 UTC m=+67.742067604" observedRunningTime="2026-04-17 16:32:23.570631565 +0000 UTC m=+68.949303744" watchObservedRunningTime="2026-04-17 16:32:23.571984551 +0000 UTC m=+68.950656727" Apr 17 16:32:23.587516 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:23.587466 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-psgfk" podStartSLOduration=34.208267937 podStartE2EDuration="37.587452091s" podCreationTimestamp="2026-04-17 16:31:46 +0000 UTC" firstStartedPulling="2026-04-17 16:32:18.976134764 +0000 UTC m=+64.354806920" lastFinishedPulling="2026-04-17 16:32:22.355318916 +0000 UTC m=+67.733991074" observedRunningTime="2026-04-17 16:32:23.585858257 +0000 UTC m=+68.964530433" watchObservedRunningTime="2026-04-17 16:32:23.587452091 +0000 UTC m=+68.966124266" Apr 17 16:32:24.564470 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:24.564428 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wbb7t" event={"ID":"76fc56e5-185d-49c6-a5b5-79e4afc9575c","Type":"ContainerStarted","Data":"1429c56deb2a6f8dc27d31a159509854f380755d38d7a6efb2ba8cc89ad5151b"} Apr 17 16:32:24.565913 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:24.565876 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lt7mn" event={"ID":"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0","Type":"ContainerStarted","Data":"93bdd19b9d0c3f9f51a36747c0d3051d8bae0e53dd2ac73303b983f53b9f7a4c"} Apr 17 16:32:24.565913 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:24.565905 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lt7mn" event={"ID":"3e538eeb-9985-4bbd-ae4b-d6ac1469dba0","Type":"ContainerStarted","Data":"b441359751f68642c89329c3ab4b173561686eccc7c42004f2351191771a4c6b"} Apr 17 16:32:24.583483 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:24.583439 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lt7mn" podStartSLOduration=67.984577295 podStartE2EDuration="1m9.583425585s" podCreationTimestamp="2026-04-17 16:31:15 +0000 UTC" firstStartedPulling="2026-04-17 16:32:22.553140534 +0000 UTC m=+67.931812688" lastFinishedPulling="2026-04-17 16:32:24.151988822 +0000 UTC m=+69.530660978" observedRunningTime="2026-04-17 16:32:24.581544674 +0000 UTC m=+69.960216850" watchObservedRunningTime="2026-04-17 16:32:24.583425585 +0000 UTC m=+69.962097760" Apr 17 16:32:26.574678 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:26.574646 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wbb7t" event={"ID":"76fc56e5-185d-49c6-a5b5-79e4afc9575c","Type":"ContainerStarted","Data":"eb2454a86155b90b2b7665341c25fe1341d1bb23531ad2fcedb7ece4fa9ce580"} Apr 17 16:32:26.599901 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:26.599845 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wbb7t" podStartSLOduration=19.125797341 podStartE2EDuration="22.59982913s" podCreationTimestamp="2026-04-17 16:32:04 +0000 UTC" firstStartedPulling="2026-04-17 16:32:22.641813117 +0000 UTC m=+68.020485271" lastFinishedPulling="2026-04-17 16:32:26.115844903 +0000 UTC m=+71.494517060" observedRunningTime="2026-04-17 16:32:26.598792221 +0000 UTC m=+71.977464396" watchObservedRunningTime="2026-04-17 16:32:26.59982913 +0000 UTC m=+71.978501307" Apr 17 16:32:27.027657 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:27.027611 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podUID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" containerName="proxy-agent" probeResult="failure" output="Get \"http://10.132.0.20:8888/healthz\": dial tcp 10.132.0.20:8888: connect: connection refused" Apr 17 16:32:27.027849 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:27.027681 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:32:27.028152 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:27.028118 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podUID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:32:27.028270 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:27.028217 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="proxy-agent" containerStatusID={"Type":"cri-o","ID":"4f6b270a8bfc7cb61ff6fcb3589bf001f4e1ea601350918e7acb2e2a642f4321"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" containerMessage="Container proxy-agent failed liveness probe, will be restarted" Apr 17 16:32:27.028332 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:27.028309 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podUID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" containerName="proxy-agent" containerID="cri-o://4f6b270a8bfc7cb61ff6fcb3589bf001f4e1ea601350918e7acb2e2a642f4321" gracePeriod=30 Apr 17 16:32:27.579929 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:27.579898 2572 generic.go:358] "Generic (PLEG): container finished" podID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" containerID="4f6b270a8bfc7cb61ff6fcb3589bf001f4e1ea601350918e7acb2e2a642f4321" exitCode=2 Apr 17 16:32:27.580353 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:27.579969 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" event={"ID":"cbfcaa7c-7f0a-4a57-96cc-78428bb81916","Type":"ContainerDied","Data":"4f6b270a8bfc7cb61ff6fcb3589bf001f4e1ea601350918e7acb2e2a642f4321"} Apr 17 16:32:27.580353 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:27.580006 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" event={"ID":"cbfcaa7c-7f0a-4a57-96cc-78428bb81916","Type":"ContainerStarted","Data":"b306993d0ee127acbf9f7c57338b21a474bfc67e028cd483081395181cf5ed4e"} Apr 17 16:32:27.580353 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:27.580021 2572 scope.go:117] "RemoveContainer" containerID="5f4d7a9fea46d1d7b285f1374bcd759751c49a42d9b6aec1df4500af60c59c96" Apr 17 16:32:30.593365 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:30.593332 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" event={"ID":"cbfcaa7c-7f0a-4a57-96cc-78428bb81916","Type":"ContainerStarted","Data":"8a99a4700dca633212408bef67972a92f858bd04a04e678fbc94e6182ed89aa4"} Apr 17 16:32:30.615919 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:30.615874 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podStartSLOduration=22.192068625 podStartE2EDuration="1m4.615861527s" podCreationTimestamp="2026-04-17 16:31:26 +0000 UTC" firstStartedPulling="2026-04-17 16:31:47.257345379 +0000 UTC m=+32.636017531" lastFinishedPulling="2026-04-17 16:32:29.681138279 +0000 UTC m=+75.059810433" observedRunningTime="2026-04-17 16:32:30.613789467 +0000 UTC m=+75.992461641" watchObservedRunningTime="2026-04-17 16:32:30.615861527 +0000 UTC m=+75.994533729" Apr 17 16:32:33.461220 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:33.461179 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qch6k" Apr 17 16:32:33.567744 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:33.567720 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-psgfk" Apr 17 16:32:36.573314 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.573285 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-fb6d49967-7zjnb"] Apr 17 16:32:36.577718 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.577688 2572 patch_prober.go:28] interesting pod/image-registry-fb6d49967-7zjnb container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:32:36.577853 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.577739 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" podUID="b6f18147-3b7c-4098-b765-2b71bf2dc0f9" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:32:36.613437 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.613402 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-744dd97c69-4j2xx"] Apr 17 16:32:36.655484 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.655462 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-744dd97c69-4j2xx"] Apr 17 16:32:36.655605 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.655564 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.700772 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.700746 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de97fb72-3543-47b3-b17e-ff7b86ce5de1-trusted-ca\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.700887 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.700785 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de97fb72-3543-47b3-b17e-ff7b86ce5de1-image-registry-private-configuration\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.700887 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.700804 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkvj4\" (UniqueName: \"kubernetes.io/projected/de97fb72-3543-47b3-b17e-ff7b86ce5de1-kube-api-access-bkvj4\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.700887 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.700835 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de97fb72-3543-47b3-b17e-ff7b86ce5de1-registry-tls\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.700887 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.700855 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de97fb72-3543-47b3-b17e-ff7b86ce5de1-registry-certificates\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.700887 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.700876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de97fb72-3543-47b3-b17e-ff7b86ce5de1-installation-pull-secrets\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.701069 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.700973 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de97fb72-3543-47b3-b17e-ff7b86ce5de1-bound-sa-token\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.701069 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.701008 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de97fb72-3543-47b3-b17e-ff7b86ce5de1-ca-trust-extracted\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.801660 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.801624 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de97fb72-3543-47b3-b17e-ff7b86ce5de1-registry-tls\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.801660 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.801662 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de97fb72-3543-47b3-b17e-ff7b86ce5de1-registry-certificates\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.801894 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.801684 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de97fb72-3543-47b3-b17e-ff7b86ce5de1-installation-pull-secrets\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.801894 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.801713 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de97fb72-3543-47b3-b17e-ff7b86ce5de1-bound-sa-token\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.801894 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.801733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de97fb72-3543-47b3-b17e-ff7b86ce5de1-ca-trust-extracted\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.801894 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.801761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de97fb72-3543-47b3-b17e-ff7b86ce5de1-trusted-ca\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.801894 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.801792 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de97fb72-3543-47b3-b17e-ff7b86ce5de1-image-registry-private-configuration\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.802143 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.801924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkvj4\" (UniqueName: \"kubernetes.io/projected/de97fb72-3543-47b3-b17e-ff7b86ce5de1-kube-api-access-bkvj4\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.802369 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.802330 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de97fb72-3543-47b3-b17e-ff7b86ce5de1-ca-trust-extracted\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.802923 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.802898 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de97fb72-3543-47b3-b17e-ff7b86ce5de1-registry-certificates\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.803090 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.803064 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de97fb72-3543-47b3-b17e-ff7b86ce5de1-trusted-ca\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.804363 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.804343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de97fb72-3543-47b3-b17e-ff7b86ce5de1-installation-pull-secrets\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.804464 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.804361 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de97fb72-3543-47b3-b17e-ff7b86ce5de1-registry-tls\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.804524 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.804465 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de97fb72-3543-47b3-b17e-ff7b86ce5de1-image-registry-private-configuration\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.811300 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.811224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkvj4\" (UniqueName: \"kubernetes.io/projected/de97fb72-3543-47b3-b17e-ff7b86ce5de1-kube-api-access-bkvj4\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.811551 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.811534 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de97fb72-3543-47b3-b17e-ff7b86ce5de1-bound-sa-token\") pod \"image-registry-744dd97c69-4j2xx\" (UID: \"de97fb72-3543-47b3-b17e-ff7b86ce5de1\") " pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:36.964765 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:36.964739 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:37.028565 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:37.028302 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podUID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:32:37.028565 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:37.028374 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" Apr 17 16:32:37.028972 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:37.028948 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"bc268551ccc0207b5ecb067d593d3d14e2e9a547d96d225829c393b01d55943d"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 16:32:37.029054 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:37.029000 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" podUID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" containerName="service-proxy" containerID="cri-o://bc268551ccc0207b5ecb067d593d3d14e2e9a547d96d225829c393b01d55943d" gracePeriod=30 Apr 17 16:32:37.084729 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:37.084702 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-744dd97c69-4j2xx"] Apr 17 16:32:37.087672 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:32:37.087639 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde97fb72_3543_47b3_b17e_ff7b86ce5de1.slice/crio-c41d79fc8502e47803c2ae5a42a181fdd2744f9634e268ccf2f8d3db67733eff WatchSource:0}: Error finding container c41d79fc8502e47803c2ae5a42a181fdd2744f9634e268ccf2f8d3db67733eff: Status 404 returned error can't find the container with id c41d79fc8502e47803c2ae5a42a181fdd2744f9634e268ccf2f8d3db67733eff Apr 17 16:32:37.615710 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:37.615678 2572 generic.go:358] "Generic (PLEG): container finished" podID="cbfcaa7c-7f0a-4a57-96cc-78428bb81916" containerID="bc268551ccc0207b5ecb067d593d3d14e2e9a547d96d225829c393b01d55943d" exitCode=2 Apr 17 16:32:37.616102 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:37.615753 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" event={"ID":"cbfcaa7c-7f0a-4a57-96cc-78428bb81916","Type":"ContainerDied","Data":"bc268551ccc0207b5ecb067d593d3d14e2e9a547d96d225829c393b01d55943d"} Apr 17 16:32:37.616102 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:37.615803 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-54fd8bd9fb-srkv5" event={"ID":"cbfcaa7c-7f0a-4a57-96cc-78428bb81916","Type":"ContainerStarted","Data":"f862fb96d07475763ce2b73305a8266e014c1b3f384988d0b84d4fb6e3667627"} Apr 17 16:32:37.617152 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:37.617131 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" event={"ID":"de97fb72-3543-47b3-b17e-ff7b86ce5de1","Type":"ContainerStarted","Data":"17c57e0514833766aa47a035f25f94e691af075c7b0c85a4014cd6f3263376f6"} Apr 17 16:32:37.617278 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:37.617156 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" event={"ID":"de97fb72-3543-47b3-b17e-ff7b86ce5de1","Type":"ContainerStarted","Data":"c41d79fc8502e47803c2ae5a42a181fdd2744f9634e268ccf2f8d3db67733eff"} Apr 17 16:32:37.617278 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:37.617215 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:32:37.660886 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:37.660838 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" podStartSLOduration=1.660822181 podStartE2EDuration="1.660822181s" podCreationTimestamp="2026-04-17 16:32:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:32:37.658218557 +0000 UTC m=+83.036890734" watchObservedRunningTime="2026-04-17 16:32:37.660822181 +0000 UTC m=+83.039494357" Apr 17 16:32:46.375829 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.375790 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bb578"] Apr 17 16:32:46.381804 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.381779 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.384604 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.384583 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:32:46.385092 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.385072 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:32:46.385445 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.385424 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:32:46.385680 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.385660 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pjgf8\"" Apr 17 16:32:46.385997 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.385972 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:32:46.480955 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.480925 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-sys\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.481080 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.480968 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-wtmp\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.481080 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.480999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-textfile\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.481080 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.481033 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-root\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.481080 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.481058 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-accelerators-collector-config\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.481306 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.481091 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-metrics-client-ca\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.481306 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.481138 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-tls\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.481306 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.481215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjms8\" (UniqueName: \"kubernetes.io/projected/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-kube-api-access-jjms8\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.481306 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.481243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.577391 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.577367 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:32:46.581991 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.581957 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjms8\" (UniqueName: \"kubernetes.io/projected/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-kube-api-access-jjms8\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.582125 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.581996 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.582125 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.582036 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-sys\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.582125 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.582089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-sys\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.582421 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.582214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-wtmp\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.582421 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.582247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-textfile\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.582421 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.582273 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-root\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.582421 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.582296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-accelerators-collector-config\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.582421 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.582334 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-metrics-client-ca\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.582421 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.582368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-tls\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.582421 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.582384 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-wtmp\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.582754 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:46.582475 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 16:32:46.582754 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.582501 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-root\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.582754 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:32:46.582540 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-tls podName:f46749eb-09cb-49a4-a99e-7ed87d5d0ec6 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:47.08252015 +0000 UTC m=+92.461192307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-tls") pod "node-exporter-bb578" (UID: "f46749eb-09cb-49a4-a99e-7ed87d5d0ec6") : secret "node-exporter-tls" not found Apr 17 16:32:46.582944 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.582773 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-textfile\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.583326 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.583291 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-metrics-client-ca\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.583326 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.583299 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-accelerators-collector-config\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.584968 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.584946 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:46.592489 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:46.592466 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjms8\" (UniqueName: \"kubernetes.io/projected/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-kube-api-access-jjms8\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:47.086702 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:47.086665 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-tls\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:47.088990 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:47.088965 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f46749eb-09cb-49a4-a99e-7ed87d5d0ec6-node-exporter-tls\") pod \"node-exporter-bb578\" (UID: \"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6\") " pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:47.295351 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:47.295316 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bb578" Apr 17 16:32:47.307024 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:32:47.306995 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf46749eb_09cb_49a4_a99e_7ed87d5d0ec6.slice/crio-706521034b4c50ee30dc423ce20ffdc6f54dd5005f959df18d7e4cab8884c47b WatchSource:0}: Error finding container 706521034b4c50ee30dc423ce20ffdc6f54dd5005f959df18d7e4cab8884c47b: Status 404 returned error can't find the container with id 706521034b4c50ee30dc423ce20ffdc6f54dd5005f959df18d7e4cab8884c47b Apr 17 16:32:47.647958 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:47.647921 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bb578" event={"ID":"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6","Type":"ContainerStarted","Data":"706521034b4c50ee30dc423ce20ffdc6f54dd5005f959df18d7e4cab8884c47b"} Apr 17 16:32:48.652032 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:48.651994 2572 generic.go:358] "Generic (PLEG): container finished" podID="f46749eb-09cb-49a4-a99e-7ed87d5d0ec6" containerID="0181d4e345a29860e8ba63d67df27a21a9b485cc3ee83728915e8c8eff48ec41" exitCode=0 Apr 17 16:32:48.652501 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:48.652042 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bb578" event={"ID":"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6","Type":"ContainerDied","Data":"0181d4e345a29860e8ba63d67df27a21a9b485cc3ee83728915e8c8eff48ec41"} Apr 17 16:32:49.657748 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:49.657716 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bb578" event={"ID":"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6","Type":"ContainerStarted","Data":"c6b40cae387980c7eac4c9bd3b5a9fcb0b31ee1814a7c57fe46452912410ff39"} Apr 17 16:32:49.657748 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:49.657753 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bb578" event={"ID":"f46749eb-09cb-49a4-a99e-7ed87d5d0ec6","Type":"ContainerStarted","Data":"ef61abb134b4a8cbc6caeec1734516ff91af8ddc180b5308db2699dad7c3da9c"} Apr 17 16:32:49.680007 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:49.679959 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bb578" podStartSLOduration=2.83328799 podStartE2EDuration="3.679946015s" podCreationTimestamp="2026-04-17 16:32:46 +0000 UTC" firstStartedPulling="2026-04-17 16:32:47.309084017 +0000 UTC m=+92.687756173" lastFinishedPulling="2026-04-17 16:32:48.155742042 +0000 UTC m=+93.534414198" observedRunningTime="2026-04-17 16:32:49.678481815 +0000 UTC m=+95.057153991" watchObservedRunningTime="2026-04-17 16:32:49.679946015 +0000 UTC m=+95.058618189" Apr 17 16:32:58.623317 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:32:58.623288 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-744dd97c69-4j2xx" Apr 17 16:33:01.592927 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:01.592876 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" podUID="b6f18147-3b7c-4098-b765-2b71bf2dc0f9" containerName="registry" containerID="cri-o://1b18581a3d7c9d7a3e9f30684fbe7380b38fab30a58f229a097cf36a94ecf481" gracePeriod=30 Apr 17 16:33:01.828070 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:01.828048 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:33:02.011068 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.011036 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j24th\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-kube-api-access-j24th\") pod \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " Apr 17 16:33:02.011265 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.011090 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-image-registry-private-configuration\") pod \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " Apr 17 16:33:02.011265 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.011120 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-bound-sa-token\") pod \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " Apr 17 16:33:02.011265 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.011156 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-trusted-ca\") pod \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " Apr 17 16:33:02.011265 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.011237 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-installation-pull-secrets\") pod \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " Apr 17 16:33:02.011265 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.011264 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-certificates\") pod \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " Apr 17 16:33:02.011522 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.011294 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls\") pod \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " Apr 17 16:33:02.011522 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.011333 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-ca-trust-extracted\") pod \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\" (UID: \"b6f18147-3b7c-4098-b765-2b71bf2dc0f9\") " Apr 17 16:33:02.011896 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.011710 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b6f18147-3b7c-4098-b765-2b71bf2dc0f9" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:33:02.011896 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.011767 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b6f18147-3b7c-4098-b765-2b71bf2dc0f9" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:33:02.013690 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.013663 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b6f18147-3b7c-4098-b765-2b71bf2dc0f9" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:33:02.013844 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.013815 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b6f18147-3b7c-4098-b765-2b71bf2dc0f9" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:33:02.013931 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.013864 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-kube-api-access-j24th" (OuterVolumeSpecName: "kube-api-access-j24th") pod "b6f18147-3b7c-4098-b765-2b71bf2dc0f9" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9"). InnerVolumeSpecName "kube-api-access-j24th". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:33:02.013931 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.013905 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b6f18147-3b7c-4098-b765-2b71bf2dc0f9" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:33:02.014028 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.014011 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b6f18147-3b7c-4098-b765-2b71bf2dc0f9" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:33:02.019288 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.019266 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b6f18147-3b7c-4098-b765-2b71bf2dc0f9" (UID: "b6f18147-3b7c-4098-b765-2b71bf2dc0f9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:33:02.112685 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.112664 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-ca-trust-extracted\") on node \"ip-10-0-132-44.ec2.internal\" DevicePath \"\"" Apr 17 16:33:02.112685 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.112684 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j24th\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-kube-api-access-j24th\") on node \"ip-10-0-132-44.ec2.internal\" DevicePath \"\"" Apr 17 16:33:02.112799 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.112694 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-image-registry-private-configuration\") on node \"ip-10-0-132-44.ec2.internal\" DevicePath \"\"" Apr 17 16:33:02.112799 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.112706 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-bound-sa-token\") on node \"ip-10-0-132-44.ec2.internal\" DevicePath \"\"" Apr 17 16:33:02.112799 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.112714 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-trusted-ca\") on node \"ip-10-0-132-44.ec2.internal\" DevicePath \"\"" Apr 17 16:33:02.112799 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.112722 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-installation-pull-secrets\") on node \"ip-10-0-132-44.ec2.internal\" DevicePath \"\"" Apr 17 16:33:02.112799 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.112731 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-certificates\") on node \"ip-10-0-132-44.ec2.internal\" DevicePath \"\"" Apr 17 16:33:02.112799 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.112739 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b6f18147-3b7c-4098-b765-2b71bf2dc0f9-registry-tls\") on node \"ip-10-0-132-44.ec2.internal\" DevicePath \"\"" Apr 17 16:33:02.702944 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.702905 2572 generic.go:358] "Generic (PLEG): container finished" podID="b6f18147-3b7c-4098-b765-2b71bf2dc0f9" containerID="1b18581a3d7c9d7a3e9f30684fbe7380b38fab30a58f229a097cf36a94ecf481" exitCode=0 Apr 17 16:33:02.703348 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.702993 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" Apr 17 16:33:02.703348 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.702995 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" event={"ID":"b6f18147-3b7c-4098-b765-2b71bf2dc0f9","Type":"ContainerDied","Data":"1b18581a3d7c9d7a3e9f30684fbe7380b38fab30a58f229a097cf36a94ecf481"} Apr 17 16:33:02.703348 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.703043 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fb6d49967-7zjnb" event={"ID":"b6f18147-3b7c-4098-b765-2b71bf2dc0f9","Type":"ContainerDied","Data":"f9e133c56ff52c1b89b6463cbd9d0127d2f8b7a0538dc7f75c0a9d9909ee3de8"} Apr 17 16:33:02.703348 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.703061 2572 scope.go:117] "RemoveContainer" containerID="1b18581a3d7c9d7a3e9f30684fbe7380b38fab30a58f229a097cf36a94ecf481" Apr 17 16:33:02.711764 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.711747 2572 scope.go:117] "RemoveContainer" containerID="1b18581a3d7c9d7a3e9f30684fbe7380b38fab30a58f229a097cf36a94ecf481" Apr 17 16:33:02.711992 ip-10-0-132-44 kubenswrapper[2572]: E0417 16:33:02.711957 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b18581a3d7c9d7a3e9f30684fbe7380b38fab30a58f229a097cf36a94ecf481\": container with ID starting with 1b18581a3d7c9d7a3e9f30684fbe7380b38fab30a58f229a097cf36a94ecf481 not found: ID does not exist" containerID="1b18581a3d7c9d7a3e9f30684fbe7380b38fab30a58f229a097cf36a94ecf481" Apr 17 16:33:02.712039 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.711998 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b18581a3d7c9d7a3e9f30684fbe7380b38fab30a58f229a097cf36a94ecf481"} err="failed to get container status \"1b18581a3d7c9d7a3e9f30684fbe7380b38fab30a58f229a097cf36a94ecf481\": rpc error: code = NotFound desc = could not find container \"1b18581a3d7c9d7a3e9f30684fbe7380b38fab30a58f229a097cf36a94ecf481\": container with ID starting with 1b18581a3d7c9d7a3e9f30684fbe7380b38fab30a58f229a097cf36a94ecf481 not found: ID does not exist" Apr 17 16:33:02.726514 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.726494 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-fb6d49967-7zjnb"] Apr 17 16:33:02.730216 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:02.730178 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-fb6d49967-7zjnb"] Apr 17 16:33:03.192061 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:03.192031 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f18147-3b7c-4098-b765-2b71bf2dc0f9" path="/var/lib/kubelet/pods/b6f18147-3b7c-4098-b765-2b71bf2dc0f9/volumes" Apr 17 16:33:12.732863 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:12.732826 2572 generic.go:358] "Generic (PLEG): container finished" podID="4050d306-2166-4792-b643-4e16417bd406" containerID="37dae34c79a3db2defe79eb69ad2f86f17c1ef70e55c17798fbaaf1dd11d643f" exitCode=0 Apr 17 16:33:12.733336 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:12.732904 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" event={"ID":"4050d306-2166-4792-b643-4e16417bd406","Type":"ContainerDied","Data":"37dae34c79a3db2defe79eb69ad2f86f17c1ef70e55c17798fbaaf1dd11d643f"} Apr 17 16:33:12.733336 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:12.733234 2572 scope.go:117] "RemoveContainer" containerID="37dae34c79a3db2defe79eb69ad2f86f17c1ef70e55c17798fbaaf1dd11d643f" Apr 17 16:33:13.737652 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:13.737618 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dnxcv" event={"ID":"4050d306-2166-4792-b643-4e16417bd406","Type":"ContainerStarted","Data":"fb9f08b6c76adaa7e479ede579fc8e4a85596d8e3cfa185cbd20ceeae50be81c"} Apr 17 16:33:27.780066 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:27.779986 2572 generic.go:358] "Generic (PLEG): container finished" podID="81ae48b9-3879-4953-b4f9-833feac79819" containerID="38d39f89afff7f38e6400bb085d3ee19cb2245f3faf511749ee766885c2edf68" exitCode=0 Apr 17 16:33:27.780485 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:27.780059 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" event={"ID":"81ae48b9-3879-4953-b4f9-833feac79819","Type":"ContainerDied","Data":"38d39f89afff7f38e6400bb085d3ee19cb2245f3faf511749ee766885c2edf68"} Apr 17 16:33:27.780485 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:27.780371 2572 scope.go:117] "RemoveContainer" containerID="38d39f89afff7f38e6400bb085d3ee19cb2245f3faf511749ee766885c2edf68" Apr 17 16:33:28.784685 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:28.784648 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-78xv2" event={"ID":"81ae48b9-3879-4953-b4f9-833feac79819","Type":"ContainerStarted","Data":"bef3454692db13aa0f493326ce2fe806b317de36e08f5743e37130d14b8bd24e"} Apr 17 16:33:31.794594 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:31.794520 2572 generic.go:358] "Generic (PLEG): container finished" podID="4138b3cf-4356-4853-b790-fdfd4d1b8d21" containerID="b3bcc011f61f6aa809b22d2fc845912868395d4d93caad1c31604ac1b9e6dbf8" exitCode=0 Apr 17 16:33:31.794594 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:31.794558 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ftwtr" event={"ID":"4138b3cf-4356-4853-b790-fdfd4d1b8d21","Type":"ContainerDied","Data":"b3bcc011f61f6aa809b22d2fc845912868395d4d93caad1c31604ac1b9e6dbf8"} Apr 17 16:33:31.795085 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:31.794855 2572 scope.go:117] "RemoveContainer" containerID="b3bcc011f61f6aa809b22d2fc845912868395d4d93caad1c31604ac1b9e6dbf8" Apr 17 16:33:32.803042 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:33:32.803007 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ftwtr" event={"ID":"4138b3cf-4356-4853-b790-fdfd4d1b8d21","Type":"ContainerStarted","Data":"73366310f6fbdd34c44fef3e84112668fbd827f4db9cc7d47d7d136048631d26"} Apr 17 16:36:15.076967 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:36:15.076936 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/1.log" Apr 17 16:36:15.077691 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:36:15.077662 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/1.log" Apr 17 16:36:15.082391 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:36:15.082366 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/0.log" Apr 17 16:36:15.083086 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:36:15.083067 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/0.log" Apr 17 16:36:15.088365 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:36:15.088346 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:41:15.098935 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:41:15.098907 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/1.log" Apr 17 16:41:15.100980 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:41:15.100955 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/1.log" Apr 17 16:41:15.103459 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:41:15.103438 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/0.log" Apr 17 16:41:15.105487 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:41:15.105469 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/0.log" Apr 17 16:46:15.119413 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:46:15.119381 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/1.log" Apr 17 16:46:15.122438 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:46:15.122415 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/1.log" Apr 17 16:46:15.124070 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:46:15.124052 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/0.log" Apr 17 16:46:15.127228 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:46:15.127210 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/0.log" Apr 17 16:51:15.139657 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:51:15.139623 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/1.log" Apr 17 16:51:15.144063 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:51:15.144040 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/1.log" Apr 17 16:51:15.144200 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:51:15.144099 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/0.log" Apr 17 16:51:15.148337 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:51:15.148320 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/0.log" Apr 17 16:54:16.906340 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:16.906313 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5v2lr_345c95f5-2f92-4ba5-8afd-6484fb524fad/global-pull-secret-syncer/0.log" Apr 17 16:54:17.074540 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:17.074494 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fh942_bab37d78-677d-45dc-81ad-fba92f1bf0c6/konnectivity-agent/0.log" Apr 17 16:54:17.166517 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:17.166448 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-44.ec2.internal_97b6bf974c89d22078b0dac98765eb2d/haproxy/0.log" Apr 17 16:54:21.102795 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:21.102755 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zdpww_418111de-59f5-4b93-bf45-150196b0de95/cluster-monitoring-operator/0.log" Apr 17 16:54:21.261545 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:21.261517 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bb578_f46749eb-09cb-49a4-a99e-7ed87d5d0ec6/node-exporter/0.log" Apr 17 16:54:21.283771 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:21.283751 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bb578_f46749eb-09cb-49a4-a99e-7ed87d5d0ec6/kube-rbac-proxy/0.log" Apr 17 16:54:21.307489 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:21.307471 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bb578_f46749eb-09cb-49a4-a99e-7ed87d5d0ec6/init-textfile/0.log" Apr 17 16:54:23.148692 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:23.148660 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-w25s5_289e73e3-2dca-4a41-b5bf-d6148102c16a/networking-console-plugin/0.log" Apr 17 16:54:23.569783 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:23.569716 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/1.log" Apr 17 16:54:23.573697 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:23.573677 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-t8bs6_c8e1510f-1cb7-4361-bb57-1944dc90fae3/console-operator/2.log" Apr 17 16:54:23.981217 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:23.981174 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7"] Apr 17 16:54:23.981477 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:23.981465 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6f18147-3b7c-4098-b765-2b71bf2dc0f9" containerName="registry" Apr 17 16:54:23.981519 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:23.981478 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f18147-3b7c-4098-b765-2b71bf2dc0f9" containerName="registry" Apr 17 16:54:23.981556 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:23.981547 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6f18147-3b7c-4098-b765-2b71bf2dc0f9" containerName="registry" Apr 17 16:54:23.984456 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:23.984436 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:23.987142 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:23.987111 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-l7rfq\"/\"kube-root-ca.crt\"" Apr 17 16:54:23.987267 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:23.987112 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-l7rfq\"/\"openshift-service-ca.crt\"" Apr 17 16:54:23.987267 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:23.987231 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-l7rfq\"/\"default-dockercfg-9vr2t\"" Apr 17 16:54:23.994479 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:23.994459 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7"] Apr 17 16:54:24.072697 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.072668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-lib-modules\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.072697 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.072697 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-proc\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.072849 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.072716 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-podres\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.072849 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.072779 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-sys\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.072849 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.072839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvfs2\" (UniqueName: \"kubernetes.io/projected/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-kube-api-access-gvfs2\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.173928 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.173902 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvfs2\" (UniqueName: \"kubernetes.io/projected/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-kube-api-access-gvfs2\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.174326 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.173943 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-lib-modules\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.174326 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.173960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-proc\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.174326 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.173975 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-podres\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.174326 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.173998 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-sys\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.174326 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.174056 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-proc\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.174326 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.174079 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-sys\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.174326 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.174117 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-podres\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.174326 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.174117 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-lib-modules\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.182039 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.182013 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvfs2\" (UniqueName: \"kubernetes.io/projected/e0f46fb5-ffad-4b6e-aa59-161f40529fc5-kube-api-access-gvfs2\") pod \"perf-node-gather-daemonset-st7c7\" (UID: \"e0f46fb5-ffad-4b6e-aa59-161f40529fc5\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.295182 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.295134 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:24.391013 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.390989 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-mkhdq_c7871d70-1e63-496f-835f-a447b5d1800c/volume-data-source-validator/0.log" Apr 17 16:54:24.410204 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.410162 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7"] Apr 17 16:54:24.412679 ip-10-0-132-44 kubenswrapper[2572]: W0417 16:54:24.412650 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode0f46fb5_ffad_4b6e_aa59_161f40529fc5.slice/crio-f8f65dd2140c2a4fbb314ff801c1dccbf050a547dd107353bbe8fbc5c3d4a396 WatchSource:0}: Error finding container f8f65dd2140c2a4fbb314ff801c1dccbf050a547dd107353bbe8fbc5c3d4a396: Status 404 returned error can't find the container with id f8f65dd2140c2a4fbb314ff801c1dccbf050a547dd107353bbe8fbc5c3d4a396 Apr 17 16:54:24.414122 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:24.414106 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:54:25.084279 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:25.084255 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-psgfk_f91565e7-f3a4-4471-a2b5-0fcd0292176b/dns/0.log" Apr 17 16:54:25.103920 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:25.103901 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-psgfk_f91565e7-f3a4-4471-a2b5-0fcd0292176b/kube-rbac-proxy/0.log" Apr 17 16:54:25.174669 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:25.174649 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rtbw5_33d7fffe-ef0b-495e-adb5-fbc82f11a1f0/dns-node-resolver/0.log" Apr 17 16:54:25.340169 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:25.340087 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" event={"ID":"e0f46fb5-ffad-4b6e-aa59-161f40529fc5","Type":"ContainerStarted","Data":"eccaebb4d8ab600c5ff231d4fda90119ac6a42f36fee16939ec540f64b0eafcd"} Apr 17 16:54:25.340169 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:25.340125 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" event={"ID":"e0f46fb5-ffad-4b6e-aa59-161f40529fc5","Type":"ContainerStarted","Data":"f8f65dd2140c2a4fbb314ff801c1dccbf050a547dd107353bbe8fbc5c3d4a396"} Apr 17 16:54:25.340169 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:25.340155 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:25.355715 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:25.355680 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" podStartSLOduration=2.35566784 podStartE2EDuration="2.35566784s" podCreationTimestamp="2026-04-17 16:54:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:54:25.354005354 +0000 UTC m=+1390.732677546" watchObservedRunningTime="2026-04-17 16:54:25.35566784 +0000 UTC m=+1390.734340052" Apr 17 16:54:25.560266 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:25.560234 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-744dd97c69-4j2xx_de97fb72-3543-47b3-b17e-ff7b86ce5de1/registry/0.log" Apr 17 16:54:25.577772 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:25.577752 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-b58dr_908fb34a-55a4-4783-af03-7b4b7c408f98/node-ca/0.log" Apr 17 16:54:26.305788 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:26.305759 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79f8c9df5d-9kzhz_19d7e13d-e66e-48ec-b132-f6e07a38ea96/router/0.log" Apr 17 16:54:26.669993 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:26.669960 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wtckq_ff048557-6618-44a8-9b98-aa0325746b04/serve-healthcheck-canary/0.log" Apr 17 16:54:26.996875 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:26.996799 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-ftwtr_4138b3cf-4356-4853-b790-fdfd4d1b8d21/insights-operator/0.log" Apr 17 16:54:26.997304 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:26.997287 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-ftwtr_4138b3cf-4356-4853-b790-fdfd4d1b8d21/insights-operator/1.log" Apr 17 16:54:27.084355 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:27.084320 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wbb7t_76fc56e5-185d-49c6-a5b5-79e4afc9575c/kube-rbac-proxy/0.log" Apr 17 16:54:27.103422 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:27.103404 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wbb7t_76fc56e5-185d-49c6-a5b5-79e4afc9575c/exporter/0.log" Apr 17 16:54:27.123183 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:27.123158 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wbb7t_76fc56e5-185d-49c6-a5b5-79e4afc9575c/extractor/0.log" Apr 17 16:54:31.351633 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:31.351604 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-st7c7" Apr 17 16:54:33.118958 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:33.118930 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wbp5w_4758db3d-9d86-4b79-bb5f-feb582fbe734/migrator/0.log" Apr 17 16:54:33.140728 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:33.140702 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wbp5w_4758db3d-9d86-4b79-bb5f-feb582fbe734/graceful-termination/0.log" Apr 17 16:54:33.538265 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:33.538145 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-dnxcv_4050d306-2166-4792-b643-4e16417bd406/kube-storage-version-migrator-operator/1.log" Apr 17 16:54:33.539075 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:33.539054 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-dnxcv_4050d306-2166-4792-b643-4e16417bd406/kube-storage-version-migrator-operator/0.log" Apr 17 16:54:34.530702 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:34.530634 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2fb27_01242e37-ff64-4fe3-825f-6cb0f669e5b7/kube-multus/0.log" Apr 17 16:54:34.814106 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:34.814044 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gl5np_665698fb-87f6-4ef0-a908-b1d2f13eb9d6/kube-multus-additional-cni-plugins/0.log" Apr 17 16:54:34.840610 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:34.840592 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gl5np_665698fb-87f6-4ef0-a908-b1d2f13eb9d6/egress-router-binary-copy/0.log" Apr 17 16:54:34.863671 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:34.863650 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gl5np_665698fb-87f6-4ef0-a908-b1d2f13eb9d6/cni-plugins/0.log" Apr 17 16:54:34.886599 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:34.886580 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gl5np_665698fb-87f6-4ef0-a908-b1d2f13eb9d6/bond-cni-plugin/0.log" Apr 17 16:54:34.908748 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:34.908725 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gl5np_665698fb-87f6-4ef0-a908-b1d2f13eb9d6/routeoverride-cni/0.log" Apr 17 16:54:34.931364 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:34.931342 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gl5np_665698fb-87f6-4ef0-a908-b1d2f13eb9d6/whereabouts-cni-bincopy/0.log" Apr 17 16:54:34.955059 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:34.955032 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gl5np_665698fb-87f6-4ef0-a908-b1d2f13eb9d6/whereabouts-cni/0.log" Apr 17 16:54:35.172831 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:35.172810 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lt7mn_3e538eeb-9985-4bbd-ae4b-d6ac1469dba0/network-metrics-daemon/0.log" Apr 17 16:54:35.190899 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:35.190880 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lt7mn_3e538eeb-9985-4bbd-ae4b-d6ac1469dba0/kube-rbac-proxy/0.log" Apr 17 16:54:36.672530 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:36.672493 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-controller/0.log" Apr 17 16:54:36.692169 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:36.692146 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/0.log" Apr 17 16:54:36.697866 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:36.697829 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovn-acl-logging/1.log" Apr 17 16:54:36.716414 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:36.716389 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/kube-rbac-proxy-node/0.log" Apr 17 16:54:36.738219 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:36.738133 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 16:54:36.760709 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:36.760687 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/northd/0.log" Apr 17 16:54:36.780807 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:36.780792 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/nbdb/0.log" Apr 17 16:54:36.802127 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:36.802109 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/sbdb/0.log" Apr 17 16:54:36.890107 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:36.890077 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nnv8v_2a93066e-fe3b-416c-ab13-3098d92ffb5f/ovnkube-controller/0.log" Apr 17 16:54:37.800174 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:37.800145 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-nlj65_5f749b09-28ed-4c6f-a64f-2df1f97857d6/check-endpoints/0.log" Apr 17 16:54:37.880163 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:37.880139 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qch6k_d9099ff1-798e-434b-8980-189e358b2f96/network-check-target-container/0.log" Apr 17 16:54:38.805507 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:38.805481 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-jl2wf_9c85e109-cdf7-4611-80ae-1231d32d04b9/iptables-alerter/0.log" Apr 17 16:54:39.415872 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:39.415842 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-svpg4_757f13f2-9e57-4475-b1d7-97713e23ab42/tuned/0.log" Apr 17 16:54:41.199658 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:41.199629 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-chmjc_1cd90798-a06c-4c4c-9c1a-45465cc231dd/cluster-samples-operator/0.log" Apr 17 16:54:41.214815 ip-10-0-132-44 kubenswrapper[2572]: I0417 16:54:41.214785 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-chmjc_1cd90798-a06c-4c4c-9c1a-45465cc231dd/cluster-samples-operator-watch/0.log"