Apr 24 22:29:31.049868 ip-10-0-136-66 systemd[1]: Starting Kubernetes Kubelet... Apr 24 22:29:31.561024 ip-10-0-136-66 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:31.561024 ip-10-0-136-66 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 22:29:31.561024 ip-10-0-136-66 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:31.561024 ip-10-0-136-66 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 22:29:31.561024 ip-10-0-136-66 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:31.562975 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.562883 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 22:29:31.567706 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567681 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:31.567706 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567700 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:31.567706 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567705 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:31.567706 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567709 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:31.567706 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567712 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567716 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567720 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567724 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567729 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567733 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567736 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567741 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567745 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567748 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567752 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567756 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567759 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567763 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567767 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567770 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567774 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567778 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567782 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:31.567991 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567786 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567790 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567794 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567802 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567807 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567811 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567814 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567819 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567823 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567826 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567830 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567834 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567838 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567842 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567847 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567852 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567857 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567861 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567865 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567869 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:31.568763 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567873 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567879 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567884 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567888 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567892 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567896 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567904 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567909 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567913 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567917 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567921 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567925 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567930 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567934 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567938 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567941 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567945 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567950 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567953 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:31.569478 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567958 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567961 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567965 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567970 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567974 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567978 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567982 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567986 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567992 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.567996 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568000 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568004 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568008 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568012 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568016 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568020 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568026 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568030 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568035 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568042 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:31.569950 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568046 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568050 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568054 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568061 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568704 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568716 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568721 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568726 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568730 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568735 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568739 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568743 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568747 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568751 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568756 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568762 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568767 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568770 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568774 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:31.570764 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568778 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568782 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568786 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568790 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568795 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568800 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568804 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568808 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568812 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568818 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568822 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568826 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568831 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568834 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568838 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568843 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568847 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568851 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568856 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568860 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:31.571546 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568864 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568870 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568874 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568877 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568881 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568886 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568891 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568895 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568899 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568903 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568907 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568911 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568917 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568923 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568927 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568931 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568935 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568939 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568943 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:31.572067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568947 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568951 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568955 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568959 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568964 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568968 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568972 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568976 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568980 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568984 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568989 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568993 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.568997 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569002 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569006 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569010 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569014 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569018 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569022 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569026 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:31.572527 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569030 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569034 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569038 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569042 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569047 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569051 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569055 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569059 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569064 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569068 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569072 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.569076 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570056 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570072 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570093 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570106 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570113 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570119 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570125 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570137 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570143 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570148 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 22:29:31.573158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570154 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570159 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570164 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570169 2568 flags.go:64] FLAG: --cgroup-root="" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570174 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570179 2568 flags.go:64] FLAG: --client-ca-file="" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570184 2568 flags.go:64] FLAG: --cloud-config="" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570188 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570199 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570205 2568 flags.go:64] FLAG: --cluster-domain="" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570210 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570215 2568 flags.go:64] FLAG: --config-dir="" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570220 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570226 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570232 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570237 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570242 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570248 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570252 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570257 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570262 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570268 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570272 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570279 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570284 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 22:29:31.573938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570288 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570293 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570302 2568 flags.go:64] FLAG: --enable-server="true" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570307 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570314 2568 flags.go:64] FLAG: --event-burst="100" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570319 2568 flags.go:64] FLAG: --event-qps="50" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570324 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570329 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570334 2568 flags.go:64] FLAG: --eviction-hard="" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570340 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570345 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570350 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570355 2568 flags.go:64] FLAG: --eviction-soft="" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570359 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570364 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570370 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570375 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570379 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570384 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570389 2568 flags.go:64] FLAG: --feature-gates="" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570395 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570400 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570405 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570410 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570414 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 24 22:29:31.574541 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570419 2568 flags.go:64] FLAG: --help="false" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570424 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-136-66.ec2.internal" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570430 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570434 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570439 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570444 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570450 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570454 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570459 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570463 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570469 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570474 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570479 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570483 2568 flags.go:64] FLAG: --kube-reserved="" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570489 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570493 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570498 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570503 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570507 2568 flags.go:64] FLAG: --lock-file="" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570512 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570517 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570521 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570531 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 22:29:31.575267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570536 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570541 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570545 2568 flags.go:64] FLAG: --logging-format="text" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570550 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570556 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570560 2568 flags.go:64] FLAG: --manifest-url="" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570583 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570590 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570595 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570601 2568 flags.go:64] FLAG: --max-pods="110" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570606 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570610 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570615 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570619 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570624 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570629 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570633 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570646 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570650 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570656 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570662 2568 flags.go:64] FLAG: --pod-cidr="" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570667 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570675 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570679 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 22:29:31.575834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570685 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570690 2568 flags.go:64] FLAG: --port="10250" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570695 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570699 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-067500c1042a85927" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570705 2568 flags.go:64] FLAG: --qos-reserved="" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570709 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570714 2568 flags.go:64] FLAG: --register-node="true" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570721 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570725 2568 flags.go:64] FLAG: --register-with-taints="" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570731 2568 flags.go:64] FLAG: --registry-burst="10" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570737 2568 flags.go:64] FLAG: --registry-qps="5" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570741 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570746 2568 flags.go:64] FLAG: --reserved-memory="" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570752 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570757 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570761 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570766 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570770 2568 flags.go:64] FLAG: --runonce="false" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570775 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570780 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570784 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570789 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570793 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570798 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570803 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570808 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 22:29:31.576426 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570812 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570817 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570822 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570827 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570832 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570837 2568 flags.go:64] FLAG: --system-cgroups="" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570842 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570851 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570855 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570860 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570866 2568 flags.go:64] FLAG: --tls-min-version="" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570870 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570875 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570880 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570885 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570890 2568 flags.go:64] FLAG: --v="2" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570896 2568 flags.go:64] FLAG: --version="false" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570902 2568 flags.go:64] FLAG: --vmodule="" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570909 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.570914 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571067 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571073 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571077 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571082 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:31.577067 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571086 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571090 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571094 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571098 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571102 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571107 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571111 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571115 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571119 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571123 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571127 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571132 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571137 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571141 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571150 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571155 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571181 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571188 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571193 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571197 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:31.577650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571202 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571206 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571211 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571215 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571219 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571232 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571237 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571241 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571246 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571250 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571255 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571259 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571263 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571267 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571271 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571275 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571281 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571287 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571291 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:31.578149 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571296 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571312 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571319 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571324 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571328 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571332 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571338 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571344 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571348 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571352 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571356 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571360 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571365 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571369 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571373 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571377 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571382 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571385 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571390 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571394 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:31.578695 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571398 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571402 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571406 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571410 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571414 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571418 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571422 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571426 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571430 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571436 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571442 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571447 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571452 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571456 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571460 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571464 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571468 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571472 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571476 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:31.579166 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571482 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:31.579656 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571486 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:31.579656 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571490 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:31.579656 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.571494 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:31.579656 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.572542 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:31.579773 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.579663 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 22:29:31.579802 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.579775 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579825 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579830 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579834 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579837 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579841 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579844 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579846 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579849 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579852 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579855 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579858 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579861 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579863 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579866 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579869 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579871 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:31.579867 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579874 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579877 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579880 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579883 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579886 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579889 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579891 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579894 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579897 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579899 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579902 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579904 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579907 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579909 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579912 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579918 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579920 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579923 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579925 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579928 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:31.580347 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579930 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579933 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579935 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579937 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579940 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579942 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579945 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579947 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579950 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579952 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579955 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579957 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579959 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579962 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579965 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579969 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579971 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579974 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579977 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579980 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:31.580922 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579982 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579985 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579987 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579990 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579992 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.579996 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580000 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580003 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580006 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580008 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580011 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580014 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580016 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580019 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580021 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580024 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580026 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580029 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580031 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:31.581406 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580034 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580036 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580039 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580043 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580046 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580049 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580052 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580056 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580059 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580062 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580064 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.580070 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580165 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580169 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580172 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:31.581892 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580175 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580178 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580181 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580184 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580186 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580189 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580192 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580196 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580198 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580201 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580203 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580206 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580208 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580210 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580213 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580215 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580218 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580220 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580223 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580225 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:31.582248 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580227 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580230 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580233 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580235 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580238 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580241 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580244 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580246 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580249 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580251 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580255 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580259 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580261 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580264 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580267 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580269 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580271 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580274 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580276 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:31.582774 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580279 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580659 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580663 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580666 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580668 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580671 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580674 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580676 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580678 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580681 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580684 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580686 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580689 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580691 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580694 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580696 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580699 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580701 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580704 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580706 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:31.583212 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580709 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580712 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580714 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580716 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580718 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580721 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580723 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580726 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580728 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580730 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580733 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580735 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580737 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580740 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580743 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580745 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580748 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580750 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580753 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:31.583727 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580756 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:31.584528 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580759 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:31.584528 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580761 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:31.584528 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580764 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:31.584528 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:31.580766 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:31.584528 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.580771 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:31.584528 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.581711 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 22:29:31.585468 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.585453 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 22:29:31.586510 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.586499 2568 server.go:1019] "Starting client certificate rotation" Apr 24 22:29:31.586611 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.586594 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:31.586668 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.586640 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:31.615682 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.615652 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:31.620358 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.620338 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:31.638095 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.638070 2568 log.go:25] "Validated CRI v1 runtime API" Apr 24 22:29:31.645345 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.645331 2568 log.go:25] "Validated CRI v1 image API" Apr 24 22:29:31.646799 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.646782 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 22:29:31.646874 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.646797 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:31.652041 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.652023 2568 fs.go:135] Filesystem UUIDs: map[24b1619b-d893-47fb-8d32-0deeea3ceeaf:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 960a86b0-7c30-4dfc-94f9-b6968b67ad46:/dev/nvme0n1p4] Apr 24 22:29:31.652090 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.652043 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 22:29:31.658775 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.658662 2568 manager.go:217] Machine: {Timestamp:2026-04-24 22:29:31.656484664 +0000 UTC m=+0.472810733 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098738 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec210ab154d535c03eca8c21f7f44755 SystemUUID:ec210ab1-54d5-35c0-3eca-8c21f7f44755 BootID:01f3c425-cd20-407c-9678-77d176dcfd6b Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:55:30:cd:9c:3b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:55:30:cd:9c:3b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8a:3c:28:0f:d0:4f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 22:29:31.658775 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.658771 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 22:29:31.658902 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.658890 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 22:29:31.660173 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.660145 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 22:29:31.660324 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.660177 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-66.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 22:29:31.660370 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.660336 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 22:29:31.660370 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.660345 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 22:29:31.660370 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.660359 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:31.661262 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.661252 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:31.663272 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.663261 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:31.663383 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.663375 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 22:29:31.666168 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.666159 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 24 22:29:31.666200 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.666172 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 22:29:31.666200 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.666196 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 22:29:31.666277 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.666205 2568 kubelet.go:397] "Adding apiserver pod source" Apr 24 22:29:31.666277 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.666217 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 22:29:31.667581 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.667555 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:31.667630 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.667588 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:31.670991 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.670979 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 22:29:31.672764 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.672752 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 22:29:31.674975 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.674963 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 22:29:31.675016 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.674981 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 22:29:31.675016 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.674988 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 22:29:31.675016 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.674993 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 22:29:31.675016 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.674999 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 22:29:31.675016 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.675005 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 22:29:31.675016 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.675010 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 22:29:31.675016 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.675015 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 22:29:31.675240 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.675022 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 22:29:31.675240 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.675027 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 22:29:31.675240 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.675036 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 22:29:31.675240 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.675044 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 22:29:31.676078 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.676068 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 22:29:31.676109 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.676079 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 22:29:31.679852 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.679838 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 22:29:31.679933 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.679877 2568 server.go:1295] "Started kubelet" Apr 24 22:29:31.683101 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.679983 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 22:29:31.683669 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.683645 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-66.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 22:29:31.683780 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.683692 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-66.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 22:29:31.683786 ip-10-0-136-66 systemd[1]: Started Kubernetes Kubelet. Apr 24 22:29:31.684082 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.684052 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 22:29:31.684296 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.684163 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 22:29:31.684353 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.684309 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 22:29:31.687032 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.687000 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 22:29:31.688439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.688399 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 24 22:29:31.690306 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.689144 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-66.ec2.internal.18a96b8ac20fde7e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-66.ec2.internal,UID:ip-10-0-136-66.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-66.ec2.internal,},FirstTimestamp:2026-04-24 22:29:31.67985011 +0000 UTC m=+0.496176179,LastTimestamp:2026-04-24 22:29:31.67985011 +0000 UTC m=+0.496176179,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-66.ec2.internal,}" Apr 24 22:29:31.694384 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.694367 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 22:29:31.695248 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.695225 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pgzs2" Apr 24 22:29:31.700374 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.700358 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 22:29:31.700450 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.700379 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:31.700773 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.700754 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pgzs2" Apr 24 22:29:31.701134 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.701112 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 22:29:31.701134 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.701114 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 22:29:31.701263 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.701145 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 22:29:31.701263 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.701215 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 24 22:29:31.701263 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.701224 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 24 22:29:31.701607 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.701446 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 22:29:31.701607 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.701462 2568 factory.go:55] Registering systemd factory Apr 24 22:29:31.701607 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.701471 2568 factory.go:223] Registration of the systemd container factory successfully Apr 24 22:29:31.701607 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.701528 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:31.701785 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.701757 2568 factory.go:153] Registering CRI-O factory Apr 24 22:29:31.701785 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.701775 2568 factory.go:223] Registration of the crio container factory successfully Apr 24 22:29:31.701870 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.701800 2568 factory.go:103] Registering Raw factory Apr 24 22:29:31.701870 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.701814 2568 manager.go:1196] Started watching for new ooms in manager Apr 24 22:29:31.702188 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.702172 2568 manager.go:319] Starting recovery of all containers Apr 24 22:29:31.703190 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.703169 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 22:29:31.703282 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.703262 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-66.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 22:29:31.709381 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.709363 2568 manager.go:324] Recovery completed Apr 24 22:29:31.713012 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.712999 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:31.715730 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.715714 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:31.715816 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.715748 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:31.715816 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.715763 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:31.716274 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.716258 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 22:29:31.716345 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.716275 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 22:29:31.716345 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.716293 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:31.720136 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.720124 2568 policy_none.go:49] "None policy: Start" Apr 24 22:29:31.720184 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.720140 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 22:29:31.720184 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.720150 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 24 22:29:31.765116 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.765102 2568 manager.go:341] "Starting Device Plugin manager" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.765167 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.765180 2568 server.go:85] "Starting device plugin registration server" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.765409 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.765421 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.765502 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.765606 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.765615 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.766325 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.766358 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.779225 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.780510 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.780534 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.780550 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.780556 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.780648 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 22:29:31.790779 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.782921 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:31.866246 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.866154 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:31.867143 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.867127 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:31.867248 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.867156 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:31.867248 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.867168 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:31.867248 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.867192 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-66.ec2.internal" Apr 24 22:29:31.876856 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.876836 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-66.ec2.internal" Apr 24 22:29:31.876967 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.876862 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-66.ec2.internal\": node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:31.880889 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.880866 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-66.ec2.internal"] Apr 24 22:29:31.880971 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.880929 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:31.881946 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.881930 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:31.882033 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.881958 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:31.882033 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.881971 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:31.884392 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.884380 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:31.884518 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.884504 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal" Apr 24 22:29:31.884554 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.884533 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:31.885080 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.885066 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:31.885143 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.885082 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:31.885143 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.885099 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:31.885143 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.885087 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:31.885143 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.885108 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:31.885143 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.885116 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:31.887408 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.887394 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-66.ec2.internal" Apr 24 22:29:31.887460 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.887418 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:31.888516 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.888503 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:31.888610 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.888529 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:31.888610 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:31.888538 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:31.891988 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.891969 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:31.906160 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.906143 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-66.ec2.internal\" not found" node="ip-10-0-136-66.ec2.internal" Apr 24 22:29:31.910525 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.910508 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-66.ec2.internal\" not found" node="ip-10-0-136-66.ec2.internal" Apr 24 22:29:31.992832 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:31.992811 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:32.002238 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.002221 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a49d08f0104145377483dfb31696bdfe-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal\" (UID: \"a49d08f0104145377483dfb31696bdfe\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal" Apr 24 22:29:32.002315 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.002264 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a49d08f0104145377483dfb31696bdfe-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal\" (UID: \"a49d08f0104145377483dfb31696bdfe\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal" Apr 24 22:29:32.002315 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.002283 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2f140c6571039b5778eaeb104c6d62fa-config\") pod \"kube-apiserver-proxy-ip-10-0-136-66.ec2.internal\" (UID: \"2f140c6571039b5778eaeb104c6d62fa\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-66.ec2.internal" Apr 24 22:29:32.093368 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:32.093326 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:32.102647 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.102625 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a49d08f0104145377483dfb31696bdfe-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal\" (UID: \"a49d08f0104145377483dfb31696bdfe\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal" Apr 24 22:29:32.102695 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.102654 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a49d08f0104145377483dfb31696bdfe-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal\" (UID: \"a49d08f0104145377483dfb31696bdfe\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal" Apr 24 22:29:32.102695 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.102675 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2f140c6571039b5778eaeb104c6d62fa-config\") pod \"kube-apiserver-proxy-ip-10-0-136-66.ec2.internal\" (UID: \"2f140c6571039b5778eaeb104c6d62fa\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-66.ec2.internal" Apr 24 22:29:32.102755 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.102716 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2f140c6571039b5778eaeb104c6d62fa-config\") pod \"kube-apiserver-proxy-ip-10-0-136-66.ec2.internal\" (UID: \"2f140c6571039b5778eaeb104c6d62fa\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-66.ec2.internal" Apr 24 22:29:32.102755 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.102738 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a49d08f0104145377483dfb31696bdfe-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal\" (UID: \"a49d08f0104145377483dfb31696bdfe\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal" Apr 24 22:29:32.102820 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.102749 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a49d08f0104145377483dfb31696bdfe-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal\" (UID: \"a49d08f0104145377483dfb31696bdfe\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal" Apr 24 22:29:32.194064 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:32.193986 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:32.208447 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.208427 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal" Apr 24 22:29:32.214483 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.214465 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-66.ec2.internal" Apr 24 22:29:32.294967 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:32.294930 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:32.395530 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:32.395502 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:32.496087 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:32.496005 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:32.586509 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.586478 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 22:29:32.587124 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.586636 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:29:32.596634 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:32.596613 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:32.607248 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.607230 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:32.697624 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:32.697595 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:32.700948 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.700923 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:32.704277 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.704245 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 22:24:31 +0000 UTC" deadline="2028-01-22 14:00:49.57427172 +0000 UTC" Apr 24 22:29:32.704277 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.704274 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15303h31m16.870000296s" Apr 24 22:29:32.709691 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.709543 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:32.730988 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.730970 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-n5x65" Apr 24 22:29:32.742689 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.742670 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-n5x65" Apr 24 22:29:32.797871 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:32.797853 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:32.805097 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:32.805067 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f140c6571039b5778eaeb104c6d62fa.slice/crio-35d03f3df1d617e74409b04253b6601994fb24a66b9b36248f8cc74be998b1aa WatchSource:0}: Error finding container 35d03f3df1d617e74409b04253b6601994fb24a66b9b36248f8cc74be998b1aa: Status 404 returned error can't find the container with id 35d03f3df1d617e74409b04253b6601994fb24a66b9b36248f8cc74be998b1aa Apr 24 22:29:32.805287 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:32.805269 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda49d08f0104145377483dfb31696bdfe.slice/crio-b3cd825fa0ea6087bc260212513345a2d6256935368a8a7e146f17a05799717f WatchSource:0}: Error finding container b3cd825fa0ea6087bc260212513345a2d6256935368a8a7e146f17a05799717f: Status 404 returned error can't find the container with id b3cd825fa0ea6087bc260212513345a2d6256935368a8a7e146f17a05799717f Apr 24 22:29:32.811732 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.811715 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:29:32.879976 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:32.879949 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:32.898930 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:32.898899 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:32.999471 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:32.999435 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:33.100040 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:33.099955 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-66.ec2.internal\" not found" Apr 24 22:29:33.129339 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.129314 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:33.201080 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.201035 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal" Apr 24 22:29:33.213798 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.213773 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:33.214867 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.214854 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-66.ec2.internal" Apr 24 22:29:33.220687 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.220662 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:33.504151 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.504006 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:33.667733 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.667700 2568 apiserver.go:52] "Watching apiserver" Apr 24 22:29:33.674646 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.674618 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 22:29:33.676167 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.676136 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal","openshift-multus/multus-additional-cni-plugins-m8pjb","openshift-multus/network-metrics-daemon-2wftt","kube-system/konnectivity-agent-js2gj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs","openshift-image-registry/node-ca-qz7zn","openshift-multus/multus-qbqlv","openshift-network-diagnostics/network-check-target-lclmx","openshift-network-operator/iptables-alerter-s2xd9","openshift-ovn-kubernetes/ovnkube-node-tbc58","kube-system/kube-apiserver-proxy-ip-10-0-136-66.ec2.internal","openshift-cluster-node-tuning-operator/tuned-qz6cr"] Apr 24 22:29:33.679404 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.679372 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.681559 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.681533 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:33.681673 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:33.681623 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:33.681790 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.681770 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 22:29:33.681915 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.681897 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 22:29:33.681979 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.681929 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 22:29:33.682071 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.682058 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 22:29:33.682132 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.682087 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 22:29:33.682195 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.682182 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jbwlx\"" Apr 24 22:29:33.686535 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.686515 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-js2gj" Apr 24 22:29:33.686668 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.686649 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.688316 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.688299 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 22:29:33.689048 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.688627 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8747p\"" Apr 24 22:29:33.689048 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.688662 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 22:29:33.689048 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.688686 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 22:29:33.689048 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.688697 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-crzrv\"" Apr 24 22:29:33.691911 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.691412 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.693115 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.693097 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:33.693462 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.693443 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:33.693758 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.693740 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9nvt5\"" Apr 24 22:29:33.693988 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.693976 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.695917 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.695795 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 22:29:33.695917 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.695830 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-ztw56\"" Apr 24 22:29:33.696080 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.696063 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 22:29:33.696196 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.696177 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 22:29:33.696278 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.696258 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 22:29:33.696737 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.696447 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 22:29:33.696737 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.696460 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 22:29:33.696737 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.696613 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qz7zn" Apr 24 22:29:33.698873 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.698851 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.698991 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.698975 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:33.699065 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.699013 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 22:29:33.699065 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:33.699051 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:33.699191 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.699176 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 22:29:33.699256 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.699200 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wwb2r\"" Apr 24 22:29:33.699495 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.699479 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 22:29:33.700702 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.700685 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 22:29:33.701027 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.701010 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 22:29:33.701107 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.701013 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5r9c4\"" Apr 24 22:29:33.701627 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.701297 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 22:29:33.701627 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.701327 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s2xd9" Apr 24 22:29:33.702167 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.702149 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 22:29:33.703155 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.703136 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 22:29:33.703377 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.703362 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:33.703847 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.703693 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:33.703847 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.703796 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gvh2n\"" Apr 24 22:29:33.712438 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712420 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99d5da74-fb38-467a-951e-9d474464c9b1-os-release\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.712585 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712547 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/99d5da74-fb38-467a-951e-9d474464c9b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.712660 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712619 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8d80044f-c134-4e09-8b63-d8d8d4e50a46-multus-daemon-config\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.712660 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712646 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-kubelet\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.712766 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712671 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-sysctl-conf\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.712766 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712695 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99d5da74-fb38-467a-951e-9d474464c9b1-system-cni-dir\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.712766 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712719 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/99d5da74-fb38-467a-951e-9d474464c9b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.712766 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712745 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-run-k8s-cni-cncf-io\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.712949 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712771 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-run-systemd\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.712949 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712794 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n9q8\" (UniqueName: \"kubernetes.io/projected/0f2be5fe-c0eb-4a07-8625-e3980ae39927-kube-api-access-8n9q8\") pod \"iptables-alerter-s2xd9\" (UID: \"0f2be5fe-c0eb-4a07-8625-e3980ae39927\") " pod="openshift-network-operator/iptables-alerter-s2xd9" Apr 24 22:29:33.712949 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712817 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99d5da74-fb38-467a-951e-9d474464c9b1-cnibin\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.712949 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712842 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-var-lib-kubelet\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.712949 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712865 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-log-socket\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.712949 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712888 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-lib-modules\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.712949 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712910 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-cnibin\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.712949 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712933 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-tuned\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.713313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712954 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-tmp\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.713313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.712977 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-system-cni-dir\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.713313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713010 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-multus-cni-dir\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.713313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713041 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-run-multus-certs\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.713313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713064 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-slash\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.713313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713086 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-run-openvswitch\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.713313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713109 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-cni-bin\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.713313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713133 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-ovn-node-metrics-cert\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.713313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713204 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhj8w\" (UniqueName: \"kubernetes.io/projected/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-kube-api-access-lhj8w\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.713313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713229 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07631fd3-43b5-43d2-9831-42159a9806e2-host\") pod \"node-ca-qz7zn\" (UID: \"07631fd3-43b5-43d2-9831-42159a9806e2\") " pod="openshift-image-registry/node-ca-qz7zn" Apr 24 22:29:33.713313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713255 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f2be5fe-c0eb-4a07-8625-e3980ae39927-host-slash\") pod \"iptables-alerter-s2xd9\" (UID: \"0f2be5fe-c0eb-4a07-8625-e3980ae39927\") " pod="openshift-network-operator/iptables-alerter-s2xd9" Apr 24 22:29:33.713313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713278 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8d80044f-c134-4e09-8b63-d8d8d4e50a46-cni-binary-copy\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.713313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713305 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-multus-conf-dir\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-systemd-units\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713404 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-node-log\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713428 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-modprobe-d\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713451 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-run\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713476 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c-agent-certs\") pod \"konnectivity-agent-js2gj\" (UID: \"77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c\") " pod="kube-system/konnectivity-agent-js2gj" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713499 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpb2r\" (UniqueName: \"kubernetes.io/projected/99d5da74-fb38-467a-951e-9d474464c9b1-kube-api-access-gpb2r\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713524 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-hostroot\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713546 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-etc-kubernetes\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713587 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-run-ovn-kubernetes\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713627 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-etc-selinux\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713667 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsnqh\" (UniqueName: \"kubernetes.io/projected/1f3a7dfc-adce-4b4b-967d-4c568dacabda-kube-api-access-nsnqh\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713712 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-var-lib-cni-bin\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713752 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-run-ovn\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713780 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.713822 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713819 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-ovnkube-script-lib\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713842 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-systemd\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713865 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-kubelet-dir\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713889 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-multus-socket-dir-parent\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713913 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-var-lib-cni-multus\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713938 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-run-netns\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713960 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-kubernetes\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.713982 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-var-lib-kubelet\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714005 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-sys-fs\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714030 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr9m5\" (UniqueName: \"kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5\") pod \"network-check-target-lclmx\" (UID: \"5346f0d6-5375-4d8a-9fb6-8f7c3a45720a\") " pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714052 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-run-netns\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714074 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b48k\" (UniqueName: \"kubernetes.io/projected/8d80044f-c134-4e09-8b63-d8d8d4e50a46-kube-api-access-2b48k\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714099 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-env-overrides\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99d5da74-fb38-467a-951e-9d474464c9b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714144 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-var-lib-openvswitch\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714168 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-etc-openvswitch\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.714492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714193 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-socket-dir\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714217 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-registration-dir\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714239 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c-konnectivity-ca\") pod \"konnectivity-agent-js2gj\" (UID: \"77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c\") " pod="kube-system/konnectivity-agent-js2gj" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714262 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-os-release\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714284 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-device-dir\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714307 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs\") pod \"network-metrics-daemon-2wftt\" (UID: \"9b056a73-ce1c-4e88-9a66-ebfc4498a736\") " pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714330 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87s9q\" (UniqueName: \"kubernetes.io/projected/9b056a73-ce1c-4e88-9a66-ebfc4498a736-kube-api-access-87s9q\") pod \"network-metrics-daemon-2wftt\" (UID: \"9b056a73-ce1c-4e88-9a66-ebfc4498a736\") " pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714353 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07631fd3-43b5-43d2-9831-42159a9806e2-serviceca\") pod \"node-ca-qz7zn\" (UID: \"07631fd3-43b5-43d2-9831-42159a9806e2\") " pod="openshift-image-registry/node-ca-qz7zn" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714375 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f2be5fe-c0eb-4a07-8625-e3980ae39927-iptables-alerter-script\") pod \"iptables-alerter-s2xd9\" (UID: \"0f2be5fe-c0eb-4a07-8625-e3980ae39927\") " pod="openshift-network-operator/iptables-alerter-s2xd9" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714399 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-cni-netd\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714425 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-ovnkube-config\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714447 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-host\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714490 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnjhv\" (UniqueName: \"kubernetes.io/projected/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-kube-api-access-vnjhv\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714520 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-sysconfig\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714545 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-sys\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714589 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzv2b\" (UniqueName: \"kubernetes.io/projected/07631fd3-43b5-43d2-9831-42159a9806e2-kube-api-access-vzv2b\") pod \"node-ca-qz7zn\" (UID: \"07631fd3-43b5-43d2-9831-42159a9806e2\") " pod="openshift-image-registry/node-ca-qz7zn" Apr 24 22:29:33.715053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714613 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99d5da74-fb38-467a-951e-9d474464c9b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.715597 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.714635 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-sysctl-d\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.743593 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.743386 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:32 +0000 UTC" deadline="2027-10-01 17:04:59.655572753 +0000 UTC" Apr 24 22:29:33.743714 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.743597 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12594h35m25.911984044s" Apr 24 22:29:33.784711 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.784663 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-66.ec2.internal" event={"ID":"2f140c6571039b5778eaeb104c6d62fa","Type":"ContainerStarted","Data":"35d03f3df1d617e74409b04253b6601994fb24a66b9b36248f8cc74be998b1aa"} Apr 24 22:29:33.785639 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.785616 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal" event={"ID":"a49d08f0104145377483dfb31696bdfe","Type":"ContainerStarted","Data":"b3cd825fa0ea6087bc260212513345a2d6256935368a8a7e146f17a05799717f"} Apr 24 22:29:33.815671 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815645 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-var-lib-kubelet\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.815801 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815697 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-sys-fs\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.815801 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815725 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9m5\" (UniqueName: \"kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5\") pod \"network-check-target-lclmx\" (UID: \"5346f0d6-5375-4d8a-9fb6-8f7c3a45720a\") " pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:33.815801 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815752 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-run-netns\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.815801 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815776 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2b48k\" (UniqueName: \"kubernetes.io/projected/8d80044f-c134-4e09-8b63-d8d8d4e50a46-kube-api-access-2b48k\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.815995 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815806 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-env-overrides\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.815995 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815818 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-run-netns\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.815995 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815831 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99d5da74-fb38-467a-951e-9d474464c9b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.815995 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815775 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-var-lib-kubelet\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.815995 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815856 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-var-lib-openvswitch\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.815995 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815881 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-etc-openvswitch\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.815995 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815906 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-socket-dir\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.815995 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815938 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-registration-dir\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.815995 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815964 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c-konnectivity-ca\") pod \"konnectivity-agent-js2gj\" (UID: \"77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c\") " pod="kube-system/konnectivity-agent-js2gj" Apr 24 22:29:33.815995 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.815987 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-os-release\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816037 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-device-dir\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816073 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-etc-openvswitch\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816078 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs\") pod \"network-metrics-daemon-2wftt\" (UID: \"9b056a73-ce1c-4e88-9a66-ebfc4498a736\") " pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816123 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87s9q\" (UniqueName: \"kubernetes.io/projected/9b056a73-ce1c-4e88-9a66-ebfc4498a736-kube-api-access-87s9q\") pod \"network-metrics-daemon-2wftt\" (UID: \"9b056a73-ce1c-4e88-9a66-ebfc4498a736\") " pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816147 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07631fd3-43b5-43d2-9831-42159a9806e2-serviceca\") pod \"node-ca-qz7zn\" (UID: \"07631fd3-43b5-43d2-9831-42159a9806e2\") " pod="openshift-image-registry/node-ca-qz7zn" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816167 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f2be5fe-c0eb-4a07-8625-e3980ae39927-iptables-alerter-script\") pod \"iptables-alerter-s2xd9\" (UID: \"0f2be5fe-c0eb-4a07-8625-e3980ae39927\") " pod="openshift-network-operator/iptables-alerter-s2xd9" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816166 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-socket-dir\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816182 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-cni-netd\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:33.816171 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816197 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-ovnkube-config\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816216 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-host\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:33.816245 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs podName:9b056a73-ce1c-4e88-9a66-ebfc4498a736 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:34.316222892 +0000 UTC m=+3.132548955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs") pod "network-metrics-daemon-2wftt" (UID: "9b056a73-ce1c-4e88-9a66-ebfc4498a736") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816250 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-host\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816278 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnjhv\" (UniqueName: \"kubernetes.io/projected/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-kube-api-access-vnjhv\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816306 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-sysconfig\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816332 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-sys\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.816439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816356 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzv2b\" (UniqueName: \"kubernetes.io/projected/07631fd3-43b5-43d2-9831-42159a9806e2-kube-api-access-vzv2b\") pod \"node-ca-qz7zn\" (UID: \"07631fd3-43b5-43d2-9831-42159a9806e2\") " pod="openshift-image-registry/node-ca-qz7zn" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816384 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99d5da74-fb38-467a-951e-9d474464c9b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816409 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-sysctl-d\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816433 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99d5da74-fb38-467a-951e-9d474464c9b1-os-release\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816457 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/99d5da74-fb38-467a-951e-9d474464c9b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816483 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8d80044f-c134-4e09-8b63-d8d8d4e50a46-multus-daemon-config\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816507 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-kubelet\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816531 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-sysctl-conf\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816556 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99d5da74-fb38-467a-951e-9d474464c9b1-system-cni-dir\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816642 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/99d5da74-fb38-467a-951e-9d474464c9b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816667 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-env-overrides\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816710 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-run-k8s-cni-cncf-io\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816671 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-run-k8s-cni-cncf-io\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816733 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c-konnectivity-ca\") pod \"konnectivity-agent-js2gj\" (UID: \"77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c\") " pod="kube-system/konnectivity-agent-js2gj" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816751 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-run-systemd\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816780 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8n9q8\" (UniqueName: \"kubernetes.io/projected/0f2be5fe-c0eb-4a07-8625-e3980ae39927-kube-api-access-8n9q8\") pod \"iptables-alerter-s2xd9\" (UID: \"0f2be5fe-c0eb-4a07-8625-e3980ae39927\") " pod="openshift-network-operator/iptables-alerter-s2xd9" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816801 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99d5da74-fb38-467a-951e-9d474464c9b1-cnibin\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.817201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816818 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-os-release\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816823 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-var-lib-kubelet\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816857 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-var-lib-kubelet\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816857 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-log-socket\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816883 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-lib-modules\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816898 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-log-socket\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816909 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-cnibin\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816926 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-tuned\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816940 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-tmp\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816955 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-system-cni-dir\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816960 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-device-dir\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.816036 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-sys-fs\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817020 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-registration-dir\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817031 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-multus-cni-dir\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817075 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-run-multus-certs\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817106 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-slash\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817140 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-run-openvswitch\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817165 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-cni-bin\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.818047 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817192 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-ovn-node-metrics-cert\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817193 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-sysctl-d\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817222 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhj8w\" (UniqueName: \"kubernetes.io/projected/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-kube-api-access-lhj8w\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817246 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07631fd3-43b5-43d2-9831-42159a9806e2-host\") pod \"node-ca-qz7zn\" (UID: \"07631fd3-43b5-43d2-9831-42159a9806e2\") " pod="openshift-image-registry/node-ca-qz7zn" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817262 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99d5da74-fb38-467a-951e-9d474464c9b1-os-release\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817272 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f2be5fe-c0eb-4a07-8625-e3980ae39927-host-slash\") pod \"iptables-alerter-s2xd9\" (UID: \"0f2be5fe-c0eb-4a07-8625-e3980ae39927\") " pod="openshift-network-operator/iptables-alerter-s2xd9" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817324 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f2be5fe-c0eb-4a07-8625-e3980ae39927-host-slash\") pod \"iptables-alerter-s2xd9\" (UID: \"0f2be5fe-c0eb-4a07-8625-e3980ae39927\") " pod="openshift-network-operator/iptables-alerter-s2xd9" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817363 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-var-lib-openvswitch\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817388 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07631fd3-43b5-43d2-9831-42159a9806e2-serviceca\") pod \"node-ca-qz7zn\" (UID: \"07631fd3-43b5-43d2-9831-42159a9806e2\") " pod="openshift-image-registry/node-ca-qz7zn" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817408 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-sys\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817670 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-run-multus-certs\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817729 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-slash\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817770 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-run-openvswitch\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817815 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-cni-bin\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817823 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/99d5da74-fb38-467a-951e-9d474464c9b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817879 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-sysconfig\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.817959 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f2be5fe-c0eb-4a07-8625-e3980ae39927-iptables-alerter-script\") pod \"iptables-alerter-s2xd9\" (UID: \"0f2be5fe-c0eb-4a07-8625-e3980ae39927\") " pod="openshift-network-operator/iptables-alerter-s2xd9" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818018 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-cni-netd\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.818809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818115 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07631fd3-43b5-43d2-9831-42159a9806e2-host\") pod \"node-ca-qz7zn\" (UID: \"07631fd3-43b5-43d2-9831-42159a9806e2\") " pod="openshift-image-registry/node-ca-qz7zn" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818125 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818130 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99d5da74-fb38-467a-951e-9d474464c9b1-system-cni-dir\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818168 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-kubelet\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818295 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-sysctl-conf\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818347 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-cnibin\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818386 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-run-systemd\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818487 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-ovnkube-config\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818633 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99d5da74-fb38-467a-951e-9d474464c9b1-cnibin\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818723 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8d80044f-c134-4e09-8b63-d8d8d4e50a46-multus-daemon-config\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818747 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-lib-modules\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818818 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-multus-cni-dir\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.818865 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-system-cni-dir\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819018 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99d5da74-fb38-467a-951e-9d474464c9b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819078 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8d80044f-c134-4e09-8b63-d8d8d4e50a46-cni-binary-copy\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819127 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-multus-conf-dir\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819160 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-systemd-units\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819167 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/99d5da74-fb38-467a-951e-9d474464c9b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.819545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819193 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-node-log\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819240 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-multus-conf-dir\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819363 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99d5da74-fb38-467a-951e-9d474464c9b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819431 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-modprobe-d\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819375 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-node-log\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819455 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-run\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819417 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-systemd-units\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819502 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c-agent-certs\") pod \"konnectivity-agent-js2gj\" (UID: \"77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c\") " pod="kube-system/konnectivity-agent-js2gj" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819526 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpb2r\" (UniqueName: \"kubernetes.io/projected/99d5da74-fb38-467a-951e-9d474464c9b1-kube-api-access-gpb2r\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819559 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-modprobe-d\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819613 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-run\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819780 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-hostroot\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819812 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-etc-kubernetes\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819837 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-run-ovn-kubernetes\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819863 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-etc-selinux\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819892 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsnqh\" (UniqueName: \"kubernetes.io/projected/1f3a7dfc-adce-4b4b-967d-4c568dacabda-kube-api-access-nsnqh\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819920 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-var-lib-cni-bin\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.819943 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-run-ovn\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.820376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820003 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8d80044f-c134-4e09-8b63-d8d8d4e50a46-cni-binary-copy\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820012 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-hostroot\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820033 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820092 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-ovnkube-script-lib\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820102 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-etc-selinux\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820117 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-systemd\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820143 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820156 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-kubelet-dir\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820182 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-multus-socket-dir-parent\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820176 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-var-lib-cni-bin\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820226 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-run-ovn\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820210 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-var-lib-cni-multus\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820269 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-systemd\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820271 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-run-netns\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820309 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-kubernetes\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820385 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-etc-kubernetes\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820406 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-host-var-lib-cni-multus\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.821158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820446 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-kubernetes\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.821928 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820459 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1f3a7dfc-adce-4b4b-967d-4c568dacabda-kubelet-dir\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.821928 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820092 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-run-ovn-kubernetes\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.821928 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820498 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8d80044f-c134-4e09-8b63-d8d8d4e50a46-multus-socket-dir-parent\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.821928 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.820498 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-host-run-netns\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.822135 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.822108 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-ovnkube-script-lib\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.826702 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.826666 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-etc-tuned\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.826799 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.826764 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-ovn-node-metrics-cert\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.827077 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.827048 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c-agent-certs\") pod \"konnectivity-agent-js2gj\" (UID: \"77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c\") " pod="kube-system/konnectivity-agent-js2gj" Apr 24 22:29:33.828423 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:33.828397 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:33.828509 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:33.828429 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:33.828509 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:33.828450 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fr9m5 for pod openshift-network-diagnostics/network-check-target-lclmx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:33.828645 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:33.828514 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5 podName:5346f0d6-5375-4d8a-9fb6-8f7c3a45720a nodeName:}" failed. No retries permitted until 2026-04-24 22:29:34.328488785 +0000 UTC m=+3.144814845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fr9m5" (UniqueName: "kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5") pod "network-check-target-lclmx" (UID: "5346f0d6-5375-4d8a-9fb6-8f7c3a45720a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:33.829836 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.829811 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzv2b\" (UniqueName: \"kubernetes.io/projected/07631fd3-43b5-43d2-9831-42159a9806e2-kube-api-access-vzv2b\") pod \"node-ca-qz7zn\" (UID: \"07631fd3-43b5-43d2-9831-42159a9806e2\") " pod="openshift-image-registry/node-ca-qz7zn" Apr 24 22:29:33.831479 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.831457 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87s9q\" (UniqueName: \"kubernetes.io/projected/9b056a73-ce1c-4e88-9a66-ebfc4498a736-kube-api-access-87s9q\") pod \"network-metrics-daemon-2wftt\" (UID: \"9b056a73-ce1c-4e88-9a66-ebfc4498a736\") " pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:33.831625 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.831505 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b48k\" (UniqueName: \"kubernetes.io/projected/8d80044f-c134-4e09-8b63-d8d8d4e50a46-kube-api-access-2b48k\") pod \"multus-qbqlv\" (UID: \"8d80044f-c134-4e09-8b63-d8d8d4e50a46\") " pod="openshift-multus/multus-qbqlv" Apr 24 22:29:33.831835 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.831818 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsnqh\" (UniqueName: \"kubernetes.io/projected/1f3a7dfc-adce-4b4b-967d-4c568dacabda-kube-api-access-nsnqh\") pod \"aws-ebs-csi-driver-node-l4mqs\" (UID: \"1f3a7dfc-adce-4b4b-967d-4c568dacabda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:33.831976 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.831952 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n9q8\" (UniqueName: \"kubernetes.io/projected/0f2be5fe-c0eb-4a07-8625-e3980ae39927-kube-api-access-8n9q8\") pod \"iptables-alerter-s2xd9\" (UID: \"0f2be5fe-c0eb-4a07-8625-e3980ae39927\") " pod="openshift-network-operator/iptables-alerter-s2xd9" Apr 24 22:29:33.832424 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.832385 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnjhv\" (UniqueName: \"kubernetes.io/projected/ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d-kube-api-access-vnjhv\") pod \"ovnkube-node-tbc58\" (UID: \"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:33.833664 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.833624 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpb2r\" (UniqueName: \"kubernetes.io/projected/99d5da74-fb38-467a-951e-9d474464c9b1-kube-api-access-gpb2r\") pod \"multus-additional-cni-plugins-m8pjb\" (UID: \"99d5da74-fb38-467a-951e-9d474464c9b1\") " pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:33.834793 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.834771 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhj8w\" (UniqueName: \"kubernetes.io/projected/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-kube-api-access-lhj8w\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.836196 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.836175 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a-tmp\") pod \"tuned-qz6cr\" (UID: \"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a\") " pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:33.993445 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:33.993401 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m8pjb" Apr 24 22:29:34.002298 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.002272 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qbqlv" Apr 24 22:29:34.014095 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.014072 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-js2gj" Apr 24 22:29:34.019073 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.019050 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" Apr 24 22:29:34.026814 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.026792 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:34.035407 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.035355 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qz7zn" Apr 24 22:29:34.042048 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.042028 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" Apr 24 22:29:34.048539 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.048519 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s2xd9" Apr 24 22:29:34.323727 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.323646 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs\") pod \"network-metrics-daemon-2wftt\" (UID: \"9b056a73-ce1c-4e88-9a66-ebfc4498a736\") " pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:34.323884 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:34.323792 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:34.323884 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:34.323861 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs podName:9b056a73-ce1c-4e88-9a66-ebfc4498a736 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:35.323841898 +0000 UTC m=+4.140167966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs") pod "network-metrics-daemon-2wftt" (UID: "9b056a73-ce1c-4e88-9a66-ebfc4498a736") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:34.424817 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.424786 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9m5\" (UniqueName: \"kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5\") pod \"network-check-target-lclmx\" (UID: \"5346f0d6-5375-4d8a-9fb6-8f7c3a45720a\") " pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:34.424971 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:34.424938 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:34.424971 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:34.424956 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:34.424971 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:34.424968 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fr9m5 for pod openshift-network-diagnostics/network-check-target-lclmx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:34.425163 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:34.425032 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5 podName:5346f0d6-5375-4d8a-9fb6-8f7c3a45720a nodeName:}" failed. No retries permitted until 2026-04-24 22:29:35.425013549 +0000 UTC m=+4.241339608 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fr9m5" (UniqueName: "kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5") pod "network-check-target-lclmx" (UID: "5346f0d6-5375-4d8a-9fb6-8f7c3a45720a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:34.622808 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:34.622776 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d80044f_c134_4e09_8b63_d8d8d4e50a46.slice/crio-1771d4000c14a1958e665410ab5b74d73997ddf7377d3d5da2f0a8daa3f85e78 WatchSource:0}: Error finding container 1771d4000c14a1958e665410ab5b74d73997ddf7377d3d5da2f0a8daa3f85e78: Status 404 returned error can't find the container with id 1771d4000c14a1958e665410ab5b74d73997ddf7377d3d5da2f0a8daa3f85e78 Apr 24 22:29:34.624470 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:34.624393 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99d5da74_fb38_467a_951e_9d474464c9b1.slice/crio-ed4e6e1c1c0db1d6a0019499106dd9eb8d59bf85fb7fc77fdf46b8ce82b2b30b WatchSource:0}: Error finding container ed4e6e1c1c0db1d6a0019499106dd9eb8d59bf85fb7fc77fdf46b8ce82b2b30b: Status 404 returned error can't find the container with id ed4e6e1c1c0db1d6a0019499106dd9eb8d59bf85fb7fc77fdf46b8ce82b2b30b Apr 24 22:29:34.625097 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:34.625070 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad18a0fd_1bbf_4f92_9239_77f7b1a9ae7d.slice/crio-40c7140ef779d1799b566b2169607c40366d32153b11463ef41117a2214a8d1e WatchSource:0}: Error finding container 40c7140ef779d1799b566b2169607c40366d32153b11463ef41117a2214a8d1e: Status 404 returned error can't find the container with id 40c7140ef779d1799b566b2169607c40366d32153b11463ef41117a2214a8d1e Apr 24 22:29:34.629778 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:34.629750 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8bf15d5_9bb6_4feb_bddd_e1ccb5b4028a.slice/crio-546ad4d3cb49944b9b22ac02ba04a4f9b305bae0efff142379b7f9ea5579b592 WatchSource:0}: Error finding container 546ad4d3cb49944b9b22ac02ba04a4f9b305bae0efff142379b7f9ea5579b592: Status 404 returned error can't find the container with id 546ad4d3cb49944b9b22ac02ba04a4f9b305bae0efff142379b7f9ea5579b592 Apr 24 22:29:34.632060 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:34.632035 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07631fd3_43b5_43d2_9831_42159a9806e2.slice/crio-8f34ac8763df7963ad3b11e25448b354614eb5b107209d20b82052ab2362f9a7 WatchSource:0}: Error finding container 8f34ac8763df7963ad3b11e25448b354614eb5b107209d20b82052ab2362f9a7: Status 404 returned error can't find the container with id 8f34ac8763df7963ad3b11e25448b354614eb5b107209d20b82052ab2362f9a7 Apr 24 22:29:34.632396 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:34.632375 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77dbb1fd_4a06_4c52_8ec4_35b9d8f89a3c.slice/crio-0cd0ef122548b1f3d01a9234fcb13e4b4907bdeb3a427fa89a7e275a59d9e0ce WatchSource:0}: Error finding container 0cd0ef122548b1f3d01a9234fcb13e4b4907bdeb3a427fa89a7e275a59d9e0ce: Status 404 returned error can't find the container with id 0cd0ef122548b1f3d01a9234fcb13e4b4907bdeb3a427fa89a7e275a59d9e0ce Apr 24 22:29:34.744283 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.744251 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:32 +0000 UTC" deadline="2028-01-29 07:37:19.274413916 +0000 UTC" Apr 24 22:29:34.744283 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.744280 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15465h7m44.530136915s" Apr 24 22:29:34.781727 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.781701 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:34.781891 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:34.781837 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:34.789141 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.789115 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-66.ec2.internal" event={"ID":"2f140c6571039b5778eaeb104c6d62fa","Type":"ContainerStarted","Data":"a8ba09baaf2d1219f725311c682d2d8df43030cd800e1edbd7a9e633ebddcdbe"} Apr 24 22:29:34.790238 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.790210 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qz7zn" event={"ID":"07631fd3-43b5-43d2-9831-42159a9806e2","Type":"ContainerStarted","Data":"8f34ac8763df7963ad3b11e25448b354614eb5b107209d20b82052ab2362f9a7"} Apr 24 22:29:34.791932 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.791909 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s2xd9" event={"ID":"0f2be5fe-c0eb-4a07-8625-e3980ae39927","Type":"ContainerStarted","Data":"4f04927802a3cf41ff83281d1f4a81c856260433c192c357e0b0b69e9f8e0b5d"} Apr 24 22:29:34.792909 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.792877 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" event={"ID":"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a","Type":"ContainerStarted","Data":"546ad4d3cb49944b9b22ac02ba04a4f9b305bae0efff142379b7f9ea5579b592"} Apr 24 22:29:34.793912 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.793892 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8pjb" event={"ID":"99d5da74-fb38-467a-951e-9d474464c9b1","Type":"ContainerStarted","Data":"ed4e6e1c1c0db1d6a0019499106dd9eb8d59bf85fb7fc77fdf46b8ce82b2b30b"} Apr 24 22:29:34.794741 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.794723 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" event={"ID":"1f3a7dfc-adce-4b4b-967d-4c568dacabda","Type":"ContainerStarted","Data":"ed9e14d03b8b3b88959c270e8a83fcfe17426ef1418087c2911cafb3c20d2709"} Apr 24 22:29:34.795743 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.795723 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-js2gj" event={"ID":"77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c","Type":"ContainerStarted","Data":"0cd0ef122548b1f3d01a9234fcb13e4b4907bdeb3a427fa89a7e275a59d9e0ce"} Apr 24 22:29:34.796721 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.796702 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" event={"ID":"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d","Type":"ContainerStarted","Data":"40c7140ef779d1799b566b2169607c40366d32153b11463ef41117a2214a8d1e"} Apr 24 22:29:34.797606 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:34.797586 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qbqlv" event={"ID":"8d80044f-c134-4e09-8b63-d8d8d4e50a46","Type":"ContainerStarted","Data":"1771d4000c14a1958e665410ab5b74d73997ddf7377d3d5da2f0a8daa3f85e78"} Apr 24 22:29:35.330322 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:35.330281 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs\") pod \"network-metrics-daemon-2wftt\" (UID: \"9b056a73-ce1c-4e88-9a66-ebfc4498a736\") " pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:35.330507 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:35.330441 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:35.330587 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:35.330509 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs podName:9b056a73-ce1c-4e88-9a66-ebfc4498a736 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:37.330489868 +0000 UTC m=+6.146815948 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs") pod "network-metrics-daemon-2wftt" (UID: "9b056a73-ce1c-4e88-9a66-ebfc4498a736") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:35.431051 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:35.431014 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9m5\" (UniqueName: \"kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5\") pod \"network-check-target-lclmx\" (UID: \"5346f0d6-5375-4d8a-9fb6-8f7c3a45720a\") " pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:35.431231 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:35.431200 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:35.431231 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:35.431228 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:35.431345 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:35.431240 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fr9m5 for pod openshift-network-diagnostics/network-check-target-lclmx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:35.431345 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:35.431310 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5 podName:5346f0d6-5375-4d8a-9fb6-8f7c3a45720a nodeName:}" failed. No retries permitted until 2026-04-24 22:29:37.431289334 +0000 UTC m=+6.247615392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fr9m5" (UniqueName: "kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5") pod "network-check-target-lclmx" (UID: "5346f0d6-5375-4d8a-9fb6-8f7c3a45720a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:35.781758 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:35.781733 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:35.782163 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:35.781857 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:36.781323 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:36.781290 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:36.781498 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:36.781432 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:36.824259 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:36.824221 2568 generic.go:358] "Generic (PLEG): container finished" podID="a49d08f0104145377483dfb31696bdfe" containerID="bca9e5fdb8c7165534dda32a651c7170e3d88ec060413400631928a78c4bba09" exitCode=0 Apr 24 22:29:36.824741 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:36.824274 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal" event={"ID":"a49d08f0104145377483dfb31696bdfe","Type":"ContainerDied","Data":"bca9e5fdb8c7165534dda32a651c7170e3d88ec060413400631928a78c4bba09"} Apr 24 22:29:36.846197 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:36.846136 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-66.ec2.internal" podStartSLOduration=3.84611971 podStartE2EDuration="3.84611971s" podCreationTimestamp="2026-04-24 22:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:34.801268517 +0000 UTC m=+3.617594596" watchObservedRunningTime="2026-04-24 22:29:36.84611971 +0000 UTC m=+5.662445801" Apr 24 22:29:37.347948 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:37.347909 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs\") pod \"network-metrics-daemon-2wftt\" (UID: \"9b056a73-ce1c-4e88-9a66-ebfc4498a736\") " pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:37.348139 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:37.348087 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:37.348220 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:37.348155 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs podName:9b056a73-ce1c-4e88-9a66-ebfc4498a736 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:41.348136178 +0000 UTC m=+10.164462239 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs") pod "network-metrics-daemon-2wftt" (UID: "9b056a73-ce1c-4e88-9a66-ebfc4498a736") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:37.448304 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:37.448265 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9m5\" (UniqueName: \"kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5\") pod \"network-check-target-lclmx\" (UID: \"5346f0d6-5375-4d8a-9fb6-8f7c3a45720a\") " pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:37.448476 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:37.448429 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:37.448476 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:37.448449 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:37.448476 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:37.448461 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fr9m5 for pod openshift-network-diagnostics/network-check-target-lclmx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:37.448650 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:37.448529 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5 podName:5346f0d6-5375-4d8a-9fb6-8f7c3a45720a nodeName:}" failed. No retries permitted until 2026-04-24 22:29:41.448509804 +0000 UTC m=+10.264835880 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fr9m5" (UniqueName: "kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5") pod "network-check-target-lclmx" (UID: "5346f0d6-5375-4d8a-9fb6-8f7c3a45720a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:37.783350 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:37.783318 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:37.783503 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:37.783439 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:38.781515 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:38.781485 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:38.781945 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:38.781699 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:39.783618 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:39.783582 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:39.784054 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:39.783715 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:40.781611 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:40.781578 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:40.781782 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:40.781741 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:41.378491 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:41.377906 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs\") pod \"network-metrics-daemon-2wftt\" (UID: \"9b056a73-ce1c-4e88-9a66-ebfc4498a736\") " pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:41.378491 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:41.378095 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:41.378491 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:41.378159 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs podName:9b056a73-ce1c-4e88-9a66-ebfc4498a736 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:49.378139064 +0000 UTC m=+18.194465121 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs") pod "network-metrics-daemon-2wftt" (UID: "9b056a73-ce1c-4e88-9a66-ebfc4498a736") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:41.478995 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:41.478385 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9m5\" (UniqueName: \"kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5\") pod \"network-check-target-lclmx\" (UID: \"5346f0d6-5375-4d8a-9fb6-8f7c3a45720a\") " pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:41.478995 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:41.478538 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:41.478995 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:41.478556 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:41.478995 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:41.478587 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fr9m5 for pod openshift-network-diagnostics/network-check-target-lclmx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:41.478995 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:41.478644 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5 podName:5346f0d6-5375-4d8a-9fb6-8f7c3a45720a nodeName:}" failed. No retries permitted until 2026-04-24 22:29:49.478626184 +0000 UTC m=+18.294952249 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fr9m5" (UniqueName: "kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5") pod "network-check-target-lclmx" (UID: "5346f0d6-5375-4d8a-9fb6-8f7c3a45720a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:41.781994 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:41.781964 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:41.782151 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:41.782074 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:42.781282 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:42.781246 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:42.781761 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:42.781364 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:43.781374 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:43.781203 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:43.781761 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:43.781442 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:44.781764 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:44.781728 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:44.782173 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:44.781857 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:45.781217 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:45.781190 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:45.781391 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:45.781302 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:46.781576 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:46.781534 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:46.781969 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:46.781664 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:47.780845 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:47.780810 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:47.781008 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:47.780916 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:48.781507 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:48.781465 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:48.781938 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:48.781604 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:49.281039 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.280997 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-wx9wp"] Apr 24 22:29:49.356034 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.356005 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wx9wp" Apr 24 22:29:49.358185 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.358160 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 22:29:49.358185 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.358185 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-f49vr\"" Apr 24 22:29:49.358371 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.358305 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 22:29:49.433758 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.433726 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8jrh\" (UniqueName: \"kubernetes.io/projected/e67c0e72-592f-4ae7-a84c-a898562b0176-kube-api-access-c8jrh\") pod \"node-resolver-wx9wp\" (UID: \"e67c0e72-592f-4ae7-a84c-a898562b0176\") " pod="openshift-dns/node-resolver-wx9wp" Apr 24 22:29:49.433936 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.433770 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs\") pod \"network-metrics-daemon-2wftt\" (UID: \"9b056a73-ce1c-4e88-9a66-ebfc4498a736\") " pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:49.433936 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:49.433891 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:49.434040 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.433887 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e67c0e72-592f-4ae7-a84c-a898562b0176-hosts-file\") pod \"node-resolver-wx9wp\" (UID: \"e67c0e72-592f-4ae7-a84c-a898562b0176\") " pod="openshift-dns/node-resolver-wx9wp" Apr 24 22:29:49.434040 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:49.433949 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs podName:9b056a73-ce1c-4e88-9a66-ebfc4498a736 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:05.433933808 +0000 UTC m=+34.250259864 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs") pod "network-metrics-daemon-2wftt" (UID: "9b056a73-ce1c-4e88-9a66-ebfc4498a736") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:49.434040 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.433984 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e67c0e72-592f-4ae7-a84c-a898562b0176-tmp-dir\") pod \"node-resolver-wx9wp\" (UID: \"e67c0e72-592f-4ae7-a84c-a898562b0176\") " pod="openshift-dns/node-resolver-wx9wp" Apr 24 22:29:49.534816 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.534736 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8jrh\" (UniqueName: \"kubernetes.io/projected/e67c0e72-592f-4ae7-a84c-a898562b0176-kube-api-access-c8jrh\") pod \"node-resolver-wx9wp\" (UID: \"e67c0e72-592f-4ae7-a84c-a898562b0176\") " pod="openshift-dns/node-resolver-wx9wp" Apr 24 22:29:49.534981 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.534837 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e67c0e72-592f-4ae7-a84c-a898562b0176-hosts-file\") pod \"node-resolver-wx9wp\" (UID: \"e67c0e72-592f-4ae7-a84c-a898562b0176\") " pod="openshift-dns/node-resolver-wx9wp" Apr 24 22:29:49.534981 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.534864 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e67c0e72-592f-4ae7-a84c-a898562b0176-tmp-dir\") pod \"node-resolver-wx9wp\" (UID: \"e67c0e72-592f-4ae7-a84c-a898562b0176\") " pod="openshift-dns/node-resolver-wx9wp" Apr 24 22:29:49.534981 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.534897 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9m5\" (UniqueName: \"kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5\") pod \"network-check-target-lclmx\" (UID: \"5346f0d6-5375-4d8a-9fb6-8f7c3a45720a\") " pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:49.535238 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.535192 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e67c0e72-592f-4ae7-a84c-a898562b0176-hosts-file\") pod \"node-resolver-wx9wp\" (UID: \"e67c0e72-592f-4ae7-a84c-a898562b0176\") " pod="openshift-dns/node-resolver-wx9wp" Apr 24 22:29:49.535360 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:49.535344 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:49.535423 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:49.535370 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:49.535423 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:49.535384 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fr9m5 for pod openshift-network-diagnostics/network-check-target-lclmx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:49.535530 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:49.535443 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5 podName:5346f0d6-5375-4d8a-9fb6-8f7c3a45720a nodeName:}" failed. No retries permitted until 2026-04-24 22:30:05.535425101 +0000 UTC m=+34.351751172 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fr9m5" (UniqueName: "kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5") pod "network-check-target-lclmx" (UID: "5346f0d6-5375-4d8a-9fb6-8f7c3a45720a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:49.535609 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.535536 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e67c0e72-592f-4ae7-a84c-a898562b0176-tmp-dir\") pod \"node-resolver-wx9wp\" (UID: \"e67c0e72-592f-4ae7-a84c-a898562b0176\") " pod="openshift-dns/node-resolver-wx9wp" Apr 24 22:29:49.546008 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.545976 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8jrh\" (UniqueName: \"kubernetes.io/projected/e67c0e72-592f-4ae7-a84c-a898562b0176-kube-api-access-c8jrh\") pod \"node-resolver-wx9wp\" (UID: \"e67c0e72-592f-4ae7-a84c-a898562b0176\") " pod="openshift-dns/node-resolver-wx9wp" Apr 24 22:29:49.665861 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.665828 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wx9wp" Apr 24 22:29:49.782026 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:49.781509 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:49.782026 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:49.781663 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:50.781229 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:50.781194 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:50.781414 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:50.781312 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:51.391798 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:29:51.391769 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode67c0e72_592f_4ae7_a84c_a898562b0176.slice/crio-a5fbf64f08048ab8c69e07bad7bf2254eedc53d5604a4f3803e10071e9b879aa WatchSource:0}: Error finding container a5fbf64f08048ab8c69e07bad7bf2254eedc53d5604a4f3803e10071e9b879aa: Status 404 returned error can't find the container with id a5fbf64f08048ab8c69e07bad7bf2254eedc53d5604a4f3803e10071e9b879aa Apr 24 22:29:51.782252 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.782015 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:51.782443 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:51.782361 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:51.847855 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.847805 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" event={"ID":"1f3a7dfc-adce-4b4b-967d-4c568dacabda","Type":"ContainerStarted","Data":"782785667a2f3df9bae23ba4aea154cb093e60b671ac0f84bddc2e295eaa9dd5"} Apr 24 22:29:51.849089 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.849066 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-js2gj" event={"ID":"77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c","Type":"ContainerStarted","Data":"66235f92e89915f4e1f5752c84a40de04128ac067bbda8b1b372a1ed7045238f"} Apr 24 22:29:51.850346 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.850326 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" event={"ID":"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d","Type":"ContainerStarted","Data":"c45361a3c5482b0772fe594a86cc0b7f4671dca7be6a377a3848e9545d903d68"} Apr 24 22:29:51.851725 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.851707 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qbqlv" event={"ID":"8d80044f-c134-4e09-8b63-d8d8d4e50a46","Type":"ContainerStarted","Data":"7a8796db6332a42f9d63ab16f8fe0b66fc65fd17f2633c2fe2d98ba4582dbf5c"} Apr 24 22:29:51.852938 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.852919 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wx9wp" event={"ID":"e67c0e72-592f-4ae7-a84c-a898562b0176","Type":"ContainerStarted","Data":"b5760d12e2fd8e18f4659f41b95bdd48ac60d203072d13bd74e77c7790f67a4a"} Apr 24 22:29:51.853036 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.852943 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wx9wp" event={"ID":"e67c0e72-592f-4ae7-a84c-a898562b0176","Type":"ContainerStarted","Data":"a5fbf64f08048ab8c69e07bad7bf2254eedc53d5604a4f3803e10071e9b879aa"} Apr 24 22:29:51.854367 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.854332 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal" event={"ID":"a49d08f0104145377483dfb31696bdfe","Type":"ContainerStarted","Data":"2f48c170411656c5da0c7d6350152bb1cfa55970e6f5e74099483fd90a463250"} Apr 24 22:29:51.855837 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.855603 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qz7zn" event={"ID":"07631fd3-43b5-43d2-9831-42159a9806e2","Type":"ContainerStarted","Data":"c00839776ac699aa27b384f13249a2ad4b1a4d555254ca7db5ffc332c39da794"} Apr 24 22:29:51.857072 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.857051 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" event={"ID":"f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a","Type":"ContainerStarted","Data":"cf3c93900c64c9bde39eb9c75b386466a11762adfe9ec982b85ab7b960a0c671"} Apr 24 22:29:51.858166 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.858146 2568 generic.go:358] "Generic (PLEG): container finished" podID="99d5da74-fb38-467a-951e-9d474464c9b1" containerID="3f6138af4b896a8b9777b4ad54a0917cc46dc9073810a788464347689c8bc36f" exitCode=0 Apr 24 22:29:51.858247 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.858173 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8pjb" event={"ID":"99d5da74-fb38-467a-951e-9d474464c9b1","Type":"ContainerDied","Data":"3f6138af4b896a8b9777b4ad54a0917cc46dc9073810a788464347689c8bc36f"} Apr 24 22:29:51.877857 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.877813 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-js2gj" podStartSLOduration=4.13534067 podStartE2EDuration="20.877797271s" podCreationTimestamp="2026-04-24 22:29:31 +0000 UTC" firstStartedPulling="2026-04-24 22:29:34.63448549 +0000 UTC m=+3.450811548" lastFinishedPulling="2026-04-24 22:29:51.376942092 +0000 UTC m=+20.193268149" observedRunningTime="2026-04-24 22:29:51.861821743 +0000 UTC m=+20.678147820" watchObservedRunningTime="2026-04-24 22:29:51.877797271 +0000 UTC m=+20.694123348" Apr 24 22:29:51.877990 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.877966 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qbqlv" podStartSLOduration=4.083713407 podStartE2EDuration="20.877960647s" podCreationTimestamp="2026-04-24 22:29:31 +0000 UTC" firstStartedPulling="2026-04-24 22:29:34.625297574 +0000 UTC m=+3.441623634" lastFinishedPulling="2026-04-24 22:29:51.419544819 +0000 UTC m=+20.235870874" observedRunningTime="2026-04-24 22:29:51.8777116 +0000 UTC m=+20.694037690" watchObservedRunningTime="2026-04-24 22:29:51.877960647 +0000 UTC m=+20.694286725" Apr 24 22:29:51.894429 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.894396 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qz6cr" podStartSLOduration=4.150097507 podStartE2EDuration="20.894383409s" podCreationTimestamp="2026-04-24 22:29:31 +0000 UTC" firstStartedPulling="2026-04-24 22:29:34.632643797 +0000 UTC m=+3.448969866" lastFinishedPulling="2026-04-24 22:29:51.376929709 +0000 UTC m=+20.193255768" observedRunningTime="2026-04-24 22:29:51.894093967 +0000 UTC m=+20.710420065" watchObservedRunningTime="2026-04-24 22:29:51.894383409 +0000 UTC m=+20.710709487" Apr 24 22:29:51.906581 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.906535 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qz7zn" podStartSLOduration=3.163801134 podStartE2EDuration="19.906522798s" podCreationTimestamp="2026-04-24 22:29:32 +0000 UTC" firstStartedPulling="2026-04-24 22:29:34.634205632 +0000 UTC m=+3.450531702" lastFinishedPulling="2026-04-24 22:29:51.376927305 +0000 UTC m=+20.193253366" observedRunningTime="2026-04-24 22:29:51.906487412 +0000 UTC m=+20.722813501" watchObservedRunningTime="2026-04-24 22:29:51.906522798 +0000 UTC m=+20.722848876" Apr 24 22:29:51.921412 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.921375 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wx9wp" podStartSLOduration=2.921361941 podStartE2EDuration="2.921361941s" podCreationTimestamp="2026-04-24 22:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:51.920901233 +0000 UTC m=+20.737227310" watchObservedRunningTime="2026-04-24 22:29:51.921361941 +0000 UTC m=+20.737688019" Apr 24 22:29:51.953371 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:51.953333 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-66.ec2.internal" podStartSLOduration=18.953320094 podStartE2EDuration="18.953320094s" podCreationTimestamp="2026-04-24 22:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:51.952962685 +0000 UTC m=+20.769288763" watchObservedRunningTime="2026-04-24 22:29:51.953320094 +0000 UTC m=+20.769646171" Apr 24 22:29:52.781291 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:52.781050 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:52.781874 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:52.781412 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:52.862051 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:52.862011 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s2xd9" event={"ID":"0f2be5fe-c0eb-4a07-8625-e3980ae39927","Type":"ContainerStarted","Data":"6ab8bba8f60481573fdd16c593184b1c0dbb9b252f64ae6e8b08e63b175125b1"} Apr 24 22:29:52.864853 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:52.864831 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:29:52.865203 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:52.865142 2568 generic.go:358] "Generic (PLEG): container finished" podID="ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d" containerID="3484f44bffeb97f49306e76c77d2b5f03e1ffd3f3bc7fe4bee3f940175937a42" exitCode=1 Apr 24 22:29:52.865203 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:52.865202 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" event={"ID":"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d","Type":"ContainerStarted","Data":"e65e74cad0eb514137207143a8dd2e69fe9493b140a7a094a6430f3444952e3d"} Apr 24 22:29:52.865337 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:52.865231 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" event={"ID":"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d","Type":"ContainerStarted","Data":"da9dfe09cf8795140cc7e91f91562d378c374a182f7ed82ad9916ce9bac6e3fc"} Apr 24 22:29:52.865337 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:52.865241 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" event={"ID":"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d","Type":"ContainerStarted","Data":"05576e128556b87c0a4052acba37b5de14d4098a156feb6c0ef2e8ff545daa0b"} Apr 24 22:29:52.865337 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:52.865269 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" event={"ID":"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d","Type":"ContainerStarted","Data":"aa640e82e0ef23f7573c530a1996bdc015dcd4522e78b06011cab1e8c4e33999"} Apr 24 22:29:52.865337 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:52.865278 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" event={"ID":"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d","Type":"ContainerDied","Data":"3484f44bffeb97f49306e76c77d2b5f03e1ffd3f3bc7fe4bee3f940175937a42"} Apr 24 22:29:52.875620 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:52.875556 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-s2xd9" podStartSLOduration=4.136151068 podStartE2EDuration="20.875544927s" podCreationTimestamp="2026-04-24 22:29:32 +0000 UTC" firstStartedPulling="2026-04-24 22:29:34.637537353 +0000 UTC m=+3.453863409" lastFinishedPulling="2026-04-24 22:29:51.376931207 +0000 UTC m=+20.193257268" observedRunningTime="2026-04-24 22:29:52.875538111 +0000 UTC m=+21.691864191" watchObservedRunningTime="2026-04-24 22:29:52.875544927 +0000 UTC m=+21.691871005" Apr 24 22:29:52.925687 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:52.925661 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 22:29:53.780083 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:53.779984 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T22:29:52.925683302Z","UUID":"cd32d49a-8224-439b-964f-c5c12527a025","Handler":null,"Name":"","Endpoint":""} Apr 24 22:29:53.780766 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:53.780747 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:53.780893 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:53.780861 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:53.781894 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:53.781874 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 22:29:53.782366 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:53.781901 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 22:29:53.868729 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:53.868692 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" event={"ID":"1f3a7dfc-adce-4b4b-967d-4c568dacabda","Type":"ContainerStarted","Data":"7a834c0127a73e8a8e764416bd67fc05aaea03df50ec9c973c250d271d50b2cf"} Apr 24 22:29:54.645678 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:54.645644 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-js2gj" Apr 24 22:29:54.646345 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:54.646324 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-js2gj" Apr 24 22:29:54.780728 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:54.780700 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:54.780875 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:54.780819 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:54.875184 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:54.875149 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" event={"ID":"1f3a7dfc-adce-4b4b-967d-4c568dacabda","Type":"ContainerStarted","Data":"5f09fb0018f02749d3c7fcfeb4c3846643d16679b636b3fa5624a346e9dba5a2"} Apr 24 22:29:54.875834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:54.875338 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-js2gj" Apr 24 22:29:54.876165 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:54.876147 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-js2gj" Apr 24 22:29:54.908851 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:54.908752 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-l4mqs" podStartSLOduration=3.564712348 podStartE2EDuration="22.908738751s" podCreationTimestamp="2026-04-24 22:29:32 +0000 UTC" firstStartedPulling="2026-04-24 22:29:34.637814196 +0000 UTC m=+3.454140251" lastFinishedPulling="2026-04-24 22:29:53.981840599 +0000 UTC m=+22.798166654" observedRunningTime="2026-04-24 22:29:54.908652333 +0000 UTC m=+23.724978412" watchObservedRunningTime="2026-04-24 22:29:54.908738751 +0000 UTC m=+23.725064829" Apr 24 22:29:55.783795 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:55.783767 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:55.783970 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:55.783870 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:56.781344 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:56.781316 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:56.781860 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:56.781417 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:56.881236 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:56.881209 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:29:56.881532 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:56.881512 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" event={"ID":"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d","Type":"ContainerStarted","Data":"62bbdf926e354afaa0b93741c9fbc5255cbb8aed0e0d2dce3964e37cd08e0e51"} Apr 24 22:29:56.882958 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:56.882934 2568 generic.go:358] "Generic (PLEG): container finished" podID="99d5da74-fb38-467a-951e-9d474464c9b1" containerID="cdbdc096d77dbdb0ca895562793740b09d9716a9c924c2e33310161fecdb845c" exitCode=0 Apr 24 22:29:56.883053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:56.883011 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8pjb" event={"ID":"99d5da74-fb38-467a-951e-9d474464c9b1","Type":"ContainerDied","Data":"cdbdc096d77dbdb0ca895562793740b09d9716a9c924c2e33310161fecdb845c"} Apr 24 22:29:57.301002 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.300962 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-j22h7"] Apr 24 22:29:57.304410 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.304390 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:29:57.304550 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:57.304465 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-j22h7" podUID="5b342b72-c7fb-4579-947f-7a261031b1a3" Apr 24 22:29:57.393129 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.393088 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:29:57.393280 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.393154 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5b342b72-c7fb-4579-947f-7a261031b1a3-kubelet-config\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:29:57.393280 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.393216 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5b342b72-c7fb-4579-947f-7a261031b1a3-dbus\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:29:57.494505 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.494475 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5b342b72-c7fb-4579-947f-7a261031b1a3-kubelet-config\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:29:57.494505 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.494507 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5b342b72-c7fb-4579-947f-7a261031b1a3-dbus\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:29:57.494706 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.494553 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:29:57.494706 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.494618 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5b342b72-c7fb-4579-947f-7a261031b1a3-kubelet-config\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:29:57.494706 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:57.494669 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:57.494811 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:57.494719 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret podName:5b342b72-c7fb-4579-947f-7a261031b1a3 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:57.994706271 +0000 UTC m=+26.811032327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret") pod "global-pull-secret-syncer-j22h7" (UID: "5b342b72-c7fb-4579-947f-7a261031b1a3") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:57.494943 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.494800 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5b342b72-c7fb-4579-947f-7a261031b1a3-dbus\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:29:57.781845 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.781808 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:57.782214 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:57.781939 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:57.886052 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.885961 2568 generic.go:358] "Generic (PLEG): container finished" podID="99d5da74-fb38-467a-951e-9d474464c9b1" containerID="e7164a1e4c543de298717bdbc983f22120ae9880e1df1a19249e2048c1118b92" exitCode=0 Apr 24 22:29:57.886052 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.886003 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8pjb" event={"ID":"99d5da74-fb38-467a-951e-9d474464c9b1","Type":"ContainerDied","Data":"e7164a1e4c543de298717bdbc983f22120ae9880e1df1a19249e2048c1118b92"} Apr 24 22:29:57.997695 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:57.997664 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:29:57.997870 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:57.997810 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:57.997933 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:57.997875 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret podName:5b342b72-c7fb-4579-947f-7a261031b1a3 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:58.997855555 +0000 UTC m=+27.814181615 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret") pod "global-pull-secret-syncer-j22h7" (UID: "5b342b72-c7fb-4579-947f-7a261031b1a3") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:58.780737 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:58.780705 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:29:58.780889 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:58.780705 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:29:58.780889 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:58.780836 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:29:58.780889 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:58.780883 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-j22h7" podUID="5b342b72-c7fb-4579-947f-7a261031b1a3" Apr 24 22:29:58.890522 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:58.890496 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:29:58.890934 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:58.890879 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" event={"ID":"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d","Type":"ContainerStarted","Data":"809c6c56aa7e5ba178e2b67a55fcf0fa5ff2943f6f97ab3b37e35cc1d5d40f92"} Apr 24 22:29:58.891238 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:58.891212 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:58.891379 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:58.891359 2568 scope.go:117] "RemoveContainer" containerID="3484f44bffeb97f49306e76c77d2b5f03e1ffd3f3bc7fe4bee3f940175937a42" Apr 24 22:29:58.894626 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:58.894552 2568 generic.go:358] "Generic (PLEG): container finished" podID="99d5da74-fb38-467a-951e-9d474464c9b1" containerID="d976077a9bd50fdac2b0718cc0ca7fdd5e7df41f8fb8f6e3a17128e2cdd94d8e" exitCode=0 Apr 24 22:29:58.894626 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:58.894605 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8pjb" event={"ID":"99d5da74-fb38-467a-951e-9d474464c9b1","Type":"ContainerDied","Data":"d976077a9bd50fdac2b0718cc0ca7fdd5e7df41f8fb8f6e3a17128e2cdd94d8e"} Apr 24 22:29:58.907202 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:58.907176 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:59.006644 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:59.006612 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:29:59.006848 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:59.006828 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:59.006906 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:59.006900 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret podName:5b342b72-c7fb-4579-947f-7a261031b1a3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:01.006877649 +0000 UTC m=+29.823203724 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret") pod "global-pull-secret-syncer-j22h7" (UID: "5b342b72-c7fb-4579-947f-7a261031b1a3") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:59.784720 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:59.784519 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:29:59.784880 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:29:59.784817 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:29:59.904439 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:59.904359 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:29:59.907792 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:59.907708 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" event={"ID":"ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d","Type":"ContainerStarted","Data":"0bde7390dc2ed1f231130a74a97c6603b0850552ffd45abe14e8fbf7a6c6475b"} Apr 24 22:29:59.909112 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:59.909088 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:59.909249 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:59.909126 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:59.928213 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:59.928163 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:29:59.946351 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:29:59.946286 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" podStartSLOduration=11.871449364 podStartE2EDuration="28.94626645s" podCreationTimestamp="2026-04-24 22:29:31 +0000 UTC" firstStartedPulling="2026-04-24 22:29:34.628319668 +0000 UTC m=+3.444645724" lastFinishedPulling="2026-04-24 22:29:51.703136743 +0000 UTC m=+20.519462810" observedRunningTime="2026-04-24 22:29:59.944597425 +0000 UTC m=+28.760923505" watchObservedRunningTime="2026-04-24 22:29:59.94626645 +0000 UTC m=+28.762592530" Apr 24 22:30:00.198591 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:00.198544 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-j22h7"] Apr 24 22:30:00.198742 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:00.198718 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:30:00.198850 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:00.198826 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-j22h7" podUID="5b342b72-c7fb-4579-947f-7a261031b1a3" Apr 24 22:30:00.202533 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:00.202309 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lclmx"] Apr 24 22:30:00.202533 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:00.202390 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:30:00.202533 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:00.202456 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:30:00.218225 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:00.218198 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2wftt"] Apr 24 22:30:00.218371 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:00.218324 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:30:00.218454 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:00.218434 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:30:01.028038 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:01.027955 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:30:01.028612 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:01.028232 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:01.028612 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:01.028289 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret podName:5b342b72-c7fb-4579-947f-7a261031b1a3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:05.028270093 +0000 UTC m=+33.844596152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret") pod "global-pull-secret-syncer-j22h7" (UID: "5b342b72-c7fb-4579-947f-7a261031b1a3") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:01.783149 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:01.783116 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:30:01.783322 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:01.783203 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:30:01.783582 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:01.783533 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:30:01.783695 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:01.783636 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-j22h7" podUID="5b342b72-c7fb-4579-947f-7a261031b1a3" Apr 24 22:30:01.783695 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:01.783674 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:30:01.783807 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:01.783741 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:30:03.781706 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:03.781638 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:30:03.782156 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:03.781781 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:30:03.782156 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:03.781781 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-j22h7" podUID="5b342b72-c7fb-4579-947f-7a261031b1a3" Apr 24 22:30:03.782156 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:03.781800 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:30:03.782156 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:03.781883 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wftt" podUID="9b056a73-ce1c-4e88-9a66-ebfc4498a736" Apr 24 22:30:03.782156 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:03.781944 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lclmx" podUID="5346f0d6-5375-4d8a-9fb6-8f7c3a45720a" Apr 24 22:30:05.056709 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.056673 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:30:05.057228 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.056808 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:05.057228 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.056858 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret podName:5b342b72-c7fb-4579-947f-7a261031b1a3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:13.056845596 +0000 UTC m=+41.873171652 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret") pod "global-pull-secret-syncer-j22h7" (UID: "5b342b72-c7fb-4579-947f-7a261031b1a3") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:05.460004 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.459921 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs\") pod \"network-metrics-daemon-2wftt\" (UID: \"9b056a73-ce1c-4e88-9a66-ebfc4498a736\") " pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:30:05.460146 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.460031 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:05.460146 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.460085 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs podName:9b056a73-ce1c-4e88-9a66-ebfc4498a736 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:37.460070896 +0000 UTC m=+66.276396956 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs") pod "network-metrics-daemon-2wftt" (UID: "9b056a73-ce1c-4e88-9a66-ebfc4498a736") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:05.501340 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.501313 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-66.ec2.internal" event="NodeReady" Apr 24 22:30:05.501497 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.501427 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 22:30:05.550478 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.550442 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5bdcbf4855-czjjp"] Apr 24 22:30:05.560926 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.560709 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9m5\" (UniqueName: \"kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5\") pod \"network-check-target-lclmx\" (UID: \"5346f0d6-5375-4d8a-9fb6-8f7c3a45720a\") " pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:30:05.560926 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.561012 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:05.560926 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.561076 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:05.560926 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.561094 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fr9m5 for pod openshift-network-diagnostics/network-check-target-lclmx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:05.560926 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.561175 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5 podName:5346f0d6-5375-4d8a-9fb6-8f7c3a45720a nodeName:}" failed. No retries permitted until 2026-04-24 22:30:37.561152656 +0000 UTC m=+66.377478714 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-fr9m5" (UniqueName: "kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5") pod "network-check-target-lclmx" (UID: "5346f0d6-5375-4d8a-9fb6-8f7c3a45720a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:05.575176 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.575148 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2"] Apr 24 22:30:05.575328 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.575310 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.579004 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.578979 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 22:30:05.579137 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.579008 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 22:30:05.579137 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.578980 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8jvtj\"" Apr 24 22:30:05.579234 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.579144 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 22:30:05.589323 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.588235 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 22:30:05.599988 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.599956 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7"] Apr 24 22:30:05.600122 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.600104 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:05.603247 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.603220 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:30:05.603357 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.603220 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 22:30:05.603357 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.603284 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 22:30:05.603649 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.603631 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-hxwqm\"" Apr 24 22:30:05.611900 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.611878 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j"] Apr 24 22:30:05.612027 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.612012 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" Apr 24 22:30:05.614103 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.614083 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2wn7s\"" Apr 24 22:30:05.614527 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.614510 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 22:30:05.615689 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.615666 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:30:05.615790 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.615753 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 22:30:05.615927 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.615911 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 22:30:05.630683 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.630650 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-ddhcv"] Apr 24 22:30:05.630811 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.630794 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" Apr 24 22:30:05.635319 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.635296 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 22:30:05.635468 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.635305 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 22:30:05.635529 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.635466 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bkdpz\"" Apr 24 22:30:05.635678 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.635660 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 22:30:05.636617 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.636596 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:30:05.657500 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.657465 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gz596"] Apr 24 22:30:05.657500 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.657488 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.661099 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.661070 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40b71b75-d3f5-4c46-82d7-81a92e3e572d-ca-trust-extracted\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.661099 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.661098 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn5m9\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-kube-api-access-bn5m9\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.661271 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.661119 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40b71b75-d3f5-4c46-82d7-81a92e3e572d-installation-pull-secrets\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.661271 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.661139 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.661392 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.661274 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40b71b75-d3f5-4c46-82d7-81a92e3e572d-trusted-ca\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.661392 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.661311 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/40b71b75-d3f5-4c46-82d7-81a92e3e572d-image-registry-private-configuration\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.661392 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.661341 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-certificates\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.661392 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.661368 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-bound-sa-token\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.662350 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.662328 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 22:30:05.662675 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.662643 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 22:30:05.663062 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.663043 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 22:30:05.663199 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.663174 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-f85jf\"" Apr 24 22:30:05.663284 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.663208 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 22:30:05.668344 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.668324 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 22:30:05.672691 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.672674 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25"] Apr 24 22:30:05.672815 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.672801 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gz596" Apr 24 22:30:05.676998 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.676971 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:30:05.677106 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.677083 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-r92mp\"" Apr 24 22:30:05.677740 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.677701 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 22:30:05.690733 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.690707 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rlv6p"] Apr 24 22:30:05.690852 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.690836 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:05.695718 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.695694 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 22:30:05.696071 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.696054 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 22:30:05.696144 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.696091 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 22:30:05.696144 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.696119 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-mtpf2\"" Apr 24 22:30:05.696259 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.696203 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 22:30:05.702882 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.702854 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6cf5756446-6zvgg"] Apr 24 22:30:05.703011 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.702996 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:05.709246 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.709222 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-9zfdh\"" Apr 24 22:30:05.709534 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.709516 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:30:05.709649 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.709632 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 22:30:05.709706 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.709639 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 22:30:05.710079 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.710031 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 22:30:05.714581 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.714548 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 22:30:05.715019 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.715005 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5bdcbf4855-czjjp"] Apr 24 22:30:05.715077 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.715026 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq"] Apr 24 22:30:05.715180 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.715164 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:05.717149 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.717131 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 22:30:05.717225 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.717199 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 22:30:05.719023 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.719010 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 22:30:05.719155 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.719137 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 22:30:05.719301 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.719283 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-p9zjk\"" Apr 24 22:30:05.719380 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.719365 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 22:30:05.719664 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.719647 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 22:30:05.727017 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.726990 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lb8gp"] Apr 24 22:30:05.727161 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.727144 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:05.729690 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.729669 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 22:30:05.729768 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.729739 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-p6295\"" Apr 24 22:30:05.729818 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.729771 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 22:30:05.739234 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.739212 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gz596"] Apr 24 22:30:05.739234 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.739239 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm"] Apr 24 22:30:05.739359 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.739344 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lb8gp" Apr 24 22:30:05.741702 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.741685 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:30:05.742081 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.742067 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:30:05.742188 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.742172 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-r7bkx\"" Apr 24 22:30:05.753698 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.753674 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2"] Apr 24 22:30:05.753698 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.753703 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr"] Apr 24 22:30:05.753850 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.753814 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" Apr 24 22:30:05.756065 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.756041 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 22:30:05.756065 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.756046 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 22:30:05.756242 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.756106 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 22:30:05.756242 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.756047 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 22:30:05.761718 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.761691 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45450237-ea60-4420-a8ca-ce71d93e2261-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6dhr7\" (UID: \"45450237-ea60-4420-a8ca-ce71d93e2261\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" Apr 24 22:30:05.761810 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.761725 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45450237-ea60-4420-a8ca-ce71d93e2261-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6dhr7\" (UID: \"45450237-ea60-4420-a8ca-ce71d93e2261\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" Apr 24 22:30:05.761810 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.761743 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa9e4a3b-8580-4463-a5cd-6aa7ba82738b-config\") pod \"service-ca-operator-d6fc45fc5-2wr7j\" (UID: \"aa9e4a3b-8580-4463-a5cd-6aa7ba82738b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" Apr 24 22:30:05.761810 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.761764 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40b71b75-d3f5-4c46-82d7-81a92e3e572d-trusted-ca\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.761810 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.761781 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9e4a3b-8580-4463-a5cd-6aa7ba82738b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2wr7j\" (UID: \"aa9e4a3b-8580-4463-a5cd-6aa7ba82738b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" Apr 24 22:30:05.761935 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.761820 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/40b71b75-d3f5-4c46-82d7-81a92e3e572d-image-registry-private-configuration\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.761935 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.761870 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-certificates\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.761935 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.761903 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-bound-sa-token\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.761935 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.761922 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-snapshots\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.762095 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.761957 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ns5m\" (UniqueName: \"kubernetes.io/projected/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-kube-api-access-5ns5m\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.762135 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762096 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwg9r\" (UniqueName: \"kubernetes.io/projected/760dbb36-9588-4e56-bc23-e7c58e592567-kube-api-access-qwg9r\") pod \"cluster-samples-operator-6dc5bdb6b4-984q2\" (UID: \"760dbb36-9588-4e56-bc23-e7c58e592567\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:05.762135 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762128 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4z7v\" (UniqueName: \"kubernetes.io/projected/45450237-ea60-4420-a8ca-ce71d93e2261-kube-api-access-b4z7v\") pod \"kube-storage-version-migrator-operator-6769c5d45-6dhr7\" (UID: \"45450237-ea60-4420-a8ca-ce71d93e2261\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" Apr 24 22:30:05.762207 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762151 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc4rj\" (UniqueName: \"kubernetes.io/projected/aa9e4a3b-8580-4463-a5cd-6aa7ba82738b-kube-api-access-bc4rj\") pod \"service-ca-operator-d6fc45fc5-2wr7j\" (UID: \"aa9e4a3b-8580-4463-a5cd-6aa7ba82738b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" Apr 24 22:30:05.762207 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762173 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-984q2\" (UID: \"760dbb36-9588-4e56-bc23-e7c58e592567\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:05.762207 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762192 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-serving-cert\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.762329 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762219 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40b71b75-d3f5-4c46-82d7-81a92e3e572d-ca-trust-extracted\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.762329 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762243 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bn5m9\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-kube-api-access-bn5m9\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.762329 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762267 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.762329 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762302 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40b71b75-d3f5-4c46-82d7-81a92e3e572d-installation-pull-secrets\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.762329 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762327 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.762555 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762352 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftpnv\" (UniqueName: \"kubernetes.io/projected/e2e30cff-84a0-405d-bdd7-3d9f61b16917-kube-api-access-ftpnv\") pod \"volume-data-source-validator-7c6cbb6c87-gz596\" (UID: \"e2e30cff-84a0-405d-bdd7-3d9f61b16917\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gz596" Apr 24 22:30:05.762555 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762376 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-tmp\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.762555 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762403 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-service-ca-bundle\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.762555 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762420 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-certificates\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.762555 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.762516 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:05.762555 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.762532 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bdcbf4855-czjjp: secret "image-registry-tls" not found Apr 24 22:30:05.762768 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762580 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40b71b75-d3f5-4c46-82d7-81a92e3e572d-ca-trust-extracted\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.762768 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.762595 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls podName:40b71b75-d3f5-4c46-82d7-81a92e3e572d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.262580854 +0000 UTC m=+35.078906910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls") pod "image-registry-5bdcbf4855-czjjp" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d") : secret "image-registry-tls" not found Apr 24 22:30:05.762768 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.762684 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40b71b75-d3f5-4c46-82d7-81a92e3e572d-trusted-ca\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.767049 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.767010 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/40b71b75-d3f5-4c46-82d7-81a92e3e572d-image-registry-private-configuration\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.769321 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.769298 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40b71b75-d3f5-4c46-82d7-81a92e3e572d-installation-pull-secrets\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.769429 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.769412 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rlv6p"] Apr 24 22:30:05.769468 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.769435 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7"] Apr 24 22:30:05.769468 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.769445 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25"] Apr 24 22:30:05.769468 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.769455 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw"] Apr 24 22:30:05.769609 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.769523 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr" Apr 24 22:30:05.772134 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.772059 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 22:30:05.772254 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.772140 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-mhqv7\"" Apr 24 22:30:05.772448 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.772427 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-bound-sa-token\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.772974 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.772950 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn5m9\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-kube-api-access-bn5m9\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:05.787802 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.787780 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lb8gp"] Apr 24 22:30:05.787943 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.787925 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:30:05.788023 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.787926 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:30:05.788170 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.787925 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:05.788418 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.788309 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:30:05.790263 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.790246 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2wb7b\"" Apr 24 22:30:05.790492 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.790479 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:05.790858 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.790842 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 22:30:05.790943 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.790871 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 22:30:05.790993 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.790965 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-94q8c\"" Apr 24 22:30:05.791043 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.791034 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq"] Apr 24 22:30:05.791090 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.791052 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j"] Apr 24 22:30:05.791090 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.791064 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm"] Apr 24 22:30:05.791090 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.791076 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr"] Apr 24 22:30:05.791090 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.791089 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-ddhcv"] Apr 24 22:30:05.791206 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.791102 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw"] Apr 24 22:30:05.791206 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.791114 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6cf5756446-6zvgg"] Apr 24 22:30:05.791206 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.791127 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-x22nw"] Apr 24 22:30:05.791327 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.791312 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 22:30:05.791655 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.791640 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 22:30:05.791708 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.791693 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 22:30:05.814553 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.814532 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wsnnl"] Apr 24 22:30:05.814686 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.814671 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:05.818143 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.818122 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hjbjm\"" Apr 24 22:30:05.818227 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.818120 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 22:30:05.818529 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.818512 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 22:30:05.836064 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.836038 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wsnnl"] Apr 24 22:30:05.836064 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.836065 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x22nw"] Apr 24 22:30:05.836215 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.836167 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:05.838943 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.838921 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 22:30:05.839055 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.839002 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 22:30:05.839055 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.839008 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 22:30:05.839055 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.839015 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-g8k4l\"" Apr 24 22:30:05.863431 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.863397 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa9e4a3b-8580-4463-a5cd-6aa7ba82738b-config\") pod \"service-ca-operator-d6fc45fc5-2wr7j\" (UID: \"aa9e4a3b-8580-4463-a5cd-6aa7ba82738b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" Apr 24 22:30:05.863623 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.863440 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d94d4d43-b527-44fc-a469-c293ea0d393c-trusted-ca\") pod \"console-operator-9d4b6777b-rlv6p\" (UID: \"d94d4d43-b527-44fc-a469-c293ea0d393c\") " pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:05.863623 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.863467 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnvsz\" (UniqueName: \"kubernetes.io/projected/00bf21d3-25d9-4d4b-a2f5-c99835cb1daa-kube-api-access-nnvsz\") pod \"network-check-source-8894fc9bd-lb8gp\" (UID: \"00bf21d3-25d9-4d4b-a2f5-c99835cb1daa\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lb8gp" Apr 24 22:30:05.863623 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.863490 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62qqc\" (UniqueName: \"kubernetes.io/projected/ab2d689d-cd6c-4576-962c-fe8652ae8258-kube-api-access-62qqc\") pod \"klusterlet-addon-workmgr-6b9cb95d68-slldm\" (UID: \"ab2d689d-cd6c-4576-962c-fe8652ae8258\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" Apr 24 22:30:05.863623 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.863526 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45450237-ea60-4420-a8ca-ce71d93e2261-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6dhr7\" (UID: \"45450237-ea60-4420-a8ca-ce71d93e2261\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" Apr 24 22:30:05.863623 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.863612 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9e4a3b-8580-4463-a5cd-6aa7ba82738b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2wr7j\" (UID: \"aa9e4a3b-8580-4463-a5cd-6aa7ba82738b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" Apr 24 22:30:05.863913 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.863655 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-snapshots\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.863913 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.863678 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw686\" (UniqueName: \"kubernetes.io/projected/f030e0a1-53de-48ef-9d7b-0da9ecbb9124-kube-api-access-jw686\") pod \"managed-serviceaccount-addon-agent-fb556c45-sgwdr\" (UID: \"f030e0a1-53de-48ef-9d7b-0da9ecbb9124\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr" Apr 24 22:30:05.863913 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.863705 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:05.863913 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.863732 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t4st\" (UniqueName: \"kubernetes.io/projected/788467b9-4514-4bb7-88c9-87f727737472-kube-api-access-9t4st\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:05.863913 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.863810 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-serving-cert\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.863913 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.863850 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94d4d43-b527-44fc-a469-c293ea0d393c-config\") pod \"console-operator-9d4b6777b-rlv6p\" (UID: \"d94d4d43-b527-44fc-a469-c293ea0d393c\") " pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:05.863913 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.863881 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.864166 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864005 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-default-certificate\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:05.864166 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864042 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa9e4a3b-8580-4463-a5cd-6aa7ba82738b-config\") pod \"service-ca-operator-d6fc45fc5-2wr7j\" (UID: \"aa9e4a3b-8580-4463-a5cd-6aa7ba82738b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" Apr 24 22:30:05.864166 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864053 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:05.864166 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864086 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:05.864166 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864117 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftpnv\" (UniqueName: \"kubernetes.io/projected/e2e30cff-84a0-405d-bdd7-3d9f61b16917-kube-api-access-ftpnv\") pod \"volume-data-source-validator-7c6cbb6c87-gz596\" (UID: \"e2e30cff-84a0-405d-bdd7-3d9f61b16917\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gz596" Apr 24 22:30:05.864166 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864144 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-tmp\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.864428 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864188 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ab2d689d-cd6c-4576-962c-fe8652ae8258-klusterlet-config\") pod \"klusterlet-addon-workmgr-6b9cb95d68-slldm\" (UID: \"ab2d689d-cd6c-4576-962c-fe8652ae8258\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" Apr 24 22:30:05.864428 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864217 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjrlj\" (UniqueName: \"kubernetes.io/projected/463655ac-43dd-4bbb-9fc3-a11c952398a9-kube-api-access-gjrlj\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:05.864428 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864241 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45450237-ea60-4420-a8ca-ce71d93e2261-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6dhr7\" (UID: \"45450237-ea60-4420-a8ca-ce71d93e2261\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" Apr 24 22:30:05.864428 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864264 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/788467b9-4514-4bb7-88c9-87f727737472-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:05.864627 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864298 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwg9r\" (UniqueName: \"kubernetes.io/projected/760dbb36-9588-4e56-bc23-e7c58e592567-kube-api-access-qwg9r\") pod \"cluster-samples-operator-6dc5bdb6b4-984q2\" (UID: \"760dbb36-9588-4e56-bc23-e7c58e592567\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:05.864627 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864532 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab2d689d-cd6c-4576-962c-fe8652ae8258-tmp\") pod \"klusterlet-addon-workmgr-6b9cb95d68-slldm\" (UID: \"ab2d689d-cd6c-4576-962c-fe8652ae8258\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" Apr 24 22:30:05.864627 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864590 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4z7v\" (UniqueName: \"kubernetes.io/projected/45450237-ea60-4420-a8ca-ce71d93e2261-kube-api-access-b4z7v\") pod \"kube-storage-version-migrator-operator-6769c5d45-6dhr7\" (UID: \"45450237-ea60-4420-a8ca-ce71d93e2261\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" Apr 24 22:30:05.864627 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864618 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc4rj\" (UniqueName: \"kubernetes.io/projected/aa9e4a3b-8580-4463-a5cd-6aa7ba82738b-kube-api-access-bc4rj\") pod \"service-ca-operator-d6fc45fc5-2wr7j\" (UID: \"aa9e4a3b-8580-4463-a5cd-6aa7ba82738b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" Apr 24 22:30:05.864627 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864349 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-snapshots\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.864627 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864645 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-984q2\" (UID: \"760dbb36-9588-4e56-bc23-e7c58e592567\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:05.864627 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864673 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr8xq\" (UniqueName: \"kubernetes.io/projected/d94d4d43-b527-44fc-a469-c293ea0d393c-kube-api-access-gr8xq\") pod \"console-operator-9d4b6777b-rlv6p\" (UID: \"d94d4d43-b527-44fc-a469-c293ea0d393c\") " pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:05.865113 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864644 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-tmp\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.865113 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864705 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d94d4d43-b527-44fc-a469-c293ea0d393c-serving-cert\") pod \"console-operator-9d4b6777b-rlv6p\" (UID: \"d94d4d43-b527-44fc-a469-c293ea0d393c\") " pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:05.865113 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.864724 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:30:05.865113 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864731 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f030e0a1-53de-48ef-9d7b-0da9ecbb9124-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fb556c45-sgwdr\" (UID: \"f030e0a1-53de-48ef-9d7b-0da9ecbb9124\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr" Apr 24 22:30:05.865113 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864761 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/29de487b-ac86-4115-aa5b-af699bbd6649-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zz9fq\" (UID: \"29de487b-ac86-4115-aa5b-af699bbd6649\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:05.865113 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.864787 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls podName:760dbb36-9588-4e56-bc23-e7c58e592567 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.364769923 +0000 UTC m=+35.181095983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-984q2" (UID: "760dbb36-9588-4e56-bc23-e7c58e592567") : secret "samples-operator-tls" not found Apr 24 22:30:05.865113 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864806 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zz9fq\" (UID: \"29de487b-ac86-4115-aa5b-af699bbd6649\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:05.865113 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864833 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-stats-auth\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:05.865113 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864894 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-service-ca-bundle\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.865113 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.864926 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ns5m\" (UniqueName: \"kubernetes.io/projected/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-kube-api-access-5ns5m\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.865591 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.865377 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-service-ca-bundle\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.865740 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.865710 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.866385 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.866364 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9e4a3b-8580-4463-a5cd-6aa7ba82738b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2wr7j\" (UID: \"aa9e4a3b-8580-4463-a5cd-6aa7ba82738b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" Apr 24 22:30:05.866549 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.866532 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-serving-cert\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.866684 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.866670 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45450237-ea60-4420-a8ca-ce71d93e2261-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6dhr7\" (UID: \"45450237-ea60-4420-a8ca-ce71d93e2261\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" Apr 24 22:30:05.874159 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.874137 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45450237-ea60-4420-a8ca-ce71d93e2261-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6dhr7\" (UID: \"45450237-ea60-4420-a8ca-ce71d93e2261\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" Apr 24 22:30:05.876339 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.876319 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftpnv\" (UniqueName: \"kubernetes.io/projected/e2e30cff-84a0-405d-bdd7-3d9f61b16917-kube-api-access-ftpnv\") pod \"volume-data-source-validator-7c6cbb6c87-gz596\" (UID: \"e2e30cff-84a0-405d-bdd7-3d9f61b16917\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gz596" Apr 24 22:30:05.886486 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.886437 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwg9r\" (UniqueName: \"kubernetes.io/projected/760dbb36-9588-4e56-bc23-e7c58e592567-kube-api-access-qwg9r\") pod \"cluster-samples-operator-6dc5bdb6b4-984q2\" (UID: \"760dbb36-9588-4e56-bc23-e7c58e592567\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:05.890797 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.890754 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc4rj\" (UniqueName: \"kubernetes.io/projected/aa9e4a3b-8580-4463-a5cd-6aa7ba82738b-kube-api-access-bc4rj\") pod \"service-ca-operator-d6fc45fc5-2wr7j\" (UID: \"aa9e4a3b-8580-4463-a5cd-6aa7ba82738b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" Apr 24 22:30:05.891185 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.891167 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4z7v\" (UniqueName: \"kubernetes.io/projected/45450237-ea60-4420-a8ca-ce71d93e2261-kube-api-access-b4z7v\") pod \"kube-storage-version-migrator-operator-6769c5d45-6dhr7\" (UID: \"45450237-ea60-4420-a8ca-ce71d93e2261\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" Apr 24 22:30:05.892389 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.892368 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ns5m\" (UniqueName: \"kubernetes.io/projected/9c268f7d-cce3-435b-bc67-fc4ba3d34b62-kube-api-access-5ns5m\") pod \"insights-operator-585dfdc468-ddhcv\" (UID: \"9c268f7d-cce3-435b-bc67-fc4ba3d34b62\") " pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.921899 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.921868 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" Apr 24 22:30:05.924176 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.924149 2568 generic.go:358] "Generic (PLEG): container finished" podID="99d5da74-fb38-467a-951e-9d474464c9b1" containerID="8e9de886846f50f58b96ebc0fc9297d22cb7380d99b4a3b74e98bb98c4ad7437" exitCode=0 Apr 24 22:30:05.924272 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.924216 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8pjb" event={"ID":"99d5da74-fb38-467a-951e-9d474464c9b1","Type":"ContainerDied","Data":"8e9de886846f50f58b96ebc0fc9297d22cb7380d99b4a3b74e98bb98c4ad7437"} Apr 24 22:30:05.939309 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.939278 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" Apr 24 22:30:05.965859 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.965827 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:05.965994 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.965871 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert\") pod \"ingress-canary-wsnnl\" (UID: \"8e2721e7-1e03-4664-9c72-58e388853714\") " pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:05.965994 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.965892 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:05.965994 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.965922 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ab2d689d-cd6c-4576-962c-fe8652ae8258-klusterlet-config\") pod \"klusterlet-addon-workmgr-6b9cb95d68-slldm\" (UID: \"ab2d689d-cd6c-4576-962c-fe8652ae8258\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" Apr 24 22:30:05.965994 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.965943 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjrlj\" (UniqueName: \"kubernetes.io/projected/463655ac-43dd-4bbb-9fc3-a11c952398a9-kube-api-access-gjrlj\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:05.965994 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.965959 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/de52ac65-738f-4a82-8711-dc6a41133d4a-ca\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:05.965994 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.965978 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/788467b9-4514-4bb7-88c9-87f727737472-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:05.965994 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.965995 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jkbw\" (UniqueName: \"kubernetes.io/projected/8e2721e7-1e03-4664-9c72-58e388853714-kube-api-access-8jkbw\") pod \"ingress-canary-wsnnl\" (UID: \"8e2721e7-1e03-4664-9c72-58e388853714\") " pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966011 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/de52ac65-738f-4a82-8711-dc6a41133d4a-hub\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966039 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab2d689d-cd6c-4576-962c-fe8652ae8258-tmp\") pod \"klusterlet-addon-workmgr-6b9cb95d68-slldm\" (UID: \"ab2d689d-cd6c-4576-962c-fe8652ae8258\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966067 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr8xq\" (UniqueName: \"kubernetes.io/projected/d94d4d43-b527-44fc-a469-c293ea0d393c-kube-api-access-gr8xq\") pod \"console-operator-9d4b6777b-rlv6p\" (UID: \"d94d4d43-b527-44fc-a469-c293ea0d393c\") " pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966085 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d94d4d43-b527-44fc-a469-c293ea0d393c-serving-cert\") pod \"console-operator-9d4b6777b-rlv6p\" (UID: \"d94d4d43-b527-44fc-a469-c293ea0d393c\") " pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966101 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f030e0a1-53de-48ef-9d7b-0da9ecbb9124-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fb556c45-sgwdr\" (UID: \"f030e0a1-53de-48ef-9d7b-0da9ecbb9124\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966120 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/29de487b-ac86-4115-aa5b-af699bbd6649-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zz9fq\" (UID: \"29de487b-ac86-4115-aa5b-af699bbd6649\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966144 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zz9fq\" (UID: \"29de487b-ac86-4115-aa5b-af699bbd6649\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966170 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/de52ac65-738f-4a82-8711-dc6a41133d4a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966201 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-stats-auth\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966226 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1832b77e-f856-4269-bb4c-9f4e5c59c722-tmp-dir\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966279 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d94d4d43-b527-44fc-a469-c293ea0d393c-trusted-ca\") pod \"console-operator-9d4b6777b-rlv6p\" (UID: \"d94d4d43-b527-44fc-a469-c293ea0d393c\") " pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966304 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnvsz\" (UniqueName: \"kubernetes.io/projected/00bf21d3-25d9-4d4b-a2f5-c99835cb1daa-kube-api-access-nnvsz\") pod \"network-check-source-8894fc9bd-lb8gp\" (UID: \"00bf21d3-25d9-4d4b-a2f5-c99835cb1daa\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lb8gp" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966319 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-ddhcv" Apr 24 22:30:05.966333 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966329 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62qqc\" (UniqueName: \"kubernetes.io/projected/ab2d689d-cd6c-4576-962c-fe8652ae8258-kube-api-access-62qqc\") pod \"klusterlet-addon-workmgr-6b9cb95d68-slldm\" (UID: \"ab2d689d-cd6c-4576-962c-fe8652ae8258\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" Apr 24 22:30:05.967053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966380 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:05.967053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966415 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw686\" (UniqueName: \"kubernetes.io/projected/f030e0a1-53de-48ef-9d7b-0da9ecbb9124-kube-api-access-jw686\") pod \"managed-serviceaccount-addon-agent-fb556c45-sgwdr\" (UID: \"f030e0a1-53de-48ef-9d7b-0da9ecbb9124\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr" Apr 24 22:30:05.967053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966445 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:05.967053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966473 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9t4st\" (UniqueName: \"kubernetes.io/projected/788467b9-4514-4bb7-88c9-87f727737472-kube-api-access-9t4st\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:05.967053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966497 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/de52ac65-738f-4a82-8711-dc6a41133d4a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:05.967053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966535 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmt5h\" (UniqueName: \"kubernetes.io/projected/1832b77e-f856-4269-bb4c-9f4e5c59c722-kube-api-access-fmt5h\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:05.967053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966581 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94d4d43-b527-44fc-a469-c293ea0d393c-config\") pod \"console-operator-9d4b6777b-rlv6p\" (UID: \"d94d4d43-b527-44fc-a469-c293ea0d393c\") " pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:05.967053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966607 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1832b77e-f856-4269-bb4c-9f4e5c59c722-config-volume\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:05.967053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966632 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wqn2\" (UniqueName: \"kubernetes.io/projected/de52ac65-738f-4a82-8711-dc6a41133d4a-kube-api-access-6wqn2\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:05.967053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966662 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/de52ac65-738f-4a82-8711-dc6a41133d4a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:05.967053 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.966701 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-default-certificate\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:05.967549 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.967374 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:05.967549 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.967461 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert podName:29de487b-ac86-4115-aa5b-af699bbd6649 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.467440093 +0000 UTC m=+35.283766172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zz9fq" (UID: "29de487b-ac86-4115-aa5b-af699bbd6649") : secret "networking-console-plugin-cert" not found Apr 24 22:30:05.967780 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.967748 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:30:05.967827 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.967800 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs podName:463655ac-43dd-4bbb-9fc3-a11c952398a9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.467783729 +0000 UTC m=+35.284109785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs") pod "router-default-6cf5756446-6zvgg" (UID: "463655ac-43dd-4bbb-9fc3-a11c952398a9") : secret "router-metrics-certs-default" not found Apr 24 22:30:05.968959 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.967895 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle podName:463655ac-43dd-4bbb-9fc3-a11c952398a9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.467884889 +0000 UTC m=+35.284210947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle") pod "router-default-6cf5756446-6zvgg" (UID: "463655ac-43dd-4bbb-9fc3-a11c952398a9") : configmap references non-existent config key: service-ca.crt Apr 24 22:30:05.968959 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.968619 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab2d689d-cd6c-4576-962c-fe8652ae8258-tmp\") pod \"klusterlet-addon-workmgr-6b9cb95d68-slldm\" (UID: \"ab2d689d-cd6c-4576-962c-fe8652ae8258\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" Apr 24 22:30:05.968959 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.968906 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/29de487b-ac86-4115-aa5b-af699bbd6649-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zz9fq\" (UID: \"29de487b-ac86-4115-aa5b-af699bbd6649\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:05.969404 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.969381 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:05.969492 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:05.969452 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls podName:788467b9-4514-4bb7-88c9-87f727737472 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.469435578 +0000 UTC m=+35.285761651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rqq25" (UID: "788467b9-4514-4bb7-88c9-87f727737472") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:05.970519 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.970492 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94d4d43-b527-44fc-a469-c293ea0d393c-config\") pod \"console-operator-9d4b6777b-rlv6p\" (UID: \"d94d4d43-b527-44fc-a469-c293ea0d393c\") " pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:05.970982 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.970960 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-stats-auth\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:05.971362 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.971338 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-default-certificate\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:05.973429 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.971505 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d94d4d43-b527-44fc-a469-c293ea0d393c-trusted-ca\") pod \"console-operator-9d4b6777b-rlv6p\" (UID: \"d94d4d43-b527-44fc-a469-c293ea0d393c\") " pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:05.973429 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.971787 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/788467b9-4514-4bb7-88c9-87f727737472-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:05.974156 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.974129 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ab2d689d-cd6c-4576-962c-fe8652ae8258-klusterlet-config\") pod \"klusterlet-addon-workmgr-6b9cb95d68-slldm\" (UID: \"ab2d689d-cd6c-4576-962c-fe8652ae8258\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" Apr 24 22:30:05.974430 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.974404 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d94d4d43-b527-44fc-a469-c293ea0d393c-serving-cert\") pod \"console-operator-9d4b6777b-rlv6p\" (UID: \"d94d4d43-b527-44fc-a469-c293ea0d393c\") " pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:05.975384 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.975363 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f030e0a1-53de-48ef-9d7b-0da9ecbb9124-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fb556c45-sgwdr\" (UID: \"f030e0a1-53de-48ef-9d7b-0da9ecbb9124\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr" Apr 24 22:30:05.980040 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.979857 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr8xq\" (UniqueName: \"kubernetes.io/projected/d94d4d43-b527-44fc-a469-c293ea0d393c-kube-api-access-gr8xq\") pod \"console-operator-9d4b6777b-rlv6p\" (UID: \"d94d4d43-b527-44fc-a469-c293ea0d393c\") " pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:05.981529 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.981479 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gz596" Apr 24 22:30:05.982132 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.982105 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw686\" (UniqueName: \"kubernetes.io/projected/f030e0a1-53de-48ef-9d7b-0da9ecbb9124-kube-api-access-jw686\") pod \"managed-serviceaccount-addon-agent-fb556c45-sgwdr\" (UID: \"f030e0a1-53de-48ef-9d7b-0da9ecbb9124\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr" Apr 24 22:30:05.984641 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.984594 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62qqc\" (UniqueName: \"kubernetes.io/projected/ab2d689d-cd6c-4576-962c-fe8652ae8258-kube-api-access-62qqc\") pod \"klusterlet-addon-workmgr-6b9cb95d68-slldm\" (UID: \"ab2d689d-cd6c-4576-962c-fe8652ae8258\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" Apr 24 22:30:05.985708 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.985681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnvsz\" (UniqueName: \"kubernetes.io/projected/00bf21d3-25d9-4d4b-a2f5-c99835cb1daa-kube-api-access-nnvsz\") pod \"network-check-source-8894fc9bd-lb8gp\" (UID: \"00bf21d3-25d9-4d4b-a2f5-c99835cb1daa\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lb8gp" Apr 24 22:30:05.986611 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.986589 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t4st\" (UniqueName: \"kubernetes.io/projected/788467b9-4514-4bb7-88c9-87f727737472-kube-api-access-9t4st\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:05.994924 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:05.994883 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjrlj\" (UniqueName: \"kubernetes.io/projected/463655ac-43dd-4bbb-9fc3-a11c952398a9-kube-api-access-gjrlj\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:06.012378 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.012016 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:06.049284 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.048803 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lb8gp" Apr 24 22:30:06.066639 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.063412 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" Apr 24 22:30:06.069656 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.069049 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmt5h\" (UniqueName: \"kubernetes.io/projected/1832b77e-f856-4269-bb4c-9f4e5c59c722-kube-api-access-fmt5h\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:06.069656 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.069095 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1832b77e-f856-4269-bb4c-9f4e5c59c722-config-volume\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:06.069656 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.069121 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wqn2\" (UniqueName: \"kubernetes.io/projected/de52ac65-738f-4a82-8711-dc6a41133d4a-kube-api-access-6wqn2\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:06.069656 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.069150 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/de52ac65-738f-4a82-8711-dc6a41133d4a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:06.069656 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.069209 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert\") pod \"ingress-canary-wsnnl\" (UID: \"8e2721e7-1e03-4664-9c72-58e388853714\") " pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:06.069656 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.069286 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/de52ac65-738f-4a82-8711-dc6a41133d4a-ca\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:06.069656 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.069327 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jkbw\" (UniqueName: \"kubernetes.io/projected/8e2721e7-1e03-4664-9c72-58e388853714-kube-api-access-8jkbw\") pod \"ingress-canary-wsnnl\" (UID: \"8e2721e7-1e03-4664-9c72-58e388853714\") " pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:06.069656 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.069351 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/de52ac65-738f-4a82-8711-dc6a41133d4a-hub\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:06.069656 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.069440 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/de52ac65-738f-4a82-8711-dc6a41133d4a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:06.069656 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.069469 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1832b77e-f856-4269-bb4c-9f4e5c59c722-tmp-dir\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:06.069656 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.069549 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:06.069656 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.069632 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/de52ac65-738f-4a82-8711-dc6a41133d4a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:06.073108 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.071034 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/de52ac65-738f-4a82-8711-dc6a41133d4a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:06.073108 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.072183 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:06.073108 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.072267 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert podName:8e2721e7-1e03-4664-9c72-58e388853714 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.572247813 +0000 UTC m=+35.388573869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert") pod "ingress-canary-wsnnl" (UID: "8e2721e7-1e03-4664-9c72-58e388853714") : secret "canary-serving-cert" not found Apr 24 22:30:06.080982 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.080691 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1832b77e-f856-4269-bb4c-9f4e5c59c722-config-volume\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:06.080982 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.080934 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:06.080982 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.080963 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1832b77e-f856-4269-bb4c-9f4e5c59c722-tmp-dir\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:06.081203 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.081013 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls podName:1832b77e-f856-4269-bb4c-9f4e5c59c722 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.580993367 +0000 UTC m=+35.397319424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls") pod "dns-default-x22nw" (UID: "1832b77e-f856-4269-bb4c-9f4e5c59c722") : secret "dns-default-metrics-tls" not found Apr 24 22:30:06.090083 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.089693 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/de52ac65-738f-4a82-8711-dc6a41133d4a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:06.094702 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.094345 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr" Apr 24 22:30:06.094702 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.094698 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmt5h\" (UniqueName: \"kubernetes.io/projected/1832b77e-f856-4269-bb4c-9f4e5c59c722-kube-api-access-fmt5h\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:06.096703 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.096647 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/de52ac65-738f-4a82-8711-dc6a41133d4a-hub\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:06.097013 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.096986 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/de52ac65-738f-4a82-8711-dc6a41133d4a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:06.098421 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.098387 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/de52ac65-738f-4a82-8711-dc6a41133d4a-ca\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:06.099685 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.099414 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wqn2\" (UniqueName: \"kubernetes.io/projected/de52ac65-738f-4a82-8711-dc6a41133d4a-kube-api-access-6wqn2\") pod \"cluster-proxy-proxy-agent-68f7c77688-tn2fw\" (UID: \"de52ac65-738f-4a82-8711-dc6a41133d4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:06.100017 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.099975 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jkbw\" (UniqueName: \"kubernetes.io/projected/8e2721e7-1e03-4664-9c72-58e388853714-kube-api-access-8jkbw\") pod \"ingress-canary-wsnnl\" (UID: \"8e2721e7-1e03-4664-9c72-58e388853714\") " pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:06.117598 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.114398 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:30:06.171175 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.170548 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7"] Apr 24 22:30:06.269649 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.262255 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j"] Apr 24 22:30:06.272610 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.272488 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:06.273343 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.272870 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:06.273343 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.272893 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bdcbf4855-czjjp: secret "image-registry-tls" not found Apr 24 22:30:06.276044 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.274119 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls podName:40b71b75-d3f5-4c46-82d7-81a92e3e572d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:07.274088621 +0000 UTC m=+36.090414688 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls") pod "image-registry-5bdcbf4855-czjjp" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d") : secret "image-registry-tls" not found Apr 24 22:30:06.276044 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.275221 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-ddhcv"] Apr 24 22:30:06.304243 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.304208 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-rlv6p"] Apr 24 22:30:06.316599 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.316543 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gz596"] Apr 24 22:30:06.352221 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.352194 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lb8gp"] Apr 24 22:30:06.367093 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.367072 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr"] Apr 24 22:30:06.369798 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.369773 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm"] Apr 24 22:30:06.374972 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.374949 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-984q2\" (UID: \"760dbb36-9588-4e56-bc23-e7c58e592567\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:06.375117 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.375099 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:30:06.375190 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.375181 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls podName:760dbb36-9588-4e56-bc23-e7c58e592567 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:07.375163477 +0000 UTC m=+36.191489549 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-984q2" (UID: "760dbb36-9588-4e56-bc23-e7c58e592567") : secret "samples-operator-tls" not found Apr 24 22:30:06.382681 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.382654 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw"] Apr 24 22:30:06.386040 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:06.386010 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c268f7d_cce3_435b_bc67_fc4ba3d34b62.slice/crio-2a86dbcf0e47e65f38326826d6e70480e77e7f526e317b8445131f3245373112 WatchSource:0}: Error finding container 2a86dbcf0e47e65f38326826d6e70480e77e7f526e317b8445131f3245373112: Status 404 returned error can't find the container with id 2a86dbcf0e47e65f38326826d6e70480e77e7f526e317b8445131f3245373112 Apr 24 22:30:06.386496 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:06.386462 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2e30cff_84a0_405d_bdd7_3d9f61b16917.slice/crio-e712a2313e9d1a631cd727e195fa63e3a9ded893b677b93ca9a8a445a9917992 WatchSource:0}: Error finding container e712a2313e9d1a631cd727e195fa63e3a9ded893b677b93ca9a8a445a9917992: Status 404 returned error can't find the container with id e712a2313e9d1a631cd727e195fa63e3a9ded893b677b93ca9a8a445a9917992 Apr 24 22:30:06.387626 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:06.387601 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00bf21d3_25d9_4d4b_a2f5_c99835cb1daa.slice/crio-ceb59a6433b2f04aaefadfe55c4a398a404d17576e2c44082226f16c4712fbe7 WatchSource:0}: Error finding container ceb59a6433b2f04aaefadfe55c4a398a404d17576e2c44082226f16c4712fbe7: Status 404 returned error can't find the container with id ceb59a6433b2f04aaefadfe55c4a398a404d17576e2c44082226f16c4712fbe7 Apr 24 22:30:06.388342 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:06.388318 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf030e0a1_53de_48ef_9d7b_0da9ecbb9124.slice/crio-a0deed0e103f7de7ad98df47e59544ffa86cf908314f55926b86ed6c1d85b002 WatchSource:0}: Error finding container a0deed0e103f7de7ad98df47e59544ffa86cf908314f55926b86ed6c1d85b002: Status 404 returned error can't find the container with id a0deed0e103f7de7ad98df47e59544ffa86cf908314f55926b86ed6c1d85b002 Apr 24 22:30:06.389018 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:06.388855 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd94d4d43_b527_44fc_a469_c293ea0d393c.slice/crio-7a0136d0096ad5d5d7262f6c9dda9feb1526b5433b0cf9f6fc3d593085c1356a WatchSource:0}: Error finding container 7a0136d0096ad5d5d7262f6c9dda9feb1526b5433b0cf9f6fc3d593085c1356a: Status 404 returned error can't find the container with id 7a0136d0096ad5d5d7262f6c9dda9feb1526b5433b0cf9f6fc3d593085c1356a Apr 24 22:30:06.396240 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:06.396212 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab2d689d_cd6c_4576_962c_fe8652ae8258.slice/crio-e3d56b7d17032e3ec13b9d2834d32009fa8c379d7d0dc0c9a3d06e28b5fa45e7 WatchSource:0}: Error finding container e3d56b7d17032e3ec13b9d2834d32009fa8c379d7d0dc0c9a3d06e28b5fa45e7: Status 404 returned error can't find the container with id e3d56b7d17032e3ec13b9d2834d32009fa8c379d7d0dc0c9a3d06e28b5fa45e7 Apr 24 22:30:06.399522 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:06.399497 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa9e4a3b_8580_4463_a5cd_6aa7ba82738b.slice/crio-b8e95447e29aa9c43fc39b271b6e80e908b7b77572fa0318b09d5c65a7b0accf WatchSource:0}: Error finding container b8e95447e29aa9c43fc39b271b6e80e908b7b77572fa0318b09d5c65a7b0accf: Status 404 returned error can't find the container with id b8e95447e29aa9c43fc39b271b6e80e908b7b77572fa0318b09d5c65a7b0accf Apr 24 22:30:06.476049 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.475948 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:06.476049 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.476003 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:06.476049 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.476039 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:06.476317 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.476094 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zz9fq\" (UID: \"29de487b-ac86-4115-aa5b-af699bbd6649\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:06.476317 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.476102 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:06.476317 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.476155 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:30:06.476317 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.476165 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls podName:788467b9-4514-4bb7-88c9-87f727737472 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:07.476150576 +0000 UTC m=+36.292476633 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rqq25" (UID: "788467b9-4514-4bb7-88c9-87f727737472") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:06.476317 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.476183 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:06.476317 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.476198 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle podName:463655ac-43dd-4bbb-9fc3-a11c952398a9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:07.476180175 +0000 UTC m=+36.292506237 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle") pod "router-default-6cf5756446-6zvgg" (UID: "463655ac-43dd-4bbb-9fc3-a11c952398a9") : configmap references non-existent config key: service-ca.crt Apr 24 22:30:06.476317 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.476221 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs podName:463655ac-43dd-4bbb-9fc3-a11c952398a9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:07.476211969 +0000 UTC m=+36.292538029 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs") pod "router-default-6cf5756446-6zvgg" (UID: "463655ac-43dd-4bbb-9fc3-a11c952398a9") : secret "router-metrics-certs-default" not found Apr 24 22:30:06.476317 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.476235 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert podName:29de487b-ac86-4115-aa5b-af699bbd6649 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:07.476227811 +0000 UTC m=+36.292553869 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zz9fq" (UID: "29de487b-ac86-4115-aa5b-af699bbd6649") : secret "networking-console-plugin-cert" not found Apr 24 22:30:06.577679 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.577630 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert\") pod \"ingress-canary-wsnnl\" (UID: \"8e2721e7-1e03-4664-9c72-58e388853714\") " pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:06.577825 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.577792 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:06.577877 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.577862 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert podName:8e2721e7-1e03-4664-9c72-58e388853714 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:07.577845741 +0000 UTC m=+36.394171802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert") pod "ingress-canary-wsnnl" (UID: "8e2721e7-1e03-4664-9c72-58e388853714") : secret "canary-serving-cert" not found Apr 24 22:30:06.678414 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.678380 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:06.678624 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.678560 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:06.678685 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:06.678658 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls podName:1832b77e-f856-4269-bb4c-9f4e5c59c722 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:07.678636506 +0000 UTC m=+36.494962572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls") pod "dns-default-x22nw" (UID: "1832b77e-f856-4269-bb4c-9f4e5c59c722") : secret "dns-default-metrics-tls" not found Apr 24 22:30:06.931353 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.931283 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ddhcv" event={"ID":"9c268f7d-cce3-435b-bc67-fc4ba3d34b62","Type":"ContainerStarted","Data":"2a86dbcf0e47e65f38326826d6e70480e77e7f526e317b8445131f3245373112"} Apr 24 22:30:06.932536 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.932476 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr" event={"ID":"f030e0a1-53de-48ef-9d7b-0da9ecbb9124","Type":"ContainerStarted","Data":"a0deed0e103f7de7ad98df47e59544ffa86cf908314f55926b86ed6c1d85b002"} Apr 24 22:30:06.940597 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.939661 2568 generic.go:358] "Generic (PLEG): container finished" podID="99d5da74-fb38-467a-951e-9d474464c9b1" containerID="45d513d867ae6849a71ec2a21725d378170396b363b1959303b02081ee98b964" exitCode=0 Apr 24 22:30:06.940597 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.939744 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8pjb" event={"ID":"99d5da74-fb38-467a-951e-9d474464c9b1","Type":"ContainerDied","Data":"45d513d867ae6849a71ec2a21725d378170396b363b1959303b02081ee98b964"} Apr 24 22:30:06.942456 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.942416 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" event={"ID":"ab2d689d-cd6c-4576-962c-fe8652ae8258","Type":"ContainerStarted","Data":"e3d56b7d17032e3ec13b9d2834d32009fa8c379d7d0dc0c9a3d06e28b5fa45e7"} Apr 24 22:30:06.944120 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.944096 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lb8gp" event={"ID":"00bf21d3-25d9-4d4b-a2f5-c99835cb1daa","Type":"ContainerStarted","Data":"ceb59a6433b2f04aaefadfe55c4a398a404d17576e2c44082226f16c4712fbe7"} Apr 24 22:30:06.956536 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.956499 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gz596" event={"ID":"e2e30cff-84a0-405d-bdd7-3d9f61b16917","Type":"ContainerStarted","Data":"e712a2313e9d1a631cd727e195fa63e3a9ded893b677b93ca9a8a445a9917992"} Apr 24 22:30:06.961285 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.961231 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" event={"ID":"45450237-ea60-4420-a8ca-ce71d93e2261","Type":"ContainerStarted","Data":"1625d4c597840ee1922a264041c79d705bbe0052d9afa660d6ac2360cf3e9603"} Apr 24 22:30:06.964019 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.963989 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" event={"ID":"d94d4d43-b527-44fc-a469-c293ea0d393c","Type":"ContainerStarted","Data":"7a0136d0096ad5d5d7262f6c9dda9feb1526b5433b0cf9f6fc3d593085c1356a"} Apr 24 22:30:06.966249 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.966212 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" event={"ID":"aa9e4a3b-8580-4463-a5cd-6aa7ba82738b","Type":"ContainerStarted","Data":"b8e95447e29aa9c43fc39b271b6e80e908b7b77572fa0318b09d5c65a7b0accf"} Apr 24 22:30:06.967580 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:06.967547 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" event={"ID":"de52ac65-738f-4a82-8711-dc6a41133d4a","Type":"ContainerStarted","Data":"7f54d6fd913169eaace919e465f10cd46d04d394f426198801a7bd7e080731d1"} Apr 24 22:30:07.287493 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:07.287450 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:07.287951 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.287698 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:07.287951 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.287717 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bdcbf4855-czjjp: secret "image-registry-tls" not found Apr 24 22:30:07.287951 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.287778 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls podName:40b71b75-d3f5-4c46-82d7-81a92e3e572d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:09.287758842 +0000 UTC m=+38.104084900 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls") pod "image-registry-5bdcbf4855-czjjp" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d") : secret "image-registry-tls" not found Apr 24 22:30:07.389659 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:07.389014 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-984q2\" (UID: \"760dbb36-9588-4e56-bc23-e7c58e592567\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:07.389659 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.389189 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:30:07.389659 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.389255 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls podName:760dbb36-9588-4e56-bc23-e7c58e592567 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:09.389235886 +0000 UTC m=+38.205561965 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-984q2" (UID: "760dbb36-9588-4e56-bc23-e7c58e592567") : secret "samples-operator-tls" not found Apr 24 22:30:07.491211 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:07.490031 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zz9fq\" (UID: \"29de487b-ac86-4115-aa5b-af699bbd6649\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:07.491211 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:07.490154 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:07.491211 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:07.490213 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:07.491211 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:07.490251 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:07.491211 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.490429 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle podName:463655ac-43dd-4bbb-9fc3-a11c952398a9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:09.490409367 +0000 UTC m=+38.306735423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle") pod "router-default-6cf5756446-6zvgg" (UID: "463655ac-43dd-4bbb-9fc3-a11c952398a9") : configmap references non-existent config key: service-ca.crt Apr 24 22:30:07.491211 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.490881 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:07.491211 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.490935 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert podName:29de487b-ac86-4115-aa5b-af699bbd6649 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:09.490917779 +0000 UTC m=+38.307243836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zz9fq" (UID: "29de487b-ac86-4115-aa5b-af699bbd6649") : secret "networking-console-plugin-cert" not found Apr 24 22:30:07.491211 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.490993 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:07.491211 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.491030 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls podName:788467b9-4514-4bb7-88c9-87f727737472 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:09.491019384 +0000 UTC m=+38.307345456 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rqq25" (UID: "788467b9-4514-4bb7-88c9-87f727737472") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:07.491211 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.491084 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:30:07.491211 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.491115 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs podName:463655ac-43dd-4bbb-9fc3-a11c952398a9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:09.491105426 +0000 UTC m=+38.307431484 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs") pod "router-default-6cf5756446-6zvgg" (UID: "463655ac-43dd-4bbb-9fc3-a11c952398a9") : secret "router-metrics-certs-default" not found Apr 24 22:30:07.592203 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:07.591706 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert\") pod \"ingress-canary-wsnnl\" (UID: \"8e2721e7-1e03-4664-9c72-58e388853714\") " pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:07.592203 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.592126 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:07.592963 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.592940 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert podName:8e2721e7-1e03-4664-9c72-58e388853714 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:09.592916109 +0000 UTC m=+38.409242182 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert") pod "ingress-canary-wsnnl" (UID: "8e2721e7-1e03-4664-9c72-58e388853714") : secret "canary-serving-cert" not found Apr 24 22:30:07.693621 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:07.693581 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:07.693861 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.693739 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:07.693861 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:07.693804 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls podName:1832b77e-f856-4269-bb4c-9f4e5c59c722 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:09.693786276 +0000 UTC m=+38.510112331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls") pod "dns-default-x22nw" (UID: "1832b77e-f856-4269-bb4c-9f4e5c59c722") : secret "dns-default-metrics-tls" not found Apr 24 22:30:07.996590 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:07.996320 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m8pjb" event={"ID":"99d5da74-fb38-467a-951e-9d474464c9b1","Type":"ContainerStarted","Data":"4b3e215247e771f762878ec175188dbdf52e003d5c2b41598267719768f784f5"} Apr 24 22:30:08.045004 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:08.044125 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m8pjb" podStartSLOduration=6.820945253 podStartE2EDuration="37.044104269s" podCreationTimestamp="2026-04-24 22:29:31 +0000 UTC" firstStartedPulling="2026-04-24 22:29:34.626717254 +0000 UTC m=+3.443043317" lastFinishedPulling="2026-04-24 22:30:04.849876264 +0000 UTC m=+33.666202333" observedRunningTime="2026-04-24 22:30:08.041910992 +0000 UTC m=+36.858237060" watchObservedRunningTime="2026-04-24 22:30:08.044104269 +0000 UTC m=+36.860430349" Apr 24 22:30:09.313712 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:09.312782 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:09.313712 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.312940 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:09.313712 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.312957 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bdcbf4855-czjjp: secret "image-registry-tls" not found Apr 24 22:30:09.313712 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.313020 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls podName:40b71b75-d3f5-4c46-82d7-81a92e3e572d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:13.312997803 +0000 UTC m=+42.129323864 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls") pod "image-registry-5bdcbf4855-czjjp" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d") : secret "image-registry-tls" not found Apr 24 22:30:09.414198 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:09.413490 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-984q2\" (UID: \"760dbb36-9588-4e56-bc23-e7c58e592567\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:09.414198 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.413777 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:30:09.414198 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.413843 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls podName:760dbb36-9588-4e56-bc23-e7c58e592567 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:13.413823788 +0000 UTC m=+42.230149843 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-984q2" (UID: "760dbb36-9588-4e56-bc23-e7c58e592567") : secret "samples-operator-tls" not found Apr 24 22:30:09.515508 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:09.514430 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:09.515508 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:09.514516 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:09.515508 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:09.514557 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:09.515508 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:09.514660 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zz9fq\" (UID: \"29de487b-ac86-4115-aa5b-af699bbd6649\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:09.515508 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.514833 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:30:09.515508 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.514899 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs podName:463655ac-43dd-4bbb-9fc3-a11c952398a9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:13.514880015 +0000 UTC m=+42.331206071 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs") pod "router-default-6cf5756446-6zvgg" (UID: "463655ac-43dd-4bbb-9fc3-a11c952398a9") : secret "router-metrics-certs-default" not found Apr 24 22:30:09.515508 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.515309 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:09.515508 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.515355 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls podName:788467b9-4514-4bb7-88c9-87f727737472 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:13.515341443 +0000 UTC m=+42.331667506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rqq25" (UID: "788467b9-4514-4bb7-88c9-87f727737472") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:09.515508 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.515433 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle podName:463655ac-43dd-4bbb-9fc3-a11c952398a9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:13.515423614 +0000 UTC m=+42.331749675 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle") pod "router-default-6cf5756446-6zvgg" (UID: "463655ac-43dd-4bbb-9fc3-a11c952398a9") : configmap references non-existent config key: service-ca.crt Apr 24 22:30:09.516822 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.516029 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:09.516822 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.516082 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert podName:29de487b-ac86-4115-aa5b-af699bbd6649 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:13.516066882 +0000 UTC m=+42.332392942 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zz9fq" (UID: "29de487b-ac86-4115-aa5b-af699bbd6649") : secret "networking-console-plugin-cert" not found Apr 24 22:30:09.618506 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:09.617818 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert\") pod \"ingress-canary-wsnnl\" (UID: \"8e2721e7-1e03-4664-9c72-58e388853714\") " pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:09.618506 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.618023 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:09.618506 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.618081 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert podName:8e2721e7-1e03-4664-9c72-58e388853714 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:13.618063209 +0000 UTC m=+42.434389273 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert") pod "ingress-canary-wsnnl" (UID: "8e2721e7-1e03-4664-9c72-58e388853714") : secret "canary-serving-cert" not found Apr 24 22:30:09.719071 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:09.719033 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:09.719416 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.719396 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:09.719498 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:09.719467 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls podName:1832b77e-f856-4269-bb4c-9f4e5c59c722 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:13.719449108 +0000 UTC m=+42.535775178 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls") pod "dns-default-x22nw" (UID: "1832b77e-f856-4269-bb4c-9f4e5c59c722") : secret "dns-default-metrics-tls" not found Apr 24 22:30:13.057868 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:13.057829 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:30:13.073510 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:13.073478 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5b342b72-c7fb-4579-947f-7a261031b1a3-original-pull-secret\") pod \"global-pull-secret-syncer-j22h7\" (UID: \"5b342b72-c7fb-4579-947f-7a261031b1a3\") " pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:30:13.307224 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:13.307182 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j22h7" Apr 24 22:30:13.360903 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:13.360869 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:13.361075 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.361045 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:13.361075 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.361069 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bdcbf4855-czjjp: secret "image-registry-tls" not found Apr 24 22:30:13.361173 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.361160 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls podName:40b71b75-d3f5-4c46-82d7-81a92e3e572d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:21.36113646 +0000 UTC m=+50.177462518 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls") pod "image-registry-5bdcbf4855-czjjp" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d") : secret "image-registry-tls" not found Apr 24 22:30:13.461424 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:13.461386 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-984q2\" (UID: \"760dbb36-9588-4e56-bc23-e7c58e592567\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:13.461616 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.461521 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:30:13.461616 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.461602 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls podName:760dbb36-9588-4e56-bc23-e7c58e592567 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:21.461587505 +0000 UTC m=+50.277913578 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-984q2" (UID: "760dbb36-9588-4e56-bc23-e7c58e592567") : secret "samples-operator-tls" not found Apr 24 22:30:13.562684 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:13.562598 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:13.562940 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:13.562683 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:13.562940 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:13.562729 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:13.562940 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.562793 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:13.562940 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:13.562822 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zz9fq\" (UID: \"29de487b-ac86-4115-aa5b-af699bbd6649\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:13.562940 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.562871 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls podName:788467b9-4514-4bb7-88c9-87f727737472 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:21.562849864 +0000 UTC m=+50.379175944 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rqq25" (UID: "788467b9-4514-4bb7-88c9-87f727737472") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:13.562940 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.562923 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:13.563241 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.562973 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert podName:29de487b-ac86-4115-aa5b-af699bbd6649 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:21.56295729 +0000 UTC m=+50.379283346 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zz9fq" (UID: "29de487b-ac86-4115-aa5b-af699bbd6649") : secret "networking-console-plugin-cert" not found Apr 24 22:30:13.563241 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.563035 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:30:13.563241 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.563069 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs podName:463655ac-43dd-4bbb-9fc3-a11c952398a9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:21.563057649 +0000 UTC m=+50.379383712 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs") pod "router-default-6cf5756446-6zvgg" (UID: "463655ac-43dd-4bbb-9fc3-a11c952398a9") : secret "router-metrics-certs-default" not found Apr 24 22:30:13.563241 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.563151 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle podName:463655ac-43dd-4bbb-9fc3-a11c952398a9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:21.563124122 +0000 UTC m=+50.379450204 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle") pod "router-default-6cf5756446-6zvgg" (UID: "463655ac-43dd-4bbb-9fc3-a11c952398a9") : configmap references non-existent config key: service-ca.crt Apr 24 22:30:13.664027 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:13.663974 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert\") pod \"ingress-canary-wsnnl\" (UID: \"8e2721e7-1e03-4664-9c72-58e388853714\") " pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:13.664243 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.664128 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:13.664243 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.664199 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert podName:8e2721e7-1e03-4664-9c72-58e388853714 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:21.664179194 +0000 UTC m=+50.480505250 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert") pod "ingress-canary-wsnnl" (UID: "8e2721e7-1e03-4664-9c72-58e388853714") : secret "canary-serving-cert" not found Apr 24 22:30:13.764823 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:13.764793 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:13.765009 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.764963 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:13.765071 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:13.765043 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls podName:1832b77e-f856-4269-bb4c-9f4e5c59c722 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:21.765023312 +0000 UTC m=+50.581349387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls") pod "dns-default-x22nw" (UID: "1832b77e-f856-4269-bb4c-9f4e5c59c722") : secret "dns-default-metrics-tls" not found Apr 24 22:30:18.644765 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:18.644739 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-j22h7"] Apr 24 22:30:18.647694 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:18.647666 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b342b72_c7fb_4579_947f_7a261031b1a3.slice/crio-eba3adf046f3aaf7d3c0d4f9ac1e57e9c6fe089659d73da0f847e394620bc30d WatchSource:0}: Error finding container eba3adf046f3aaf7d3c0d4f9ac1e57e9c6fe089659d73da0f847e394620bc30d: Status 404 returned error can't find the container with id eba3adf046f3aaf7d3c0d4f9ac1e57e9c6fe089659d73da0f847e394620bc30d Apr 24 22:30:19.032892 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.032836 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr" event={"ID":"f030e0a1-53de-48ef-9d7b-0da9ecbb9124","Type":"ContainerStarted","Data":"3df2298746388517c50f32a984ab7e8650d13cca03396a35838740651766ee9a"} Apr 24 22:30:19.034166 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.034140 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-j22h7" event={"ID":"5b342b72-c7fb-4579-947f-7a261031b1a3","Type":"ContainerStarted","Data":"eba3adf046f3aaf7d3c0d4f9ac1e57e9c6fe089659d73da0f847e394620bc30d"} Apr 24 22:30:19.035628 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.035604 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" event={"ID":"ab2d689d-cd6c-4576-962c-fe8652ae8258","Type":"ContainerStarted","Data":"4bbfce6c56172592192480e04ba808cfa7c1e8632916bd1ee827c9bad7cd2bdc"} Apr 24 22:30:19.035815 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.035802 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" Apr 24 22:30:19.037509 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.037484 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lb8gp" event={"ID":"00bf21d3-25d9-4d4b-a2f5-c99835cb1daa","Type":"ContainerStarted","Data":"e1f0a805f0678b615988cc3b04b89a52bd337a6f067ce85bb8cb62a267cacd6c"} Apr 24 22:30:19.037873 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.037853 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" Apr 24 22:30:19.039494 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.039094 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gz596" event={"ID":"e2e30cff-84a0-405d-bdd7-3d9f61b16917","Type":"ContainerStarted","Data":"4b22c8b1fe3c3015ce7d113c7704ec86750e6434cb30e5e7870740b4d22e1fab"} Apr 24 22:30:19.041130 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.041103 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" event={"ID":"45450237-ea60-4420-a8ca-ce71d93e2261","Type":"ContainerStarted","Data":"746eabf7443465fbd13da4986746e09e2a434ec139dd0f1a09b06c1bda1623ea"} Apr 24 22:30:19.045096 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.045008 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/0.log" Apr 24 22:30:19.045096 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.045047 2568 generic.go:358] "Generic (PLEG): container finished" podID="d94d4d43-b527-44fc-a469-c293ea0d393c" containerID="75b8320d29c5e8cdb9ece1d3643d11141a749342a79c86b024a638a32ae77e4f" exitCode=255 Apr 24 22:30:19.045236 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.045151 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" event={"ID":"d94d4d43-b527-44fc-a469-c293ea0d393c","Type":"ContainerDied","Data":"75b8320d29c5e8cdb9ece1d3643d11141a749342a79c86b024a638a32ae77e4f"} Apr 24 22:30:19.045401 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.045383 2568 scope.go:117] "RemoveContainer" containerID="75b8320d29c5e8cdb9ece1d3643d11141a749342a79c86b024a638a32ae77e4f" Apr 24 22:30:19.047882 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.046833 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" event={"ID":"aa9e4a3b-8580-4463-a5cd-6aa7ba82738b","Type":"ContainerStarted","Data":"6903b5448a7d4a28cfd3f2942269744bc50822baae13bc00193dd5503583e20b"} Apr 24 22:30:19.048462 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.048441 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" event={"ID":"de52ac65-738f-4a82-8711-dc6a41133d4a","Type":"ContainerStarted","Data":"799d1ed82c2ffaf81061df1761906a19d8f7de585cd5dedec2f745be110bfa10"} Apr 24 22:30:19.050035 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.050019 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ddhcv" event={"ID":"9c268f7d-cce3-435b-bc67-fc4ba3d34b62","Type":"ContainerStarted","Data":"8d370121bef4c49366b0498424e7b35929ed8ec43038438383a33092e5814b6d"} Apr 24 22:30:19.073416 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.072016 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fb556c45-sgwdr" podStartSLOduration=26.941497505 podStartE2EDuration="39.072001485s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:30:06.390447803 +0000 UTC m=+35.206773859" lastFinishedPulling="2026-04-24 22:30:18.52095176 +0000 UTC m=+47.337277839" observedRunningTime="2026-04-24 22:30:19.0710127 +0000 UTC m=+47.887338779" watchObservedRunningTime="2026-04-24 22:30:19.072001485 +0000 UTC m=+47.888327562" Apr 24 22:30:19.110736 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.109999 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" podStartSLOduration=21.651354944 podStartE2EDuration="32.109983135s" podCreationTimestamp="2026-04-24 22:29:47 +0000 UTC" firstStartedPulling="2026-04-24 22:30:06.191305885 +0000 UTC m=+35.007631943" lastFinishedPulling="2026-04-24 22:30:16.649934077 +0000 UTC m=+45.466260134" observedRunningTime="2026-04-24 22:30:19.109382443 +0000 UTC m=+47.925708520" watchObservedRunningTime="2026-04-24 22:30:19.109983135 +0000 UTC m=+47.926309220" Apr 24 22:30:19.175142 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.175064 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gz596" podStartSLOduration=20.044897002 podStartE2EDuration="32.17504555s" podCreationTimestamp="2026-04-24 22:29:47 +0000 UTC" firstStartedPulling="2026-04-24 22:30:06.38929004 +0000 UTC m=+35.205616095" lastFinishedPulling="2026-04-24 22:30:18.519438584 +0000 UTC m=+47.335764643" observedRunningTime="2026-04-24 22:30:19.142222555 +0000 UTC m=+47.958548634" watchObservedRunningTime="2026-04-24 22:30:19.17504555 +0000 UTC m=+47.991371629" Apr 24 22:30:19.221006 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.220333 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" podStartSLOduration=20.10194485 podStartE2EDuration="32.22031126s" podCreationTimestamp="2026-04-24 22:29:47 +0000 UTC" firstStartedPulling="2026-04-24 22:30:06.401055978 +0000 UTC m=+35.217382047" lastFinishedPulling="2026-04-24 22:30:18.519422394 +0000 UTC m=+47.335748457" observedRunningTime="2026-04-24 22:30:19.176107458 +0000 UTC m=+47.992433537" watchObservedRunningTime="2026-04-24 22:30:19.22031126 +0000 UTC m=+48.036637339" Apr 24 22:30:19.255004 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.254950 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b9cb95d68-slldm" podStartSLOduration=27.116239205 podStartE2EDuration="39.25493135s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:30:06.399391913 +0000 UTC m=+35.215717971" lastFinishedPulling="2026-04-24 22:30:18.538084058 +0000 UTC m=+47.354410116" observedRunningTime="2026-04-24 22:30:19.254487919 +0000 UTC m=+48.070813995" watchObservedRunningTime="2026-04-24 22:30:19.25493135 +0000 UTC m=+48.071257429" Apr 24 22:30:19.292522 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.291238 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lb8gp" podStartSLOduration=20.159051862 podStartE2EDuration="32.291218758s" podCreationTimestamp="2026-04-24 22:29:47 +0000 UTC" firstStartedPulling="2026-04-24 22:30:06.38947062 +0000 UTC m=+35.205796676" lastFinishedPulling="2026-04-24 22:30:18.521637504 +0000 UTC m=+47.337963572" observedRunningTime="2026-04-24 22:30:19.290183332 +0000 UTC m=+48.106509411" watchObservedRunningTime="2026-04-24 22:30:19.291218758 +0000 UTC m=+48.107544837" Apr 24 22:30:19.337243 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:19.337182 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-ddhcv" podStartSLOduration=20.2062442 podStartE2EDuration="32.337169612s" podCreationTimestamp="2026-04-24 22:29:47 +0000 UTC" firstStartedPulling="2026-04-24 22:30:06.388549216 +0000 UTC m=+35.204875285" lastFinishedPulling="2026-04-24 22:30:18.519474635 +0000 UTC m=+47.335800697" observedRunningTime="2026-04-24 22:30:19.3364589 +0000 UTC m=+48.152784980" watchObservedRunningTime="2026-04-24 22:30:19.337169612 +0000 UTC m=+48.153495681" Apr 24 22:30:20.054732 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.054641 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:30:20.055250 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.055233 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/0.log" Apr 24 22:30:20.055358 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.055272 2568 generic.go:358] "Generic (PLEG): container finished" podID="d94d4d43-b527-44fc-a469-c293ea0d393c" containerID="76ab86624a9c8d43f39fe5fcdb5b9a58038611e34f61f23a6a9fd166cc753717" exitCode=255 Apr 24 22:30:20.056288 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.055484 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" event={"ID":"d94d4d43-b527-44fc-a469-c293ea0d393c","Type":"ContainerDied","Data":"76ab86624a9c8d43f39fe5fcdb5b9a58038611e34f61f23a6a9fd166cc753717"} Apr 24 22:30:20.056288 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.055536 2568 scope.go:117] "RemoveContainer" containerID="75b8320d29c5e8cdb9ece1d3643d11141a749342a79c86b024a638a32ae77e4f" Apr 24 22:30:20.057075 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.056962 2568 scope.go:117] "RemoveContainer" containerID="76ab86624a9c8d43f39fe5fcdb5b9a58038611e34f61f23a6a9fd166cc753717" Apr 24 22:30:20.057181 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:20.057156 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rlv6p_openshift-console-operator(d94d4d43-b527-44fc-a469-c293ea0d393c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" podUID="d94d4d43-b527-44fc-a469-c293ea0d393c" Apr 24 22:30:20.791159 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.789824 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zd4ll"] Apr 24 22:30:20.798905 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.796807 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zd4ll" Apr 24 22:30:20.802722 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.802694 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zd4ll"] Apr 24 22:30:20.803392 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.803372 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 22:30:20.803637 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.803618 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 22:30:20.804682 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.804650 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-srm98\"" Apr 24 22:30:20.839449 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.839415 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt6r2\" (UniqueName: \"kubernetes.io/projected/7666dac8-f01e-4d50-86a2-4a298cdbb22b-kube-api-access-rt6r2\") pod \"migrator-74bb7799d9-zd4ll\" (UID: \"7666dac8-f01e-4d50-86a2-4a298cdbb22b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zd4ll" Apr 24 22:30:20.940201 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.940162 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt6r2\" (UniqueName: \"kubernetes.io/projected/7666dac8-f01e-4d50-86a2-4a298cdbb22b-kube-api-access-rt6r2\") pod \"migrator-74bb7799d9-zd4ll\" (UID: \"7666dac8-f01e-4d50-86a2-4a298cdbb22b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zd4ll" Apr 24 22:30:20.953155 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:20.953094 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt6r2\" (UniqueName: \"kubernetes.io/projected/7666dac8-f01e-4d50-86a2-4a298cdbb22b-kube-api-access-rt6r2\") pod \"migrator-74bb7799d9-zd4ll\" (UID: \"7666dac8-f01e-4d50-86a2-4a298cdbb22b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zd4ll" Apr 24 22:30:21.063589 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:21.063328 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:30:21.064020 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:21.063924 2568 scope.go:117] "RemoveContainer" containerID="76ab86624a9c8d43f39fe5fcdb5b9a58038611e34f61f23a6a9fd166cc753717" Apr 24 22:30:21.064386 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.064358 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rlv6p_openshift-console-operator(d94d4d43-b527-44fc-a469-c293ea0d393c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" podUID="d94d4d43-b527-44fc-a469-c293ea0d393c" Apr 24 22:30:21.115616 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:21.115551 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zd4ll" Apr 24 22:30:21.278158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:21.278123 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zd4ll"] Apr 24 22:30:21.282223 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:21.282166 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7666dac8_f01e_4d50_86a2_4a298cdbb22b.slice/crio-acef3d9eb77f672c1a4ae5edbeb0f4e34a714f44b70c016d5a1613fd4782cf26 WatchSource:0}: Error finding container acef3d9eb77f672c1a4ae5edbeb0f4e34a714f44b70c016d5a1613fd4782cf26: Status 404 returned error can't find the container with id acef3d9eb77f672c1a4ae5edbeb0f4e34a714f44b70c016d5a1613fd4782cf26 Apr 24 22:30:21.445765 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:21.445681 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:21.445938 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.445833 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:21.445938 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.445853 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5bdcbf4855-czjjp: secret "image-registry-tls" not found Apr 24 22:30:21.445938 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.445927 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls podName:40b71b75-d3f5-4c46-82d7-81a92e3e572d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:37.445908182 +0000 UTC m=+66.262234257 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls") pod "image-registry-5bdcbf4855-czjjp" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d") : secret "image-registry-tls" not found Apr 24 22:30:21.546477 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:21.546442 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-984q2\" (UID: \"760dbb36-9588-4e56-bc23-e7c58e592567\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:21.546663 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.546623 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 22:30:21.546749 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.546696 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls podName:760dbb36-9588-4e56-bc23-e7c58e592567 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:37.546680053 +0000 UTC m=+66.363006113 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-984q2" (UID: "760dbb36-9588-4e56-bc23-e7c58e592567") : secret "samples-operator-tls" not found Apr 24 22:30:21.648010 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:21.647922 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:21.648010 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.648005 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:21.648313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:21.648062 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:21.648313 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.648069 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls podName:788467b9-4514-4bb7-88c9-87f727737472 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:37.648050939 +0000 UTC m=+66.464376996 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rqq25" (UID: "788467b9-4514-4bb7-88c9-87f727737472") : secret "cluster-monitoring-operator-tls" not found Apr 24 22:30:21.648313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:21.648107 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:21.648313 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.648138 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 22:30:21.648313 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.648188 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs podName:463655ac-43dd-4bbb-9fc3-a11c952398a9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:37.648173368 +0000 UTC m=+66.464499432 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs") pod "router-default-6cf5756446-6zvgg" (UID: "463655ac-43dd-4bbb-9fc3-a11c952398a9") : secret "router-metrics-certs-default" not found Apr 24 22:30:21.648313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:21.648204 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zz9fq\" (UID: \"29de487b-ac86-4115-aa5b-af699bbd6649\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:21.648313 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.648227 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle podName:463655ac-43dd-4bbb-9fc3-a11c952398a9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:37.648219758 +0000 UTC m=+66.464545814 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle") pod "router-default-6cf5756446-6zvgg" (UID: "463655ac-43dd-4bbb-9fc3-a11c952398a9") : configmap references non-existent config key: service-ca.crt Apr 24 22:30:21.648313 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.648263 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:21.648313 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.648299 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert podName:29de487b-ac86-4115-aa5b-af699bbd6649 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:37.648289286 +0000 UTC m=+66.464615343 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zz9fq" (UID: "29de487b-ac86-4115-aa5b-af699bbd6649") : secret "networking-console-plugin-cert" not found Apr 24 22:30:21.749447 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:21.749369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert\") pod \"ingress-canary-wsnnl\" (UID: \"8e2721e7-1e03-4664-9c72-58e388853714\") " pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:21.749621 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.749524 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:21.749621 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.749603 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert podName:8e2721e7-1e03-4664-9c72-58e388853714 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:37.749585715 +0000 UTC m=+66.565911777 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert") pod "ingress-canary-wsnnl" (UID: "8e2721e7-1e03-4664-9c72-58e388853714") : secret "canary-serving-cert" not found Apr 24 22:30:21.849877 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:21.849843 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:21.850036 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.849995 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:21.850080 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:21.850061 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls podName:1832b77e-f856-4269-bb4c-9f4e5c59c722 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:37.850041858 +0000 UTC m=+66.666367918 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls") pod "dns-default-x22nw" (UID: "1832b77e-f856-4269-bb4c-9f4e5c59c722") : secret "dns-default-metrics-tls" not found Apr 24 22:30:22.067362 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.067322 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zd4ll" event={"ID":"7666dac8-f01e-4d50-86a2-4a298cdbb22b","Type":"ContainerStarted","Data":"acef3d9eb77f672c1a4ae5edbeb0f4e34a714f44b70c016d5a1613fd4782cf26"} Apr 24 22:30:22.732369 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.732327 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-v4cv5"] Apr 24 22:30:22.763896 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.763852 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-v4cv5"] Apr 24 22:30:22.764090 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.763993 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-v4cv5" Apr 24 22:30:22.766604 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.766560 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 22:30:22.767599 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.767557 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 22:30:22.768267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.767791 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 22:30:22.768267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.767894 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 22:30:22.768267 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.768084 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bdggz\"" Apr 24 22:30:22.859084 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.859049 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee3c94ed-70b7-4b3d-83eb-2a0034a3c439-signing-key\") pod \"service-ca-865cb79987-v4cv5\" (UID: \"ee3c94ed-70b7-4b3d-83eb-2a0034a3c439\") " pod="openshift-service-ca/service-ca-865cb79987-v4cv5" Apr 24 22:30:22.859256 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.859204 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d7tr\" (UniqueName: \"kubernetes.io/projected/ee3c94ed-70b7-4b3d-83eb-2a0034a3c439-kube-api-access-4d7tr\") pod \"service-ca-865cb79987-v4cv5\" (UID: \"ee3c94ed-70b7-4b3d-83eb-2a0034a3c439\") " pod="openshift-service-ca/service-ca-865cb79987-v4cv5" Apr 24 22:30:22.859256 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.859247 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee3c94ed-70b7-4b3d-83eb-2a0034a3c439-signing-cabundle\") pod \"service-ca-865cb79987-v4cv5\" (UID: \"ee3c94ed-70b7-4b3d-83eb-2a0034a3c439\") " pod="openshift-service-ca/service-ca-865cb79987-v4cv5" Apr 24 22:30:22.960503 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.960466 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d7tr\" (UniqueName: \"kubernetes.io/projected/ee3c94ed-70b7-4b3d-83eb-2a0034a3c439-kube-api-access-4d7tr\") pod \"service-ca-865cb79987-v4cv5\" (UID: \"ee3c94ed-70b7-4b3d-83eb-2a0034a3c439\") " pod="openshift-service-ca/service-ca-865cb79987-v4cv5" Apr 24 22:30:22.960682 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.960513 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee3c94ed-70b7-4b3d-83eb-2a0034a3c439-signing-cabundle\") pod \"service-ca-865cb79987-v4cv5\" (UID: \"ee3c94ed-70b7-4b3d-83eb-2a0034a3c439\") " pod="openshift-service-ca/service-ca-865cb79987-v4cv5" Apr 24 22:30:22.960682 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.960634 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee3c94ed-70b7-4b3d-83eb-2a0034a3c439-signing-key\") pod \"service-ca-865cb79987-v4cv5\" (UID: \"ee3c94ed-70b7-4b3d-83eb-2a0034a3c439\") " pod="openshift-service-ca/service-ca-865cb79987-v4cv5" Apr 24 22:30:22.973190 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.973161 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee3c94ed-70b7-4b3d-83eb-2a0034a3c439-signing-cabundle\") pod \"service-ca-865cb79987-v4cv5\" (UID: \"ee3c94ed-70b7-4b3d-83eb-2a0034a3c439\") " pod="openshift-service-ca/service-ca-865cb79987-v4cv5" Apr 24 22:30:22.975380 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.975353 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee3c94ed-70b7-4b3d-83eb-2a0034a3c439-signing-key\") pod \"service-ca-865cb79987-v4cv5\" (UID: \"ee3c94ed-70b7-4b3d-83eb-2a0034a3c439\") " pod="openshift-service-ca/service-ca-865cb79987-v4cv5" Apr 24 22:30:22.982655 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:22.982591 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d7tr\" (UniqueName: \"kubernetes.io/projected/ee3c94ed-70b7-4b3d-83eb-2a0034a3c439-kube-api-access-4d7tr\") pod \"service-ca-865cb79987-v4cv5\" (UID: \"ee3c94ed-70b7-4b3d-83eb-2a0034a3c439\") " pod="openshift-service-ca/service-ca-865cb79987-v4cv5" Apr 24 22:30:23.076307 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:23.076268 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-v4cv5" Apr 24 22:30:23.509362 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:23.509336 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wx9wp_e67c0e72-592f-4ae7-a84c-a898562b0176/dns-node-resolver/0.log" Apr 24 22:30:24.367983 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:24.367958 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-v4cv5"] Apr 24 22:30:24.372920 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:24.372898 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee3c94ed_70b7_4b3d_83eb_2a0034a3c439.slice/crio-3d9cc697549a406e46ec798ddb875cb1b89a9ede0b06f5a1b066c88d0b3c6dad WatchSource:0}: Error finding container 3d9cc697549a406e46ec798ddb875cb1b89a9ede0b06f5a1b066c88d0b3c6dad: Status 404 returned error can't find the container with id 3d9cc697549a406e46ec798ddb875cb1b89a9ede0b06f5a1b066c88d0b3c6dad Apr 24 22:30:24.512546 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:24.512522 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qz7zn_07631fd3-43b5-43d2-9831-42159a9806e2/node-ca/0.log" Apr 24 22:30:25.077377 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.077341 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-v4cv5" event={"ID":"ee3c94ed-70b7-4b3d-83eb-2a0034a3c439","Type":"ContainerStarted","Data":"ed0f1f68c2194aabaf154de3c43d8bde42ead6c35e0edf557c5467dd3c1b852c"} Apr 24 22:30:25.077377 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.077378 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-v4cv5" event={"ID":"ee3c94ed-70b7-4b3d-83eb-2a0034a3c439","Type":"ContainerStarted","Data":"3d9cc697549a406e46ec798ddb875cb1b89a9ede0b06f5a1b066c88d0b3c6dad"} Apr 24 22:30:25.079293 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.079270 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" event={"ID":"de52ac65-738f-4a82-8711-dc6a41133d4a","Type":"ContainerStarted","Data":"53f20260c532319c6fef0f763edb24cd51ca96f3abc0429703989733a9d2dd7a"} Apr 24 22:30:25.079396 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.079299 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" event={"ID":"de52ac65-738f-4a82-8711-dc6a41133d4a","Type":"ContainerStarted","Data":"bc470339e339291e0db9bd451b235b92f2ff6f8077d851127ba54396c3f37ada"} Apr 24 22:30:25.080753 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.080729 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zd4ll" event={"ID":"7666dac8-f01e-4d50-86a2-4a298cdbb22b","Type":"ContainerStarted","Data":"3d9e738759162c3aa623c4dbb9796155b38af16768a562e9fa747281fe74dab9"} Apr 24 22:30:25.080839 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.080758 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zd4ll" event={"ID":"7666dac8-f01e-4d50-86a2-4a298cdbb22b","Type":"ContainerStarted","Data":"cfc45eb2505c0d5f02d855aacf6a2d10786aeb2b5abbd8401700dd269ba3ccb9"} Apr 24 22:30:25.081813 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.081785 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-j22h7" event={"ID":"5b342b72-c7fb-4579-947f-7a261031b1a3","Type":"ContainerStarted","Data":"c75fcb262e76914bf0ebec630e364ae92255fc14d4663f757e7b998f015ce72c"} Apr 24 22:30:25.098930 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.098874 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-v4cv5" podStartSLOduration=3.098859266 podStartE2EDuration="3.098859266s" podCreationTimestamp="2026-04-24 22:30:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:25.096423877 +0000 UTC m=+53.912749961" watchObservedRunningTime="2026-04-24 22:30:25.098859266 +0000 UTC m=+53.915185345" Apr 24 22:30:25.107759 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.107735 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zd4ll_7666dac8-f01e-4d50-86a2-4a298cdbb22b/migrator/0.log" Apr 24 22:30:25.118280 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.118246 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-j22h7" podStartSLOduration=22.528218308 podStartE2EDuration="28.118233935s" podCreationTimestamp="2026-04-24 22:29:57 +0000 UTC" firstStartedPulling="2026-04-24 22:30:18.649438773 +0000 UTC m=+47.465764829" lastFinishedPulling="2026-04-24 22:30:24.239454387 +0000 UTC m=+53.055780456" observedRunningTime="2026-04-24 22:30:25.117852935 +0000 UTC m=+53.934179014" watchObservedRunningTime="2026-04-24 22:30:25.118233935 +0000 UTC m=+53.934560013" Apr 24 22:30:25.134636 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.134600 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zd4ll" podStartSLOduration=2.058275021 podStartE2EDuration="5.134590435s" podCreationTimestamp="2026-04-24 22:30:20 +0000 UTC" firstStartedPulling="2026-04-24 22:30:21.284439921 +0000 UTC m=+50.100765977" lastFinishedPulling="2026-04-24 22:30:24.360755325 +0000 UTC m=+53.177081391" observedRunningTime="2026-04-24 22:30:25.13352449 +0000 UTC m=+53.949850569" watchObservedRunningTime="2026-04-24 22:30:25.134590435 +0000 UTC m=+53.950916510" Apr 24 22:30:25.157698 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.157660 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" podStartSLOduration=27.334453949 podStartE2EDuration="45.157649636s" podCreationTimestamp="2026-04-24 22:29:40 +0000 UTC" firstStartedPulling="2026-04-24 22:30:06.401356558 +0000 UTC m=+35.217682614" lastFinishedPulling="2026-04-24 22:30:24.224552245 +0000 UTC m=+53.040878301" observedRunningTime="2026-04-24 22:30:25.156717301 +0000 UTC m=+53.973043403" watchObservedRunningTime="2026-04-24 22:30:25.157649636 +0000 UTC m=+53.973975714" Apr 24 22:30:25.308147 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.308120 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zd4ll_7666dac8-f01e-4d50-86a2-4a298cdbb22b/graceful-termination/0.log" Apr 24 22:30:25.511982 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:25.511946 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6dhr7_45450237-ea60-4420-a8ca-ce71d93e2261/kube-storage-version-migrator-operator/0.log" Apr 24 22:30:26.012515 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:26.012481 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:26.012690 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:26.012535 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:26.012935 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:26.012920 2568 scope.go:117] "RemoveContainer" containerID="76ab86624a9c8d43f39fe5fcdb5b9a58038611e34f61f23a6a9fd166cc753717" Apr 24 22:30:26.013116 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:30:26.013099 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-rlv6p_openshift-console-operator(d94d4d43-b527-44fc-a469-c293ea0d393c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" podUID="d94d4d43-b527-44fc-a469-c293ea0d393c" Apr 24 22:30:31.923412 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:31.923384 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tbc58" Apr 24 22:30:37.505248 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.505208 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs\") pod \"network-metrics-daemon-2wftt\" (UID: \"9b056a73-ce1c-4e88-9a66-ebfc4498a736\") " pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:30:37.505665 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.505329 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:37.507872 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.507846 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls\") pod \"image-registry-5bdcbf4855-czjjp\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:37.507994 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.507909 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:37.517986 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.517952 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b056a73-ce1c-4e88-9a66-ebfc4498a736-metrics-certs\") pod \"network-metrics-daemon-2wftt\" (UID: \"9b056a73-ce1c-4e88-9a66-ebfc4498a736\") " pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:30:37.605361 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.605328 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2wb7b\"" Apr 24 22:30:37.605742 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.605718 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-984q2\" (UID: \"760dbb36-9588-4e56-bc23-e7c58e592567\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:37.605801 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.605781 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9m5\" (UniqueName: \"kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5\") pod \"network-check-target-lclmx\" (UID: \"5346f0d6-5375-4d8a-9fb6-8f7c3a45720a\") " pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:30:37.608336 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.608311 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/760dbb36-9588-4e56-bc23-e7c58e592567-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-984q2\" (UID: \"760dbb36-9588-4e56-bc23-e7c58e592567\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:37.608448 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.608401 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr9m5\" (UniqueName: \"kubernetes.io/projected/5346f0d6-5375-4d8a-9fb6-8f7c3a45720a-kube-api-access-fr9m5\") pod \"network-check-target-lclmx\" (UID: \"5346f0d6-5375-4d8a-9fb6-8f7c3a45720a\") " pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:30:37.612436 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.612423 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wftt" Apr 24 22:30:37.621394 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.621352 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-94q8c\"" Apr 24 22:30:37.628445 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.628424 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:30:37.687331 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.687306 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8jvtj\"" Apr 24 22:30:37.695917 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.695889 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:37.706614 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.706348 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:37.706614 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.706415 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:37.706614 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.706454 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:37.706614 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.706529 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zz9fq\" (UID: \"29de487b-ac86-4115-aa5b-af699bbd6649\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:37.708210 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.708187 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/463655ac-43dd-4bbb-9fc3-a11c952398a9-service-ca-bundle\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:37.709593 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.709549 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788467b9-4514-4bb7-88c9-87f727737472-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rqq25\" (UID: \"788467b9-4514-4bb7-88c9-87f727737472\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:37.709971 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.709939 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/29de487b-ac86-4115-aa5b-af699bbd6649-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zz9fq\" (UID: \"29de487b-ac86-4115-aa5b-af699bbd6649\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:37.710699 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.710679 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/463655ac-43dd-4bbb-9fc3-a11c952398a9-metrics-certs\") pod \"router-default-6cf5756446-6zvgg\" (UID: \"463655ac-43dd-4bbb-9fc3-a11c952398a9\") " pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:37.721464 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.721311 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-hxwqm\"" Apr 24 22:30:37.729084 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.729055 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" Apr 24 22:30:37.749774 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.749611 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2wftt"] Apr 24 22:30:37.753650 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:37.753617 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b056a73_ce1c_4e88_9a66_ebfc4498a736.slice/crio-b278e30683f8371e581dae25c949758398fdf3d85427c4e119f0c83a03f02f50 WatchSource:0}: Error finding container b278e30683f8371e581dae25c949758398fdf3d85427c4e119f0c83a03f02f50: Status 404 returned error can't find the container with id b278e30683f8371e581dae25c949758398fdf3d85427c4e119f0c83a03f02f50 Apr 24 22:30:37.769944 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.769922 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lclmx"] Apr 24 22:30:37.772251 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:37.772207 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5346f0d6_5375_4d8a_9fb6_8f7c3a45720a.slice/crio-a3c64fdfe55a338494012f68168844f8961e1b2a7a6715a1884c69d1cab7f416 WatchSource:0}: Error finding container a3c64fdfe55a338494012f68168844f8961e1b2a7a6715a1884c69d1cab7f416: Status 404 returned error can't find the container with id a3c64fdfe55a338494012f68168844f8961e1b2a7a6715a1884c69d1cab7f416 Apr 24 22:30:37.801252 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.801167 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-mtpf2\"" Apr 24 22:30:37.807841 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.807817 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert\") pod \"ingress-canary-wsnnl\" (UID: \"8e2721e7-1e03-4664-9c72-58e388853714\") " pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:37.810726 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.810356 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" Apr 24 22:30:37.812164 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.812122 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e2721e7-1e03-4664-9c72-58e388853714-cert\") pod \"ingress-canary-wsnnl\" (UID: \"8e2721e7-1e03-4664-9c72-58e388853714\") " pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:37.825941 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.825915 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-p9zjk\"" Apr 24 22:30:37.834670 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.834641 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:37.837218 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.837164 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-p6295\"" Apr 24 22:30:37.843864 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.843841 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5bdcbf4855-czjjp"] Apr 24 22:30:37.845834 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.845551 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" Apr 24 22:30:37.861013 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:37.860973 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40b71b75_d3f5_4c46_82d7_81a92e3e572d.slice/crio-2133d38938bf60cba40e623c299f8923cc0039123189eeb30466e6be2fdb0990 WatchSource:0}: Error finding container 2133d38938bf60cba40e623c299f8923cc0039123189eeb30466e6be2fdb0990: Status 404 returned error can't find the container with id 2133d38938bf60cba40e623c299f8923cc0039123189eeb30466e6be2fdb0990 Apr 24 22:30:37.884458 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.884280 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2"] Apr 24 22:30:37.912530 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.910830 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:37.918504 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.918315 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1832b77e-f856-4269-bb4c-9f4e5c59c722-metrics-tls\") pod \"dns-default-x22nw\" (UID: \"1832b77e-f856-4269-bb4c-9f4e5c59c722\") " pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:37.940340 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.940115 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hjbjm\"" Apr 24 22:30:37.947397 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.947188 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-g8k4l\"" Apr 24 22:30:37.948248 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.948223 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:37.955591 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.955172 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wsnnl" Apr 24 22:30:37.997077 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:37.996535 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25"] Apr 24 22:30:38.002457 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:38.002423 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod788467b9_4514_4bb7_88c9_87f727737472.slice/crio-b999c9384b174bc657073b4ac79d0c0e9512814a643a604e14a612e25e068950 WatchSource:0}: Error finding container b999c9384b174bc657073b4ac79d0c0e9512814a643a604e14a612e25e068950: Status 404 returned error can't find the container with id b999c9384b174bc657073b4ac79d0c0e9512814a643a604e14a612e25e068950 Apr 24 22:30:38.017982 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.017887 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6cf5756446-6zvgg"] Apr 24 22:30:38.023030 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:38.022760 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod463655ac_43dd_4bbb_9fc3_a11c952398a9.slice/crio-86c396195da3d8b4f37e51bf81c0e54f672430619d03af9d598947041ab2619b WatchSource:0}: Error finding container 86c396195da3d8b4f37e51bf81c0e54f672430619d03af9d598947041ab2619b: Status 404 returned error can't find the container with id 86c396195da3d8b4f37e51bf81c0e54f672430619d03af9d598947041ab2619b Apr 24 22:30:38.042084 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.040243 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq"] Apr 24 22:30:38.045137 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:38.045112 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29de487b_ac86_4115_aa5b_af699bbd6649.slice/crio-532efc398ec7a4a604f3610b999b5fc2be80bc0b367cf8cabc045b78512ef642 WatchSource:0}: Error finding container 532efc398ec7a4a604f3610b999b5fc2be80bc0b367cf8cabc045b78512ef642: Status 404 returned error can't find the container with id 532efc398ec7a4a604f3610b999b5fc2be80bc0b367cf8cabc045b78512ef642 Apr 24 22:30:38.108988 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.108962 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wsnnl"] Apr 24 22:30:38.116678 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:38.116649 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e2721e7_1e03_4664_9c72_58e388853714.slice/crio-45ba941ae756ea94802ff5fbdfd748dc5cb3792c858aaedcd6097587aad8e5c3 WatchSource:0}: Error finding container 45ba941ae756ea94802ff5fbdfd748dc5cb3792c858aaedcd6097587aad8e5c3: Status 404 returned error can't find the container with id 45ba941ae756ea94802ff5fbdfd748dc5cb3792c858aaedcd6097587aad8e5c3 Apr 24 22:30:38.120546 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.120517 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6cf5756446-6zvgg" event={"ID":"463655ac-43dd-4bbb-9fc3-a11c952398a9","Type":"ContainerStarted","Data":"86c396195da3d8b4f37e51bf81c0e54f672430619d03af9d598947041ab2619b"} Apr 24 22:30:38.122036 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.122011 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" event={"ID":"40b71b75-d3f5-4c46-82d7-81a92e3e572d","Type":"ContainerStarted","Data":"e8081fe53f579e7851d421c170c9fc97005cc1a99b0bb5d99bf008c779200563"} Apr 24 22:30:38.122131 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.122045 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" event={"ID":"40b71b75-d3f5-4c46-82d7-81a92e3e572d","Type":"ContainerStarted","Data":"2133d38938bf60cba40e623c299f8923cc0039123189eeb30466e6be2fdb0990"} Apr 24 22:30:38.122131 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.122086 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:30:38.123217 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.123181 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" event={"ID":"788467b9-4514-4bb7-88c9-87f727737472","Type":"ContainerStarted","Data":"b999c9384b174bc657073b4ac79d0c0e9512814a643a604e14a612e25e068950"} Apr 24 22:30:38.124298 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.124267 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2wftt" event={"ID":"9b056a73-ce1c-4e88-9a66-ebfc4498a736","Type":"ContainerStarted","Data":"b278e30683f8371e581dae25c949758398fdf3d85427c4e119f0c83a03f02f50"} Apr 24 22:30:38.125466 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.125441 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" event={"ID":"29de487b-ac86-4115-aa5b-af699bbd6649","Type":"ContainerStarted","Data":"532efc398ec7a4a604f3610b999b5fc2be80bc0b367cf8cabc045b78512ef642"} Apr 24 22:30:38.126805 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.126775 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" event={"ID":"760dbb36-9588-4e56-bc23-e7c58e592567","Type":"ContainerStarted","Data":"5e86b2cabaa145fa1eccb851b40f5ffcefd05d68380ece2d77c7171e30b89ede"} Apr 24 22:30:38.127235 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.127190 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x22nw"] Apr 24 22:30:38.128250 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.128231 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lclmx" event={"ID":"5346f0d6-5375-4d8a-9fb6-8f7c3a45720a","Type":"ContainerStarted","Data":"4a1a1368a96ae4975edf6702cfbe0ae0ba1081965c1a752b99751787becb43e8"} Apr 24 22:30:38.128313 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.128257 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lclmx" event={"ID":"5346f0d6-5375-4d8a-9fb6-8f7c3a45720a","Type":"ContainerStarted","Data":"a3c64fdfe55a338494012f68168844f8961e1b2a7a6715a1884c69d1cab7f416"} Apr 24 22:30:38.128427 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.128405 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:30:38.130475 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:38.130454 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1832b77e_f856_4269_bb4c_9f4e5c59c722.slice/crio-2522d4f17c80d960f1da179c99b6912f5685311b96926ffe35802f6adcb7adb6 WatchSource:0}: Error finding container 2522d4f17c80d960f1da179c99b6912f5685311b96926ffe35802f6adcb7adb6: Status 404 returned error can't find the container with id 2522d4f17c80d960f1da179c99b6912f5685311b96926ffe35802f6adcb7adb6 Apr 24 22:30:38.173421 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:38.173364 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" podStartSLOduration=66.173346967 podStartE2EDuration="1m6.173346967s" podCreationTimestamp="2026-04-24 22:29:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:38.15238785 +0000 UTC m=+66.968713927" watchObservedRunningTime="2026-04-24 22:30:38.173346967 +0000 UTC m=+66.989673046" Apr 24 22:30:39.135046 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:39.134296 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6cf5756446-6zvgg" event={"ID":"463655ac-43dd-4bbb-9fc3-a11c952398a9","Type":"ContainerStarted","Data":"3c1f62cd8bacaeec105b105da0549ccfcc397c1f89ab77da34a47f82d488575e"} Apr 24 22:30:39.137497 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:39.137468 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x22nw" event={"ID":"1832b77e-f856-4269-bb4c-9f4e5c59c722","Type":"ContainerStarted","Data":"2522d4f17c80d960f1da179c99b6912f5685311b96926ffe35802f6adcb7adb6"} Apr 24 22:30:39.140710 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:39.140683 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wsnnl" event={"ID":"8e2721e7-1e03-4664-9c72-58e388853714","Type":"ContainerStarted","Data":"45ba941ae756ea94802ff5fbdfd748dc5cb3792c858aaedcd6097587aad8e5c3"} Apr 24 22:30:39.155755 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:39.155683 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-lclmx" podStartSLOduration=67.155640971 podStartE2EDuration="1m7.155640971s" podCreationTimestamp="2026-04-24 22:29:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:38.173085825 +0000 UTC m=+66.989411916" watchObservedRunningTime="2026-04-24 22:30:39.155640971 +0000 UTC m=+67.971967050" Apr 24 22:30:39.157914 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:39.157725 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6cf5756446-6zvgg" podStartSLOduration=52.157711241 podStartE2EDuration="52.157711241s" podCreationTimestamp="2026-04-24 22:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:39.154943755 +0000 UTC m=+67.971269834" watchObservedRunningTime="2026-04-24 22:30:39.157711241 +0000 UTC m=+67.974037319" Apr 24 22:30:39.781103 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:39.781073 2568 scope.go:117] "RemoveContainer" containerID="76ab86624a9c8d43f39fe5fcdb5b9a58038611e34f61f23a6a9fd166cc753717" Apr 24 22:30:39.835518 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:39.835487 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:39.838262 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:39.838242 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:40.143286 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:40.143187 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:40.144611 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:40.144584 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6cf5756446-6zvgg" Apr 24 22:30:42.005597 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.005552 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-cr5q4"] Apr 24 22:30:42.025703 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.025664 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.027960 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.027935 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 22:30:42.027960 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.027938 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 22:30:42.028158 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.027969 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-ksx55\"" Apr 24 22:30:42.050297 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.050268 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqbjm\" (UniqueName: \"kubernetes.io/projected/a50379b2-619d-4cdd-9cb9-259c5a1cde61-kube-api-access-lqbjm\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.050453 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.050326 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a50379b2-619d-4cdd-9cb9-259c5a1cde61-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.050453 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.050393 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a50379b2-619d-4cdd-9cb9-259c5a1cde61-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.050592 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.050476 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a50379b2-619d-4cdd-9cb9-259c5a1cde61-crio-socket\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.050592 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.050511 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a50379b2-619d-4cdd-9cb9-259c5a1cde61-data-volume\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.057063 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.057034 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cr5q4"] Apr 24 22:30:42.151391 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.151348 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a50379b2-619d-4cdd-9cb9-259c5a1cde61-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.151559 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.151410 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a50379b2-619d-4cdd-9cb9-259c5a1cde61-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.151559 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.151548 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a50379b2-619d-4cdd-9cb9-259c5a1cde61-crio-socket\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.151760 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.151603 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a50379b2-619d-4cdd-9cb9-259c5a1cde61-data-volume\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.151760 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.151620 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqbjm\" (UniqueName: \"kubernetes.io/projected/a50379b2-619d-4cdd-9cb9-259c5a1cde61-kube-api-access-lqbjm\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.151945 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.151925 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a50379b2-619d-4cdd-9cb9-259c5a1cde61-crio-socket\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.163622 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.163595 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a50379b2-619d-4cdd-9cb9-259c5a1cde61-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.163750 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.163592 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a50379b2-619d-4cdd-9cb9-259c5a1cde61-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.163750 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.163681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a50379b2-619d-4cdd-9cb9-259c5a1cde61-data-volume\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.186401 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.186336 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqbjm\" (UniqueName: \"kubernetes.io/projected/a50379b2-619d-4cdd-9cb9-259c5a1cde61-kube-api-access-lqbjm\") pod \"insights-runtime-extractor-cr5q4\" (UID: \"a50379b2-619d-4cdd-9cb9-259c5a1cde61\") " pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.335257 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.335221 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cr5q4" Apr 24 22:30:42.745486 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:42.745442 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cr5q4"] Apr 24 22:30:43.156772 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.156730 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" event={"ID":"788467b9-4514-4bb7-88c9-87f727737472","Type":"ContainerStarted","Data":"c57a9771cb0b0b51835e1c5b2286523ee78ddc9fefe84418b42c16042be57421"} Apr 24 22:30:43.160461 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.160395 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2wftt" event={"ID":"9b056a73-ce1c-4e88-9a66-ebfc4498a736","Type":"ContainerStarted","Data":"7c560f4c09a55d8ab0d2a31ca1226e12cc676235d1fe6a983202452c2437d311"} Apr 24 22:30:43.160461 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.160430 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2wftt" event={"ID":"9b056a73-ce1c-4e88-9a66-ebfc4498a736","Type":"ContainerStarted","Data":"8557ad4661c9727b2cd2ce08f7e042d6772605f0283d16ed4fba47530cd04ca8"} Apr 24 22:30:43.162144 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.162118 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" event={"ID":"29de487b-ac86-4115-aa5b-af699bbd6649","Type":"ContainerStarted","Data":"156e001ea58866dcc98ed6ee5b01a6eba0fa2a419ab2d726c516bd965de4094b"} Apr 24 22:30:43.164060 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.164042 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" event={"ID":"760dbb36-9588-4e56-bc23-e7c58e592567","Type":"ContainerStarted","Data":"02b1cd391b83f88c0123933daf6febc45a412c9ba75e586bdd4291f84f66ddd0"} Apr 24 22:30:43.164161 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.164151 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" event={"ID":"760dbb36-9588-4e56-bc23-e7c58e592567","Type":"ContainerStarted","Data":"b6c3fac2e4b27890015ed6719a4c9c1e8cd9fbdb05e11e36507a88892efce71f"} Apr 24 22:30:43.165495 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.165475 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:30:43.165591 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.165542 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" event={"ID":"d94d4d43-b527-44fc-a469-c293ea0d393c","Type":"ContainerStarted","Data":"98981030e9fba201184b752cdc1bc30ba2d33cbbe7072a39fc65e33a9d931ca7"} Apr 24 22:30:43.166078 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.166032 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:43.167846 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.167753 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x22nw" event={"ID":"1832b77e-f856-4269-bb4c-9f4e5c59c722","Type":"ContainerStarted","Data":"b67e475b6fd8e7fb09671842fbc878f14b07c4122202bcee2f811379a17d808a"} Apr 24 22:30:43.167846 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.167780 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x22nw" event={"ID":"1832b77e-f856-4269-bb4c-9f4e5c59c722","Type":"ContainerStarted","Data":"69894a5369dc35fff7377f531d814505c3d2e112a2384a3c2ce9a8242f9cffef"} Apr 24 22:30:43.167846 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.167823 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:43.169452 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.169423 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wsnnl" event={"ID":"8e2721e7-1e03-4664-9c72-58e388853714","Type":"ContainerStarted","Data":"eb06c02505e32455ad07edca414802afeaae05193c6eddc3f00ca8b332d0e344"} Apr 24 22:30:43.170939 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.170906 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cr5q4" event={"ID":"a50379b2-619d-4cdd-9cb9-259c5a1cde61","Type":"ContainerStarted","Data":"67f5699381dea2c31853d0a7adfab331404635524377b3666b3d6d46d192ea51"} Apr 24 22:30:43.170939 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.170928 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cr5q4" event={"ID":"a50379b2-619d-4cdd-9cb9-259c5a1cde61","Type":"ContainerStarted","Data":"3b363890557fc125d5ecf77a30b74f89e7fa43f9e67fc3474272102218b081ab"} Apr 24 22:30:43.188209 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.187954 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rqq25" podStartSLOduration=51.625859235 podStartE2EDuration="56.187937116s" podCreationTimestamp="2026-04-24 22:29:47 +0000 UTC" firstStartedPulling="2026-04-24 22:30:38.006091037 +0000 UTC m=+66.822417094" lastFinishedPulling="2026-04-24 22:30:42.568168904 +0000 UTC m=+71.384494975" observedRunningTime="2026-04-24 22:30:43.186961643 +0000 UTC m=+72.003287721" watchObservedRunningTime="2026-04-24 22:30:43.187937116 +0000 UTC m=+72.004263195" Apr 24 22:30:43.234727 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.234674 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2wftt" podStartSLOduration=67.430864343 podStartE2EDuration="1m12.234657894s" podCreationTimestamp="2026-04-24 22:29:31 +0000 UTC" firstStartedPulling="2026-04-24 22:30:37.756325446 +0000 UTC m=+66.572651503" lastFinishedPulling="2026-04-24 22:30:42.560118983 +0000 UTC m=+71.376445054" observedRunningTime="2026-04-24 22:30:43.234134385 +0000 UTC m=+72.050460475" watchObservedRunningTime="2026-04-24 22:30:43.234657894 +0000 UTC m=+72.050983971" Apr 24 22:30:43.312388 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.310899 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-x22nw" podStartSLOduration=33.873520897 podStartE2EDuration="38.310877521s" podCreationTimestamp="2026-04-24 22:30:05 +0000 UTC" firstStartedPulling="2026-04-24 22:30:38.132249313 +0000 UTC m=+66.948575369" lastFinishedPulling="2026-04-24 22:30:42.569605923 +0000 UTC m=+71.385931993" observedRunningTime="2026-04-24 22:30:43.264232364 +0000 UTC m=+72.080558444" watchObservedRunningTime="2026-04-24 22:30:43.310877521 +0000 UTC m=+72.127203600" Apr 24 22:30:43.312388 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.312248 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" podStartSLOduration=44.291039796 podStartE2EDuration="56.312235211s" podCreationTimestamp="2026-04-24 22:29:47 +0000 UTC" firstStartedPulling="2026-04-24 22:30:06.390908265 +0000 UTC m=+35.207234320" lastFinishedPulling="2026-04-24 22:30:18.412103665 +0000 UTC m=+47.228429735" observedRunningTime="2026-04-24 22:30:43.310364542 +0000 UTC m=+72.126690621" watchObservedRunningTime="2026-04-24 22:30:43.312235211 +0000 UTC m=+72.128561290" Apr 24 22:30:43.349557 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.349491 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wsnnl" podStartSLOduration=33.769081818 podStartE2EDuration="38.349474933s" podCreationTimestamp="2026-04-24 22:30:05 +0000 UTC" firstStartedPulling="2026-04-24 22:30:38.118755677 +0000 UTC m=+66.935081734" lastFinishedPulling="2026-04-24 22:30:42.699148789 +0000 UTC m=+71.515474849" observedRunningTime="2026-04-24 22:30:43.346935033 +0000 UTC m=+72.163261113" watchObservedRunningTime="2026-04-24 22:30:43.349474933 +0000 UTC m=+72.165801083" Apr 24 22:30:43.378825 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.378770 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-984q2" podStartSLOduration=51.79354811 podStartE2EDuration="56.378752409s" podCreationTimestamp="2026-04-24 22:29:47 +0000 UTC" firstStartedPulling="2026-04-24 22:30:37.984008537 +0000 UTC m=+66.800334593" lastFinishedPulling="2026-04-24 22:30:42.569212835 +0000 UTC m=+71.385538892" observedRunningTime="2026-04-24 22:30:43.378286969 +0000 UTC m=+72.194613048" watchObservedRunningTime="2026-04-24 22:30:43.378752409 +0000 UTC m=+72.195078488" Apr 24 22:30:43.424445 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.424385 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zz9fq" podStartSLOduration=44.903269123 podStartE2EDuration="49.42436403s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="2026-04-24 22:30:38.04711117 +0000 UTC m=+66.863437226" lastFinishedPulling="2026-04-24 22:30:42.568206069 +0000 UTC m=+71.384532133" observedRunningTime="2026-04-24 22:30:43.421784355 +0000 UTC m=+72.238110433" watchObservedRunningTime="2026-04-24 22:30:43.42436403 +0000 UTC m=+72.240690110" Apr 24 22:30:43.753461 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:43.753429 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-rlv6p" Apr 24 22:30:44.177017 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:44.176925 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cr5q4" event={"ID":"a50379b2-619d-4cdd-9cb9-259c5a1cde61","Type":"ContainerStarted","Data":"07b3337f0bc66a2e9d4d49cb76ce4fd51b719c7e56a9c08e0bdf4ddd84c1ce59"} Apr 24 22:30:46.184576 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:46.184518 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cr5q4" event={"ID":"a50379b2-619d-4cdd-9cb9-259c5a1cde61","Type":"ContainerStarted","Data":"a746be9a06141f50e04976ccc5f9e7275be3c5dfef6852d2b92761358c5e4b9d"} Apr 24 22:30:46.211523 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:46.211473 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-cr5q4" podStartSLOduration=2.349370369 podStartE2EDuration="5.211456972s" podCreationTimestamp="2026-04-24 22:30:41 +0000 UTC" firstStartedPulling="2026-04-24 22:30:42.886643662 +0000 UTC m=+71.702969722" lastFinishedPulling="2026-04-24 22:30:45.748730267 +0000 UTC m=+74.565056325" observedRunningTime="2026-04-24 22:30:46.211099575 +0000 UTC m=+75.027425653" watchObservedRunningTime="2026-04-24 22:30:46.211456972 +0000 UTC m=+75.027783070" Apr 24 22:30:53.180376 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:53.180336 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-x22nw" Apr 24 22:30:53.905789 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:53.905760 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-s4959"] Apr 24 22:30:53.945343 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:53.945311 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:53.949218 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:53.949193 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 22:30:53.953646 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:53.951898 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 22:30:53.956678 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:53.956513 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fcsn5\"" Apr 24 22:30:53.956775 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:53.956757 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 22:30:53.975388 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:53.975368 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 22:30:54.027707 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.027674 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hsh4\" (UniqueName: \"kubernetes.io/projected/bbc35a5f-594f-4186-9264-6ccd3b19d334-kube-api-access-4hsh4\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.027707 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.027710 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-accelerators-collector-config\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.027896 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.027743 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-tls\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.027896 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.027786 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bbc35a5f-594f-4186-9264-6ccd3b19d334-root\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.027896 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.027811 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbc35a5f-594f-4186-9264-6ccd3b19d334-sys\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.027896 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.027824 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-wtmp\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.027896 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.027847 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-textfile\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.027896 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.027864 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbc35a5f-594f-4186-9264-6ccd3b19d334-metrics-client-ca\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.027896 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.027888 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.128593 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.128531 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hsh4\" (UniqueName: \"kubernetes.io/projected/bbc35a5f-594f-4186-9264-6ccd3b19d334-kube-api-access-4hsh4\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.128593 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.128587 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-accelerators-collector-config\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.128809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.128619 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-tls\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.128809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.128642 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bbc35a5f-594f-4186-9264-6ccd3b19d334-root\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.128809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.128658 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbc35a5f-594f-4186-9264-6ccd3b19d334-sys\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.128809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.128672 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-wtmp\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.128809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.128691 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-textfile\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.128809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.128717 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbc35a5f-594f-4186-9264-6ccd3b19d334-metrics-client-ca\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.128809 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.128763 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbc35a5f-594f-4186-9264-6ccd3b19d334-sys\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.129116 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.128805 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bbc35a5f-594f-4186-9264-6ccd3b19d334-root\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.129116 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.128869 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-wtmp\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.129116 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.128919 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.129224 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.129131 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-textfile\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.129290 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.129271 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-accelerators-collector-config\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.139344 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.139322 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-tls\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.139344 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.139338 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbc35a5f-594f-4186-9264-6ccd3b19d334-metrics-client-ca\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.141048 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.141025 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bbc35a5f-594f-4186-9264-6ccd3b19d334-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.170611 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.170505 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hsh4\" (UniqueName: \"kubernetes.io/projected/bbc35a5f-594f-4186-9264-6ccd3b19d334-kube-api-access-4hsh4\") pod \"node-exporter-s4959\" (UID: \"bbc35a5f-594f-4186-9264-6ccd3b19d334\") " pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.259336 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:54.259300 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s4959" Apr 24 22:30:54.268772 ip-10-0-136-66 kubenswrapper[2568]: W0424 22:30:54.268744 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbc35a5f_594f_4186_9264_6ccd3b19d334.slice/crio-1d79ae2d26ca27ca997e629c6ab36fe461a3594766dbc6c1de5056699dd789bb WatchSource:0}: Error finding container 1d79ae2d26ca27ca997e629c6ab36fe461a3594766dbc6c1de5056699dd789bb: Status 404 returned error can't find the container with id 1d79ae2d26ca27ca997e629c6ab36fe461a3594766dbc6c1de5056699dd789bb Apr 24 22:30:55.207883 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:55.207850 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s4959" event={"ID":"bbc35a5f-594f-4186-9264-6ccd3b19d334","Type":"ContainerStarted","Data":"1d79ae2d26ca27ca997e629c6ab36fe461a3594766dbc6c1de5056699dd789bb"} Apr 24 22:30:56.212282 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:56.212247 2568 generic.go:358] "Generic (PLEG): container finished" podID="bbc35a5f-594f-4186-9264-6ccd3b19d334" containerID="b7039ef387a8b104355ba7540a6487cf5dbff35be179a39e7211331394cac389" exitCode=0 Apr 24 22:30:56.212731 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:56.212339 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s4959" event={"ID":"bbc35a5f-594f-4186-9264-6ccd3b19d334","Type":"ContainerDied","Data":"b7039ef387a8b104355ba7540a6487cf5dbff35be179a39e7211331394cac389"} Apr 24 22:30:57.217401 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:57.217363 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s4959" event={"ID":"bbc35a5f-594f-4186-9264-6ccd3b19d334","Type":"ContainerStarted","Data":"4da71e069f272aabe45a45674bc39e92af7913a777150b8b11bf5071dc0e8eb5"} Apr 24 22:30:57.217852 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:57.217408 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s4959" event={"ID":"bbc35a5f-594f-4186-9264-6ccd3b19d334","Type":"ContainerStarted","Data":"62e04c7310429160a30bfd1b844f5420b98c70e0e72f2f52a671ca4a8447d455"} Apr 24 22:30:57.239016 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:57.238968 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-s4959" podStartSLOduration=2.987195798 podStartE2EDuration="4.238953629s" podCreationTimestamp="2026-04-24 22:30:53 +0000 UTC" firstStartedPulling="2026-04-24 22:30:54.270384737 +0000 UTC m=+83.086710793" lastFinishedPulling="2026-04-24 22:30:55.522142565 +0000 UTC m=+84.338468624" observedRunningTime="2026-04-24 22:30:57.238273121 +0000 UTC m=+86.054599201" watchObservedRunningTime="2026-04-24 22:30:57.238953629 +0000 UTC m=+86.055279706" Apr 24 22:30:57.700933 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:57.700892 2568 patch_prober.go:28] interesting pod/image-registry-5bdcbf4855-czjjp container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 22:30:57.701097 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:57.700960 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" podUID="40b71b75-d3f5-4c46-82d7-81a92e3e572d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:30:59.145664 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:30:59.145634 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:31:04.292866 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:04.292827 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5bdcbf4855-czjjp"] Apr 24 22:31:09.143145 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:09.143112 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-lclmx" Apr 24 22:31:29.313120 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.313083 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" podUID="40b71b75-d3f5-4c46-82d7-81a92e3e572d" containerName="registry" containerID="cri-o://e8081fe53f579e7851d421c170c9fc97005cc1a99b0bb5d99bf008c779200563" gracePeriod=30 Apr 24 22:31:29.553612 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.553588 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:31:29.596300 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.596226 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn5m9\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-kube-api-access-bn5m9\") pod \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " Apr 24 22:31:29.596300 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.596269 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-certificates\") pod \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " Apr 24 22:31:29.596300 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.596299 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40b71b75-d3f5-4c46-82d7-81a92e3e572d-installation-pull-secrets\") pod \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " Apr 24 22:31:29.596538 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.596328 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40b71b75-d3f5-4c46-82d7-81a92e3e572d-ca-trust-extracted\") pod \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " Apr 24 22:31:29.596538 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.596367 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-bound-sa-token\") pod \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " Apr 24 22:31:29.596538 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.596394 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls\") pod \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " Apr 24 22:31:29.596538 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.596420 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/40b71b75-d3f5-4c46-82d7-81a92e3e572d-image-registry-private-configuration\") pod \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " Apr 24 22:31:29.596538 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.596446 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40b71b75-d3f5-4c46-82d7-81a92e3e572d-trusted-ca\") pod \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\" (UID: \"40b71b75-d3f5-4c46-82d7-81a92e3e572d\") " Apr 24 22:31:29.596865 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.596786 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "40b71b75-d3f5-4c46-82d7-81a92e3e572d" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:29.597018 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.596992 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40b71b75-d3f5-4c46-82d7-81a92e3e572d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "40b71b75-d3f5-4c46-82d7-81a92e3e572d" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:29.598990 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.598945 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-kube-api-access-bn5m9" (OuterVolumeSpecName: "kube-api-access-bn5m9") pod "40b71b75-d3f5-4c46-82d7-81a92e3e572d" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d"). InnerVolumeSpecName "kube-api-access-bn5m9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:29.599210 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.599181 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b71b75-d3f5-4c46-82d7-81a92e3e572d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "40b71b75-d3f5-4c46-82d7-81a92e3e572d" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:31:29.599507 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.599473 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b71b75-d3f5-4c46-82d7-81a92e3e572d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "40b71b75-d3f5-4c46-82d7-81a92e3e572d" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:31:29.599687 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.599663 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "40b71b75-d3f5-4c46-82d7-81a92e3e572d" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:29.599687 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.599668 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "40b71b75-d3f5-4c46-82d7-81a92e3e572d" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:29.605050 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.605023 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40b71b75-d3f5-4c46-82d7-81a92e3e572d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "40b71b75-d3f5-4c46-82d7-81a92e3e572d" (UID: "40b71b75-d3f5-4c46-82d7-81a92e3e572d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:31:29.697290 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.697241 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40b71b75-d3f5-4c46-82d7-81a92e3e572d-ca-trust-extracted\") on node \"ip-10-0-136-66.ec2.internal\" DevicePath \"\"" Apr 24 22:31:29.697290 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.697285 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-bound-sa-token\") on node \"ip-10-0-136-66.ec2.internal\" DevicePath \"\"" Apr 24 22:31:29.697290 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.697296 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-tls\") on node \"ip-10-0-136-66.ec2.internal\" DevicePath \"\"" Apr 24 22:31:29.697290 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.697307 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/40b71b75-d3f5-4c46-82d7-81a92e3e572d-image-registry-private-configuration\") on node \"ip-10-0-136-66.ec2.internal\" DevicePath \"\"" Apr 24 22:31:29.697545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.697317 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40b71b75-d3f5-4c46-82d7-81a92e3e572d-trusted-ca\") on node \"ip-10-0-136-66.ec2.internal\" DevicePath \"\"" Apr 24 22:31:29.697545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.697325 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bn5m9\" (UniqueName: \"kubernetes.io/projected/40b71b75-d3f5-4c46-82d7-81a92e3e572d-kube-api-access-bn5m9\") on node \"ip-10-0-136-66.ec2.internal\" DevicePath \"\"" Apr 24 22:31:29.697545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.697335 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40b71b75-d3f5-4c46-82d7-81a92e3e572d-registry-certificates\") on node \"ip-10-0-136-66.ec2.internal\" DevicePath \"\"" Apr 24 22:31:29.697545 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:29.697344 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40b71b75-d3f5-4c46-82d7-81a92e3e572d-installation-pull-secrets\") on node \"ip-10-0-136-66.ec2.internal\" DevicePath \"\"" Apr 24 22:31:30.314904 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.314868 2568 generic.go:358] "Generic (PLEG): container finished" podID="45450237-ea60-4420-a8ca-ce71d93e2261" containerID="746eabf7443465fbd13da4986746e09e2a434ec139dd0f1a09b06c1bda1623ea" exitCode=0 Apr 24 22:31:30.315360 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.314962 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" event={"ID":"45450237-ea60-4420-a8ca-ce71d93e2261","Type":"ContainerDied","Data":"746eabf7443465fbd13da4986746e09e2a434ec139dd0f1a09b06c1bda1623ea"} Apr 24 22:31:30.315501 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.315471 2568 scope.go:117] "RemoveContainer" containerID="746eabf7443465fbd13da4986746e09e2a434ec139dd0f1a09b06c1bda1623ea" Apr 24 22:31:30.316738 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.316365 2568 generic.go:358] "Generic (PLEG): container finished" podID="aa9e4a3b-8580-4463-a5cd-6aa7ba82738b" containerID="6903b5448a7d4a28cfd3f2942269744bc50822baae13bc00193dd5503583e20b" exitCode=0 Apr 24 22:31:30.316738 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.316441 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" event={"ID":"aa9e4a3b-8580-4463-a5cd-6aa7ba82738b","Type":"ContainerDied","Data":"6903b5448a7d4a28cfd3f2942269744bc50822baae13bc00193dd5503583e20b"} Apr 24 22:31:30.316843 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.316765 2568 scope.go:117] "RemoveContainer" containerID="6903b5448a7d4a28cfd3f2942269744bc50822baae13bc00193dd5503583e20b" Apr 24 22:31:30.317502 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.317481 2568 generic.go:358] "Generic (PLEG): container finished" podID="40b71b75-d3f5-4c46-82d7-81a92e3e572d" containerID="e8081fe53f579e7851d421c170c9fc97005cc1a99b0bb5d99bf008c779200563" exitCode=0 Apr 24 22:31:30.317600 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.317535 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" Apr 24 22:31:30.317600 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.317545 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" event={"ID":"40b71b75-d3f5-4c46-82d7-81a92e3e572d","Type":"ContainerDied","Data":"e8081fe53f579e7851d421c170c9fc97005cc1a99b0bb5d99bf008c779200563"} Apr 24 22:31:30.317672 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.317602 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5bdcbf4855-czjjp" event={"ID":"40b71b75-d3f5-4c46-82d7-81a92e3e572d","Type":"ContainerDied","Data":"2133d38938bf60cba40e623c299f8923cc0039123189eeb30466e6be2fdb0990"} Apr 24 22:31:30.317672 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.317621 2568 scope.go:117] "RemoveContainer" containerID="e8081fe53f579e7851d421c170c9fc97005cc1a99b0bb5d99bf008c779200563" Apr 24 22:31:30.328471 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.328450 2568 scope.go:117] "RemoveContainer" containerID="e8081fe53f579e7851d421c170c9fc97005cc1a99b0bb5d99bf008c779200563" Apr 24 22:31:30.328778 ip-10-0-136-66 kubenswrapper[2568]: E0424 22:31:30.328756 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8081fe53f579e7851d421c170c9fc97005cc1a99b0bb5d99bf008c779200563\": container with ID starting with e8081fe53f579e7851d421c170c9fc97005cc1a99b0bb5d99bf008c779200563 not found: ID does not exist" containerID="e8081fe53f579e7851d421c170c9fc97005cc1a99b0bb5d99bf008c779200563" Apr 24 22:31:30.328858 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.328784 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8081fe53f579e7851d421c170c9fc97005cc1a99b0bb5d99bf008c779200563"} err="failed to get container status \"e8081fe53f579e7851d421c170c9fc97005cc1a99b0bb5d99bf008c779200563\": rpc error: code = NotFound desc = could not find container \"e8081fe53f579e7851d421c170c9fc97005cc1a99b0bb5d99bf008c779200563\": container with ID starting with e8081fe53f579e7851d421c170c9fc97005cc1a99b0bb5d99bf008c779200563 not found: ID does not exist" Apr 24 22:31:30.345132 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.345108 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5bdcbf4855-czjjp"] Apr 24 22:31:30.349790 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:30.349767 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5bdcbf4855-czjjp"] Apr 24 22:31:31.321825 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:31.321780 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6dhr7" event={"ID":"45450237-ea60-4420-a8ca-ce71d93e2261","Type":"ContainerStarted","Data":"c7d86ac267b7dfc90c561ef649eca8ebb0151014b704a2a641d1a5af8e9ae371"} Apr 24 22:31:31.323403 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:31.323378 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2wr7j" event={"ID":"aa9e4a3b-8580-4463-a5cd-6aa7ba82738b","Type":"ContainerStarted","Data":"e89b3e63a2bb4c1d9ab57ec50a15684d765a2d29ab51aafeb6bbee49c4081e24"} Apr 24 22:31:31.789296 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:31.789241 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b71b75-d3f5-4c46-82d7-81a92e3e572d" path="/var/lib/kubelet/pods/40b71b75-d3f5-4c46-82d7-81a92e3e572d/volumes" Apr 24 22:31:34.334922 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:34.334895 2568 generic.go:358] "Generic (PLEG): container finished" podID="9c268f7d-cce3-435b-bc67-fc4ba3d34b62" containerID="8d370121bef4c49366b0498424e7b35929ed8ec43038438383a33092e5814b6d" exitCode=0 Apr 24 22:31:34.335197 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:34.334974 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ddhcv" event={"ID":"9c268f7d-cce3-435b-bc67-fc4ba3d34b62","Type":"ContainerDied","Data":"8d370121bef4c49366b0498424e7b35929ed8ec43038438383a33092e5814b6d"} Apr 24 22:31:34.335347 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:34.335327 2568 scope.go:117] "RemoveContainer" containerID="8d370121bef4c49366b0498424e7b35929ed8ec43038438383a33092e5814b6d" Apr 24 22:31:35.339354 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:35.339313 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ddhcv" event={"ID":"9c268f7d-cce3-435b-bc67-fc4ba3d34b62","Type":"ContainerStarted","Data":"69ced8d4029b3c0da14294ed053ac78c2ac5c077672755059e5f57867576eedc"} Apr 24 22:31:56.116172 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:31:56.116129 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" podUID="de52ac65-738f-4a82-8711-dc6a41133d4a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 22:32:06.115907 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:32:06.115866 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" podUID="de52ac65-738f-4a82-8711-dc6a41133d4a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 22:32:16.116524 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:32:16.116438 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" podUID="de52ac65-738f-4a82-8711-dc6a41133d4a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 22:32:16.116524 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:32:16.116517 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" Apr 24 22:32:16.117040 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:32:16.117023 2568 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"53f20260c532319c6fef0f763edb24cd51ca96f3abc0429703989733a9d2dd7a"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 22:32:16.117080 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:32:16.117060 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" podUID="de52ac65-738f-4a82-8711-dc6a41133d4a" containerName="service-proxy" containerID="cri-o://53f20260c532319c6fef0f763edb24cd51ca96f3abc0429703989733a9d2dd7a" gracePeriod=30 Apr 24 22:32:16.465103 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:32:16.465013 2568 generic.go:358] "Generic (PLEG): container finished" podID="de52ac65-738f-4a82-8711-dc6a41133d4a" containerID="53f20260c532319c6fef0f763edb24cd51ca96f3abc0429703989733a9d2dd7a" exitCode=2 Apr 24 22:32:16.465103 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:32:16.465090 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" event={"ID":"de52ac65-738f-4a82-8711-dc6a41133d4a","Type":"ContainerDied","Data":"53f20260c532319c6fef0f763edb24cd51ca96f3abc0429703989733a9d2dd7a"} Apr 24 22:32:16.465283 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:32:16.465130 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68f7c77688-tn2fw" event={"ID":"de52ac65-738f-4a82-8711-dc6a41133d4a","Type":"ContainerStarted","Data":"f676b234eed13cb1ead0c528dfd5d0b9abcff1deb1d733fdcc9ed085003fe3b0"} Apr 24 22:34:31.708046 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:34:31.708017 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:34:31.713770 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:34:31.713744 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:34:31.713941 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:34:31.713802 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:34:31.719716 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:34:31.719691 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 22:34:31.720254 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:34:31.720236 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:39:31.731006 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:39:31.730979 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:39:31.735280 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:39:31.735253 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:39:31.736291 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:39:31.736273 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:39:31.740917 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:39:31.740892 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:44:31.753147 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:44:31.753117 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:44:31.755050 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:44:31.755020 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:44:31.757948 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:44:31.757927 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:44:31.759941 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:44:31.759920 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:49:31.774556 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:49:31.774529 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:49:31.775040 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:49:31.774814 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:49:31.780275 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:49:31.780255 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:49:31.780469 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:49:31.780455 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:54:31.796119 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:54:31.796088 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:54:31.796632 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:54:31.796288 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:54:31.801011 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:54:31.800991 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:54:31.801203 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:54:31.801188 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:59:31.818466 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:59:31.818433 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:59:31.818979 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:59:31.818854 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 22:59:31.824146 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:59:31.824125 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 22:59:31.824559 ip-10-0-136-66 kubenswrapper[2568]: I0424 22:59:31.824534 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:04:31.838918 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:04:31.838888 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 23:04:31.840715 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:04:31.840692 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 23:04:31.844097 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:04:31.844076 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:04:31.845776 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:04:31.845757 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:09:31.858871 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:09:31.858839 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 23:09:31.861741 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:09:31.861719 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 23:09:31.864266 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:09:31.864246 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:09:31.867156 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:09:31.867139 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:14:31.879279 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:14:31.879250 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 23:14:31.885776 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:14:31.885754 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 23:14:31.887193 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:14:31.887171 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:14:31.890937 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:14:31.890919 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:19:31.902353 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:19:31.902322 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 23:19:31.906162 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:19:31.906140 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 23:19:31.907422 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:19:31.907398 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:19:31.911148 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:19:31.911132 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:24:31.922612 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:24:31.922537 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 23:24:31.928067 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:24:31.928042 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:24:31.928448 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:24:31.928430 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 23:24:31.933548 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:24:31.933524 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:29:31.942211 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:29:31.942170 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 23:29:31.948168 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:29:31.948143 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:29:31.948740 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:29:31.948720 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 23:29:31.953606 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:29:31.953590 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:33:37.030323 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:37.030294 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-j22h7_5b342b72-c7fb-4579-947f-7a261031b1a3/global-pull-secret-syncer/0.log" Apr 24 23:33:37.032417 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:37.032397 2568 ???:1] "http: TLS handshake error from 10.0.136.66:36072: EOF" Apr 24 23:33:37.238755 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:37.238723 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-js2gj_77dbb1fd-4a06-4c52-8ec4-35b9d8f89a3c/konnectivity-agent/0.log" Apr 24 23:33:37.355464 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:37.355390 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-66.ec2.internal_2f140c6571039b5778eaeb104c6d62fa/haproxy/0.log" Apr 24 23:33:40.558158 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:40.558129 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-rqq25_788467b9-4514-4bb7-88c9-87f727737472/cluster-monitoring-operator/0.log" Apr 24 23:33:40.879697 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:40.879625 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s4959_bbc35a5f-594f-4186-9264-6ccd3b19d334/node-exporter/0.log" Apr 24 23:33:40.901501 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:40.901477 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s4959_bbc35a5f-594f-4186-9264-6ccd3b19d334/kube-rbac-proxy/0.log" Apr 24 23:33:40.922585 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:40.922544 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s4959_bbc35a5f-594f-4186-9264-6ccd3b19d334/init-textfile/0.log" Apr 24 23:33:42.655721 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:42.655692 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-zz9fq_29de487b-ac86-4115-aa5b-af699bbd6649/networking-console-plugin/0.log" Apr 24 23:33:43.127104 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:43.127024 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/1.log" Apr 24 23:33:43.131664 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:43.131642 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-rlv6p_d94d4d43-b527-44fc-a469-c293ea0d393c/console-operator/2.log" Apr 24 23:33:43.929409 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:43.929380 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-gz596_e2e30cff-84a0-405d-bdd7-3d9f61b16917/volume-data-source-validator/0.log" Apr 24 23:33:44.115377 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.115345 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7"] Apr 24 23:33:44.115680 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.115668 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40b71b75-d3f5-4c46-82d7-81a92e3e572d" containerName="registry" Apr 24 23:33:44.115735 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.115682 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b71b75-d3f5-4c46-82d7-81a92e3e572d" containerName="registry" Apr 24 23:33:44.115770 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.115740 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="40b71b75-d3f5-4c46-82d7-81a92e3e572d" containerName="registry" Apr 24 23:33:44.118646 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.118630 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.120690 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.120674 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rmwdq\"/\"kube-root-ca.crt\"" Apr 24 23:33:44.121389 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.121369 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rmwdq\"/\"openshift-service-ca.crt\"" Apr 24 23:33:44.121521 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.121408 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rmwdq\"/\"default-dockercfg-z2mgt\"" Apr 24 23:33:44.123300 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.123279 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7"] Apr 24 23:33:44.207936 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.207847 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-podres\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.207936 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.207889 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-proc\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.208121 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.207958 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-lib-modules\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.208121 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.207996 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-sys\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.208121 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.208047 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csbzq\" (UniqueName: \"kubernetes.io/projected/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-kube-api-access-csbzq\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.309159 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.309116 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-lib-modules\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.309159 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.309165 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-sys\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.309342 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.309193 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csbzq\" (UniqueName: \"kubernetes.io/projected/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-kube-api-access-csbzq\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.309342 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.309217 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-podres\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.309342 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.309239 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-proc\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.309342 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.309258 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-sys\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.309342 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.309298 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-lib-modules\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.309342 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.309329 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-proc\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.309541 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.309384 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-podres\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.316756 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.316725 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csbzq\" (UniqueName: \"kubernetes.io/projected/ff14f5ad-73a3-4ad8-b080-a1916c5deabf-kube-api-access-csbzq\") pod \"perf-node-gather-daemonset-6rrl7\" (UID: \"ff14f5ad-73a3-4ad8-b080-a1916c5deabf\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.430020 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.429983 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:44.547723 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.547527 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7"] Apr 24 23:33:44.550709 ip-10-0-136-66 kubenswrapper[2568]: W0424 23:33:44.550681 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podff14f5ad_73a3_4ad8_b080_a1916c5deabf.slice/crio-829b88b28c19a00251e9dd70eca84170bd5b041a2809b2d49e96545e84a76e0a WatchSource:0}: Error finding container 829b88b28c19a00251e9dd70eca84170bd5b041a2809b2d49e96545e84a76e0a: Status 404 returned error can't find the container with id 829b88b28c19a00251e9dd70eca84170bd5b041a2809b2d49e96545e84a76e0a Apr 24 23:33:44.552526 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.552512 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:33:44.726431 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.726355 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x22nw_1832b77e-f856-4269-bb4c-9f4e5c59c722/dns/0.log" Apr 24 23:33:44.746009 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.745985 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x22nw_1832b77e-f856-4269-bb4c-9f4e5c59c722/kube-rbac-proxy/0.log" Apr 24 23:33:44.811557 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.811530 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wx9wp_e67c0e72-592f-4ae7-a84c-a898562b0176/dns-node-resolver/0.log" Apr 24 23:33:44.994226 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.994140 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" event={"ID":"ff14f5ad-73a3-4ad8-b080-a1916c5deabf","Type":"ContainerStarted","Data":"77940c6213a1c2a97b9dcba5e3932c601fc975e79528d87ebe9dd31e9d995794"} Apr 24 23:33:44.994226 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.994176 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" event={"ID":"ff14f5ad-73a3-4ad8-b080-a1916c5deabf","Type":"ContainerStarted","Data":"829b88b28c19a00251e9dd70eca84170bd5b041a2809b2d49e96545e84a76e0a"} Apr 24 23:33:44.994226 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:44.994200 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:45.277211 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:45.277183 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qz7zn_07631fd3-43b5-43d2-9831-42159a9806e2/node-ca/0.log" Apr 24 23:33:45.956147 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:45.956120 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6cf5756446-6zvgg_463655ac-43dd-4bbb-9fc3-a11c952398a9/router/0.log" Apr 24 23:33:46.326799 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:46.326772 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wsnnl_8e2721e7-1e03-4664-9c72-58e388853714/serve-healthcheck-canary/0.log" Apr 24 23:33:46.661671 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:46.661589 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-ddhcv_9c268f7d-cce3-435b-bc67-fc4ba3d34b62/insights-operator/0.log" Apr 24 23:33:46.662362 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:46.662340 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-ddhcv_9c268f7d-cce3-435b-bc67-fc4ba3d34b62/insights-operator/1.log" Apr 24 23:33:46.680511 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:46.680490 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cr5q4_a50379b2-619d-4cdd-9cb9-259c5a1cde61/kube-rbac-proxy/0.log" Apr 24 23:33:46.699616 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:46.699591 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cr5q4_a50379b2-619d-4cdd-9cb9-259c5a1cde61/exporter/0.log" Apr 24 23:33:46.718805 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:46.718770 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cr5q4_a50379b2-619d-4cdd-9cb9-259c5a1cde61/extractor/0.log" Apr 24 23:33:51.007533 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:51.007505 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" Apr 24 23:33:51.022810 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:51.022761 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-6rrl7" podStartSLOduration=7.022744305 podStartE2EDuration="7.022744305s" podCreationTimestamp="2026-04-24 23:33:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:33:45.007886028 +0000 UTC m=+3853.824212107" watchObservedRunningTime="2026-04-24 23:33:51.022744305 +0000 UTC m=+3859.839070388" Apr 24 23:33:52.973821 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:52.973791 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zd4ll_7666dac8-f01e-4d50-86a2-4a298cdbb22b/migrator/0.log" Apr 24 23:33:52.996455 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:52.996425 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zd4ll_7666dac8-f01e-4d50-86a2-4a298cdbb22b/graceful-termination/0.log" Apr 24 23:33:53.357193 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:53.357166 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6dhr7_45450237-ea60-4420-a8ca-ce71d93e2261/kube-storage-version-migrator-operator/1.log" Apr 24 23:33:53.358053 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:53.358037 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6dhr7_45450237-ea60-4420-a8ca-ce71d93e2261/kube-storage-version-migrator-operator/0.log" Apr 24 23:33:54.542125 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:54.542097 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m8pjb_99d5da74-fb38-467a-951e-9d474464c9b1/kube-multus-additional-cni-plugins/0.log" Apr 24 23:33:54.562657 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:54.562631 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m8pjb_99d5da74-fb38-467a-951e-9d474464c9b1/egress-router-binary-copy/0.log" Apr 24 23:33:54.588387 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:54.588361 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m8pjb_99d5da74-fb38-467a-951e-9d474464c9b1/cni-plugins/0.log" Apr 24 23:33:54.608388 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:54.608361 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m8pjb_99d5da74-fb38-467a-951e-9d474464c9b1/bond-cni-plugin/0.log" Apr 24 23:33:54.629164 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:54.629142 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m8pjb_99d5da74-fb38-467a-951e-9d474464c9b1/routeoverride-cni/0.log" Apr 24 23:33:54.650251 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:54.650227 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m8pjb_99d5da74-fb38-467a-951e-9d474464c9b1/whereabouts-cni-bincopy/0.log" Apr 24 23:33:54.673483 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:54.673457 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m8pjb_99d5da74-fb38-467a-951e-9d474464c9b1/whereabouts-cni/0.log" Apr 24 23:33:54.937043 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:54.936964 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qbqlv_8d80044f-c134-4e09-8b63-d8d8d4e50a46/kube-multus/0.log" Apr 24 23:33:54.957620 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:54.957590 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2wftt_9b056a73-ce1c-4e88-9a66-ebfc4498a736/network-metrics-daemon/0.log" Apr 24 23:33:54.978600 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:54.978554 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2wftt_9b056a73-ce1c-4e88-9a66-ebfc4498a736/kube-rbac-proxy/0.log" Apr 24 23:33:56.515466 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:56.515436 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-controller/0.log" Apr 24 23:33:56.539605 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:56.539557 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/0.log" Apr 24 23:33:56.554808 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:56.554778 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovn-acl-logging/1.log" Apr 24 23:33:56.580604 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:56.577743 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/kube-rbac-proxy-node/0.log" Apr 24 23:33:56.604794 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:56.604767 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 23:33:56.628696 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:56.628671 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/northd/0.log" Apr 24 23:33:56.650432 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:56.650404 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/nbdb/0.log" Apr 24 23:33:56.674400 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:56.674368 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/sbdb/0.log" Apr 24 23:33:56.762169 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:56.762137 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tbc58_ad18a0fd-1bbf-4f92-9239-77f7b1a9ae7d/ovnkube-controller/0.log" Apr 24 23:33:57.700052 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:57.700023 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-lb8gp_00bf21d3-25d9-4d4b-a2f5-c99835cb1daa/check-endpoints/0.log" Apr 24 23:33:57.768179 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:57.768143 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-lclmx_5346f0d6-5375-4d8a-9fb6-8f7c3a45720a/network-check-target-container/0.log" Apr 24 23:33:58.687206 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:58.687180 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-s2xd9_0f2be5fe-c0eb-4a07-8625-e3980ae39927/iptables-alerter/0.log" Apr 24 23:33:59.345680 ip-10-0-136-66 kubenswrapper[2568]: I0424 23:33:59.345652 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-qz6cr_f8bf15d5-9bb6-4feb-bddd-e1ccb5b4028a/tuned/0.log"