4.6 Article

Variable step-size convex regularized PRLS algorithms

Journal

SIGNAL PROCESSING
Volume 214, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.sigpro.2023.109251

Keywords

Sparse RLS algorithms; Proportionate updating; Zero-attracting; Variable step-size

Ask authors/readers for more resources

In this paper, an enhanced sparsity-aware recursive least squares (RLS) algorithm is proposed, which combines the proportionate updating (PU) and zero-attracting (ZA) mechanisms, and introduces a general convex regularization (CR) function and variable step-size (VSS) technique to improve performance.
The proportionate updating (PU) and zero-attracting (ZA) mechanisms have been applied independently in the development of sparsity-aware recursive least squares (RLS) algorithms. Recently, we propose an enhanced l1- proportionate RLS (l1-PRLS) algorithm by combining the PU and ZA mechanisms. The l1-PRLS employs a fixed step size which trades off the transient (initial convergence) and steady-state performance. In this letter, the l1- PRLS is improved in two aspects: first, we replace the l1 norm penalty by a general convex regularization (CR) function to have the CR-PRLS algorithm; second, we further introduce the variable step-size (VSS) technique to the CR-PRLS, leading to the VSS-CR-PRLS algorithm. Theoretical and numerical results were provided to corroborate the superiority of the improved algorithm.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available