Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
research [2014/05/30 11:08]
themelis
research [2017/10/06 12:36] (current)
themelis
Line 1: Line 1:
 +This is my old webpage. A recent version is [[https://​themelis.github.io/​|here]].
 +
 ===== Research ===== ===== Research =====
 \\ \\
  
-\\ 
  
  
Line 14: Line 15:
 //**A variational Bayes framework for sparse adaptive estimation**//​ \\ //**A variational Bayes framework for sparse adaptive estimation**//​ \\
 /*K.E. Themelis, A.A.Rontogiannis,​ K.D. Koutroumbas*/​ {{ ::​screenshot_from_2013-12-31_17_25_34.png?​nolink&​300|}} /*K.E. Themelis, A.A.Rontogiannis,​ K.D. Koutroumbas*/​ {{ ::​screenshot_from_2013-12-31_17_25_34.png?​nolink&​300|}}
-**Abstract** Recently, a number of $\ell_1$-norm regularized least squares (LS) type algorithms have been proposed to address the problem of \emph{sparse} adaptive signal estimation and system identification. From a Bayesian perspective,​ this task is equivalent to maximum a posteriori (MAP) estimation under a sparsity promoting heavy tailed prior for the parameters of interest. Following a different approach, this paper develops a unifying framework of sparse \emph{variational Bayes} schemes that employ heavy tailed priors in conjugate hierarchical form to facilitate posterior inference. The resulting fully automated variational schemes are first presented in a batch iterative mode. Then it is shown that by properly exploiting the structure of these batch estimation schemes, new sparse adaptive variational Bayes algorithms can be derived, which have the ability to impose and track sparsity during real-time processing in a time-varying environment. The most important feature of the proposed algorithms is that they completely eliminate the need for computationally costly parameter fine-tuning,​ a necessary ingredient of sparse adaptive deterministic algorithms. Simulation results are provided to demonstrate the effectiveness of the proposed sparse adaptive variational algorithms against state-of-the-art deterministic techniques for adaptive channel estimation. The results show that the proposed variational Bayes algorithms are numerically robust and have in general superior estimation performance compared to their deterministic counterparts. ​   ** under review** ​  ​[[http://​members.noa.gr/​themelis/​lib/​exe/​fetch.php?​media=code:​asvb_demo_code.zip|Matlab code]]+**Abstract** Recently, a number of $\ell_1$-norm regularized least squares (LS) type algorithms have been proposed to address the problem of \emph{sparse} adaptive signal estimation and system identification. From a Bayesian perspective,​ this task is equivalent to maximum a posteriori (MAP) estimation under a sparsity promoting heavy tailed prior for the parameters of interest. Following a different approach, this paper develops a unifying framework of sparse \emph{variational Bayes} schemes that employ heavy tailed priors in conjugate hierarchical form to facilitate posterior inference. The resulting fully automated variational schemes are first presented in a batch iterative mode. Then it is shown that by properly exploiting the structure of these batch estimation schemes, new sparse adaptive variational Bayes algorithms can be derived, which have the ability to impose and track sparsity during real-time processing in a time-varying environment. The most important feature of the proposed algorithms is that they completely eliminate the need for computationally costly parameter fine-tuning,​ a necessary ingredient of sparse adaptive deterministic algorithms. Simulation results are provided to demonstrate the effectiveness of the proposed sparse adaptive variational algorithms against state-of-the-art deterministic techniques for adaptive channel estimation. The results show that the proposed variational Bayes algorithms are numerically robust and have in general superior estimation performance compared to their deterministic counterparts. ​  /* ** under review** ​  ​[[http://​members.noa.gr/​themelis/​lib/​exe/​fetch.php?​media=code:​asvb_demo_code.zip|Matlab code]] ​*/
  
  
Line 27: Line 28:
   * Statistical signal and image processing ​   * Statistical signal and image processing ​
   * Probabilistic machine learning   * Probabilistic machine learning
-  * Sparse representations ​+  * Sparse ​and low-rank ​representations ​
  
  
Line 33: Line 34:
  
   * Spectral unmixing, hyperspectral image processing   * Spectral unmixing, hyperspectral image processing
-  * Adaptive estimation+  * Adaptive estimation, wireless channel equalization
  
 ==== Collaborators ==== ==== Collaborators ====