您的位置:首页 > 其它

Sparse Dictionaries (How to Make or Find them)

2014-03-17 21:54 246 查看
Sparse
Dictionaries (How to Make or Find them)

When
a signal is said to be sparse in an engineering sense, it really means that the signal can be expanded in either a small number of terms or in a series with significantly decaying coefficients. In the former case, one talks about a strictly sparse signal,
in the latter case, one talks about a compressible signal. In order to produce Compressed Measurements, one first need to know what is the family of functions in which the signal of interest is sparse. Depending on the case, one might be lucky and know that
the signal is sparse in a basis found in harmonic analysis (2.1) or one may have to spend some work in devising what these sparse basis is through an algorithm dedicated to finding sparse dictionaries from a set of signal examples(2.2 and 2.3). Finally, Remi
Gribonval and Karin
Schnassproduce some estimate in Dictionary
Identification - Sparse Matrix-Factorisation via L1-Minimisation on the number of training examples
needed to build a dictionary.

2.1 Basis Functions for which signals are either sparse or compressible

Fourier, Polynomials, etc...

All kinds of wavelets and higher dimensional related functions (a
few are listed in Where is the Starlet)
2.2 Algorithms that find
sparse dictionaries
 are presented in:

Online
Learning for Matrix Factorization and Sparse Coding by Julien
Mairal, Francis
Bach, Jean
Ponce, Guillermo
Sapiro [The code is released as SPArse
Modeling Software or SPAMS]

Dictionary
Learning Algorithms for Sparse Representation (Matlab implementation ofFOCUSS/FOCUSS-CNDL
is here)
Multiscale
sparse image representation with learned dictionaries [Matlab implementation of the K-SVD
algorithm is here, a newer implementation by Ron Rubinstein is here ] 
Efficient
sparse coding algorithms [ Matlab code
is here ]

Non-negative
Sparse Modeling of Textures (NMF) [Matlab implementation of NMF
(Non-negative Matrix Factorization) and NTF (Non-negative Tensor), a faster implementation of NMF
can be found here,
here is a more recent Non-Negative
Tensor Factorizations package]
Shift
Invariant Sparse Coding of Image and Music Data. Matlab implemention is here

MoTIF
: an Efficient Algorithm for Learning Translation Invariant Dictionaries, but also Learning
Multi-Modal Dictionaries.Also, more recent: Shift-invariant
dictionary learning for sparse representations: extending K-SVD. 
Thresholded
Smoothed-L0 (SL0) Dictionary Learning for Sparse Representations by Hadi Zayyani, Massoud
Babaie-Zadeh and Remi
Gribonval.
Let
us note the Matlab
Toolbox Sparsity by Gabriel
Peyre that has implemented some of these techniques. Knowledge of specific domain signals enables
the ability to build these hopefully small dictionaries.

For
a review of the state of the art on the subject on how to compile dictionaries from training signals and attendant theoretical issues, check the following document by Remi
Gribonval for his Habilitation
a Diriger Des Recherches entitled: Sur
quelques problèmes mathématiques de modélisation parcimonieuse translated into Sparse Representations:
From Source Separation to Compressed Sensing. There is a video and
in an audio only
format of this presentation in French. The accompanying slides in English are here.

2.3 Data Driven Dictionaries

The
next step will almost certainly bring about techniques that find elements within a manifold as
opposed to a full set of functions, some sort of Data Driven Dictionary. In this setting, one can list:

Geometric
harmonics (as demo-ed here)
Diffusion
Wavelets ( Matlab
code for Diffusion Geometry and Diffusion Wavelets. ) 

Treelets.
A code in Matlab is available here.
Some of these techniques are being used for dimensionality reduction, which
in effect is stating that datasets are compressible when being represented with these dictionaries.

weblink: https://sites.google.com/site/igorcarron2/cs#sparse
http://guiuestc.i.sohu.com/blog/view/165291424.htm
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: 
相关文章推荐