April 19, 2008

Flat Noise Set - on Fast Noiselet transform

After first try to provide some code for the noiselet transform matrix, which yields a flat noise set of coefficients from the Haar system (and to obtain nice pictures), a valuable implemetation has been made public. No lie, fastest Matlab code for noiselets is now available from Laurent Jacques (direct link). David Donoho(*) recently recalled that the compressed sensing litterature shared similarities with concepts in other fiels (an oft essential remark). For instance in data stream processing, as for “heavy hitters” or “Iceberg queries”. The Fast Noiselet is not the nastiest floe in this ocean of litterature.

(*) David L. Donoho, Compressed Sensing, IEEE Transactions on Information Theory, April 2006 (pdf copy).

On complex hadamard
matrices

April 13, 2008

Connections or connexions?

François Morellet is a painter, scultor, engraver, who turned abstract by the 1950's. Some of his works carry strange connections with sparsity, wavelets and even compressive sensing. First, his name, akin to both a crocodile and Jean Morlet, who died almost one year ago. Then, he authored a work entitled "4 trames de tirets simples (non quinconce) 0°-45°-90°-135°", which obviously refer to a set of four oriented separable ("non quincunx") frames, very similar to complex dual-tree wavelets. The remaining terms "tirets simples" (simple dashes) may refer to Haar or Hadamard like vectors. Less structured grids (see Any grid) are also present, as in this "Répartition aléatoire de 40 000 carrés":

The following "3 doubles trames 0°, 30°, 60°" is far better structured. Connections. Or Connexions (cnx.org), at Rice University, providing Content Commons of free, open-licensed educational materials in fields such as music, electrical engineering and psychology. Remember Rice also hosts a Compressive Sensing repository.

April 9, 2008

RIP scene sends Smog (Noiselet today)

Recently, Processing Vein Mess was proposed as an applicative anagram for Compressive sensing. Its Compressed sensing avatar possesses a much nicer anagram: "RIP scene sends Smog". This was illustrated in a recent talk by Albert Cohen around the Restricted Isometry Property (RIP). For practical applications, Compressive sensing onset lies in the developpement of specific observation matrices. Noiselets are an example of those. Let us quote artist Michael Thieke about them:
Very sparse. Very minimal.
These musicians with their instruments make sounds that may not have been intended by the original inventors. They do this in a way that at first seems to be a very random. After a longer listen, the inspirations soak through. These “noiselets and sounduals” (my words entirely) may be improvised, but they are very potent in their expressive capability.

Real, or imaginery (for noiselet Matlab code)?

April 4, 2008

RIP, not RIP - a little bit on compressed sensing

Shall the Restricted Isometry Property (RIP) Rest In Peace (RIP again)?

This afternoon, Albert Cohen presented recent findings (made with Wolfgang Dahmen and Ronald DeVore) on Compressed sensing and best k-term approximation. These findings pertain to the fields of compressive sensing (CS, see Nuit Blanche for some live coverage) and non-linear approximation (NLA), namely: how do fixed linear measurements (CS) and adaptive linear measurements (NLA) compare for signal approximation in the lp norm? Both in a deterministic and a probabilistic fashion. A. Cohen nicely reviewed some references on the topic, including the interesting FRI (Finite Rate of Innovation) by Per Luigi Dragotti and Martin Vetterli (et al.), nicely covered here (we come back later on this approach) and the compressive sensing resources. He emphasized an approach based on decoupling (1) the information quantity and (2) the algorithms. The basic question is: given some unknown x, presumably sparse, linearly observed via y = Px, find a decoder D (generally non-linear) such that D(Px) is close enough to x. (P,D) is characterized by some "instance optimality" property. The work focuses on the characterization of the null space ker(P), since it essentially measures P's ambiguity in recovering a sparse x from y. The existence of the decoder is related (Lemma 3.1 from Compressed sensing and best k-term approximation) to the injectivity of P and to the relative spread of ker(P). Compared to Candès-Tao-Romberg Restricted Isometry Property, this approach differs both in the algorithmic approach and in its stability. It is important to mention, that the "ker" (null space) point of view still applies when x = Ms is a quasi-sparsified (through a wavelet transform for instance) version of some signal s, while PM would not satisfy RIP, P would.

Coming back to Finite Rate of Innovation. A signal band-limited in [-1/2 1/2] is perfectly reconstructed with at least one uniform measurement per period of time, through a cardinal sine kernel. Which may be considered as an innovation rate per time unit. While the Nyquist-Shannon condition is sufficient, it is not necessary. FRI derives conditions, based on parametric hypotheses on the signals (e.g. formed with p diracs per time period), for perfectly reconstructing signals after an acquisition made of a sampling kernel followed by uniform sampling. Interestingly, in contrast to standard CS, the sparsity in FRI resides in a continuous domain, and not in a discrete one. This seems somewhat more natural. On the other hand, non-uniform sampling is more efficiently dealt with CS.

For recent accounts on FRI:
Estimating Signals with Finite Rate of Innovation from Noisy Samples: A Stochastic Algorithm, by Vincent Y. F. Tan, Vivek K. Goyal
Compressive Sampling: Sparse Sampling of Signal Innovations, by Thierry Blu, Pier-Luigi Dragotti, Martin Vetterli, Pina Marziliano, Lionel Coulot
Nuit Blanche of the above paper
Nuit Blanche in Neutron transport (very cold neutrons?)

Forfait Joule : grandeurs et unités du temps de travail

[Article très provisoire, pour arriver à la notion professionnelle de "forfait Joule" #ToutCaPourCa] Quand on parle de travail que...