compressed sensing and a blog

My friend’s blog led me to Terrence Tao’s blog. A mathematician writes topics of applied mathematics and others. A glance tells me that all postings are well written. Especially, compressed sensing and single pixel cameras drags my attention more because the topic stimulates thoughts of astronomers in virtual observatory[1] and image processing[2] (it is not an exaggeration that observational astronomy starts with taking pictures in a broad sense) and statisticians in multidimensional applications, not to mention engineers in signal and image processing.

A particular interest of mine from his post is that compressed sensing could resolves bandwidth problems in astronomy and consequential sequential analysis on astronomical data (streaming data analysis). Overall, his list of applications at the end may enlighten scientists probing the sky with different waveband telescopes.

  1. see the slog posting “Virtual Observatory”[]
  2. see the slog posting “The power of wavedetect”[]
5 Comments
  1. hlee:

    I’m sure some researchers want to join this new field, compressed sensing for applications. For those, the following link could serve as a starting point: Compressive Sensing Resources

    10-25-2007, 4:53 pm
  2. vlk:

    Interesting. From what I gather, compressive sensing is a way to build up a high-fidelity non-parametric model of the source from imperfect data. Sounds impressive and magical, but don’t we do this all the time when we point Chandra at an extended source? You get some idea of the structure in a source with 1000 counts, more with 10000, and by 100000, it is getting to be good enough for a press release. For it to be really useful in astronomy — and I admit I haven’t been able to grok all the math involved, so maybe the theory does deal with this — can it be used to tell whether the detected structure is “real” at some significance?

    11-04-2007, 12:37 am
  3. vlk:

    Just to clarify the above comment: compressed sensing suggests that the best strategy for reconstructing something is via a randomized set of pixel measurements. That is exactly what one gets with X-ray detectors — Nature supplies the randomization in terms of the photon events, which occur randomly over the detector. As the number of events increase, the number of pixels that are sensed increase, and you build up a high fidelity image of the source with no aliasing effects. Is this a fair analogy? Or am I missing something about compressed sensing?

    Also, the “single pixel camera” is very reminiscent of coded mask aperture imaging, of the kind used in, e.g., RHESSI.

    11-04-2007, 12:08 pm
  4. hlee:

    I guess objectives and interests make a similar topic very diverse/different. Until comparison is studied, it’s hard to tell. Randomization depends on the basis (wavelets, FFT, PCA) and data (photons) and the intrinsic models for earthly images and heavenly ones ask for different randomization. Furthermore, I saw that the interpretation of randomization in statistics and astronomy is different; therefore, their ideas and yours could be different/same.

    Instead, to digress, this paper may interest you and others. An EM algorithm for wavelet-based image restoration by Figueiredo, M.A.T. and Nowak, R.D. IEEE Transactions on Image Processing Aug. 2003, Volume: 12, Issue: 8, On page(s): 906- 916 from my reading list. It includes uncertainties through EM algorithm.

    11-04-2007, 11:48 pm
  5. Igor Carron:

    Hi,

    I found your discussion through the blog search engine of Google, so I am a little late.

    Yes, in implementation of the single pixel camera (Rice) or of the spectral imager (Duke), the measurement techniques used in Compressed Sensing are an instance of Coded Aperture. I believe the major difference is that the coded aperture is an instance of incoherent “mask”. In most cases, these incoherent “masks” use a random location of holes as opposed to a well defined set of holes in normal coded aperture. It is random but it is known. For instance , the kind of randomization seen with X-rays and mentioned by vlk is not known and therefore does not fit this framework.

    The main discovery of compressed sensing is that by using these random “masks” you are in essence already obtaining a compressed information that needs no further alteration. The actual signal reconstruction uses new techniques based on the fact that objects being imaged are sparse and that the system of equations being solved is underdetermined. The reconstruction step requires the “random” masks information.

    You might find these entries useful:
    http://nuit-blanche.blogspot.com/2007/10/compressed-sensing-hardware.html
    http://nuit-blanche.blogspot.com/2007/11/compressed-sensing-hardware.html
    http://nuit-blanche.blogspot.com/2008/01/compressed-sensing-learning-bases-from.html

    all my entries on the Compressed Sensing subject are here:
    http://nuit-blanche.blogspot.com/search/label/compressed%20sensing

    The Rice repository is a must read.

    Igor.

    01-23-2008, 4:44 am
Leave a comment