The AstroStat Slog » regularization http://hea-www.harvard.edu/AstroStat/slog Weaving together Astronomy+Statistics+Computer Science+Engineering+Intrumentation, far beyond the growing borders Fri, 09 Sep 2011 17:05:33 +0000 en-US hourly 1 http://wordpress.org/?v=3.4 Astroinformatics http://hea-www.harvard.edu/AstroStat/slog/2009/astroinformatics/ http://hea-www.harvard.edu/AstroStat/slog/2009/astroinformatics/#comments Mon, 13 Jul 2009 00:21:53 +0000 hlee http://hea-www.harvard.edu/AstroStat/slog/?p=3131 Approximately for a decade, there have been journals dedicated to bioinformatics. On the other hand, there is none in astronomy although astronomers have a long history of comprising a huge volume of catalogs and data archives. Prof. Bickel’s comment during his plenary lecture at the IMS-APRM particularly on sparse matrix and philosophical issues on choosing principal components led me to wonder why astronomers do not discuss astroinformatics.

Nevertheless, I’ve noticed a few astronomers rigorously apply principle component analysis (PCA) in order to reduce the dimensionality of a data set. An evident example of PCA applications in astronomy is photo-z. In contrast to the wide PCA application, almost no publication about statistical adequacy studies is found by investigating the properties of covariance matrix and its estimation method particularly when it is sparse. Even worse, the notion of measurement errors are improperly implemented since statistician’s dimension reduction methodology never confronted astronomers’ measurement errors. How to choose components is seldom discussed since the significance in physics model is rarely agreeing with statistical significance. This disagreement often elongates scientific writings hard to please readers. As a compromise, statistical parts are omitted, which makes me feel the publication incomplete.

Due to its easy visualization via intuitive scales, in wavelet multiscale imaging, the coarse scale to fine scale approach and the assumption of independent noise enables to clean the noisy image and to accentuate features in it. Likewise, principle components and other dimension reduction methods in statistics capture certain features via transformed metrics and regularized, or penalized objective functions. These features are not necessary to match the important features in astrophysics unless the likelihood function and selected priors match physics models. To my knowledge, astronomical literature exploiting PCA for dimension reduction for prediction rarely explains why PCA is chosen for dimensionality reduction, how to compensate the sparsity in covariance matrix, and other questions, often the major topics in bioinformatics. In the literature, these questions are explored to explain the particular selection of gene attributes or bio-markers under a certain response like blood pressures and types of cancers. Instead of binning and chi-square minimization, statisticians explore strategies how to compensate sparsity in the data set to get unbiased best fits and righteous error bars based on data matching assumptions and theory.

Luckily, there are efforts among some renown astronomers to form a community of astroinformatics. At the dawn of bioinformatics, genetic scientists were responsible for the bio part and statisticians were responsible for the informatics until young scientists are educated enough to carry out bioinformatics by themselves. Observing this trend partially from statistics conferences created an urge in me that it is my responsibility to ponder why there has been shortage of statisticians’ involvement in astronomy regardless of plethora of catalogs and data archives with long history. A few postings will follow what I felt while working among astronomers. I hope this small bridging effort to narrow the gap between two communities. My personal wish is to see prospering astroinformatics like bioinformatics.

]]>
http://hea-www.harvard.edu/AstroStat/slog/2009/astroinformatics/feed/ 1
Wavelet-regularized image deconvolution http://hea-www.harvard.edu/AstroStat/slog/2009/wavelet-regularized-image-deconvolution/ http://hea-www.harvard.edu/AstroStat/slog/2009/wavelet-regularized-image-deconvolution/#comments Fri, 12 Jun 2009 20:47:36 +0000 hlee http://hea-www.harvard.edu/AstroStat/slog/?p=2905

A Fast Thresholded Landweber Algorithm for Wavelet-Regularized Multidimensional Deconvolution
Vonesch and Unser (2008)
IEEE Trans. Image Proc. vol. 17(4), pp. 539-549

Quoting the authors, I also like to say that the recovery of the original image from the observed is an ill-posed problem. They traced the efforts of wavelet regularization in deconvolution back to a few relatively recent publications by astronomers. Therefore, I guess the topic and algorithm of this paper could drag some attentions from astronomers.

They explain the wavelet based reconstruction procedure in a simple term. The matrix-vector product wx= Wx yields the coefficients of x in the wavelet basis, and WTWx reconstructs the signal from these coefficients.

Their assumed model is

y=Hxorig + b,

where y and x_{orig} are vectors containing uniform samples of the original and measured signals; b represents the measurement error. H is a square (block) circulant matrix that approximates the convolution with the PSF. Then, the problem of deconvolution is to find an estimate that maximizes the cost function

J(x) = Jdata(x)+ λ Jreg(x)

They described that “this functional can also interpreted as a (negative) log-likelihood in a Bayesian statistical framework, and deconvolution can then be seen as a maximum a posteriori (MAP) estimation problem.” Also the description of the cost function is applicable to the frequently appearing topic in regression or classification problems such as ridge regression, quantile regression, LASSO, LAR, model/variable selection, state space models from time series and spatial statistics, etc.

The observed image is the d-dimensional covolution of an origianl image (the characteristic function of the object of interest) with the impulse response (or PSF). of the imaging system.

The notion of regularization or penalizing the likelihood seems not well received among astronomers based on my observation that often times the chi-square minimization (the simple least square method) without penalty is suggested and used in astronomical data analysis. Since image analysis with wavelets popular in astronomy, the fast algorithm for wavelet regularized variational deconvolution introduced in this paper could bring faster results to astronomers and could offer better insights of the underlying physical processes by separating noise and background more in a model according fashion, not simple background subtraction.

]]>
http://hea-www.harvard.edu/AstroStat/slog/2009/wavelet-regularized-image-deconvolution/feed/ 0