Shaw Prize in Mathematical Sciences 2013 — David L Donoho

David L. DonohoThe Shaw Prize was established by the well-known 106-year-old Hong Kong media mogul and philanthropist Sir Run Run Shaw in 2002. Often considered as the Asian Nobel Prize, it was first awarded in 2004 and honours individuals who have made “significant breakthroughs in academic and scientific research or application and whose work has resulted in a positive and profound impact on mankind”. It is awarded for research work in astronomy, life science and mathematical sciences. Each prize carries a monetary reward of one million US dollars.

The Shaw Prize in Mathematical Sciences 2013 has been awarded to David L Donoho of Stanford University for “his profound contributions to modern mathematical statistics and in particular the development of optimal algorithms for statistical estimation in the presence of noise and of efficient techniques for sparse representation and recovery in large data-sets.”

David Donoho is the Anne T and Robert M Bass Professor of the Humanities and Sciences, and Professor of Statistics at Stanford University, USA. He obtained his BS from Princeton University in 1978 and PhD from Harvard University in 1983. His PhD advisor was the well-known Swiss statistician Peter Jost Huber. Donoho was on the faculty of the University of California, Berkeley from 1984 to 1990 before moving to Stanford University. He was a former Presidential Young Investigator. He is Fellow of the American Academy of Arts and Sciences, SIAM Fellow, a foreign associate of the French Academy of Sciences, and a member of the US National Academy of Sciences. In 1991 he was awarded a MacArthur Fellowship (popularly called the “Genius Grant”).

Donoho is well-known for his pioneering work in statistical theory and its wide range of applications to practical problems. One of his primary interests is robust statistics and he has devised strategies for detecting errors in a database containing many dissimilar types of data. In his recent research on wavelets, he has developed a suite of interactive computer modules for exploring their properties. In particular, he has developed an application of a wavelet transform to noise reduction.

Donoho’s research is prolific and extensive in collaborative work, some of which has produced one of the most cited papers in the literature of the mathematical sciences. In an ISI (International Statistical Institute) survey conducted in 2000 on high-impact papers of the 1990s, he is ranked in the top 5 authors in the field of mathematics with 26 papers cited a total of 1,146 times to date.

Among his most influential work is joint research with Iain Johnstone, Dominique Picard and Gerard Kerkyacharian on the use of a wavelet transform in noise reduction. For example, the idea of Donoho–Johnstone thresholding is fundamental in the method of wavelet shrinkage used in reducing noise from signals, images and all kinds of data. Coupled with the fast wavelet transform (invented in the 1980s), this collaborative research has led to spectacular results and benefits in many scientific fields; in particular in applications in MRI (magnetic resonance imaging) used in medical radiology for body scans. Its impact on many facets of modern society is already being felt wherever there is a need to process information and data, such as weather forecasts, traffic forecasts, traffic cameras, tweets, wireless communications, internet searches and many other aspects of modern life.

A recurrent theme in Donoho’s work is “sparse approximation” and what might be considered its mathematical offspring “compressed sensing”. In sparse approximation, a mathematical model is formulated to cope with a large amount of observed data and to extract the basic ingredients or “atoms” that build up the data under the assumption that only few atoms are required to specify each piece of data. Mathematically, it is concerned with the estimation of a sparse multi-dimensional vector satisfying a linear system of equations given high-dimensional observed data and a “design” matrix. A joint paper of 2009 written with A M Bruckstein and M Elad on sparse approximation is one of the most cited papers in mathematics. In 2006, Donoho, and independently E Candes and T Tao, developed ideas that led to the topic of compressed sensing, which provides the methodology for “getting more with less” from data collected, leading to dramatic results in applications — reduction in measurement time, sampling rates, use of analogue-to-digital converter resources among others. The possibilities of application and benefits to quality of life appear to be tremendous. On a philosophical and ethical level, Donoho has been a strong advocate of reproducible computational research. The computational part of his research is guided by a principle which he said was inculcated and inspired by Stanford earth scientist John Claerbout in the early 1990s. In his informal talks, interviews and essays, Donoho has constantly appealed to the scientific community to make their computational results transparent by making the algorithms implemented and even the software used freely available to fellow scientists. According to him, the altruistic aspect aside, this guiding principle has also given him a personal benefit of consistently achieving high citation counts of his research papers.

Y K Leong

[This article was originally appeared in the Asia Pacific Mathematics Newsletter, Volume 3 No. 3 (July 2013), published by World Scientific. It has been republished here with a special permission from World Scientific.]

claimtoken-529724d9df6a9