Thursday July 28/8:00
Regularization and Multidimensional Function Estimation
The "curse of dimensionality" means that many more data points are required to estimate functions in two or more variables than in one. To stabilize the function estimation, the fit is regularized to reduce the effective number of degrees of freedom reduce the spurious oscillations. However, as regularization increases, the bias error increases. To minimize the total error, the bias error must be estimated self-consistently. The level of smoothing is adjusted to be small where the function is varying rapidly. Similarly, total variation diminishing numerical schemes are used to minimize the oscillations in the estimated function. The speakers will discuss these techniques and their applications in image processing, machine learning, time-frequency analysis and growth curves.
Organizer: Kurt S. Riedel
Courant Institute of Mathematical Sciences,
New York University
- 8:00: Use of Regularization for the Estimation of Boundary Curves in Image Data.
Maijian Qian, California State University, Fullerton, and Finbarr O'Sullivan, University of Washington
- 8:30: Practical Algorithm for Multiscale Image Restoration via Nonlinear PDEs with Statistical Constraints on Level Sets and Regions.
Lenny Rudin, Cognitech, Inc., and Stanley Osher, Cognitech, Inc., and University of California, Los Angeles
- 9:00: Optimal Data-based Kernel Estimation and Nonparametric Growth Curves.
Kurt S. Riedel, Organizer
- 9:30: Smoothing Spline Analysis of Variance of Data from Exponential Families.
Yuedong Wang, Grace Wahba, University of Wisconsin, Madison; Chong Gu, Purdue University, West Lafayette; Ronald Klein and Barbara Klein, University of Wisconsin