



Jodi Mead Boise State University 

Most inverse problems are illposed due to the fact that inputs such as parameters, physics and data are missing or inconsistent. This results in solution estimates that are not unique or unstable, i.e. small changes in the inputs result in large changes in the estimates. One common approach to resolving illposedness is to use regularization methods whereby information is added to the problem so that data are not overfitted. Alternatively, one could take the Bayesian point of view and assign a probability distribution to the unknowns and estimate it by exploiting Monte Carlo techniques.
In this work we take the regularization approach and use uncertainties to weight added information and data in an optimization problem. This allows us to apply statistical tests with the null hypothesis that inputs are combined within their uncertainty ranges to produce estimates of the unknowns. For example, the Discrepancy Principle can be viewed as using a chisquared test to determine the regularization parameter. The chisquared method developed by myself and colleagues uses a chisquared test similar to the Discrepancy Principle, but differs in that the test is applied to the regularized residual rather than the data residual. This approach leads to a general methodology of using statistical tests to estimate regularization parameters or uncertainties in an inversion. We will give statistical tests for nonlinear algorithms and show results from benchmark problems in Geophysics. We will also describe how statistical tests can be used to find a regularization parameter for Total Variation and show results from Imaging. 

Monday, 10 February 2014 3:10 p.m. in Math 103 4:00 p.m. Refreshments in Math Lounge 109 

Spring 2014 Colloquia & Events Mathematical Sciences  University of Montana 
