Hans-Werner van Wyk

Department of Scientific Computing, Florida State

Variational Approximation of Uncertain Parameters

Collaborators: Jeff Borggaard (Virginia Tech)

In Borggaard2012 and vanWyk2014 we propose and analyze a novel variational approach to the statistical identification of the spatially varying, uncertain diffusion coefficient \(q\) in the second order elliptic PDE \begin{equation}\label{eqn:model} -\nabla\cdot(q\nabla u) = f \text{ in } D,\quad u = 0 \text{ on } \partial D, \end{equation} from statistical descriptions (\hat u\) of the model output \(u\). \(D\) is a spatial region and \((\Omega,\mathcal F,d\omega)\) is a complete probability space. We formulate the parameter identification problem as an infinite dimensional constrained optimization problem \begin{equation}\label{eqn:min_inf} \min_{(q,u)\in Q_{\mathrm{ad}}\times \mathscr{H}_0^1} \frac{1}{2}\|u-\hat u\|_{\mathscr{H}_0^1}^2 + \frac{\beta}{2}\|q\|_{\mathscr{H}}^2, \ \ \text{subject to } e(q,u) = 0, \end{equation} where \(Q_\mathrm{ad}\) is the set of admissible parameters, \(\mathscr{H}_0^1 = H_0^1(D) \otimes L^2(\Omega)\) and \(\mathscr{H}=H^2(D)\otimes L^2(\Omega)\) are appropriate stochastic Sobolev spaces, and \(e(q,u)=0\) is the equality constraint \eqref{eqn:model}, formulated as a stochastic partial differential equation. This non-linear optimization problem elegantly combines the spatial and statistical estimation of the unknown parameter \(q\). Although solutions to \eqref{eqn:min_inf} exist and satisfy a saddle point condition \cite{vanwyk2014gbe}, locating these directly by means of standard, gradient-based optimization strategies is problematic due to the lack of differentiability of \(e(q,u)\) as a function of \(q\). This difficulty arises as a direct consequence of the inherent lack of smoothness of functions \(q\), \(u\) and \(\hat u\) in the stochastic component.

A spectral approximation of the uncertain observations (via a truncated Karhunen-Lòeve expansion) allows us to approximate \(\hat u(x,\omega)\approx\hat{u}^n(x,Y(\omega))\), where \(Y = (Y_1,...,Y_n)\) is a random vector with density \(\rho:\Gamma \rightarrow \mathbb{R}\). We then estimate the infinite dimensional problem \eqref{eqn:min_inf} by a finite dimensional, deterministic optimization problem, the so-called 'finite noise' problem, in which we seek the unknown parameter \(q^n(x,Y(\omega))\) in the space of bounded mixed derivatives and solve by means of an efficient augmented Lagrangian algorithm. Posing the problem in this space, not only guarantees the Fréchet differentiability of the 'finite noise' equality constraint \(e(q^n,u^n)=0\), but also allows for the use of numerical discretization schemes based on hierarchical finite elements and wavelets, approximations known for their effectiveness in mitigating the so-called 'curse of dimensionality'. Our main result in this regard is to link the 'finite noise' problem with the infinite dimensional problem \eqref{eqn:min_inf}, by using non-linear regularization theory and exploiting the fact that the random variables \(\{Y_i\}_{i=1}^\infty\) act as variables in the 'finite noise' problem and as an orthonormal set in the infinite dimensional problem \eqref{eqn:min_inf}. The result is summarized by the following theorem.

Theorem
Let \(\|\hat u-\hat u^n\|_{\mathscr{H}_0^1}\rightarrow 0\) and \(\beta_n\rightarrow 0\) as \(n\rightarrow \infty\). Then the sequence of minimizers \(q_*^n\) of the finite noise problem has a subsequence converging weakly to a minimizer of \eqref{eqn:min_inf} and the limit of every weakly convergent subsequence is a minimizer of \eqref{eqn:min_inf}. The corresponding model outputs converge strongly to the infinite dimensional minimizer's model output.

As for deterministic inverse problems, the identification of uncertain parameters is ill-posed. Regularization, through smoothing, or spectral-/iterative methods, plays a significant role in this context. We are currently working on incorporating regularization more fully into the sampling procedure, through the use of sensitivity information \cite{borggaard2013suq}, trust region methods, or multilevel sampling. In recent work \cite{vanwyk2014ius}, we propose a multilevel sampling scheme for parameter identification that uses diagnostic convergence indicators to inform adaptive mesh refinement in cases where parametric uncertainty is localized. The development of a variational calculus with respect to random parameters and the efficient evaluation of stochastic `gradients' is another direction in which we wish to expand. Finally, we are interested to incorporate sensor placement and experimental design into our analysis.