log in  |  register  |  feedback?  |  help  |  web accessibility
Logo
Finding directions of variability: A dimension reduction strategy for multivariate response surfaces with applications in uncertainty quantification
Tuesday, April 2, 2013, 11:00 am-12:00 pm Calendar
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract

As computational power increases, scientists and engineers
increasingly rely onĀ  simulations of complex models to test hypotheses
and inform analyses. Inputs to these models, such as boundary
conditions and material properties, are often underspecified due to a
lack of data, which leads to uncertainty in the model outputs.
Uncertainty quantification (UQ) attempts to endow simulation outputs
with measures of confidence given uncertainty in the model inputs.
Monte Carlo methods are the workhorse of UQ; model inputs are sampled
from a distribution, and the corresponding model outputs are treated
as a data set for statistical analyses. However, the slow convergence
of Monte Carlo approximations coupled with the time intensiveĀ  model
evaluations has led many to employ response surfaces; once the
response surface has been trained with a few expensive simulations,
its inexpensive predictions can be used in place of the full model.
Unfortunately, most response surfaces suffer from the curse of
dimensionality, i.e., the work required to create an accurate
approximation increases exponentially as the number of inputs
increases.

Many practical models with high dimensional inputs vary primarily
along only a few directions in the space of inputs. I will describe a
method for detecting and exploiting these primary directions of
variability to construct a response surface on a low dimensional
linear subspace of the full input space; detection is accomplished
through analysis of the gradient of the model output with respect to
the inputs, and the subspace is defined by a projection. I will show
error bounds for the low dimensional approximation that motivate
computational heuristics for building a Kriging response surface on
the subspace. As a demonstration, I will apply the method to a
nonlinear heat transfer problem on a turbine blade, where a 250
parameter model for the heat flux represents uncertain transition to
turbulence of the flow field. I will also discuss the range of
existing applications of the method and the future research
challenges.

Another challenge that arises in UQ is managing and analyzing the
large volumes of data (e.g., tens to hundreds of terabytes) generated
by many runs of a highly resolved physical simulation. In the second
part of the talk, I will discuss recent activities employing
Hadoop/MapReduce to compute the singular value decomposition of a
simulation data set whose elements depend on space, time, and and
model parameters; the computed decomposition is used to build reduced
order models for the physical simulation.

Bio

Paul Constantine is a postdoctoral researcher in Stanford's Center for
Turbulence Research. He received his Ph.D. in Computational and
Mathematical Engineering from Stanford in 2009 and spent the following
two years as the von Neumann Research Fellow at Sandia National
Laboratory's Computer Science Research Institute. His research
interests include dimension reduction and reduced order modeling
techniques for complex, high performance simulations with applications
in uncertainty quantification. He is also actively exploring the
MapReduce programming framework as a potential tool for managing and
analyzing large amounts of data generated by an ensemble of highly
resolved simulations.

This talk is organized by Howard Elman