Get Started with FePEST

Beside the standard task of model calibration (or typically named parameter estimation), FePEST provides a wide range of possibility to explore situations of best and worst-case scenarios, uncertainty quantification, etc. This section is intended to draw you the road-map for the different workflows in FePEST.

How to Start: Some concepts

  • Estimation
  • Prediction
  • Monte Carlo
  • Pareto
  • Ensemble Smoother
  • x

    Estimation

    Estimation mode is applied for cases of model calibration, so-called history matching. FePEST combines several tools and options of estimation and regularization modes to get most out of PEST suit.

    KEYWORDS: Observation Definitions, Parameter Definitions, Prior Information, Tikhonov regularization, Subspace regularization, Super Parameters

    x

    Prediction

    When FePEST runs in Prediction mode the task is to identify the best or worst case scenarios. The definition of what best/worst is, depends strictly on the conceptual model. A worst-case scenario analysis could be the parameter estimation, which drives to the maximum concentration at a specific river section, maximum pressure on a open-cast walls, etc.

    KEYWORDS: Predictive Analysis

    x

    Monte Carlo

    FePEST can run the entire Monte Carlo workflow with several levels of flexibility. Either the analysis is evaluated with a not necesary calibrated model, so-called Pre-Calibration Monte Carlo Analysis. Or the model has been already calibrated and a Post-Calibration Monte Carlo Analysis can take full advantage of the regularization knowledge (e.g. reduced number of problem dimensions).

    KEYWORDS: Stochastic Parameter Generation , Null-Space Projection

    x

    Pareto

    Sometimes a specific observation set (or groups) influences strongly the estimation (i.e. calibration) of parameter sets. FePEST provides the newly-implemented Pareto operation mode. Here you can understand the contribution of this specific observation group to total measurement objective function, e.g. what would it happen if the calibrated parameters if this groups is more /less calibrated.

    KEYWORDS: Pareto, Observation Definitions

    x

    Iterative Ensemble Smoother

    The Iterative Ensemble Smoother (IES) is a recent addition to the PEST++ software. This working mode of PEST++ works significantly different than the classic gradient-based Gauss-Levenberg-Marquardt search method using in PEST / PEST++-GLM. Instead of a single model, a specified number (e.g. 100) different models are created by randomizing the parameter values according to the specified prior uncertainty range. During each iteration, all these ensemble members are than calibrated simultaneously, minimizing the measurement object while keeping as close as possible to the individual (random) initial parameter value. The first major benefits of this method is that the resulting calibrated ensembles inherently contain information about the parameter uncertainty in a Bayesian sense. The second major benefit is that the gradient vector (Jacobian Matrix) can be estimated from correlating the behavior of the ensemble members instead of using numerical differentiation as in traditional GLM. This makes the process a lot more robust against model instabilities, and reduces the numerical effort significantly (up to 2 orders of magnitude).

    KEYWORDS: Ensemble Smoother, Observation Definitions

How to Start: Standard Workflow

Out-of-the-box Workflows

Drawing sensitivity maps

  1. Standard workflow

  2. Run PEST under Estimation mode and compute the Jacobian matrix only (Termination criteria) in Problem Settings dialog.

  3. Use Sensitivity Export utility

Running a sensitivity analysis

  1. Standard workflow

  2. Use Sensitivity Analysis (SENSAN) utility

 

For a more detailed exploration of the uncertainty of parameter definitions to the model predictions (observations), it is strongly recommended to use workflow Calibrated-constrained Monte Carlo Analysis

 

 

 

 

Table of Contents

Index

Glossary

-Search-

Back