Get Started with FePEST
Beside the standard task of model calibration (or typically named parameter estimation), FePEST provides a wide range of possibility to explore situations of best and worstcase scenarios, uncertainty quantification, etc. This section is intended to draw you the roadmap for the different workflows in FePEST.
How to Start: Some concepts
 Estimation
 Prediction
 Monte Carlo
 Pareto
 Ensemble Smoother
Estimation
Estimation mode is applied for cases of model calibration, socalled history matching. FePEST combines several tools and options of estimation and regularization modes to get most out of PEST suit.
KEYWORDS: Observation Definitions, Parameter Definitions, Prior Information, Tikhonov regularization, Subspace regularization, Super Parameters
Prediction
When FePEST runs in Prediction mode the task is to identify the best or worst case scenarios. The definition of what best/worst is, depends strictly on the conceptual model. A worstcase scenario analysis could be the parameter estimation, which drives to the maximum concentration at a specific river section, maximum pressure on a opencast walls, etc.
KEYWORDS: Predictive Analysis
Monte Carlo
FePEST can run the entire Monte Carlo workflow with several levels of flexibility. Either the analysis is evaluated with a not necesary calibrated model, socalled PreCalibration Monte Carlo Analysis. Or the model has been already calibrated and a PostCalibration Monte Carlo Analysis can take full advantage of the regularization knowledge (e.g. reduced number of problem dimensions).
KEYWORDS: Stochastic Parameter Generation , NullSpace Projection
Pareto
Sometimes a specific observation set (or groups) influences strongly the estimation (i.e. calibration) of parameter sets. FePEST provides the newlyimplemented Pareto operation mode. Here you can understand the contribution of this specific observation group to total measurement objective function, e.g. what would it happen if the calibrated parameters if this groups is more /less calibrated.
KEYWORDS: Pareto, Observation Definitions
Iterative Ensemble Smoother
The Iterative Ensemble Smoother (IES) is a recent addition to the PEST++ software. This working mode of PEST++ works significantly different than the classic gradientbased GaussLevenbergMarquardt search method using in PEST / PEST++GLM. Instead of a single model, a specified number (e.g. 100) different models are created by randomizing the parameter values according to the specified prior uncertainty range. During each iteration, all these ensemble members are than calibrated simultaneously, minimizing the measurement object while keeping as close as possible to the individual (random) initial parameter value. The first major benefits of this method is that the resulting calibrated ensembles inherently contain information about the parameter uncertainty in a Bayesian sense. The second major benefit is that the gradient vector (Jacobian Matrix) can be estimated from correlating the behavior of the ensemble members instead of using numerical differentiation as in traditional GLM. This makes the process a lot more robust against model instabilities, and reduces the numerical effort significantly (up to 2 orders of magnitude).
KEYWORDS: Ensemble Smoother, Observation Definitions
How to Start: Standard Workflow
Outofthebox Workflows
Drawing sensitivity maps

Run PEST under Estimation mode and compute the Jacobian matrix only (Termination criteria) in Problem Settings dialog.
Running a sensitivity analysis
For a more detailed exploration of the uncertainty of parameter definitions to the model predictions (observations), it is strongly recommended to use workflow Calibratedconstrained Monte Carlo Analysis 