Quant with Vahab
Quant Systems Lab · Control Systems for Quantitative Finance

Model Calibration and Validation

Calibration fits model parameters to market data; validation checks whether the model is fit for purpose.

Explanation

Calibration chooses parameters to minimise mispricing against a set of liquid instruments, subject to constraints.

Good calibration tracks quality metrics such as error distributions, stability over time, and parameter plausibility.

Validation assesses whether the model is structurally appropriate for its use-case, beyond just calibration fit.

In a central library, calibration and validation produce versioned artefacts and documentation, not just numbers.


calibrationvalidationmodel riskparameters
Interactive visualisation

This smile shows market implied vols as points and a polynomial model calibrated to part of them. Calibration error is measured on the calibration set, validation error on held-out points. Use the controls to change model flexibility and noise and see how underfit, good fit, and overfit behave.

Degree: 2 · RMSE(cal) ≈ 0.0062 · RMSE(val) ≈ 0.0141
Moneyness K / FImplied volatility0.800.901.001.101.20fitted model smileunderlying true shape (hidden in practice)calibration quotesvalidation quotes
Numbers
Degree: 2
RMSE (calibration) ≈ 0.0062 · RMSE (validation) ≈ 0.0141
Val / Cal RMSE ratio ≈ 2.27
Interpretation

Calibration is about matching the data you chose. Validation asks whether, given noise and structural limits, the model stays sensible on data you did not directly fit.

Underfit models leave systematic structure in the residuals. Overfit models drive calibration error down but see validation error explode as they chase noise. In a library, calibration and validation should produce a versioned artefact with both error metrics and rationale, not just a parameter vector.