Making Predictions And Estimation Methods

Given that we have uncertainty in the observations, we are interested in constructing a probabilistic description $p_{\bar{\theta}}(\theta \mid y)$ of the parameters $\theta$, either as a distribution, or as a set of samples. We implement three estimators for this task, namely

Each estimator constructs predictions either as samples or as a distribution; via predictsamples and predictdist, respectively. The conversion between samples and a distribution can be done automatically via sampling or fitting a multivariate normal distribution.

Estimator Overview

Making Predictions

ProbabilisticParameterEstimators.predictdistFunction
predictdist(::EstimationMethod, f, xs, ys, paramprior::Sampleable, noise_model::NoiseModel)

Solve the parameter estimation problem $y = f(x, θ) + ε$ and directly return the distribution of θ.

Depending on the method, the distribution is either generated directly, or constructed by fitting a multivariate normal distribution to generated samples. The paramprior is used either as a prior for bayesian methods, or to sample initial guesses for iterative methods.

source
ProbabilisticParameterEstimators.predictsamplesFunction
predictsamples(::EstimationMethod, f, xs, ys, paramprior::Sampleable, noise_model::NoiseModel, nsamples)

Solve the parameter estimation problem $y = f(x, θ) + ε$ and return samples of the probability distribution of θ.

Depending on the method, the samples are either generated directly, or sampled from the predicted distribution. The paramprior is used either as a prior for bayesian methods, or to sample initial guesses for iterative methods.

source

Estimation Methods

ProbabilisticParameterEstimators.LSQEstimatorType
struct LSQEstimator{ST<:Function, SAT<:NamedTuple} <: ProbabilisticParameterEstimators.EstimationMethod

The LSQEstimator works by sampling noise $\varepsilon^{(k)}$ from the noise model and repeatedly solving a least-squares parameter estimation problem for modified observations $y - \varepsilon^{(k)}$, i.e.

$\theta = \arg \min_\theta \sum_i ((y_i - \varepsilon_i^{(k)}) - f(x_i, \theta))^2 \cdot w_i$

for uncorrelated noise, where the weights $w_i$ are chosen as the inverse variance. For correlated noise, the weight results from the whole covariance matrix. The paramprior is used to sample initial guesses for $\theta$.

Therefore predictsamples will solve nsamples optimization problems and return a sample each. predictdist will do the same, and then fit a MvNormal distribution.

Fields

  • solvealg::Function: Function that creates solver algorithm; will be called with autodiff method fixed.

  • solveargs::NamedTuple: kwargs passed to NonlinearSolve.solve. Defaults to (; reltol=1e-3).

source
ProbabilisticParameterEstimators.LinearApproxEstimatorType
struct LinearApproxEstimator{ST<:Function, SAT<:NamedTuple} <: ProbabilisticParameterEstimators.EstimationMethod

The LinearApproxEstimator solves a similar optimization problem as LSQEstimator, but solves it just once, and then constructs a multivariate normal distribution centered at the solution. The covariance is constructed by computing the Jacobian of $f(x, \theta)$ and (roughly) multiplying it with the observation uncertainty. See also this wikipedia link. The paramprior is used to sample initial guesses for $\theta$.

Because a distribution is directly constructed predictdist will only solve one optimization problem and compute one Jacobian, directly yielding a MvNormal and making it very efficient. predictsamples will simply sample from this distribution, which is also very fast.

Fields

  • solvealg::Function: Function that creates solver algorithm; will be called with autodiff method fixed.

  • solveargs::NamedTuple: kwargs passed to NonlinearSolve.solve. Defaults to (; ).

source
ProbabilisticParameterEstimators.MCMCEstimatorType
struct MCMCEstimator{ST<:Turing.InferenceAlgorithm, SAT<:NamedTuple} <: EstimationMethod

The MCMCEstimator simply phrases the problem as a Monte-Carlo Markov-Chain inference problem, which we solve using Turing.jl. Therefore, predictions will always first create samples (after skipping a number of warmup steps), to which a distribution may be fitted.

See also predictdist and predictsamples.

Fields

  • samplealg::Turing.Inference.InferenceAlgorithm: Inference algorithm type for MCMC sampling. Defaults to NUTS.

  • sampleargs::NamedTuple: kwargs passed to Turing.sample. Defaults to (; drop_warmup=true, progress=false, verbose=false).

source