Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions _quarto.yml
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,7 @@ website:
- usage/custom-distribution/index.qmd
- usage/probability-interface/index.qmd
- usage/modifying-logprob/index.qmd
- usage/external-likelihoods/index.qmd
- usage/tracking-extra-quantities/index.qmd
- usage/predictive-distributions/index.qmd
- usage/mode-estimation/index.qmd
Expand Down Expand Up @@ -208,6 +209,7 @@ usage-automatic-differentiation: usage/automatic-differentiation
usage-custom-distribution: usage/custom-distribution
usage-dynamichmc: usage/dynamichmc
usage-external-samplers: usage/external-samplers
usage-external-likelihoods: usage/external-likelihoods
usage-mode-estimation: usage/mode-estimation
usage-modifying-logprob: usage/modifying-logprob
usage-performance-tips: usage/performance-tips
Expand Down
56 changes: 56 additions & 0 deletions usage/external-likelihoods/index.qmd
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
---
title: External Likelihoods
engine: julia
---

```{julia}
#| echo: false
#| output: false
using Pkg;
Pkg.instantiate();
```

Sometimes a model's likelihood is not expressed directly as a distribution over observed data, but is instead computed by an external algorithm.
A common example is **state-space models**, where a filtering algorithm (e.g. a Kalman filter or a particle filter) marginalises out the latent states and returns the marginal log-likelihood of the observations given the model parameters.

In this setting Turing only needs to sample the model parameters; the likelihood contribution is injected into the model with the [`@addlogprob!`]({{< meta usage-modifying-logprob >}}) macro.

## Minimal example

The function below stands in for an external filtering algorithm — for instance one provided by [`SSMProblems.jl`](https://github.com/TuringLang/SSMProblems.jl) or [`GeneralisedFilters.jl`](https://github.com/TuringLang/GeneralisedFilters.jl).
Here we simply compute the log-likelihood of a Gaussian with unit variance, which is sufficient to demonstrate the integration pattern.

```{julia}
using Turing

# Mock filter — computes the Gaussian log-likelihood (constant terms
# omitted as they do not affect MCMC).
function run_external_filter(data, θ)
return -0.5 * sum((data .- θ) .^ 2)
end

@model function external_likelihood_demo(data)
θ ~ Normal(0, 1)
@addlogprob! run_external_filter(data, θ)
end
```

We can now sample from this model in the usual way:

```{julia}
data = randn(100)
model = external_likelihood_demo(data)
chain = sample(model, NUTS(), 100)
```

Because the mock filter computes a Gaussian log-likelihood with unit variance, the posterior for `θ` should concentrate around the sample mean of `data`.

## When to use this pattern

Use `@addlogprob!` whenever the likelihood of your observations is computed by code that lives outside Turing's `~` syntax. Typical cases include:

- **State-space filtering** — packages such as `SSMProblems.jl` and `GeneralisedFilters.jl` evaluate the marginal likelihood via Kalman or particle filters.
- **Hidden Markov Models** — the [HMM tutorial]({{< meta hidden-markov-model >}}#efficient-inference-with-the-forward-algorithm) shows the same pattern using `HiddenMarkovModels.jl` and `logdensityof`.
- **Any domain-specific likelihood** — whenever you have a function that returns a log-probability, you can plug it in with `@addlogprob!`.

For more details on the macro itself, see [Modifying the Log Probability]({{< meta usage-modifying-logprob >}}).
Loading