From 90cbe4cfa31516e0df4ff49203700bc6d619e282 Mon Sep 17 00:00:00 2001 From: hardik-xi11 Date: Mon, 30 Mar 2026 00:56:10 +0530 Subject: [PATCH 1/2] add a new page for external likelihoods --- _quarto.yml | 2 + usage/external-likelihoods/index.qmd | 74 ++++++++++++++++++++++++++++ 2 files changed, 76 insertions(+) create mode 100644 usage/external-likelihoods/index.qmd diff --git a/_quarto.yml b/_quarto.yml index 883be4de2..998a63c57 100644 --- a/_quarto.yml +++ b/_quarto.yml @@ -79,6 +79,7 @@ website: - usage/custom-distribution/index.qmd - usage/probability-interface/index.qmd - usage/modifying-logprob/index.qmd + - usage/external-likelihoods/index.qmd - usage/tracking-extra-quantities/index.qmd - usage/predictive-distributions/index.qmd - usage/mode-estimation/index.qmd @@ -208,6 +209,7 @@ usage-automatic-differentiation: usage/automatic-differentiation usage-custom-distribution: usage/custom-distribution usage-dynamichmc: usage/dynamichmc usage-external-samplers: usage/external-samplers +usage-external-likelihoods: usage/external-likelihoods usage-mode-estimation: usage/mode-estimation usage-modifying-logprob: usage/modifying-logprob usage-performance-tips: usage/performance-tips diff --git a/usage/external-likelihoods/index.qmd b/usage/external-likelihoods/index.qmd new file mode 100644 index 000000000..92cf465a9 --- /dev/null +++ b/usage/external-likelihoods/index.qmd @@ -0,0 +1,74 @@ +--- +title: External Likelihoods +engine: julia +--- + +```{julia} +#| echo: false +#| output: false +using Pkg; +Pkg.instantiate(); +``` + +Sometimes a model's likelihood is not expressed directly as a distribution over +observed data, but is instead computed by an external algorithm. A common example +is **state-space models**, where a filtering algorithm +(e.g. a Kalman filter or a particle filter) marginalises out the latent states +and returns the marginal log-likelihood of the observations given the model +parameters. + +In this setting Turing only needs to sample the model parameters; the +likelihood contribution is injected into the model with the +[`@addlogprob!`]({{< meta usage-modifying-logprob >}}) macro. + +## Minimal example + +The function below stands in for an external filtering algorithm — for instance +one provided by +[`SSMProblems.jl`](https://github.com/TuringLang/SSMProblems.jl) or +[`GeneralisedFilters.jl`](https://github.com/TuringLang/GeneralisedFilters.jl). +Here we simply compute the log-likelihood of a Gaussian with unit variance, +which is sufficient to demonstrate the integration pattern. + +```{julia} +using Turing + +# Mock filter — computes the Gaussian log-likelihood (constant terms +# omitted as they do not affect MCMC). +function run_external_filter(data, θ) + return -0.5 * sum((data .- θ) .^ 2) +end + +@model function external_likelihood_demo(data) + θ ~ Normal(0, 1) + Turing.@addlogprob! run_external_filter(data, θ) +end +``` + +We can now sample from this model in the usual way: + +```{julia} +data = randn(100) +model = external_likelihood_demo(data) +chain = sample(model, NUTS(), 100) +``` + +Because the mock filter computes a Gaussian log-likelihood with unit variance, +the posterior for `θ` should concentrate around the sample mean of `data`. + +## When to use this pattern + +Use `@addlogprob!` whenever the likelihood of your observations is computed +by code that lives outside Turing's `~` syntax. Typical cases include: + +- **State-space filtering** — packages such as `SSMProblems.jl` and + `GeneralisedFilters.jl` evaluate the marginal likelihood via Kalman or + particle filters. +- **Hidden Markov Models** — the + [HMM tutorial]({{< meta hidden-markov-model >}}#efficient-inference-with-the-forward-algorithm) + shows the same pattern using `HiddenMarkovModels.jl` and `logdensityof`. +- **Any domain-specific likelihood** — whenever you have a function that returns + a log-probability, you can plug it in with `@addlogprob!`. + +For more details on the macro itself, see +[Modifying the Log Probability]({{< meta usage-modifying-logprob >}}). From 96530fd32e2faf48c76085ea9c7d30bb315c2550 Mon Sep 17 00:00:00 2001 From: hardik-xi11 Date: Mon, 30 Mar 2026 21:26:51 +0530 Subject: [PATCH 2/2] make it pretty --- usage/external-likelihoods/index.qmd | 42 ++++++++-------------------- 1 file changed, 12 insertions(+), 30 deletions(-) diff --git a/usage/external-likelihoods/index.qmd b/usage/external-likelihoods/index.qmd index 92cf465a9..9ad653af6 100644 --- a/usage/external-likelihoods/index.qmd +++ b/usage/external-likelihoods/index.qmd @@ -10,25 +10,15 @@ using Pkg; Pkg.instantiate(); ``` -Sometimes a model's likelihood is not expressed directly as a distribution over -observed data, but is instead computed by an external algorithm. A common example -is **state-space models**, where a filtering algorithm -(e.g. a Kalman filter or a particle filter) marginalises out the latent states -and returns the marginal log-likelihood of the observations given the model -parameters. +Sometimes a model's likelihood is not expressed directly as a distribution over observed data, but is instead computed by an external algorithm. +A common example is **state-space models**, where a filtering algorithm (e.g. a Kalman filter or a particle filter) marginalises out the latent states and returns the marginal log-likelihood of the observations given the model parameters. -In this setting Turing only needs to sample the model parameters; the -likelihood contribution is injected into the model with the -[`@addlogprob!`]({{< meta usage-modifying-logprob >}}) macro. +In this setting Turing only needs to sample the model parameters; the likelihood contribution is injected into the model with the [`@addlogprob!`]({{< meta usage-modifying-logprob >}}) macro. ## Minimal example -The function below stands in for an external filtering algorithm — for instance -one provided by -[`SSMProblems.jl`](https://github.com/TuringLang/SSMProblems.jl) or -[`GeneralisedFilters.jl`](https://github.com/TuringLang/GeneralisedFilters.jl). -Here we simply compute the log-likelihood of a Gaussian with unit variance, -which is sufficient to demonstrate the integration pattern. +The function below stands in for an external filtering algorithm — for instance one provided by [`SSMProblems.jl`](https://github.com/TuringLang/SSMProblems.jl) or [`GeneralisedFilters.jl`](https://github.com/TuringLang/GeneralisedFilters.jl). +Here we simply compute the log-likelihood of a Gaussian with unit variance, which is sufficient to demonstrate the integration pattern. ```{julia} using Turing @@ -41,7 +31,7 @@ end @model function external_likelihood_demo(data) θ ~ Normal(0, 1) - Turing.@addlogprob! run_external_filter(data, θ) + @addlogprob! run_external_filter(data, θ) end ``` @@ -53,22 +43,14 @@ model = external_likelihood_demo(data) chain = sample(model, NUTS(), 100) ``` -Because the mock filter computes a Gaussian log-likelihood with unit variance, -the posterior for `θ` should concentrate around the sample mean of `data`. +Because the mock filter computes a Gaussian log-likelihood with unit variance, the posterior for `θ` should concentrate around the sample mean of `data`. ## When to use this pattern -Use `@addlogprob!` whenever the likelihood of your observations is computed -by code that lives outside Turing's `~` syntax. Typical cases include: +Use `@addlogprob!` whenever the likelihood of your observations is computed by code that lives outside Turing's `~` syntax. Typical cases include: -- **State-space filtering** — packages such as `SSMProblems.jl` and - `GeneralisedFilters.jl` evaluate the marginal likelihood via Kalman or - particle filters. -- **Hidden Markov Models** — the - [HMM tutorial]({{< meta hidden-markov-model >}}#efficient-inference-with-the-forward-algorithm) - shows the same pattern using `HiddenMarkovModels.jl` and `logdensityof`. -- **Any domain-specific likelihood** — whenever you have a function that returns - a log-probability, you can plug it in with `@addlogprob!`. +- **State-space filtering** — packages such as `SSMProblems.jl` and `GeneralisedFilters.jl` evaluate the marginal likelihood via Kalman or particle filters. +- **Hidden Markov Models** — the [HMM tutorial]({{< meta hidden-markov-model >}}#efficient-inference-with-the-forward-algorithm) shows the same pattern using `HiddenMarkovModels.jl` and `logdensityof`. +- **Any domain-specific likelihood** — whenever you have a function that returns a log-probability, you can plug it in with `@addlogprob!`. -For more details on the macro itself, see -[Modifying the Log Probability]({{< meta usage-modifying-logprob >}}). +For more details on the macro itself, see [Modifying the Log Probability]({{< meta usage-modifying-logprob >}}).