Conversation
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1286 +/- ##
==========================================
+ Coverage 78.62% 82.32% +3.69%
==========================================
Files 50 49 -1
Lines 3631 3496 -135
==========================================
+ Hits 2855 2878 +23
+ Misses 776 618 -158 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Benchmark Report
Computer InformationBenchmark Results |
Closes #1309 Simplifies `run_ad` by removing the `varinfo` keyword and using a single `transform_strategy` path. Also replaces the `varinfo` field in `ADResult` with `ldf::LogDensityFunction` and removes unused imports. ### Migration note Before: `run_ad(model, adtype; varinfo=linked_vi)` After: `run_ad(model, adtype; transform_strategy=LinkAll())` --------- Co-authored-by: Penelope Yong <penelopeysm@gmail.com>
|
DynamicPPL.jl documentation for PR #1286 is available at: |
closes #1256 --------- Co-authored-by: hardik-xi11 <hardiklaha@gmail.com> Co-authored-by: Penelope Yong <penelopeysm@gmail.com>
Closes #1249. I'll write more here later, but you can see the changelog for a good overview of this PR. Things to do - [x] Generalise LinkedVecTransformAccumulator to something that's more like FixedTransformAccumulator - [x] Add a convenience function for 'static'-fying an LDF (?) - [x] Add tests for new behaviour - [x] Add dev docs on FixedTransform Possibly in a separate PR: - [ ] Look at the Turing VI interface to see what it needs from `bijector(model)`, and whether that can be removed (though, note also TuringLang/Turing.jl#2783) ---- Some scripts to use to benchmark: ```julia using DynamicPPL, Distributions, LogDensityProblems, Random, Chairmarks, ForwardDiff, ADTypes, LinearAlgebra # @model function esc(J, y, sigma) # mu ~ Normal(0, 5) # tau ~ truncated(Cauchy(0, 5); lower=0) # theta ~ MvNormal(fill(mu, J), tau^2 * I) # for i in 1:J # y[i] ~ Normal(theta[i], sigma[i]) # end # end # J = 8 # y = [28, 8, -3, 7, -1, 1, 18, 12] # sigma = [15, 10, 16, 11, 9, 11, 10, 18] # m = esc(J, y, sigma) @model function f() x ~ product_distribution([Beta(2,2), Uniform(4, 5), Normal()]) end m = f() ldf1 = LogDensityFunction(m, getlogjoint_internal, LinkAll(); adtype=AutoForwardDiff(), fix_transforms=true); p = rand(Xoshiro(468), ldf1); @b LogDensityProblems.logdensity(ldf1 ,p) @b LogDensityProblems.logdensity_and_gradient(ldf1 ,p) ldf2 = LogDensityFunction(m, getlogjoint_internal, LinkAll(); adtype=AutoForwardDiff()); p = rand(Xoshiro(468), ldf2); @b LogDensityProblems.logdensity(ldf2 ,p) @b LogDensityProblems.logdensity_and_gradient(ldf2 ,p) ``` --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: Hardik Kumar <hardikkumarpro0005@gmail.com> Co-authored-by: hardik-xi11 <hardiklaha@gmail.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
This extracts the raw value from a `TransformedValue`. In the cases where the transform is dynamic, it also demands that the distribution be supplied. It also adds an overload for `ParamsWithStats(vector, ldf)`, which avoids re-evaluating the model when it's not needed (i.e., in the special case where the LDF is constructed with fixed transforms for all variables). - ~~I found that not having this was actually a blocker for the VI stuff upstream because in the docs there's an example of `rand(vi_result, 100000)` (https://turinglang.org/docs/tutorials/variational-inference/#obtaining-summary-statistics). Previously, this would just return each sample as a raw vector, which I don't like (TuringLang/Turing.jl#2783): I would much prefer it return a `VarNamedTuple`, and so I changed it to do so. However, in the process, it ends up evaluating the model 100000 times and (for some unknown reason) crashes Julia on my laptop. This is a quick workaround to avoid having to evaluate the model, although in general I'd probably still like to know why it crashes (it can't be that handling 100000 VNTs is problematic, otherwise we would already have tons of issues with MCMC).~~ The crash (which I think is an OOM) is fixed by #1350 The code is mostly Claude, but I gave it a lot of steering, and did a few changes by hand. I will self-review this tomorrow. Closes #1347 (partly ... for now)
|
@sunxd3 Would you be happy to release this in its current state? (For context, about 90% of the changes are the fixed transform stuff which you already reviewed) |
|
This was discussed offline and I've addressed all of the review comments, so will merge. I will keep the MLD extension here for this release and maybe remove it in a patch release once it can be synchronised with upstream. |
|
Thanks, Penny! |
Please see changelog for info. The main changes are fixed transforms + removal of
vi[vn].CI on Turing has already been run against this branch and there are no blockers. See TuringLang/Turing.jl#2803.