Skip to content

Discuss LLVM-level autodiff via Enzyme for cross-language composability #130

@seabbs

Description

@seabbs

Note: This issue was drafted with LLM assistance based on web research.

Context

The paper discusses backend flexibility and the potential for different computational backends. One avenue not currently discussed is LLVM-level automatic differentiation via Enzyme, which could enable cross-language composability.

What is Enzyme?

Enzyme is an LLVM plugin that performs automatic differentiation at the compiler level rather than the language level. By operating on LLVM intermediate representation, it can differentiate programs written in C, C++, Swift, Julia, Rust, Fortran, and TensorFlow using a single unified tool.

Key advantage: Enzyme differentiates code after LLVM optimisation occurs, producing substantially faster derivatives than tools that differentiate before optimisation (Moses & Churavy, 2020).

Current ecosystem status

Julia (Enzyme.jl)

Rust

Conference and community

Potential benefits for composable ID modelling

  1. Cross-language components: Domain experts could write performance-critical components in Rust, C, or C++ whilst maintaining autodifferentiability within the framework
  2. Performance: LLVM-level optimisations applied before differentiation can yield substantial speedups
  3. Already integrated: Turing.jl (our current backend) already supports Enzyme via DifferentiationInterface.jl
  4. Broader contributor base: Researchers comfortable in Rust or C++ could contribute components without learning Julia
  5. Memory safety + performance: Rust components would bring memory safety guarantees with C-like performance

Relevance to the paper

This relates to:

  • Discussion of backend flexibility and abstract backends (Section 3)
  • Goal of lowering barriers for domain experts to contribute
  • Cross-ecosystem accessibility (currently addressed via EpiAwareR)
  • Backend-agnostic computational components (Section 3.2)

Questions to consider

  1. Is this worth mentioning in the discussion of alternative approaches or future work?
  2. Does it strengthen the argument about composability across computational paradigms?
  3. Is the Rust/cross-language angle too speculative given current limitations?
  4. Should we mention that Turing.jl already supports Enzyme, even if we don't use it in the case studies?

Key references

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions