-
Notifications
You must be signed in to change notification settings - Fork 580
refactor(bb): remove --output_format, have one-size-fits-all format #16201
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
refactor(bb): remove --output_format, have one-size-fits-all format #16201
Conversation
Continuing the vertical slice of doing CIVC API with the new BBAPI end-to-end, this adds: - revives a schema compiler that we were using with the circuits C++ code, adapting it for the unified bbapi - adds code to bb.js, currently unused, to wrap a process and seamlessly have bb.js use the native implementations via a process that spits in and out msgpack - implements gates and ivc verify in CIVC, uses them in api_client_ivc.cpp (backs the CLI) - move AztecBackend to the new API, using generated bindings - refactor bb.js to adapt to new generation source --------- Co-authored-by: Claude <noreply@anthropic.com>
In `stdlib_uint` we no longer need logical operations because the only places they were used in, i.e., std/turbo version of sha256, blake2s, blake3s, have been removed. So its best to reduce complexity of the `uint` class and keep it minimal. Removed the following functions from the `uint` class: ```cpp operator^ operator& operator| operator~ operator>> operator<< ror rol logic_operator ```
…ments as input and returns the commitment to the merged table (#15949) We modify the `MergeVerifier` so that it gets the subtable commitments as input and returns the commitment to the merged table. The reason for this change is that given the new structure of `ClientIVC` following [#15704](#15704), we can't access the merged table commitments from inside `complete_hiding_circuit_logic`. This PR is in preparation for [#15829](#15829) --------- Co-authored-by: AztecBot <tech@aztecprotocol.com>
The tar is not supposed to be checked in according to 50a1bd3#r2229436867
TLDR: `uint` arithmetic operators `+` and `-` had a coding error and as
a result, we weren't actually supporting lazy arithmetic over integers.
This PR simplifies the `uint` class to now allow any "unbounded" values.
#### The Issue
In the current `uint` class, we allow "unbounded" values, for example, a
`uint32_ct` can contain a value > 32 bits. This was done to allow lazy
arithmetic before such values were "normalized". This is because a call
to `normalize()` is expensive: it decomposes the value in 12-bit slices
and range-constrains each slice.
In practice though, the addition and subtraction operator actually
didn't allow any overflow due to a coding error.
On adding two $\textsf{uint}x$ values $a$ and $b$ (where $x \in [8, 16,
32, 64]$), we currently do:
https://github.com/AztecProtocol/aztec-packages/blob/5c2c217a2f1b05ae226a16ee19a99079dbba8fec/barretenberg/cpp/src/barretenberg/stdlib/primitives/uint/arithmetic.cpp#L27-L47
Assume $a, b$ are both witnesses, the `create_balanced_add_gate` creates
the following constraint:
$$a + b = q \cdot \textcolor{grey}{2^x} + r$$
where the quotient $q$ and remainder $r$ are computed as:
$$q := \frac{(a \textsf{ mod } 2^x) + (b \textsf{ mod } 2^x)}{2^x},
\quad r := \left((a \textsf{ mod } 2^x) + (b \textsf{ mod } 2^x)\right)
\textsf{ mod } 2^x.$$
In other words, the quotient and remainder are computed from the
"truncated" values of $a$ and $b$ when it should have been from the
"unbounded" values. Effectively, this means we are not actually
supporting lazy arithmetic (i.e., arithmetic operations expect inputs to
be "normalized"). I wrote a test
[here](https://github.com/AztecProtocol/aztec-packages/blob/ace0afdb4fb773cfc50af92930ecb94993ab72a5/barretenberg/cpp/src/barretenberg/stdlib/primitives/uint/uint.test.cpp#L243-L271)
that fails when, ideally, it should have passed. This confirmed the
coding error.
#### Solution(s)
One way to fix this is to actually use `get_unbounded_value()` in place
of `get_value()` (on lines 27 and 28 in `operator+` above). But we never
really were using the benefits of lazy addition (because of this silly
error). So we decided its better to remove functionality related to
"unbounded" uint values.
Thus, we remove the `witness_status` member of the `uint` class as it
tracks if a `uint` needs to be "normalized". As a consequence, we now
need to "normalize" in every constructor where we weren't constraining
the accumulators (i.e., `byte_array` and `std::vector<bool_t>`).
Further, in `operator+` and `operator-` we normalize the result. Also,
removed the `get_unbounded_value()` as it isn't being used anywhere.
…rcuit (#15829) We make the merged table received by the Merge verifier in the hiding circuit a public input to the hiding circuit. This is needed because the Merge verifier will soon receive `t_commitments`, `T_prev_commitments` as inputs rather than reading them from the proof. **EDIT:** To complete the work on the consistency checks, and to ensure the soundness of the Goblin verification, the merged table received by the Merge verifier in the last step of a Goblin accumulation must be set to be a public input of the circuit that performs the verification, so that the verifier can extract that public input and use it as the commitment to the previous table in the Merge verification. For example, in ClientIVC the last Merge verification before the final Goblin verification happens in the HidingKernel, so we need to add the merged table commitments received by the Merge verifier inside the HidingKernel to be public inputs of the HidingKernel. After this PR, `MegaVerifier = UltraVerifier<MegaFlavor>` always expects the inputs to be `PairingInputs` + commitments to ECC op tables. These inputs are produced by the class `HidingKernelIO` (even though in the future we might consider changing this name) The PR required changes to various tests to accommodate the new structure of the public inputs. --------- Co-authored-by: AztecBot <tech@aztecprotocol.com> Co-authored-by: ledwards2225 <98505400+ledwards2225@users.noreply.github.com> Co-authored-by: sergei iakovenko <105737703+iakovenkos@users.noreply.github.com> Co-authored-by: ludamad <adam.domurad@gmail.com> Co-authored-by: Suyash Bagad <suyash@aztecprotocol.com> Co-authored-by: Jonathan Hao <jonathan@aztec-labs.com> Co-authored-by: Raju Krishnamoorthy <krishnamoorthy@gmail.com> Co-authored-by: notnotraju <raju@aztec-labs.com> Co-authored-by: Lucas Xia <lucasxia01@gmail.com> Co-authored-by: Khashayar Barooti <khashayar@aztecprotocol.com> Co-authored-by: Jean M <132435771+jeanmon@users.noreply.github.com> Co-authored-by: Alex Gherghisan <alexghr@users.noreply.github.com> Co-authored-by: Santiago Palladino <spalladino@users.noreply.github.com> Co-authored-by: Santiago Palladino <santiago@aztec-labs.com> Co-authored-by: ludamad <domuradical@gmail.com> Co-authored-by: maramihali <mara@aztecprotocol.com> Co-authored-by: Sarkoxed <75146596+Sarkoxed@users.noreply.github.com>
- Replace api_ultra_honk.cpp implementation with bbapi-based version - Port all UltraHonk functionality to bbapi commands: - CircuitProve with flavor selection (including STARKNET_GARAGA_FLAVORS) - CircuitComputeVk for verification key generation - CircuitVerify for proof verification - CircuitInfo for gate counting (replaces direct gate_count calls) - VkAsFields and ProofAsFields for field element conversions - CircuitWriteSolidityVerifier for Solidity contract generation - Add comprehensive tests for all bbapi UltraHonk operations - Maintain backward compatibility with existing API interface - Keep CircuitCheck, CircuitProveAndVerify, and CircuitBenchmark as unimplemented This centralizes the UltraHonk implementation in bbapi, making it easier to maintain and extend.
- Create new BbApiUltraHonkBackend that uses bbapi commands instead of old WASM API - Maintains same public API as UltraHonkBackend for easy migration - Supports all oracle hash types (poseidon2, keccak, starknet) - Add migration guide to help users transition - Export new backend from main index This allows TypeScript users to use the same unified bbapi interface that is available in the CLI and C++ API, improving consistency and enabling better code reuse across language bindings. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
- Add proof_as_fields command to convert proofs to field elements - Add vk_as_fields command to convert verification keys to field elements - Support --mega_honk flag for MegaHonk VK format - Output results as JSON files (proof_fields.json, vk_fields.json) - Add comprehensive tests for ProofAsFields and VkAsFields in bbapi These commands expose the already-implemented bbapi functionality through the CLI, making it accessible to users who need field representations of proofs and verification keys for recursive verification. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
Document the BBAPI architecture, available commands, usage examples, and migration guidance. This helps developers understand how to use the new unified API interface across different language bindings. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
Remove bbapi README and TypeScript migration guide as requested. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
| opt_value = T(value); | ||
| } | ||
|
|
||
| template <typename T> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this was interfering
Removed the '--output_format bytes' flag from the proving command and clarified the binary output format description.
| ;; | ||
| bootstrap_e2e_hack) | ||
| echo "WARNING: This assumes your PR only changes barretenberg and the rest of the repository is unchanged from master." | ||
| echo "WARNING: This is only sound if you have not changed VK generation! (or noir-projects VKs will be incorrect)." |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can just do BOOTSTRAP_AFTER=barretenberg ./bootstrap.sh now
|
|
||
| // TODO: once UP is removed we can just roll this into the bas `Barretenberg` class. | ||
|
|
||
| export class BarretenbergVerifier { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
replaced with the more general UltraHonkVerifier that mirrors the proving options
| ]); | ||
| const fieldsJson = JSON.parse(rawFields); | ||
| const fields = fieldsJson.map(Fr.fromHexString); | ||
| const rawBinary = await fs.readFile(path.join(vkDirectoryPath, VK_FILENAME)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
adapting to the new format - fields are still very easy to get
ledwards2225
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just lovely - thanks for this
Removed padding logic for buffer chunks in binaryToFields function.
…d/bbapi/vk-simplified-buffer
…d/bbapi/vk-simplified-buffer
…nto ad/bbapi/vk-simplified-buffer
…d/bbapi/vk-simplified-buffer
| }; | ||
| } | ||
|
|
||
| export class UltraHonkVerifierBackend { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ludamad Is it advised that we always use this strategy for verification now?
In Noir's browser tests using just the UltraHonkBackend like such:
const backend = new UltraHonkBackend(noir_program.bytecode, { logger: debugLogger });
const proof = await backend.generateProof(witness);
// JS verification
const verified = await backend.verifyProof(proof);
Gives these failures
❌ Noir end to end test > a_1_mul (Compile, Execute, Prove, Verify)
Error: Expected variant name 'CircuitComputeVkResponse' but got 'CircuitProveResponse'
at ../../node_modules/@aztec/bb.js/src/cbind/generated/async.ts:25:14
at async UltraHonkBackend.verifyProof (../../node_modules/@aztec/bb.js/src/barretenberg/backend.ts:160:21)
at async n.<anonymous> (test/browser/compile_prove_verify.test.ts:72:21)
❌ Noir end to end test > assert_statement (Compile, Execute, Prove, Verify)
Error: prover trying to get too many points in MemBn254CrsFactory! 513 vs 4097
at throw_or_abort_impl (../../node_modules/@aztec/bb.js/src/barretenberg_wasm/barretenberg_wasm_base/index.ts:56:16)
at wasm://wasm/0235bac2:wasm-function[11]:0x1d9be
at wasm://wasm/0235bac2:wasm-function[3441]:0x4f69e8
at wasm://wasm/0235bac2:wasm-function[396]:0x2d638
at wasm://wasm/0235bac2:wasm-function[6683]:0x82d855
at wasm://wasm/0235bac2:wasm-function[608]:0x38ecf
at wasm://wasm/0235bac2:wasm-function[541]:0x35b7c
at wasm://wasm/0235bac2:wasm-function[732]:0x4092f
at wasm://wasm/0235bac2:wasm-function[729]:0x3ca45
at wasm://wasm/0235bac2:wasm-function[730]:0x3ea78
I see these errors being confusing for users. Is UltraHonkBackend::verify_proof still being used anywhere? Can we axe it as to avoid these issues?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah realizing this may just be an async issue which only pops up in the browser, thus needing the isolated verification call (however I have not tested this). Either way it looks like the docs would need updating https://barretenberg.aztec.network/docs/how_to_guides/on-the-browser (cc @signorecello). Ignore me, these looks to all be async issues with the web-test-runner we are using.
…16201) This changes vks to be serialized as bytes using to_field_elements. Proofs were already byte-compatible, this allows having bb not worry about how to split fields into JSON The changes are holistic as we now need to split the fields from the binary vk/proof - but this is now trivial with the new deserialization format.
…16201) This changes vks to be serialized as bytes using to_field_elements. Proofs were already byte-compatible, this allows having bb not worry about how to split fields into JSON The changes are holistic as we now need to split the fields from the binary vk/proof - but this is now trivial with the new deserialization format.
This changes vks to be serialized as bytes using to_field_elements. Proofs were already byte-compatible, this allows having bb not worry about how to split fields into JSON
The changes are holistic as we now need to split the fields from the binary vk/proof - but this is now trivial with the new deserialization format.