-
Notifications
You must be signed in to change notification settings - Fork 433
Add support for "phantom" BOLT 12 offers, up to the invoice_request step #4335
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
5b8bfc6
a7832be
bcaf6c8
2d92f16
73a0ba6
706b0db
9c6c534
5ef29ff
6c4d335
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -58,7 +58,7 @@ impl<'a, T: Writeable> Writeable for ChaChaPolyWriteAdapter<'a, T> { | |
| } | ||
|
|
||
| /// Encrypts the provided plaintext with the given key using ChaCha20Poly1305 in the modified | ||
| /// with-AAD form used in [`ChaChaDualPolyReadAdapter`]. | ||
| /// with-AAD form used in [`ChaChaTriPolyReadAdapter`]. | ||
| pub(crate) fn chachapoly_encrypt_with_swapped_aad( | ||
| mut plaintext: Vec<u8>, key: [u8; 32], aad: [u8; 32], | ||
| ) -> Vec<u8> { | ||
|
|
@@ -84,34 +84,48 @@ pub(crate) fn chachapoly_encrypt_with_swapped_aad( | |
| plaintext | ||
| } | ||
|
|
||
| #[derive(PartialEq, Eq)] | ||
| pub(crate) enum TriPolyAADUsed { | ||
| /// No AAD was used. | ||
| /// | ||
| /// The HMAC validated with standard ChaCha20Poly1305. | ||
| NoAAD, | ||
| /// The HMAC vlidated using the first AAD provided. | ||
| A, | ||
| /// The HMAC vlidated using the second AAD provided. | ||
| B, | ||
| } | ||
|
|
||
| /// Enables the use of the serialization macros for objects that need to be simultaneously decrypted | ||
| /// and deserialized. This allows us to avoid an intermediate Vec allocation. | ||
| /// | ||
| /// This variant of [`ChaChaPolyReadAdapter`] calculates Poly1305 tags twice, once using the given | ||
| /// key and once with the given 32-byte AAD appended after the encrypted stream, accepting either | ||
| /// being correct as sufficient. | ||
| /// This variant of [`ChaChaPolyReadAdapter`] calculates Poly1305 tags thrice, once using the given | ||
| /// key and twice with each of the two given 32-byte AADs appended after the encrypted stream, | ||
| /// accepting any being correct as sufficient. | ||
| /// | ||
| /// Note that we do *not* use the provided AAD as the standard ChaCha20Poly1305 AAD as that would | ||
| /// Note that we do *not* use the provided AADs as the standard ChaCha20Poly1305 AAD as that would | ||
| /// require placing it first and prevent us from avoiding redundant Poly1305 rounds. Instead, the | ||
| /// ChaCha20Poly1305 MAC check is tweaked to move the AAD to *after* the the contents being | ||
| /// checked, effectively treating the contents as the AAD for the AAD-containing MAC but behaving | ||
| /// like classic ChaCha20Poly1305 for the non-AAD-containing MAC. | ||
| pub(crate) struct ChaChaDualPolyReadAdapter<R: Readable> { | ||
| pub(crate) struct ChaChaTriPolyReadAdapter<R: Readable> { | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Would you say there is sufficient test coverage on this? I think there is some higher-level coverage, but no direct unit test.
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think so? I mean if its horribly broken the higher-level code will also break. Its also somewhat indirectly fuzzed through the onion decoding fuzzing.
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think lower level unit tests do have value even if there are higher level tests already.
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Not really sure what you'd want to test? That it can decrypt correctly? That's definitely well-covered. The one bug we did have in it was found by our fuzz test. More test coverage is great but more tests to get the same coverage seems like a net-negative. |
||
| pub readable: R, | ||
| pub used_aad: bool, | ||
| pub used_aad: TriPolyAADUsed, | ||
| } | ||
|
|
||
| impl<T: Readable> LengthReadableArgs<([u8; 32], [u8; 32])> for ChaChaDualPolyReadAdapter<T> { | ||
| impl<T: Readable> LengthReadableArgs<([u8; 32], [u8; 32], [u8; 32])> | ||
| for ChaChaTriPolyReadAdapter<T> | ||
| { | ||
| // Simultaneously read and decrypt an object from a LengthLimitedRead storing it in | ||
| // Self::readable. LengthLimitedRead must be used instead of std::io::Read because we need the | ||
| // total length to separate out the tag at the end. | ||
| fn read<R: LengthLimitedRead>( | ||
| r: &mut R, params: ([u8; 32], [u8; 32]), | ||
| r: &mut R, params: ([u8; 32], [u8; 32], [u8; 32]), | ||
| ) -> Result<Self, DecodeError> { | ||
| if r.remaining_bytes() < 16 { | ||
| return Err(DecodeError::InvalidValue); | ||
| } | ||
| let (key, aad) = params; | ||
| let (key, aad_a, aad_b) = params; | ||
|
|
||
| let mut chacha = ChaCha20::new(&key[..], &[0; 12]); | ||
| let mut mac_key = [0u8; 64]; | ||
|
|
@@ -125,7 +139,7 @@ impl<T: Readable> LengthReadableArgs<([u8; 32], [u8; 32])> for ChaChaDualPolyRea | |
| let decrypted_len = r.remaining_bytes() - 16; | ||
| let s = FixedLengthReader::new(r, decrypted_len); | ||
| let mut chacha_stream = | ||
| ChaChaDualPolyReader { chacha: &mut chacha, poly: &mut mac, read_len: 0, read: s }; | ||
| ChaChaTriPolyReader { chacha: &mut chacha, poly: &mut mac, read_len: 0, read: s }; | ||
|
|
||
| let readable: T = Readable::read(&mut chacha_stream)?; | ||
| while chacha_stream.read.bytes_remain() { | ||
|
|
@@ -142,14 +156,18 @@ impl<T: Readable> LengthReadableArgs<([u8; 32], [u8; 32])> for ChaChaDualPolyRea | |
| mac.input(&[0; 16][0..16 - (read_len % 16)]); | ||
| } | ||
|
|
||
| let mut mac_aad = mac; | ||
| let mut mac_aad_a = mac; | ||
| let mut mac_aad_b = mac; | ||
|
|
||
| mac_aad.input(&aad[..]); | ||
| mac_aad_a.input(&aad_a[..]); | ||
| mac_aad_b.input(&aad_b[..]); | ||
| // Note that we don't need to pad the AAD since its a multiple of 16 bytes | ||
|
|
||
| // For the AAD-containing MAC, swap the AAD and the read data, effectively. | ||
| mac_aad.input(&(read_len as u64).to_le_bytes()); | ||
| mac_aad.input(&32u64.to_le_bytes()); | ||
| mac_aad_a.input(&(read_len as u64).to_le_bytes()); | ||
| mac_aad_b.input(&(read_len as u64).to_le_bytes()); | ||
| mac_aad_a.input(&32u64.to_le_bytes()); | ||
| mac_aad_b.input(&32u64.to_le_bytes()); | ||
|
|
||
| // For the non-AAD-containing MAC, leave the data and AAD where they belong. | ||
| mac.input(&0u64.to_le_bytes()); | ||
|
|
@@ -158,23 +176,25 @@ impl<T: Readable> LengthReadableArgs<([u8; 32], [u8; 32])> for ChaChaDualPolyRea | |
| let mut tag = [0 as u8; 16]; | ||
| r.read_exact(&mut tag)?; | ||
| if fixed_time_eq(&mac.result(), &tag) { | ||
| Ok(Self { readable, used_aad: false }) | ||
| } else if fixed_time_eq(&mac_aad.result(), &tag) { | ||
| Ok(Self { readable, used_aad: true }) | ||
| Ok(Self { readable, used_aad: TriPolyAADUsed::NoAAD }) | ||
| } else if fixed_time_eq(&mac_aad_a.result(), &tag) { | ||
| Ok(Self { readable, used_aad: TriPolyAADUsed::A }) | ||
| } else if fixed_time_eq(&mac_aad_b.result(), &tag) { | ||
| Ok(Self { readable, used_aad: TriPolyAADUsed::B }) | ||
| } else { | ||
| return Err(DecodeError::InvalidValue); | ||
| } | ||
| } | ||
| } | ||
|
|
||
| struct ChaChaDualPolyReader<'a, R: Read> { | ||
| struct ChaChaTriPolyReader<'a, R: Read> { | ||
| chacha: &'a mut ChaCha20, | ||
| poly: &'a mut Poly1305, | ||
| read_len: usize, | ||
| pub read: R, | ||
| } | ||
|
|
||
| impl<'a, R: Read> Read for ChaChaDualPolyReader<'a, R> { | ||
| impl<'a, R: Read> Read for ChaChaTriPolyReader<'a, R> { | ||
| // Decrypts bytes from Self::read into `dest`. | ||
| // After all reads complete, the caller must compare the expected tag with | ||
| // the result of `Poly1305::result()`. | ||
|
|
@@ -349,15 +369,15 @@ mod tests { | |
| } | ||
|
|
||
| #[test] | ||
| fn short_read_chacha_dual_read_adapter() { | ||
| // Previously, if we attempted to read from a ChaChaDualPolyReadAdapter but the object | ||
| fn short_read_chacha_tri_read_adapter() { | ||
| // Previously, if we attempted to read from a ChaChaTriPolyReadAdapter but the object | ||
| // being read is shorter than the available buffer while the buffer passed to | ||
| // ChaChaDualPolyReadAdapter itself always thinks it has room, we'd end up | ||
| // ChaChaTriPolyReadAdapter itself always thinks it has room, we'd end up | ||
| // infinite-looping as we didn't handle `Read::read`'s 0 return values at EOF. | ||
| let mut stream = &[0; 1024][..]; | ||
| let mut too_long_stream = FixedLengthReader::new(&mut stream, 2048); | ||
| let keys = ([42; 32], [99; 32]); | ||
| let res = super::ChaChaDualPolyReadAdapter::<u8>::read(&mut too_long_stream, keys); | ||
| let keys = ([42; 32], [98; 32], [99; 32]); | ||
| let res = super::ChaChaTriPolyReadAdapter::<u8>::read(&mut too_long_stream, keys); | ||
| match res { | ||
| Ok(_) => panic!(), | ||
| Err(e) => assert_eq!(e, DecodeError::ShortRead), | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -13416,6 +13416,45 @@ macro_rules! create_offer_builder { ($self: ident, $builder: ty) => { | |
|
|
||
| Ok(builder.into()) | ||
| } | ||
|
|
||
| /// Creates an [`OfferBuilder`] such that the [`Offer`] it builds is recognized by any | ||
| /// [`ChannelManager`] (or [`OffersMessageFlow`]) using the same [`ExpandedKey`] (as returned | ||
| /// from [`NodeSigner::get_expanded_key`]). This allows any nodes participating in a BOLT 11 | ||
| /// "phantom node" cluster to also receive BOLT 12 payments. | ||
| /// | ||
| /// Note that, unlike with BOLT 11 invoices, BOLT 12 "phantom" offers do not in fact have any | ||
| /// "phantom node" appended to receiving paths. Instead, multiple blinded paths are simply | ||
| /// included which terminate at different final nodes. | ||
| /// | ||
| /// `other_nodes_channels` must be set to a list of each participating node's `node_id` (from | ||
| /// [`NodeSigner::get_node_id`] with a [`Recipient::Node`]) and its channels. | ||
| /// | ||
| /// `path_count_limit` is used to limit the number of blinded paths included in the resulting | ||
| /// [`Offer`]. Note that if this is less than the number of participating nodes (i.e. | ||
| /// `other_nodes_channels.len() + 1`) not all nodes will participate in receiving funds. | ||
| /// Because the parameterized [`MessageRouter`] will only get a chance to limit the number of | ||
| /// paths *per-node*, it is important to set this for offers which will be included in a QR | ||
| /// code. | ||
| /// | ||
| /// See [`Self::create_offer_builder`] for more details on the blinded path construction. | ||
| /// | ||
| /// [`ExpandedKey`]: inbound_payment::ExpandedKey | ||
| pub fn create_phantom_offer_builder( | ||
| &$self, other_nodes_channels: Vec<(PublicKey, Vec<ChannelDetails>)>, | ||
joostjager marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| path_count_limit: usize, | ||
| ) -> Result<$builder, Bolt12SemanticError> { | ||
| let mut peers = Vec::with_capacity(other_nodes_channels.len() + 1); | ||
| peers.push(($self.get_our_node_id(), $self.get_peers_for_blinded_path())); | ||
| for (node_id, peer_chans) in other_nodes_channels { | ||
| peers.push((node_id, Self::channel_details_to_forward_nodes(peer_chans))); | ||
| } | ||
|
|
||
| let builder = $self.flow.create_phantom_offer_builder( | ||
| &*$self.entropy_source, peers, path_count_limit | ||
| )?; | ||
|
|
||
| Ok(builder.into()) | ||
| } | ||
| } } | ||
|
|
||
| macro_rules! create_refund_builder { ($self: ident, $builder: ty) => { | ||
|
|
@@ -14045,6 +14084,41 @@ where | |
| now | ||
| } | ||
|
|
||
| /// Converts a list of channels to a list of peers which may be suitable to receive onion | ||
| /// messages through. | ||
| fn channel_details_to_forward_nodes( | ||
| mut channel_list: Vec<ChannelDetails>, | ||
| ) -> Vec<MessageForwardNode> { | ||
| channel_list.sort_unstable_by_key(|chan| chan.counterparty.node_id); | ||
| let mut res = Vec::new(); | ||
| // TODO: When MSRV reaches 1.77 use chunk_by | ||
| let mut start = 0; | ||
| while start < channel_list.len() { | ||
| let counterparty_node_id = channel_list[start].counterparty.node_id; | ||
| let end = channel_list[start..] | ||
| .iter() | ||
| .position(|chan| chan.counterparty.node_id != counterparty_node_id) | ||
TheBlueMatt marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| .map(|pos| start + pos) | ||
| .unwrap_or(channel_list.len()); | ||
|
|
||
| let peer_chans = &channel_list[start..end]; | ||
| if peer_chans.iter().any(|chan| chan.is_usable) | ||
| && peer_chans.iter().any(|c| c.counterparty.features.supports_onion_messages()) | ||
| { | ||
| res.push(MessageForwardNode { | ||
| node_id: peer_chans[0].counterparty.node_id, | ||
| short_channel_id: peer_chans | ||
| .iter() | ||
| .filter(|chan| chan.is_usable) | ||
| .filter_map(|chan| chan.short_channel_id) | ||
| .min(), | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why min?
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Its the oldest, matching the existing logic we have for the non-phantom case.
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Is that to maximize the chances that it still exist? Curious where the existing logic is located also.
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yea, basically. Absent any better criteria we assume that the oldest channel is most likely to stick around. We do it in |
||
| }) | ||
| } | ||
| start = end; | ||
| } | ||
| res | ||
| } | ||
|
|
||
| fn get_peers_for_blinded_path(&self) -> Vec<MessageForwardNode> { | ||
| let per_peer_state = self.per_peer_state.read().unwrap(); | ||
| per_peer_state | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -4404,21 +4404,41 @@ pub fn create_chanmon_cfgs(node_count: usize) -> Vec<TestChanMonCfg> { | |
|
|
||
| pub fn create_chanmon_cfgs_with_legacy_keys( | ||
| node_count: usize, predefined_keys_ids: Option<Vec<[u8; 32]>>, | ||
| ) -> Vec<TestChanMonCfg> { | ||
| create_chanmon_cfgs_internal(node_count, predefined_keys_ids, false) | ||
| } | ||
|
|
||
| pub fn create_phantom_chanmon_cfgs(node_count: usize) -> Vec<TestChanMonCfg> { | ||
| create_chanmon_cfgs_internal(node_count, None, true) | ||
| } | ||
|
|
||
| pub fn create_chanmon_cfgs_internal( | ||
| node_count: usize, predefined_keys_ids: Option<Vec<[u8; 32]>>, phantom: bool, | ||
| ) -> Vec<TestChanMonCfg> { | ||
| let mut chan_mon_cfgs = Vec::new(); | ||
| let phantom_seed = if phantom { Some(&[42; 32]) } else { None }; | ||
| for i in 0..node_count { | ||
| let tx_broadcaster = test_utils::TestBroadcaster::new(Network::Testnet); | ||
| let fee_estimator = test_utils::TestFeeEstimator::new(253); | ||
| let chain_source = test_utils::TestChainSource::new(Network::Testnet); | ||
| let logger = test_utils::TestLogger::with_id(format!("node {}", i)); | ||
| let persister = test_utils::TestPersister::new(); | ||
| let seed = [i as u8; 32]; | ||
| let keys_manager = if predefined_keys_ids.is_some() { | ||
| let mut seed = [i as u8; 32]; | ||
| if phantom { | ||
| // We would ideally randomize keys on every test run, but some tests fail in that case. | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why is that ideal? Aren't non-deterministic tests ideal?
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. We've historically had tests that accidentally pass because of key ordering or hashmap ordering, which can hide bugs. IMO tests should be robust against basic things like that (that might change in future PRs anyway!)
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think there are pros and cons. Fully deterministic tests are desirable too sometimes.
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Indeed there's tradeoffs. In this case its not a behavior change so there aren't many :p |
||
| // Instead, we only randomize in the phantom case. | ||
| use core::hash::{BuildHasher, Hasher}; | ||
| // Get a random value using the only std API to do so - the DefaultHasher | ||
| let rand_val = std::collections::hash_map::RandomState::new().build_hasher().finish(); | ||
| seed[..8].copy_from_slice(&rand_val.to_ne_bytes()); | ||
| } | ||
| let keys_manager = test_utils::TestKeysInterface::with_settings( | ||
| &seed, | ||
| Network::Testnet, | ||
| // Use legacy (V1) remote_key derivation for tests using legacy key sets. | ||
| test_utils::TestKeysInterface::with_v1_remote_key_derivation(&seed, Network::Testnet) | ||
| } else { | ||
| test_utils::TestKeysInterface::new(&seed, Network::Testnet) | ||
| }; | ||
| predefined_keys_ids.is_some(), | ||
| phantom_seed, | ||
| ); | ||
| let scorer = RwLock::new(test_utils::TestScorer::new()); | ||
|
|
||
| // Set predefined keys_id if provided | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You could have None here in case the node isn't configured for phantom payments. I think that might help keep that case in mind for readers, and also saves some computation?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea, we don't currently have a separate method on the signer that returns
Nonein cases where we aren't doing phantom so we'd have to add a new key-fetcher (orbool-fetcher) to decide whether to do phantom validation. I thought about doing that but it seemed way cleaner to reuse theExpandedKey(which is already used for all the inbound-payment key logic, including phantom). Its also not very much compute at all to just do one round of poly1305 + the finish logic so I'm not sure its worth the extra work.