Conversation
|
Hey @extronics! Thanks for contributing. I like the idea a lot. Just thinking out loud a bit, but I was curious if it would be easier to just have this verbose mode be the internal, default data structure and then have the existing api methods re-traverse that and pluck out the Do you think the maintenance worry of forgetting about verbose mode's alternate structure is a real concern? Or is it unlikely that we'll need to really touch this moving forward? |
|
@mrjoelkemp good point, this would solve some of the I think the performance penalty won't be a huge concern as long as the re-traversal while building the output also re-uses duplicate sub-trees (https://github.com/dependents/node-dependency-tree/pull/123/files#diff-e727e4bdf3657fd1d798edcd6b099d6e092f8573cba266154583a746bba0f346L144). From my observations, the really huge dependency trees originate from source files which include some top-level file of their own codebase again. At least from my observations. Ive seem some dependency trees in popular libraries in the range of 50 million items - almost all of them from duplicate sub-trees though. Skipping those made for sub-millisecond traversal still because you end up visiting a few dozen unique nodes only. That being said - i known that there is a lot of obscure stuff going on in real life code bases so i might be missing some reasons to avoid re-traversal :) |
In my use-case i'm processing the files, identified by
node-dependency-tree, again. In order to avoid re-resolving the dependencies in my own code, i added thepartialto the output. This way, it's easy to identify which syntax node in a babel AST refers to which dependency.The format i choose is easy for extending with further internals of
node-dependency-tree, if need arises (see README.md)I was not able to run the tests - many of them failed without my changes already. Are they supposed to be functional/can you give me some pointers what's needed to get them running?
If you like the change i'd also add some tests if i can get them to run in the first place.