We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
"# transformer-pytorch"
This is implementation of Original Transformer in PyTorch. The implementation is based on the research paper "Attention is All You Need".
There was an error while loading. Please reload this page.