《Attention Is All You Need》

Proposal propose the Transformer, a model architecture eschewing recurrence and instead relying entirely on an attention mechanism to draw global dependencies between input and output. Contributions 1
相關文章
相關標籤/搜索