Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Update Transformer Decoder Layer to return decoder self-attention
#2555
opened Sep 1, 2020 by
de9uch1
3 of 4
Update Transformer Encoder Layer to return encoder self-attention
#2551
opened Sep 1, 2020 by
de9uch1
3 of 4
Support quantization in Fairseq Sequence generator
CLA Signed
fb-exported
#1984
opened Apr 8, 2020 by
cndn
Deprecate the SequenceGenerator with the Scripted vision (#1120)
CLA Signed
fb-exported
#1975
opened Apr 7, 2020 by
liuchen9494
auto-formatting fairseq/sequence_generator.py
CLA Signed
fb-exported
#1972
opened Apr 7, 2020 by
jhcross
Allow cosine scheduler to use --lr instead of --max-lr
CLA Signed
#1891
opened Mar 23, 2020 by
myleott
Implemented Time stretch for speech recognition
CLA Signed
#1838
opened Mar 13, 2020 by
mattiadg
3 of 4
Previous Next
ProTip!
Type g i on any issue or pull request to go back to the issue listing page.