fairseq S2T follows fairseq’s careful design for scalability and extensibility… We provide end-to-end workflows from data pre-processing, model training to offline (online) inference . We implement state-of-the-art RNN-based as well as Transformer-based models and open-source detailed training recipes . Fairseq’s machine translation models and language models can be seamlessly integrated into workflows for multi-task learning .

Author(s) :

Links : PDF - Abstract

Code :

https://github.com/pytorch/fairseq/tree/master/examples/speech_to_text




Keywords : fairseq - models - based - training - s -

Leave a Reply

Your email address will not be published. Required fields are marked *