EET achieves a 1.5-15x state-of-art speedup varying with context length . Easy and Efficient Transformer (EET) has a significantperformance improvement over the existing schemes . EET is available on GitHub and is available at:https://://:// The ultra-large-scale pre-training model can effectively improve the effect of a variety of tasks, and it also brings a heavy computational burden toinference. Comparedto Faster Transformer’s implementation for GPT-2 on A100, EET achieved a1.5 x speedup. Compared to Faster Transformation’s implementation

Author(s) : Gongzheng li, Yadong Xi, Jingzhen Ding, Duan Wang, Bai Liu, Changjie Fan, Xiaoxi Mao, Zeng Zhao

Links : PDF - Abstract

Code :

Keywords : eet - transformer - faster - efficient - implementation -

Leave a Reply

Your email address will not be published. Required fields are marked *