Federated learning enables a cluster of decentralized mobile devices at theedge to collaboratively train a shared machine learning model, while keeping the raw training samples on device . This decentralized training approach isdemonstrated as a practical solution to mitigate the risk of privacy leakage . AutoFL achieves 3.6 times faster model convergencetime and 4.7 and 5.2 times higher energy efficiency for local clients and . global clients and globally over the cluster of K participants, respectively. By considering the unique characteristics of FLedge deployment judiciously, AutoFL achieve . 3.5 times faster models convergenceal time to converge and achieve higher energy . efficiency over cluster of . K participants and global . participants, . respectively. AutoFL achieved 3.7 times faster .

Author(s) : Young Geun Kim, Carole-Jean Wu

Links : PDF - Abstract

Code :
Coursera

Keywords : autofl - times - participants - energy - faster -

Leave a Reply

Your email address will not be published. Required fields are marked *