- Lei Jiao
Federated learning (FL) is an emerging technique for model training from decentralized data. Compared to learning from data in a central storage, FL has benefits of privacy preservation and communication bandwidth reduction. A challenge in FL is that data and model characteristics can vary largely across different tasks, and an FL task with improper configuration could waste a lot of computation/communication resources and may cause the trained model to diverge from the optimal result. In this talk, I will present adaptive FL methods that learns near-optimal configurations (e.g., synchronization interval, compressed model size) over time during the FL process, to reach the best model accuracy with the smallest amount of training time. These adaptive FL algorithms are derived from convergence analysis, online learning, and related analytical techniques. The performance of these algorithms is evaluated both theoretically and empirically. Some open problems will be also outlined.
Shiqiang Wang is a Research Staff Member in the Distributed AI Department at IBM T. J. Watson Research Center. He received his Ph.D. from Imperial College London in 2015. His current research focuses on the interdisciplinary areas in machine learning, distributed systems, optimization, networking, and signal processing. Dr. Wang served as a technical program committee (TPC) member of several international conferences, including ICML, NeurIPS, ICDCS, AISTATS, IJCAI, and as an associate editor of the IEEE Transactions on Mobile Computing. He received the IBM Outstanding Technical Achievement Award (OTAA) in 2019, multiple Invention Achievement Awards from IBM since 2016, Best Paper Finalist of the IEEE International Conference on Image Processing (ICIP) 2019, and Best Student Paper Award of the Network and Information Sciences International Technology Alliance (NIS-ITA) in 2015. For more details, please visit: https://shiqiang.wang/