Cisco Research marches towards democratizing federated learning

Myungjin Lee
Myungjin Lee

Wednesday, July 6th, 2022

Read Time
2 min read

Federated learning (FL) is an emerging machine learning paradigm where machine learning models can be built in a privacy-preserving manner. Geo-distributed training workers can collaboratively build a global model by sharing their local model’s weights with a central server.

While technologies around FL are evolving extremely fast, it is becoming complex and challenging to operationalize them. Hence, Cisco Research set up an ambitious vision of democratizing FL and started an in-house research project called Flame, now an open-source project. In addition, we have initiated collaborations with researchers at several different universities on projects that are well aligned with our research vision, and we funded them through our sponsored research program. Since then, we have actively been seeking to create synergies between Flame and the sponsored projects by absorbing new inventions into our system.

We recently hosted our first federated learning summit and virtually brought together our funded university principal investigators (PIs) and the broad Cisco community to continue to learn and collaborate with one another. The PIs gave a 30-minute talk on their respective research topic, touching upon various aspects in federated learning such as FL algorithms, resource heterogeneity, healthcare use cases, incentives, collaborative learning, personalization, etc.

Mosharaf Chowdhury (UMich) opened the main course of the summit by introducing FedScale – an open-source FL engine and talked about algorithms of selecting clients and scheduling client resources to boost training performance. Alexey Tumanov (GA Tech) introduced a mechanism to train custom models that are tuned for different latency-accuracy tradeoffs at deployment. Radu Marculescu (UT Austin) introduced a novel dynamic group clustering approach that accounts for edge devices mobility and hardware heterogeneity to improve training efficiency under latency and energy constraints. Mochen Yang and Xuan Bi (UMN) touched upon an important but often neglected aspect in healthcare FL – incentives. The team developed game theoretic models to understand conditions and to derive mechanisms for maintaining healthy FL partnerships. Ada Gavrilovska (GA Tech) brought up a unique perspective in training at the edge and presented a new approach called collaborative learning where knowledge can be transferred among clients at the edge. Ju Sun (UMN) highlighted how he applied federated learning for his healthcare research projects and shared his experience around non-technical challenges of FL.

Cisco Research believes in the potential of federated learning. We will continue to foster research and collaboration with our PIs, develop new inventions and integrate them into our open-source project so that research and open-source communities can truly benefit from our contributions in the long run. We look forward to sharing the outcomes of these projects and Flame in future blog posts.