FedCGD: Collective Gradient Divergence Optimized Scheduling for Wireless Federated Learning
Tan Chen, Jintao Yan, Yuxuan Sun, Sheng Zhou, Zhisheng Niu·June 09, 2025
Summary
FedCGD optimizes wireless federated learning, addressing data heterogeneity and bandwidth constraints. It introduces a CGD measure, combining device and sample-level CGDs, to improve convergence. FedCGD minimizes the sum of multi-level CGDs, balancing weighted earth moving distance and sampling variance, within polynomial time. This method boosts CIFAR-10 dataset classification accuracy by up to 4.2% with 41.8% fewer devices scheduled, offering flexible adjustments between reducing CGD components.
Introduction
Background
Overview of wireless federated learning challenges
Importance of addressing data heterogeneity and bandwidth constraints
Objective
Aim of FedCGD in enhancing wireless federated learning efficiency
Focus on improving convergence through a novel CGD measure
Method
CGD Measure
Definition and integration of device and sample-level CGDs
Purpose: Enhancing model convergence by considering both device and sample diversity
Minimization of CGDs
Objective: Minimizing the sum of multi-level CGDs
Methodology: Balancing weighted earth moving distance and sampling variance
Computational complexity: Polynomial time algorithm for efficient computation
Flexible Adjustment
Mechanism for adjusting between reducing CGD components
Benefits: Optimizing resource allocation and improving learning outcomes
Application
Dataset Classification
Case study: CIFAR-10 dataset
Results: Up to 4.2% increase in classification accuracy
Reduction in devices scheduled: 41.8%
Scalability and Flexibility
Discussion on FedCGD's adaptability in varying network conditions
Comparison with baseline methods in terms of performance and resource utilization
Conclusion
Summary of Contributions
Recap of FedCGD's innovations in wireless federated learning
Future Directions
Potential areas for further research and development
Impact on broader applications of federated learning
Basic info
papers
distributed, parallel, and cluster computing
machine learning
artificial intelligence
Advanced features
Insights
What is the CGD measure introduced in FedCGD, and how does it improve convergence?
How does FedCGD address data heterogeneity in wireless federated learning?
How does FedCGD minimize the sum of multi-level CGDs, and what metrics are balanced in this process?
What are the performance improvements of FedCGD on the CIFAR-10 dataset, and how does it compare to other methods in terms of device scheduling?