L-FGADMM
This article proposes a communication-efficient decentralized deep learning algorithm, coined layer-wise federated group ADMM (L-FGADMM). To minimize an empirical risk, every worker […]
Mix2FLD
This letter proposes a novel communication-efficient and privacy-preserving distributed machine learning framework, coined Mix2FLD. To address uplink-downlink capacity asymmetry, local model outputs […]
Federated Deep Reinforcement Learning for Internet of Things with Decentralized Cooperative Edge Caching
Edge caching is an emerging technology for addressing massive content access in mobile networks to support rapidly growing Internet-of-Things (IoT) services and […]
Incentivize to build
Federated learning (FL) rests on the notion of training a global model in a decentralized manner. Under this setting, mobile devices perform […]
Blockchained On-Device Federated Learning
By leveraging blockchain, this letter proposes a blockchained federated learning (BlockFL) architecture where local learning model updates are exchanged and verified. This […]
Federated learning for ultra-reliable low-latency V2V communications
In this paper, a novel joint transmit power and resource allocation approach for enabling ultra-reliable low-latency communication (URLLC) in vehicular networks is […]