[2] LI T, SAHU A K, TALWALKAR A, et al. Federated learning:challenges,methods,and future directions[J]. IEEE Signal Processing Magazine,2020,37(3):50-60.
[3] BRISIMI T S, CHEN R D, MELA T, et al. Federated learning of predictive models from federated electronic health records[J]. International Journal of Medical Informatics,2018,112:59-67.
[4] MCMAHAN?H B, MOORE ?E, RAMAGE?D, et al. Communication-efficient learning of deep networks from decentralized data[C]//SINGH A,ZHU X J. Proceedings of the 20th International Conference on Artificial Intelligence and Statistics(AISTATS). Cambridge:JMLR,2017:1273-1282.
[5] YU H, YANG S, ZHU S H. Parallel restarted SGD with faster convergence and less communication:demystifying why model averaging works for deep learning[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Palo Alto,California:AAAI Press,2019:5693-5700.
[6] ZENG M Y,WANG X M,PAN W J,et al. Heterogeneous training intensity for federated learning: a deep reinforcement learning approach[J]. IEEE Transactions on Network Science and Engineering,2023,10(2):990-1002.
[7] KARIMIREDDY S P,KALE S,MOHRI M,et al. Scaffold:stochastic controlled averaging for federated learning[OL]. (2021-04-09)[2023-04-30]. https://doi.org/10.48550/arxiv.1910.06378.
[8] SUN H R,HONG M Y. Distributed non-convex first-order optimization and information processing: lower complexity bounds and rate optimal algorithms[J]. IEEE Transactions on Signal Processing,2019,67(22):5912-5928.
[9] WANG S Q,TUOR T,SALONIDIS T,et al. Adaptive federated learning in resource constrained edge computing systems[J]. IEEE Journal on Selected Areas in Communications,2019,37(6):1205-1221.
[10] MA L S,SU W,LI X Z,et al. Heterogeneous data backup against early warning disasters in geo-distributed data center networks[J]. Journal of Optical Communications and Networking,2018,10(4):376-385.
[11] WANG Z G, ZHANG J W, CHANG T H, et al. Distributed stochastic consensus optimization with momentum for nonconvex nonsmooth problems[J]. IEEE Transactions on Signal Processing,2021,69:4486-4501.
[12] MORAFAH M, VAHIDIAN S, WANG W J, et al. FLIS:clustered federated learning via inference similarity for non-IID data distribution[J]. IEEE Open Journal of the Computer Society,2023,4:109-120.
[13] SHAO Y L,GüNDüZ D,LIEW S C. Federated edge learning with misaligned over-the-air computation[J]. IEEE Transactions on Wireless Communications,2022,21(6):3951-3964.
[14] LI X X,JIANG M R,ZHANG X F, et al. FedBN:federated learning on non-IID features via local batch normalization[C]// 9th International Conference on Learning Representations.[S.L.]:ICLR,2021.
[15] FABOHUNGBE O, QIAN L. The effect of batch normalization on noise resistant property of deep learning models[J]. IEEE Access,2022,10:127728-127741.
[16] CHEN Z D, DENG L, LI G Q,et al. Effective and efficient batch normalization using a few uncorrelated data for statistics estimation[J]. IEEE Transactions on Neural Networks and Learning Systems,2021,32(1):348-362.
[17] AWAIS M,IQBAL M T B,BAE S H,et al. Revisiting internal covariate shift for batch normalization[J]. IEEE Transactions on Neural Networks and Learning Systems,2021,32(11):5082-5092.
[18] XIAO H, RASUL K, VOLLGRAF R. Fashion-MNIST:a novel image dataset for benchmarking machine learning algorithms[OL].(2017-09-15)[2023-04-30].https://doi.org/10.48550/arXiv.1708. 07747.
[19] KRIZHEVSKY A. Learning multiple layers of features from tiny images[D]. Toronto:University of Toronto,2009.
[20] 周治威,刘为凯,钟小颖. 自适应量化权重用于通信高效联邦学习[J]. 控制理论与应用,2022,39(10):1961-1968.
[21] 陈龙,张水平,王海晖,等. 基于多任务学习和知识图谱的面部表情识别[J]. 武汉工程大学学报,2021,43(6):681-688.