Python Machine Learning Projects

Abstract:

Many real-world applications require learning over massive, distributed data. With the rise of smart mobile devices and Internet of Things (IoT) devices, privacy and security concerns make data sharing difficult. Federated learning can jointly train a global model without uploading data from multiple devices to a central server, preserving privacy and security. Most federated learning research uses machine learning models with full-precision weights, which have many redundant parameters that don’t need to be transmitted to the server, wasting communication costs. A federated trained ternary quantization (FTTQ) algorithm optimizes quantized networks on clients using a self-learning quantization factor. Theory proves quantization factor convergence, FTTQ unbiasedness, and reduced weight divergence. A ternary federated averaging protocol (T-FedAvg) based on FTTQ reduces upstream and downstream communication in federated learning systems. Our results show that the proposed T-FedAvg reduces communication costs and performs slightly better on non-IID data than the canonical federated learning algorithms.

Note: Please discuss with our team before submitting this abstract to the college. This Abstract or Synopsis varies based on student project requirements.

Did you like this final year project?

To download this project Code with thesis report and project training... Click Here

You may also like: