Abstract:
Machine learning and data processing use nonconvex regularized optimization problems. The consensus optimization used the iteratively reweighted algorithm due to problem structure. This paper proposes adding an inertial term to each iteration to accelerate this scheme. Classical decentralized algorithms can be implemented over a connected network where agents communicate and perform local computations. We accelerate the iteratively reweighted algorithm using diminishing stepsizes. Our algorithms reduce to decentralized schemes and suggest new ones. Both algorithms converge mathematically under several objective function assumptions. Kurdyka-Łojasiewicz property allows constant stepsize convergence rates. Numerical results show algorithm efficiency.
Note: Please discuss with our team before submitting this abstract to the college. This Abstract or Synopsis varies based on student project requirements.
Did you like this final year project?
To download this project Code with thesis report and project training... Click Here