NS2 Projects

Abstract:

Because of their widespread use in real-world problems, solving linear programming (LP) and nonlinear programming (NLP) problems is important. NLPs have no unified global optimum. However, the simplex algorithm, which has dominated LPs for decades, searches only the boundary (vertices) and ignores most of the feasible region.

In this article, we study two gradient-based methods that explore the whole feasible region and guarantee faster convergence rates for both LP and NLP optimization problems, including IoT problems like SDIoV and VANETs.

The gradient-simplex algorithm (GSA) for LPs reduces search space by moving inside the feasible region in the gradient direction, then explores the reduced boundary to find an optimal solution. For NLPs, the evolutionary-gradient algorithm (EGA) uses an evolutionary population to estimate gradients by evolving to find better solutions in steps.

Both approaches solve optimization problems with large feasible spaces efficiently and outperform state-of-the-art methods, according to extensive simulations. Included are GSA results on SDIoV and VANETs of various sizes.

Note: Please discuss with our team before submitting this abstract to the college. This Abstract or Synopsis varies based on student project requirements.

Did you like this final year project?

To download this project Code with thesis report and project training... Click Here

You may also like: