Python Machine Learning Projects

Abstract:

A ParametRIc MAnifold Learning (PRIMAL) algorithm for Gaussian mixture models (GMM) assumes that GMMs lie on or near a manifold of probability distributions generated from a low-dimensional hierarchical latent space through parametric mappings. Latent space and parametric mapping model the generative processes for priors, means, and covariance matrices, inspired by principal component analysis (PCA). Hierarchical latent spaces use linear or kernelized mappings to capture latent space dependencies. Minimizing Kullback-Leibler Divergence (KLD) between ground-truth and manifold-generated GMMs learns the function parameters and hierarchical latent space. Variational approximation and a variational EM algorithm optimize the objective function for the intractable KLD between GMMs. PRIMAL learns a continuous and interpretable manifold of GMM distributions and minimizes reconstruction error on synthetic data, flow cytometry, eye-fixation, and topic models.

Note: Please discuss with our team before submitting this abstract to the college. This Abstract or Synopsis varies based on student project requirements.

Did you like this final year project?

To download this project Code with thesis report and project training... Click Here

You may also like: