Python Deep Learning Projects

Abstract:

This paper generalizes the Attention in Attention (AiA) mechanism in P. Fang et al., 2019 by using explicit mapping in reproducing kernel Hilbert spaces to generate input feature map attention values. Inner and outer attention modules interact to build local-global interdependencies in the AiA mechanism. The second-order polynomial attention and Gaussian attention modules use the input features’ non-linear properties explicitly. AiA-Net is a deep convolutional neural network with the proposed AiA blocks. The AiA-Net extracts a discriminative pedestrian representation from complementary person appearance and part features. Ablation studies confirm the AiA mechanism’s efficacy and the feature map’s non-linear features for attention design. Our method also outperforms state-of-the-art in several benchmarks. AiA blocks also help achieve state-of-the-art video person retrieval performance.

Note: Please discuss with our team before submitting this abstract to the college. This Abstract or Synopsis varies based on student project requirements.

Did you like this final year project?

To download this project Code with thesis report and project training... Click Here

You may also like: