Abstract:
Deep neural networks increasingly use attention to interpret and improve model performance, but little research has examined how attention progresses to complete a task and whether it is reasonable. We propose an Attention with Reasoning capability (AiR) framework to understand and improve task outcomes using attention. We define an evaluation metric based on atomic reasoning operations to quantify attention that considers the reasoning process. We then analyze machine and human attention mechanisms on reasoning capability and task performance using human eye-tracking and answer correctness data. We suggest supervising attention learning throughout the reasoning process and differentiating correct and incorrect attention patterns to improve visual question answering models’ attention and reasoning. We show that the proposed framework analyzes and models attention with better reasoning and task performance.
Note: Please discuss with our team before submitting this abstract to the college. This Abstract or Synopsis varies based on student project requirements.
Did you like this final year project?
To download this project Code with thesis report and project training... Click Here