People
Faculty
Researchers
Management
Staff
Alumni
Academics
UG Programs
PG Programs
Research
Online Certificate Programs
Online Courses
Training Programs
Opportunities
Internships
Fellowships
Industry Collaborations
Faculty Careers
News & Events
Events
News
Newsletter
Research
Overview
Themes
Projects
Research Centres
Collaborations
Outcomes
Publications
Preprints
Whitepapers
Software & Datasets
Blogs
QUICK LINKS
PhD Candidacy Exam Format
Open Positions
Grievances
For Current Students
For Prospective Industry
For Prospective Faculty
For Prospective Student
Upcoming Events
Contact
×
Preksha Nema
Preksha Nema
The heads hypothesis: A unifying statistical approach towards understanding multi-headed attention in BERT
On the Importance of Local Information in Transformer Based Models
On the weak link between importance and prunability of attention heads
Towards Interpreting BERT for Reading Comprehension Based QA
Towards Transparent and Explainable Attention Models
Let's Ask Again: Refine Network for Automatic Question Generation
Let’s Ask Again: Refine Network for Automatic Question Generation
ElimiNet: A Model for Eliminating Options for Reading Comprehension with Multiple Choice Questions
Generating Descriptions from Structured Data Using a Bifocal Attention Mechanism and Gated Orthogonalization
Towards a Better Metric for Evaluating Question Generation Systems
Diversity driven attention model for query-based abstractive summarization