Author
Listed:
- Ibidapo Dare Dada
(Department of Computer and Information Sciences, Covenant University, Ota P.M.B. 1023, Ogun State, Nigeria
Department of Computer Science, Federal University of Agriculture, Abeokuta P.M.B 2240, Ogun State, Nigeria)
- Adio T. Akinwale
(Department of Computer Science, Federal University of Agriculture, Abeokuta P.M.B 2240, Ogun State, Nigeria)
- Idowu A. Osinuga
(Department of Mathematics, Federal University of Agriculture, Abeokuta P.M.B 2240, Ogun State, Nigeria)
- Henry Nwagu Ogbu
(Department of Computer and Information Sciences, Covenant University, Ota P.M.B. 1023, Ogun State, Nigeria)
- Ti-Jesu Tunde-Adeleke
(Department of Computer and Information Sciences, Covenant University, Ota P.M.B. 1023, Ogun State, Nigeria)
Abstract
This study developed and evaluated transformer-based models enhanced with inter-sentence attention (iAttention) mechanisms to improve the automatic grading of student responses to open-ended questions. Traditional transformer models emphasize intra-sentence relationships and often fail to capture complex semantic alignments needed for accurate assessment. To overcome this limitation, three iAttention mechanisms, including i A t t e n t i o n T F − I D F , i A t t e n t i o n w o r d and i A t t e n t i o n H W were proposed to enhance the model’s capacity to align key ideas between students and reference answers. This helps improve the model’s ability to capture important semantic relationships between words in two sentences. Unlike previous approaches that rely solely on aggregated sentence embeddings, the proposed method introduces inter-sentence attention layers that explicitly model semantic correspondence between individual sentences. This enables finer-grained matching of key concepts, reasoning, and logical structure which are crucial for fair and reliable assessment. The models were evaluated on multiple benchmark datasets, including Semantic Textual Similarity (STS), SemEval-2013 Beetle, SciEntsBank, Mohler, and a composite of university-level educational datasets (U-datasets). Experimental results demonstrated that integrating iAttention consistently outperforms baseline models, achieving higher Pearson and Spearman Correlation scores on STS, Mohler, and U-datasets, as well as superior Macro-F1, Weighted-F1, and Accuracy on the Beetle and SciEntsBank datasets. This approach contributes to the development of scalable, consistent, and fair automated grading systems by narrowing the gap between machine evaluation and human judgment, ultimately leading to more accurate and efficient assessment practices.
Suggested Citation
Ibidapo Dare Dada & Adio T. Akinwale & Idowu A. Osinuga & Henry Nwagu Ogbu & Ti-Jesu Tunde-Adeleke, 2025.
"iAttention Transformer: An Inter-Sentence Attention Mechanism for Automated Grading,"
Mathematics, MDPI, vol. 13(18), pages 1-31, September.
Handle:
RePEc:gam:jmathe:v:13:y:2025:i:18:p:2991-:d:1750354
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:13:y:2025:i:18:p:2991-:d:1750354. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.