×
Home Current Archive Editorial board
Instructions for papers
For Authors Aim & Scope Contact
Original scientific article

AUTOMATED ESSAY EVALUATION SYSTEMS FOR SCIENTIFIC WRITING IN ENGINEERING EDUCATION

By
Izzatbek Rejapov Orcid logo ,
Izzatbek Rejapov

Mamun university Uzbekistan

Abdurahim Mannonov Orcid logo ,
Abdurahim Mannonov

Tashkent State University of Oriental Studies , Tashkent , Uzbekistan

Gavharxon Saydaliyeva Orcid logo ,
Gavharxon Saydaliyeva

Tashkent University of Information Technology , Tashkent , Uzbekistan

Gaybullaev Otabek Orcid logo ,
Gaybullaev Otabek

Samarkand State Institute of Foreign Languages , Samarkand , Uzbekistan

Sherzod Djabbarov Orcid logo ,
Sherzod Djabbarov

Jizzakh State Pedagogical University , Jizzakh , Uzbekistan

Dilnoza Ziyoyeva Orcid logo ,
Dilnoza Ziyoyeva

Bukhara State Medical Institute named after Abu Ali ibn Sina , Bukhara , Uzbekistan

Ulugbek Eshkuvvatov Orcid logo ,
Ulugbek Eshkuvvatov

Termez State University of Engineering and Agrotechnology , Termez , Uzbekistan

Nargiza Mannapova Orcid logo
Nargiza Mannapova

Uzbek National Pedagogical University named after Nizami Tashkent Uzbekistan

Abstract

Despite the importance of scientific writing for engineering students, the technical discourse poses a significant challenge for learners. The promise of Automated Essay Evaluation (AEE) Systems to provide timely and reliable feedback has enabled greater development of writing instruction through expedited and automated support. This paper examines the development of AEE systems designed specifically for scientific writing in the context of engineering education. We review existing research that applies Natural Language Processing (NLP), machine learning, and even rule-based linguistics to evaluate structural coherence, domain vocabulary, and argumentative discourse. Additionally, we evaluate the pedagogical advantages and drawbacks of AEE concerning fostering self-regulated learning, instructor support, and assessment rigor. Results from pilot case studies and experiments illustrate increased student-writing performance and engagement with the task. This research emphasizes the need for domain-specific adaptation of AEE systems and algorithmic transparency, automated ethics, and machine-logic debates, challenging the predisposed notions of the use of AEE in engineering writing instruction.

References

1.
Shermis MD, Burstein J. Handbook of automated essay evaluation. NY: Routledge. 2013.
2.
Monir NI, Akter FY, Sayed SR. Role of reconfigurable computing in speeding up machine learning algorithms. SCCTS Transactions on Reconfigurable Computing. 2025;2(2):8–14.
3.
Attali Y, Burstein J. Automated essay scoring with e-rater® V. 2. The Journal of Technology, Learning and Assessment. 2006 Feb 1;4(3).
4.
Acar S, Cevik E, Fesli E, Bozkurt RN, Kaufman JC. Testing the Domain Specificity of Creativity with Kaufman Domains of Creativity Scale: A meta-analytic confirmatory factor analysis. The Journal of Creative Behavior. 2024;58(1):171–89.
5.
Mccorkindale W, Ghahramani R. Machine learning in chemical engineering for future trends and recent applications. Innovative Reviews in Engineering and Science. 2025;3(2):1–2.

Citation

This is an open access article distributed under the  Creative Commons Attribution Non-Commercial License (CC BY-NC) License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. 

Article metrics

Google scholar: See link

The statements, opinions and data contained in the journal are solely those of the individual authors and contributors and not of the publisher and the editor(s). We stay neutral with regard to jurisdictional claims in published maps and institutional affiliations.