×
Home Current Archive Editorial board
News Contact
Original scientific article

PYTHON-DRIVEN ADAPTIVE TESTING ALGORITHMS FOR PERSONALIZED ASSESSMENT IN E-LEARNING PLATFORMS

By
Ankita Sappa Orcid logo
Ankita Sappa

Wichita State University , Wichita , United States

Abstract

The recent advancement in e-learning systems has highlighted the need for more tailored and effective methods of assessment. The e-learning system has become increasingly common in society; however, it comes with its own unique challenges. This study explores the development of an adaptive testing framework implemented in Python, utilizing algorithms driven by the learner's real-time performance data to continually adjust the difficulty and order of questions presented. The system merges Item Response Theory (IRT) with advanced machine learning to proactively estimate learner's mastery level and modify the assessment sequence ion in real time. Different learner profiles yielded improved accuracy across tests in a broad range of assessments, less time spent on evaluation, and greater satisfaction from users. Important parameters of performance like response time, range of participation, and prediction precision were assessed with actual data in a simulated e-learning setting. This research is particularly important in its demonstration how responsive testing frameworks in Python can enhance digital assessment through adaptation and increase customized learning experiences throughout all levels. This work provides, for the first time, an open-source model to be built upon within the educational technology ecosystem while simultaneously creating pathways for innovative design of future intelligent tutoring systems.

References

1.
Sharma P, Hannafin MJ. Scaffolding in technology-enhanced learning environments. Interactive Learning Environments. 2007 Apr 1;15(1):27–46.
2.
Mogoui H. Comparison of personality traits and initial maladaptive schemas of addicts and nonaddicts. International Academic Journal of Innovative Research. 2017;(2):74–9.
3.
Choe EM, Kern JL, Chang HH. Optimizing the Use of Response Times for Item Selection in Computerized Adaptive Testing. Journal of Educational and Behavioral Statistics. 2017;43(2):135–58.
4.
N. B, A. K, R. R. Efficient Compressor Testing on Railways with A Mobile Application. International Academic Journal of Science and Engineering. 2023;10(2):123–30.
5.
Bennett RE. Formative assessment: a critical review. Assessment in Education: Principles, Policy & Practice. 2011;18(1):5–25.
6.
Saleh. Adaptive disassembly using deep reinforcement learning using path planning communication approach. International Journal of Advances in Engineering and Emerging Technology. 2022;(2):110–9.
7.
Sette M, Tao L, Jiang N. A Knowledge-Driven Web Tutoring System Framework for Adaptive and Assessment-Driven Open-Source Learning. 2017 IEEE 41st Annual Computer Software and Applications Conference (COMPSAC). IEEE; 2017. p. 712–7.
8.
H. G, S. N, P. H, H.A. A. Genome-wide assessment and characterization of simple sequence repeats (SSRs) makers for Capoeta aculeata (Valenciennes, 1844) using NGS data. International Journal of Aquatic Research and Environmental Studies. 2022;2(2):16–28.
9.
Nagy M, Korom E. Measuring Scientific Reasoning of Fourth Graders: Validation of the Science-K Inventory in Paper-Based and Computer-Based Testing Environments. Journal of Baltic Science Education. 2023;(6):1050–62.
10.
Rashad Sayed A, Helmy Khafagy M, Ali M, Hussien Mohamed M. Predict student learning styles and suitable assessment methods using click stream. Egyptian Informatics Journal. 2024;26:100469.
11.
Ahani M. The impact of academic motivation education on cognitive-adaptive non-adaptive, behavioradaptive non-adaptive dimensions of motivation and academic performance of female second year high school students in Mahneshan. Int Acad J Soc Sci. 2016;(1):133–8.
12.
Gierl MJ, Shin J, Firoozi T, Lai H. Using Content Coding and Automatic Item Generation to Improve Test Security. Frontiers in Education. 2022;7.
13.
Wang P, Liu H, Xu M. An adaptive testing item selection strategy via a deep reinforcement learning approach. Behavior Research Methods. 2024;(8):8695–714.
14.
Liu Y, Zhang T, Wang X, Yu G, Li T. New development of cognitive diagnosis models. Frontiers of Computer Science. 2023;(1):171604.
15.
Chalmers RP. Generating Adaptive and Non-Adaptive Test Interfaces for Multidimensional Item Response Theory Applications. Journal of Statistical Software. 2016;71(5).
16.
Song X, Li J, Cai T, Yang S, Yang T, Liu C. A survey on deep learning based knowledge tracing. Knowledge-Based Systems. 2022;258:110036.
17.
Sharma R, Shrivastava S, Sharma A. Predicting Student Performance Using Educational Data Mining and Learning Analytics Technique. 2023;(2):24–37.
18.
Lavanya B, Naga Madhuri J, Pagidipati B, Mythili M, Thiyagarajan K, S P. Adaptive Testing with Machine Learning: Customizing English Proficiency Assessments Based on Learner Performance. 2024 International Conference on Communication, Control, and Intelligent Systems (CCIS). IEEE; 2024. p. 1–5.
19.
Dimitrijević N, Zdravković N, Ponnusamy V. Learning data visualization in Python utilizing an autograding and feedback system. eLearning. 2024;(1):51–61.
20.
Van Der Linden W, Glas C. 2010;
21.
Pardos Z, Heffernan N. KT-IDEM: Introducing item difficulty to the knowledge tracing model. InInternational conference on user modeling, adaptation, and personalization. 2011;243–54.
22.
Wauters K, Desmet P, Van Den Noortgate W. Adaptive item-based learning environments based on the item response theory: Possibilities and challenges. Journal of Computer Assisted Learning. 2010;(6):549–62.
23.
De-Marcos L, Hilera J, Barchino R, Jiménez L, Martínez J, Gutiérrez J, et al. An experiment for improving students’ performance in secondary and tertiary education by means of m-learning auto-assessment.
24.
de-Marcos L, Hilera JR, Barchino R, Jiménez L, Martínez JJ, Gutiérrez JA, et al. An experiment for improving students performance in secondary and tertiary education by means of m-learning auto-assessment. Computers & Education. 2010;55(3):1069–79.
25.
Valero-Ramon Z, Fernandez-Llatas C, Valdivieso B, Traver V. Dynamic Models Supporting Personalised Chronic Disease Management through Healthcare Sensors with Interactive Process Mining. Sensors. 2020;20(18):5330.
26.
Benedik E, Gruber A. RETHINKING EDUCATION: UNLEASHING THE POWER OF DIGITAL INNOVATION - PARADIGM SHIFT IN EXAMINATION CULTURE - DIGITAL MEDIA AND INNOVATIVE FORMS OF EXAMINATION. INTED Proceedings. IATED; 2024.

Citation

This is an open access article distributed under the  Creative Commons Attribution Non-Commercial License (CC BY-NC) License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. 

Article metrics

Google scholar: See link

The statements, opinions and data contained in the journal are solely those of the individual authors and contributors and not of the publisher and the editor(s). We stay neutral with regard to jurisdictional claims in published maps and institutional affiliations.