Adaptive task selection in automated educational software: A comparative study

Rina Azoulay, Esther David, Mireille Avigal, Dorit Hutzler

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

3 Scopus citations

Abstract

In this study, we consider the challenge of adapting the difficulty level of the tasks suggested to a student using an educational software system. We investigate the effectiveness of different learning algorithms for the challenge of adapting the difficulty of the tasks to a student’s level, and compare their efficiency by means of simulation with virtual students. Our results demonstrate that the methods based on Bayesian inference outperformed most of the other methods, while in dynamic improvement domains, the item response theory method reached the best results. Given the fact that correctly adapting the tasks to the individual learners’ abilities can help them increase their improvement and satisfaction, our study can assist the designers of intelligent tutoring systems in selecting an appropriate adaptation method, given the needs and goals of the educational system, and given the characteristics of the learners.

Original languageEnglish
Title of host publicationIntelligent Systems and Learning Data Analytics in Online Education
PublisherElsevier
Pages179-204
Number of pages26
ISBN (Electronic)9780128234105
DOIs
StatePublished - 1 Jan 2021
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2021 Elsevier Inc. All rights reserved.

Keywords

  • Adaptive task selection
  • Educational technology
  • Intelligent tutoring systems
  • Machine learning
  • Reinforcement learning

Fingerprint

Dive into the research topics of 'Adaptive task selection in automated educational software: A comparative study'. Together they form a unique fingerprint.

Cite this