The Electronic Journal of e-Learning provides perspectives on topics relevant to the study, implementation and management of e-Learning initiatives
For general enquiries email administrator@ejel.org
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the current European Conference on e-Learning is available here

For infomation on the International Conference on eLearning, click here

For infomation on the European Conference on Games Based Learning clickhere

 

Journal Article

The Role of Essay Tests Assessment in e‑Learning: A Japanese Case Study  pp173-178

Minoru Nakayama, Hiroh Yamamoto

© Mar 2010 Volume 8 Issue 2, ECEL 2009, Editor: Shirley Williams, Florin Salajan, pp51 - 208

Look inside Download PDF (free)

Abstract

e‑Learning has some restrictions on how learning performance is assessed. Online testing is usually in the form of multiple‑choice questions, without any essay type of learning assessment. Major reasons for employing multiple‑choice tasks in e‑learning include ease of implementation and ease of managing learner's responses. To address this limitation in online assessment of learning, this study investigated an automatic assessment system as a natural language processing tool for conducting essay‑type tests in online learning. The study also examined the relationship between learner characteristics and learner performance in essay‑testing. Furthermore, the use of evaluation software for scoring Japanese essays was compared with experts' assessment and scoring of essay tests. Students were enrolled in two‑unit courses which were taught by the same professor as follows: hybrid learning course at bachelor's level, fully online course at bachelor's level, and hybrid learning course at masters level. All students took part in the final test which included two essay‑tests at the end of course, and received the appropriate credit units. Learner characteristics were measured using five constructs: motivation, personality, thinking styles, information literacy and self‑assessment of online learning experience. The essay‑tests were assessed by two outside experts. They found the two essay‑tests to be sufficient for course completion. Another score, which was generated using assessment software, consisted of three factors: rhetoric, logical structure and content fitness. Results show that experts' assessment significantly correlates with the factor of logical structure on the essay for all courses. This suggests that expert evaluation of the essay is focused on logical structure rather than other factors. When comparing the score of experts' assessment between hybrid learning and fully online courses at the bachelor's level, no significant differences were found. This indicates that in fully online learning, as well as in hybrid learning, learning performance can be measured using essay tests without the need for a face‑to‑face session to conduct this type of assessment.

 

Keywords: online learning, essay-testing, learner characteristics, learning performance

 

Share |