The Electronic Journal of e-Learning provides perspectives on topics relevant to the study, implementation and management of e-Learning initiatives
For general enquiries email administrator@ejel.org
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the current European Conference on e-Learning is available here

For infomation on the International Conference on eLearning, click here

For infomation on the European Conference on Games Based Learning clickhere

 

Journal Article

Student Online Readiness Assessment Tools: A Systematic Review Approach  pp376-384

Farid Alem, Michel Plaisent, Prosper Bernard, Okoli Chitu

© Jul 2014 Volume 12 Issue 4, Editor: Dr Rikke Ørngreen and Dr Karin Tweddell Levinsen, pp313 - 410

Look inside Download PDF (free)

Abstract

Abstract: Although there are tools to assess students readiness in an online learning context, little is known about the psychometric properties of the tools used or not. A systematic review of 5107 published and unpublished papers identified in a litera ture search on student online readiness assessment tools between 1990 and 2010 was conducted. The objective of this paper was to identify via a systematic review different tools allowing to assess the level of students preparation in an online learning e nvironment and which were published or not in scientific journals, and determine which of these tools have been validated. The results of the systematic review show that a standard tool does not exist, and that only ten instruments have been developed and published over the past 20 years to assess students readiness. In addition, few tools published demonstrated good psychometric qualities, and many unpublished tools, considered as homemade tools, were internally developed in the universities by a team o f professors without regard to their psychometric quality. Also, it appears that the tools that were published in scientific journals are rarely used by universities that offer online courses. Generally, the universities prefer to develop their own instru ment that fits their online programs.

 

Keywords: Keywords: Systematic review of online preparedness, Tool Validity, Readiness for online learning, Internet-delivered training

 

Share |

Journal Article

A Roadmap to Cope with Common Problems in E‑Learning Research Designs  pp336-349

Javier Sarsa, Tomás Escudero

© Dec 2016 Volume 14 Issue 5, Editor: Robert Ramberg, pp291 - 349

Look inside Download PDF (free)

Abstract

Abstract: E‑learning research is plenty of difficulties, as also research in education is. Usually, the high number of features involved in e‑learning processes complicates and masks the identification and isolation of the factors which cause the expected benefits, when they exist. At the same time, a bunch of threats are ready to weaken the validity of the research, for example, disregard of previous research, use of small samples, absence of randomization in the assignment to groups, ineffective designs, lack of objectivity in the measuring process, poor descriptions of the research in publications (which implies few possibilities of replication), wrong statistical procedures, inappropriate inference of results, etc. All of these obstacles accumulate and are carried along the whole research, resulting in low quality studies or irrelevant ones. This theoretical paper suggests a roadmap in order to face the most common problems in e‑learning research. The roadmap informs about some cautions which must be considered at each stage of the research and recommendations to increase the validity and reproducibility of results. The roadmap and conclusions included in this paper have been obtained from our experience in educational and e‑learning research, also from our long path as reviewers in key journals of these fields, and from readings of significant research handbooks. This is not a strict guide but a set of milestones on which it is necessary to stop and reflect.

 

Keywords: Keywords: e-Learning research, educational technology, research designs, e-learning effectiveness, methodology, validity.

 

Share |

Journal Issue

Volume 14 Issue 5 / Dec 2016  pp291‑349

Editor: Robert Ramberg

View Contents Download PDF (free)

Editorial

Guest Editors


Ramberg Robert Robert Ramberg earned his PhD in cognitive psychology at the department of psychology, Stockholm University and holds a position as professor at the department of computer‑ and systems sciences, Stockholm University (Technology enhanced learning and collaboration). Ramberg also holds a position as research director at the Swedish air force simulation center (FLSC), Swedish Defense Research Agency. Broadly conceptualized, his research focuses the design and evaluation of representations and representational artefacts to support learning, training and collaboration. Of particular interest to his research are socio‑cultural perspectives on learning and cognition, pedagogy and how these theories must be adapted when designing and evaluating technology enhanced learning and training environments. And more specifically how artifacts of various kinds (information technology and other tools) mediate human action, collaboration and learning. 

 

Keywords: Higher Education, Action Research, Digital Competencies, Mixed methods research, Technology enhanced learning, Staff development, HEIs , Technology acceptance, Power, Culture, Foucault, Ofsted, Autonetnography, ANG, Autoethnography, Meta-ethnography, eLearning, Networked learning, Reflexivity, eResearch methodology, Online learner and teacher scholarship, Online professional development, e-Learning research, Educational technology, Research designs, e-Learning effectiveness, Methodology, Validity

 

Share |