The Electronic Journal of e-Learning provides perspectives on topics relevant to the study, implementation and management of e-Learning initiatives
For general enquiries email administrator@ejel.org
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the current European Conference on e-Learning is available here

For infomation on the International Conference on eLearning, click here

For infomation on the European Conference on Games Based Learning clickhere

 

Journal Article

A Data Warehouse Model for Micro‑Level Decision Making in Higher Education  pp235-244

Liezl van Dyk

© Nov 2008 Volume 6 Issue 3, Editor: Shirley Williams, Laura Czerniewicz, pp161 - 254

Look inside Download PDF (free)

Abstract

An abundance of research, by educational researchers and scholars of teaching and learning alike, can be found on the use of ICT to plan design and deliver learning activities and assessment activities. The first steps of the instructional design process are covered quite thoroughly by this. However, the use of ICT and quantitative methods to close the instructional design cycle by supporting sustainable decision making with respect to the evaluation of the effectiveness of teaching processes hold much unleashed potential. In this paper a business intelligence approach is followed in an attempt to take advantage ICT to enable the evaluation of the effectiveness of the process of facilitating learning. The focus is on micro‑level decision support based on data drawn from the Learning Management System (LMS). Three quantifiable measures of online behaviour and three quantifiable measures of teaching effectiveness are identified from literature to arrive at a 3x3 matrix according to which 9 measures of e‑teaching effectiveness can be derived by means of pair‑wise correlation. The value and significance of information are increased within context of other information. In this paper it is shown how the value of LMS tracking data increases within context of data from other modules or others years and that useful information is created when this tracking data is correlated with measures of teaching effectives such as results, learning styles and student satisfaction. This information context can only be created when a deliberate business intelligence approach if followed. In this paper a data warehouse model is proposed to accomplish exactly this.

 

Keywords: learning management system, data warehouse, student tracking, decision support, student feedback, learning styles

 

Share |

Journal Article

Game Inspired Tool Support for e‑Learning Processes  pp101-110

Marie-Thérèse Charles, David Bustard, Michaela Black

© Jun 2009 Volume 7 Issue 2, Editor: Shirley Williams, pp85 - 190

Look inside Download PDF (free)

Abstract

Student engagement is crucial to the success of e‑learning but is often difficult to achieve in practice. One significant factor is the quality of the learning content; also important, however, is the suitability of the process through which that material is studied. In recent years much research has been devoted to improving e‑ learning content but considerably less attention given to enhancing the associated e‑learning process. This paper focuses on that process, considering in particular how student engagement might be improved using techniques common in digital games. The work is motivated by a belief that, with careful design, e‑learning systems may be able to achieve the levels of engagement expected of digital games. In general, such games succeed by entertaining players, building on their natural curiosity and competitiveness to encourage them to continue to play. This paper supports a belief that by adopting some of the engagement techniques used in games, e‑ learning can become equally successful. In particular, the paper considers how the learning process might become a form of game that helps sustain continued study. Factors affecting engagement and elements of digital games that make them engaging are identified. A proposal for improving engagement is then outlined. The approach is to encourage student involvement by rewarding desirable behaviour, including the completion of optional challenges, and giving regular feedback on performance, measured against others in the same class. Feedback is provided through a web‑based tool. The paper describes an exploratory assessment of both the tool and approach through action research. Results for two linked university modules teaching software development are presented. The results so far are very encouraging in that student engagement and performance have increased, especially at the weaker end of the class. Limitations of the approach are also outlined, together with an indication of future research plans.

 

Keywords: e-learning, digital games, engagement, feedback, action research

 

Share |

Journal Article

Enhancing the Impact of Formative Feedback on Student Learning Through an Online Feedback System  pp111-122

Thanos Hatziapostolou, Iraklis Paraskakis

© Mar 2010 Volume 8 Issue 2, ECEL 2009, Editor: Shirley Williams, Florin Salajan, pp51 - 208

Look inside Download PDF (free)

Abstract

Formative feedback is instrumental in the learning experience of a student. It can be effective in promoting learning if it is timely, personal, manageable, motivational, and in direct relation with assessment criteria. Despite its importance, however, research suggests that students are discouraged from engaging in the feedback process primarily for reasons that relate to lack of motivation and difficulty in relating to and reflecting on the feedback comments. In this paper we present Online FEdback System (OFES), an e‑learning tool that effectively supports the provision of formative feedback. Our aims are to enhance feedback reception and to strengthen the quality of feedback through the way feedback is communicated to the students. We propose that an effective feedback communication mechanism should be integrated into a student's online learning space and it is anticipated that this provision will motivate students to engage with feedback. Empirical evidence suggests that the developed system successfully addressed the issues of student engagement and motivation and achieved its objectives. The results of using the system for two years indicate a positive perception of the students which, in turn, encourage us to further explore its effectiveness by extending its functionality and integrating it into a an open source learning management system.

 

Keywords: formative feedback, online feedback, student engagement, student motivation

 

Share |

Journal Article

An Automated Individual Feedback and Marking System: An Empirical Study  pp1-14

Trevor Barker

© Apr 2011 Volume 9 Issue 1, ECEL 2010 special issue, Editor: Carlos Vaz de Carvalho, pp1 - 114

Look inside Download PDF (free)

Abstract

The recent National Students Survey showed that feedback to students was an ongoing problem in Higher Education. This paper reports on the extension of our past research into the provision of automated feedback for objective testing. In the research presented here, the system has been further developed for marking practical and essay questions and providing automated feedback. Recent research at the University of Hertfordshire was able to show that learners and tutors accept and value our automated feedback approach based on objective tests and Computer Adaptive Testing. The research reported in this paper is an important extension to this work. The automated feedback system developed for objective testing has been extended to include practical testing and essay type questions. The automated feedback system, which can be used within any subject area, is based on a simple marking scheme created by the subject tutor as a text file according to a simple template. Marks for each option and a set of feedback statements are held within a database on a computer. As marks are awarded for each question by the teacher an individual feedback file is created automatically for each learner. Teachers may also add and modify comments to each learner and save additional feedback to the database for later use. Each individual feedback file was emailed automatically to learners. The development of the system is explained in the paper and testing and evaluation with 350 first year (1 final practical test), 120 second year (1 written and 1 practical tests) and 100 final year (1 final practical test) undergraduate Computer Science students is reported. It was found that the time to mark practical and essay type tests was reduced by more than 30% in all cases compared to previous years. More importantly it was possible to provide good quality individual feedback to learners rapidly. Feedback was delivered to all within three weeks of the test submission date. In end of module tests it was very beneficial indeed as it had proven difficult to provide feedback in the past after modules had ended. Examples of the feedback provided are presented in the paper and the development of the system using a user‑centred approach based on student and staff evaluation is explained. The comments of staff teaching on these modules and a sample of students who took part in this series of evaluations of the system are presented. The results of these evaluations were very positive and are reported in the paper, showing the changes that were made to the system at each iteration of the development cycle. The provision of fast effective feedback is vital and this system was found to be an important addition to the tools available.

 

Keywords: assessment, feedback, automated systems, development, evaluation

 

Share |

Journal Article

Online formative assessment in higher education: Its pros and cons  pp228-236

Zwelijongile Gaylard Baleni

© Apr 2015 Volume 13 Issue 4, ECEL 2014, Editor: Kim Long, pp205 - 315

Look inside Download PDF (free)

Abstract

Abstract: Online and blended learning have become common educational strategy in higher education. Lecturers have to re‑theorise certain basic concerns of teaching, learning and assessment in non‑traditional environments. These concerns include perception s such as cogency and trustworthiness of assessment in online environments in relation to serving the intended purposes, as well as understanding how formative assessment operates within online learning environment. Of importance also is the issue of how formative assessment benefits both the student learning and teaching within pedagogical strategies in an online context. This papers concern is how online formative assessment provides teaching and learning as well as how lecturers and students benefit f rom it. A mixed method questionnaire on formative assessment with a main focus on how formative assessment within online contexts operates was used to collect data from courses using Blackboard. Lecturers and students at a comprehensive university were th e population. Various techniques for formative assessment linked with online tools such as discussion forums and objective tests were used. The benefits that were famous comprise improvement of student commitment, faster feedback, enhanced flexibility aro und time and place of taking the assessment task and importance in the procedure for students and lecturers also benefited with less marking time and saved on administrative costs. The crucial findings are that effective online formative assessment can nu rture a student and assessment centred focus through formative feedback and enrich student commitment with valued learning experiences. Ongoing trustworthy assessment tasks and interactive formative feedback were identified as significant features that wi ll deal with intimidations to rationality and trustworthiness within the milieu of online formative assessment.

 

Keywords: Keywords: online formative assessment, formative feedback, student engagement, learning

 

Share |

Journal Article

The effectiveness of instructor personalized and formative feedback provided by instructor in an online setting: some unresolved issues  pp196-203

Dolors Plana-Erta, Soledad Moya, Pep Simo

© Jul 2016 Volume 14 Issue 3, Editor: Rikke Ørngreen and Karin Levinsen, pp150 - 232

Look inside Download PDF (free)

Abstract

Abstract: Formative feedback has great potential for teaching and learning in online undergraduate programmes. There is a large number of courses where the main source of feedback is provided by the instructor. This is particularly seen in subjects whe re assessments are designed based on specific activities which are the same for all students, and where the assessment is performed by the instructor, not by a peer. Additionally, in introductory or basically procedural courses, there is often a need for instructor feedback, as opposed to peer‑feedback, as it demands high quality feedback both in the content and in the process in order not to mislead students. Therefore personalized feedback provided by instructor is an academic demand in the current educ ational models that have positioned the student at the center of the learning process. However in the present context of high student‑staff ratio, it is not easy to extend the use of individual comments delivered by instructors among the academic communit y. This article focuses on the virtual higher education environment given its present and future potential as well as the amount of queries currently surrounding it. Literature on formative feedback in higher education has been reviewed for the period 200 0 to 2014, in order to find answers as to which aspects are relevant to efficiently implement personalized feedback prepared by the teacher.Findings show that effective personalized feedback in an virtual environment requires a three‑dimensional analysis: from the student perspective, from the instructor one and from the media perspective (written text, video recording or audio recording) , in order to find shared aspects that contribute to the enhancement in the use of personalized feedback performed by faculty.

 

Keywords: Keywords: formative feedback, effective feedback, online feedback, student-professor dialogue

 

Share |

Journal Article

iSELF: The development of an Internet‑Tool for Self‑Evaluation and Learner Feedback  pp313-325

Nicolet Theunissen, Hester Stubb

© Jul 2014 Volume 12 Issue 4, Editor: Dr Rikke Ørngreen and Dr Karin Tweddell Levinsen, pp313 - 410

Look inside Download PDF (free)

Abstract

Abstract: This paper describes the theoretical basis and development of the iSELF: an Internet‑tool for Self‑Evaluation and Learner Feedback to stimulate self‑directed learning in ubiquitous learning environments. In ubiquitous learning, learners follow t heir own trails of interest, scaffolded by coaches, peers and tools for thinking and learning. Ubiquitous learning solutions include on‑ and off‑line, formal and informal learning. To benefit from its possibilities, learners need to develop competencies f or self‑directed learning. To do so, a self‑evaluation tool can help the learner to get insight in his/her own development, to manage and monitor his/her own learning process, to collaborate in learning, to relate the learning to 'real life' needs, and to take control over educational decisions. The iSELF was developed in an iterative process, complying to the following high level requirements: (1) Enabling learning anytime, anywhere; (2) Supporting self‑directed learning; (3) Evaluating learner, le arning solutions and job‑needs; (4) Assessing learner competencies; (5) Using card‑sort method for questionnaires; (6) Facilitating questionnaires 'under construction'; and (7) User‑friendly design. The resulting online tool contained a card‑sort module, looking somewhat like a 'solitaire' game, a profile module to evaluate core competencies, and a feedback module to suggest learning possibilities. For illustration, 14 different studies that contributed to the development of iSELF and to the devel opment of self‑evaluation questionnaires compliant to iSELF, are briefly discussed. These illustrative studies included various populations: e.g. students, employees from small and medium enterprises, crisis management organizations, and the military. Use fulness and usability of the self‑evaluation tool were valued positively. The iSELF contributes to an adaptive ubiquitous learning environment in which the learner can make the educational decisions according to self‑directed learning principles. The iSEL F will stimulate self‑directed learning in a ubiquitous le

 

Keywords: Keywords: self-evaluation, self-assessment, internet-tool, ubiquitous learning, self-directed learning, feedback

 

Share |

Journal Issue

Volume 9 Issue 1, ECEL 2010 special issue / Apr 2011  pp1‑114

Editor: Carlos Vaz de Carvalho

View Contents Download PDF (free)

Editorial

Vaz e‑Learning is one of the most active fields of research and practice in Europe, in all the education and training sectors. The use of new and innovative technologies for learning is raising expectations and motivation between researchers, teachers, students and other education stakeholders.The European Conference on e‑Learning (ECEL) is an annual event that has been at the forefront of this revolution. It brings together groups of people in a variety of areas related to e‑Learning seeking to combine cutting‑edge research with practical, real‑life applications, in order to advance the state of e‑Learning around Europe.

The 9th European Conference on e‑Learning ‑ ECEL 2010 took place in Porto, Portugal. Porto is renowned for its historical City Centre (World Heritage) and its wine but also for being an innovation‑prone city which is an excellent environment for an e‑learning conference. This special edition of EJEL is dedicated to ECEL 2010.

With an initial submission of 220 abstracts, after the double blind, peer review process there were 97 papers published in the Conference Proceedings, an acceptance rate that places ECEL 2010 on the top of the conference quality rankings. The number of high‑quality submissions to the conference required a thorough process of selection by the session chairs and the editors to finally produce this edition of the journal. The selected articles cover different points of view of e‑learning from a more technological approach to a more pedagogical one.

The first set of articles is precisely concerned with technological aspects and in particular with the importance of computer aided assessment systems in the efficiency of e‑learning. Trevor Barker presents a study on the importance of automated feedback to provide good quality individual feedback to learners. He also demonstrates that these systems, by relieving the teachers from the exhaustive task of test marking can give them more time for communicating with students. Escudeiro and Cruz present a very innovative approach to the grading of students' answers in free text. Their work minimizes fluctuations in the evaluation criteria, improves detection of plagiarism, reduces the assessment process time and allows teachers to focus on the feedback to the students. Gütl, Lankmayr, Weinhofer and Höfler approach the design, development and validation of an automatic test item creation tool. This tool is able to extract concepts out of textual learning content and create different types of questions on the basis of those concepts.

To complete this more technically‑oriented view, Kurilovas, Bireniene and Serikoviene present a model and several scientific methods for the quality evaluation of Learning Objects (LOs). They pay special attention to their reusability level, in particular, when crossing linguistic barriers.

The second set of articles focus on pedagogical aspects of e‑learning and in particular, in students' related issues. Karin Levinsen presents new concepts related with e‑learning. She addresses the phenomenology of acquiring digital literacy and self‑programming in order to be able to identify relevant learning objectives and scaffolding.

Marques and Belo approach the profiling of student through their web usage habits. Through their investigations they can discover what students do, by establishing user navigation patterns on Web based platforms, and learn how they explore and search the sites’ pages that they visit. Nakayama and Yamamoto also address student issues by examining participants’ assessments made during the transitional phase in a learning environment which includes blended and fully online courses. O’Hara, Reis, Esteves, Brás and Branco focus on the effectiveness of learning through sports with the systematic integration of interactive situations in different contexts, with or without electronic devices. Sabey and Horrocks tackle the need for new electronic resources for health research for use within the context of a classroom taught course. They describe the process of developing an interactive resource incorporating a narrative element. Finally, Tuncay, Stanescu and Tuncay present a very innovative approach to the use of metaphors in e‑learning to reinforce communication between students and teachers.

As chair of ECEL 2010 and editor of this special edition of EJEL I feel privileged to have been in contact with such exciting thoughts, ideas and projects presented by the authors. It is now my pleasure to pass on to you this collection of articles, knowing for sure that they will motivate you to continue or even to start your research, development or use of e‑Learning as a major learning strategy. I also look forward to meeting you in Brighton, this autumn, for another fascinating ECEL conference.

 

Keywords: active learning, assessment, assessment in transition, automated systems, automated test item creation, blended learning, Clickstream analysis, computer-based assessment, design for teaching and learning, development, distance learning, e-assessment, eLearning, evaluation, evidence-based practice, feedback, free-text assisted grading, fully online learning, learning objects, lifelong learning, Markov chains, metaphors, multiple criteria decision analysis, narrative, natural language processing, Navigation paths analysis, networked society, nurse education, online learning, optimisation, quality evaluation, research methods teaching, reusability, self-directed learning, self-programming, skills acquisition, sport, student assessment, students, SurveyMonkey, task design, technology, text mining, web based elearning platforms, web usage profiling.

 

Share |