The Electronic Journal of e-Learning provides perspectives on topics relevant to the study, implementation and management of e-Learning initiatives
For general enquiries email administrator@ejel.org
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop

Information about the current European Conference on e-Learning is available here

For infomation on the International Conference on eLearning, click here

For infomation on the European Conference on Games Based Learning clickhere

 

Journal Article

Multiple Criteria Evaluation of Quality and Optimisation of e‑Learning System Components  pp141-150

Eugenijus Kurilovas, Valentina Dagiene

© Mar 2010 Volume 8 Issue 2, ECEL 2009, Editor: Shirley Williams, Florin Salajan, pp51 - 208

Look inside Download PDF (free)

Abstract

The main research object of the paper is investigation and proposal of the comprehensive Learning Object Repositories (LORs) quality evaluation tool suitable for their multiple criteria decision analysis, evaluation and optimisation. Both LORs 'internal quality' and 'quality in use' evaluation (decision making) criteria are analysed in the paper. The authors have analysed several well‑known LORs quality evaluation tools. In their opinion, the comprehensive multiple criteria LOR quality evaluation tool should include both general software 'internal quality' evaluation criteria and 'quality in use' evaluation criteria suitable for the particular project or user. In the authors' opinion, the proposed LOR 'Architecture' group criteria are general 'internal quality' evaluation criteria, and 'Metadata', 'Storage', 'Graphical user interface' and 'Other' are 'customisable' 'quality in use' evaluation criteria. The authors have also presented their comprehensive Virtual Learning Environments (VLEs) quality evaluation tool combining both 'internal quality' (i.e., 'General Architecture') and 'quality in use' (i.e., 'Adaptation') technological evaluation criteria. The authors have proposed to use the quality evaluation rating tool while evaluating LORs and VLEs. The authors have analysed that if we want to optimise LORs and VLEs (or the other learning software packages) for the individual learner needs, i.e., to personalise hisher learning process in the best way according to their prerequisites, preferred learning speed and methods etc., we should use the experts' additive utility function including the proposed LORs and VLEs expert evaluation criteria ratings together with the experts preferred weights of evaluation criteria. In this case we have the multiple criteria optimisation task using criteria ratings, and their weights. Quality evaluation criteria of the main e‑Learning system components, i.e., LORs and VLEs are further investigated as the possible learning software packages optimisation parameters. Scalarization method is explored in the paper to be applied to optimise the learning software packages according to the individualised learners needs. Several open source VLEs evaluation results are also presented in the paper.

 

Keywords: managing quality in e-learning, multiple criteria evaluation, learning object repositories, virtual learning environments, optimisation

 

Share |

Journal Article

Learning Objects and Virtual Learning Environments Technical Evaluation Criteria  pp127-136

Eugenijus Kurilovas, Valentina Dagiene

© Jun 2009 Volume 7 Issue 2, Editor: Shirley Williams, pp85 - 190

Look inside Download PDF (free)

Abstract

The main scientific problems investigated in this article deal with technical evaluation of quality attributes of the main components of e‑Learning systems (referred here as DLEs — Digital Libraries of Educational Resources and Services), i.e., Learning Objects (LOs) and Virtual Learning Environments (VLEs). The main research object of the work is the effectiveness of methods of DLE components quality evaluation. The aim of the article is to analyse popular existing LO and VLE technical evaluation tools, and to formulate new more complex tools for technical quality evaluation of LOs and VLEs based on requirements for flexible DLE, as well as to evaluate most popular open source VLEs against new more complex criteria. Complex tools have been created for the evaluation of DLE components, based on a flexible approach. The authors have analysed existing tools for technical evaluation of LOs, and it was investigated that these tools have a number of limitations. Some of these tools do not examine different LO life cycle stages, and other insufficiently examine technical evaluation criteria before LO inclusion in the repository. All these tools insufficiently examine LOs reusability criteria. Therefore more complex LO technical evaluation tool is needed. It was investigated that this new more complex LO technical evaluation tool should include LO technical evaluation criteria suitable for different LO life cycle stages, including criteria before, during and after LO inclusion in the repository as well as LO reusability criteria. The authors have also examined several VLE technical evaluation tools suitable for flexible DLE, and it was investigated that these tools have a number of limitations. Several tools practically do not examine VLE adaptation capabilities criteria, and the other insufficiently examines general technical criteria. More complex VLE technical evaluation tool is needed. Therefore the authors have proposed an original more complex set of VLE technical evaluation criteria combining (1) General (Overall architecture and implementation; Interoperability; Internationalisation and Localisation; Accessibility) and (2) Adaptation (Adaptability; Personalisation; Extensibility and Adaptivity) VLE technical evaluation criteria. The authors have also selected and proposed to use the universal, clear and convenient DLE components' evaluation rating tool, and have evaluated three most popular open source VLEs against technical (both general and adaptation) criteria in conformity with this rating tool.

 

Keywords: managing quality in e-learning, technical evaluation, virtual learning environments, learning objects, repositories

 

Share |