A Field Study of a Video Supported Seamless-Learning-Setting with Elementary Learners

Thomas Fößl, Martin Ebner, Sandra Schön, Andreas Holzinger

Research output: Contribution to journalArticlepeer-review

Abstract

Seamless Learning shall initiate human learning processes that exceeds lesson and classroom limits. At the same time this approach fosters a self-regulated learning, by means of inspirational, open education settings. Advanced learning materials are easily accessible via mobile digital devices connected to the Internet. In this study it was explored whether and to what extent an open learning approach can be initiated by support of videos and incentives. The study took place in a real-world setting during a conventional mathematics class in an Austrian secondary school with N = 85 children of average age of 10, 6 years. For the investigation a traditional face-to-face maths-teaching environment was completely replaced by an open learning environment. In our study, the elementary learners were able to select their own learning pace and preferences via example videos. In addition to the use of an open education approach and videos, their learning was also incentivised via a reward system of “stars.” A pre-test-post-test-control-group study showed that the learning performance significantly increased. The reason was due to the combination of a novel teaching and learning setting and coupled incentives to foster the learning process.
Original languageEnglish
Pages (from-to)321-336
JournalEducational Technology & Society
Volume19
Issue number1
DOIs
Publication statusPublished - 2016

Keywords

  • open education
  • mathematics
  • seamless learning

ASJC Scopus subject areas

  • Computer Science Applications

Fields of Expertise

  • Information, Communication & Computing

Treatment code (Nähere Zuordnung)

  • Theoretical
  • Experimental

Fingerprint

Dive into the research topics of 'A Field Study of a Video Supported Seamless-Learning-Setting with Elementary Learners'. Together they form a unique fingerprint.

Cite this