Sensors for reflection and learning (Topic of the month)

There is a whole movement of using sensor data for self tracking and self analysis.

The Quantified Self Group has lots of examples in their blog about using sensor information from different tools, apps, and gadgets importing them into visualisation tools, sharing them in social communities, and monitoring your activities. There is a even a best practice guide and active discussion forum about what tools, apps and gadgets best to use for self tracking.

A nice overview of activities in that area are in a recent article from technology review with the title “The Measured Life“.

Applied to learning support and the TEL community I see some relations to user modeling, adaptive systems, personalisation, and recently learning analytics. The tricky thing is that in a lot of the examples from self-tracking you can easily track your steps per day, or the number of calories consumed, the types of aphyisical movements you made, or the type of sleep you had or not, but its pretty hard to track relevant things for learning. (quantify learning? probably.) In relation to personalisation and personalized learning the questions is how far the system are just used for tracking and mirroring and if they get used for coaching, tutoring, mentoring, or even control. The vision then would then become a kind of sensor enabled cyborg with feedback loops to control every aspect of life.

Nevertheless more and more sensors can be used for measuring and quantifying more complex phenomena like stress, attention, collaboration. In most cases the combination of several sensors as also personal data given by users brings the results needed. It is a bit comparable to context-aware systems and context modelling: The more sensor data and sources you have the more precise you can get on the results.

In contrast to the approach about collecting more and more data I also see another path to go comparable to the discussions in user modeling, adaptive systems, and nowadays learning analytics and reflection supporting via mirroring.

  • We can definitely use sensor data to learn and reflect about us when we use sensors to collect data and relate them to the right baselines and personal yardsticks. As some recent results of Christian Glahn and also Dominique Verpoorten show the choice of the right framing of the data is essential.
  • We can also use sensor data to trigger the right questions, in that sense sensor information could more be used to trigger context-specific experience sampling. So not the system does the inference but we have to do the inference on what is wrong and what is right when the system detects that something definitely is going wrong.
  • Last but not least also direct feedback from sensor information can be very helpful in learning contexts, most of these approaches are anyway related to Donald Schöns concepts of reflection in action and reflection about action.

So as a first point to make here: Lots of methods for tracking and self-monitoring are already available and more will follow soon, the main question is how to use them in ways that make curious, motivate, stimulate collaboration, and social interchange and gives control to the learner.

This entry was posted in mobile learning and tagged . Bookmark the permalink.