Wolfgang’s recent overview of the discussion on perspectives of learning analytics discusses tactics of annonymising “group analytics” for personal use. The main critique of that blog entry is that the common approach to ensure “privacy’ is flattening the data so no individual can get identified. However, this might render the resulting information useless for learning support. Privacy aspects raise certainly ethical issues for educational applications of analytical data. I consider data flattening as a naïve approach for enforcing privacy.
First of all, the common data flattening approache takes perspective takes a very pessimistic viewpoint that literally everybody is your enemy from whom you should get protected (e.g., because you might get bullied). Of course, the education system is not a friendly environment, but equally not everybody is an enemy. There are social planes that influence individual learners in different ways.
Secondly, privacy is tightly coupled to personal perception: everybody draws the line between private and public differently and what is considered private is quite fragmented rather than a homogeneous space. The aspect of fragmentation has been quite nicely addressed by Google plus, which makes a difference between the “all private” or “all public” dichtomy of prior social network applications. This fragmentation is further extended by individual choices of what is considered private or public information.
Thridly, privacy and more particularly data privacy is considered something that has to be produceted. Data privacy is frequently used as a synonm for data protection, for which the provider of an IT system is responsible. However, data protection and data privacy are to very different concepts. I agree that data protection has to be assured by a system provider, but data privacy is a shared responsibility of those who run a system and those who provide the data. Closely related to this problem area is transparency. Learners can only take personal responsibilty of their data/analytics if they are aware what data and analytical approaches are awailable.
The primary privacy issues that affect learning analytics affect three different problem areas.
-
Social planes
-
Personalisation
-
Transparency
The dimension of social planes for providing tunable perspectives on learning analytics data has been recently discussed in Flores et al. (2011).
Personalisation of information distribution is currently diffusing into a range of social software platforms after Google has introduced the circles in its Plus service.
Transparency for supporting learning has been covered by Verpoorten et al. (2009) and Glahn (2010).
The main challenge ethical challenge is to integrate these dimensions into an educationally sound framework. This will not be achievable without rethinking and sometimes disrupting popular educational design approaches, paradigms, and organisational policies.