‘EdTech’ as fetishism/The ‘EdTech’ commodity chain

My point is not that everything is bad but that everything is dangerous. – Michel Foucault, 1983

I first heard the above quote late last year in a Neil Selwyn (highly recommended) presentation at the University of Oxford about ‘teachers’ work in the digital age‘. Ever since, the quote has come regularly to mind as I think, talk and write about technology: it helps me avoid tipping too far into unproductive pessimism, although I’m sure I do this frequently!

How does the Foucault quote relate to technology in education?

The introduction of technologies (I’m particularly interested in those which are connected to the internet) into our schools and classrooms affects everyone involved in a range of ways. Some of these effects are positive and some are negative. Some of them are obvious, predictable and predicted by those who are responsible for their introduction. However, some are certainly not obvious, predictable or predicted. This, to me, is the significance of the quote.

So, while acknowledging that there are positive effects, it is essential for educators to consider carefully any co-present dangers. That’s the focus of this post.

‘Edtech’ as fetishism

In a section of Capital called ‘The Fetishism of the Commodity and Its Secret‘, Karl Marx lays out his concept of fetishism to account for “how the market system and the money-forms disguise real social relations through the exchange of” commodities (Harvey 2010).

David Harvey (2013, p. 4) provides a succinct explanation:

By fetishism, Marx was referring to the various masks, disguises and distortions of what is really going on around us. ‘If everything were as it appeared on the surface,’ he wrote,  ‘there would be no need for science.’ We need to get behind surface appearances if we are to act coherently in the world. Otherwise, acting in response to misleading surface signals typically produces disastrous outcomes…we need to unmask what is truly happening underneath a welter of often mystifying surface appearances.

I think this can be applied quite easily to technology in education. For example, we may observe our students genuinely engaged and ‘productive’ while using their own mobile devices for an augmented reality activity. However, concealed behind these very positive ‘surface signals’, are ‘real social relations’ (or perhaps ‘dangers’ in Foucault’s terms); when we consider these hidden social realities, the technology begins to appear highly problematic, especially in the context of education.

Update, 5/2/16:

Along similar lines is the familiar concept of ‘reification’, as explained by Leo Marx in his 2010 essay ‘Technology: The emergence of a hazardous concept‘:

Reification, as the philosopher George Lukacs  famously explained, is what occurs when we endow a human activity with the characteristics of a thing or things. It thereby acquires, as he put it, “a ‘phantom-objectivity,’ an autonomy that seems so strictly rational and all-embracing as to conceal every trace of its fundamental nature: the relation between people.” [from Lukacs’s History and Class Consciousness: Studies in Marxist Dialectics, Cambridge, 1971] In contemporary discourse, private and public, technologies [by which Marx means broader ‘sociotechnical systems’ which have material, bureaucratic and ideological components] are habitually represented by “things” – by their most conspicuous artifactual embodiments: transportation technology by automobiles, airplanes, and railroads; nuclear technology by reactors, power plants, and bombs; information technology by computers, mobile telephones, and television; and so on. By consigning technologies to the realm of things, this well-established iconography distracts attention form the human – socio-economic and political – relations which largely determine who uses them and for what purposes. (p. 576, emphasis added)

The ‘edtech’ commodity chain

One of these social realities is that the student’s mobile device is part of a commodity chain, “a network of labour and production processes whose end result is a finished commodity” (Hopkins and Wallerstein 1986, p. 159).

The following video, by Unknown Fields Division, ‘unmasks’ some of the earlier stages in this commodity chain:

It shows, for example, an artificial lake in Baotou, Inner Mongolia, filled with the toxic by-product of rare earth mining. It also claims that:

  • 1 ton of rare earth produces 75 tons of acidic waste water.
  • 380g of toxic sludge is produced in order to extract the 8 rare earth minerals required for one smartphone.
  • One laptop produces 122kg of toxic waste.

Based on these figures, the desktop computer I’m using to write this post would also obviously lead to a significant amount of toxic waste.

Expanding on these points in an article for the BBC last year, Tim Maughan wrote:

The intriguing thing about both neodymium and cerium [“used to polish touchscreens on smartphones and tablets”] is that while they’re called rare earth minerals, they’re actually fairly common. Neodymium is no rarer than copper or nickel and quite evenly distributed throughout the world’s crust. While China produces 90% of the global market’s neodymium, only 30% of the world’s deposits are located there. Arguably, what makes it, and cerium, scarce enough to be profitable are the hugely hazardous and toxic process needed to extract them from ore and to refine them into usable products. For example, cerium is extracted by crushing mineral mixtures and dissolving them in sulphuric and nitric acid, and this has to be done on a huge industrial scale, resulting in a vast amount of poisonous waste as a byproduct. It could be argued that China’s dominance of the rare earth market is less about geology and far more about the country’s willingness to take an environmental hit that other nations shy away from.

In light of this, it would be sickeningly ironic to expect our Chinese students to use a mobile device to research environmental issues.

Update, 1/2/16: For more on this, I highly recommend this review, by Karen Gregory (@claudiakincaid), of Jussi Parikka‘s A Geology of Media.

Our students in the ‘edtech’ commodity chain

Towards the other end of the “network of labour and productive processes” that is the ‘edtech’ commodity chain, we find data. Or, as Audrey Watters (2016, p. 28) prefers to call them, ‘life bits’.

Life bits is a better term than data to describe all that we’re creating now thanks to new technologies…[because it] might help us to recognize that all this data we’re creating – intentionally and unintentionally – are us. Life bits represent bits and bytes of data, sure, but also bits and pieces of our lives. Opening up life bits for others to mine sounds a lot less appealing than mining something impersonal like data.

In my ‘Student Data Blueprint’ post last year, I explored in detail some of the ways that students’ ‘life bits’ have become a commodity so I won’t go into it again here. However, I would like to suggest that the ‘life bits’/data created by our students – particularly when they use their own devices – is part of the same commodity chain that we saw in the Unknown Fields Division video. Not only that, the students themselves become labourers in the production of a data commodity which moves on down the chain to data brokers to be turned into yet another commodity – the ‘data package‘. Similarly, data produced by teachers via their free labour is used to create commodities such as the TurnItIn Scoring Engine.

Again, we see that, concealed behind the ‘surface signals’ are some disturbing social realities which might undermine various of our goals (empowering students or giving them ‘agency’), messages (the importance of critical thinking) and policies (privacy policies) in education.

Response

In addition to the ‘five-point plan‘ I proposed last year, I would briefly suggest two further responses.

 1. DBYOD

At the very least, we should tell students and staff, ‘Don’t Bring Your Own Device’. Things get even more ‘dangerous’ if we get students to use public wifi networks and set up random accounts or download apps without reading the Terms of Service.

I think education providers (regardless of whether they are public or private sector, profit or non-profit) have an ethical responsibility to consider the social realities of technology and act accordingly. If students and teachers are expected to use the internet in the course of their work/study, this includes providing devices and recommending services that are specifically set up to avoid involving students and staff in these processes without their understanding or consent. Education providers should be employing IT staff with the skills to do this and, through induction and training, ensuring that they understand these issues. To do otherwise is reckless.

For another perspective on school staff taking more responsibility, see this post, ‘Critical Thinking and the Student Privacy Debate’.

Update, 1/2/16: A mea culpa

About three years ago, when I was doing and saying the kinds of things with/about technology that I now criticise others for, I wrote this post about an experience with Socrative which “showed me clearly the potential of using mobile devices in the classroom.”

I do still get students to use Socrative occasionally but only on the desktop PCs in my school’s computer lab, rather than on their own devices. However, I feel even that is potentially problematic.

2. Get real about ‘digital literacy’

If we aim for our students to develop ‘digital literacy’ but ignore these ‘dangerous’ aspects of technology (while at the same time fussing over whether there are one or five or seven ‘digital literacies’) then I argue we are deluding ourselves and letting our students down.

Nicky Hockly, for example, wrote in 2013 for the British Council that, when talking about ‘digital literacy’, we “are not just talking about a checklist of technical skills, but also about the social practices that surround the use of new media.” But based on her published work, Hockly’s conception of these ‘social practices’ doesn’t seem to include the issues I’ve described above. Similarly, in a 2014 presentation, Mark Pegrum is quoted by Hockly as saying that “digital literacies [are] essential skills our students need to acquire for full participation in the world beyond the classroom.” And this extract from the 2013 book Digital Literacies co-authored by Hockly, Pegrum and Gavin Dudeney presents a somewhat limited view of the ‘dangers’.

On the other hand, one more recent conceptualisation of ‘digital literacies’- Mozilla’s ‘Web Literacy’ standards – does highlight these ‘sub-literacies’ (?) relating to privacy:

  • Debating privacy as a value and right in a networked world.
  • Explaining ways in which unsolicited third parties can track users across the web.
  • Controlling (meta)data shared with online services.
  • Identifying rights retained and removed through user agreements.
  • Managing and shaping online identities.

From what I’ve seen, though, over the last couple of years, these tend to be overlooked in favour of, for example, the dubious ‘remix literacy’. I believe that the ‘dangerous’ aspects  should be emphasised much more. To this end, Mozilla also provide some potentially useful Teaching Activities which is as good a place as any I’ve seen for concerned educators who want to start exploring with their students.

 

 

 

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s