Single Blog Title

This is a single blog caption
30 May 2024
Joyeeta Dey and Radhika Gorur

The "Friendly Users" Who Feed AI

In this blogpost, which was previously published in NORRAG’s 4th Policy Insights publication on “AI and Digital Inequalities”, Joyeeta Dey and Radhika Gorur draw attention to how AI technologies are designed to configure their users and constrain how users engage with them.

‘User friendliness’ is a well-established concept in technology design. To encourage more and more diverse groups to adopt new technologies, designers strive to make their technologies user friendly, adding features that make the technology easier to use. To achieve this, technology companies may invite potential users to test technologies and provide feedback so that the technology could be improved. However, even as the technology is being tweaked, the user is also being “configured”—that is, technologies are designed to constrain how users engage with them and how they are disposed towards the use of technology (Woolgar, 1990).

One aspect of “configuring the user” is what we have termed “making the user friendly”—a phenomenon which became visible in our research on digital infrastructures of education in India (Gorur & Dey, 2021). We found that the state made substantial investments in influencing the user, persuading reluctant teachers and headteachers to labour in ways that would sustain the technology.

The idea that technology companies exploit our labour to fuel their business is beginning to be more widely understood. Instagram and Facebook might be “free” services, but they are only free if we don’t acknowledge the free labour we provide as users by uploading content and allowing our data to be collected. Without this free labour, these platforms could not function. Similarly, in ed-tech, tech-creators offer teachers ‘tech-upskilling’ courses, often for free, but skilling up users is a way to enrol more people into using their technology (Thompson, 2022). Studies have shown that even when people are aware they are being manipulated, they are willing to go along with it because they have something to gain (for example, Williams et al., 2017).

Like corporations, states also depend on the labour of ‘uploaders’ to sustain their systems, but the uploaders neither have the option of not participating nor do they necessarily benefit from their labour. We observed this in our study of India’s Unified Digital Information on School Education (UDISE). The largest Education Management Information System in the world, UDISE contains information on nearly 1.5 million schools, 9.5 million teachers and 265 million students on a wide range of indicators. Although this database is presented as serving the needs of schools and communities, a comprehensive study found that UDISE data was mainly used for government financial planning (Bordoloi & Kapoor, 2018). Neither parents nor teachers were aware of how the data could be put to use for their benefit. The state provided extensive training to ensure the input of data by teachers but almost none on how the data and analyses from the system could be utilised by (and be useful to) those who laboured to populate the database. The key function of the database appeared to be to enforce accountability. In other words, teachers and headteachers were being incorporated into developing instruments that would mainly be used in their own surveillance and monitoring.

How is the state able to persuade teachers and headteachers to provide such labour when the burden of non-teaching work is already so high? Our study identified several strategies ranging from inspirational messaging that framed uploading of data as a patriotic duty to regulation that threatened to cancel the registration of schools that did not comply with data demands.

India has launched ambitious plans for the deployment of artificial intelligence (AI) in public policy and administration, including in education. Education databases form the basis of AI interventions, such as the identification of the potential school dropouts in the state of Andhra Pradesh. Globally, the UNESCO initiative Digital Transformation Collaborative (DTC), of which companies such as Google, Ericsson and Microsoft are founding members, is looking to encourage nations of the Global South to embark on comprehensive digitisation of their education systems, which will enable the widespread use of AI.

As it becomes necessary for more and more data to fuel the hi-tech dreams of nations and global agencies, it is crucial to become aware of the labour inequities that may come in their wake and to adopt fairer and more productive approaches to data generation. Much is written about how important it is for states to regulate private technology companies. While this is of course critically important, states should also ensure they themselves do not engage in extractive practices.

 

About the Authors:

Joyeeta Dey, National Institute of Advanced Studies, Bengaluru, India

Radhika Gorur, Deakin University, Melbourne, Australia

(Visited 41 times, 1 visits today)

Leave a Reply

Sub Menu
Archive
Back to top