300 Million Hours of Free Labour

Facebook algorithms contain new forms of human rights violations: exploitation and manipulation mechanisms that influence billions of people. At re:publica Vladan Joler, Djordje Krivokapic, Ben Wagner and Julia Powles demand that users should be able to exert control over their own data.

Mark Zuckerberg has only just published the official Facebook Mission in his manifesto: to create a more open society by connecting people. In fact however Facebook has mainly achieved that its users are transparent and predictable. Complex algorithms continously analyse input and activity.

With the collected data, Facebook not only creates a perfectly customised virtual reality, in 2015 the company also made around 15 billion dollars in profit. This also contains the immaterial labour of the 1,6 billion worldwide Facebook users. "On a daily basis, we provide Facebook with 300 million hours of free labour", says Vladan Joler, CEO of ShareLab. Whilst the complexity of Facebook´s data increases, understanding by its users shrinks rapidly.

Together with his colleague, Djordje Krivokapic, Joler has attemped to trace how Facebook user data is utilised. To do this they have analysed over 8,000 patents owned by the company. What they have discovered is new forms of labour: "We are the raw material", says Krivokapic. "Humans produce the material, the algorithmss do the work. Only select few are earning money with this." Whether users read an article on Facebook, like a picture, write comments or even just scroll through the newsfeed: they are selling themselves - for free.

This creates a huge gap between the few that own this factory of immaterial labour and those that work for them - no less than 300 petabytes per day in data. "That´s more than 300,000,000 hours of free digital work a day", explains Joler.

Facebook sells the data that its algorithms collect as a resource. The company offers advertising agencies the opportunity to filter Facebook users according to different categories - e.g. by ethnicity, travel behaviour and income - in order to identify potential customers. According to the representatives of ShareLab, this means that at the epi-centre of the social network, immaterial labour meets the surveillance industry.

It is precisely for this reason that Julia Powles, a researcher on the topic of data security, demands more transparency within algorithms. Facebook should give users a say when it comes to the processing of their data, says Powles. "The control over data has for the most part been lost and a detangling is highly neccessary.


By Marielle Klein (FF) und Franziska Hoppen (EJS)

Photo credit: MCB / Uwe Völkner (CC BY 2.0)

Tags: