Truth decayMake tech companies liable for "harmful and misleading material" on their platforms

Published 1 August 2018

In a withering report on its 18-month investigation into fake news and the use of data and “dark ads” in elections, the U.K. Parliament’s Digital, Culture, Media and Sport Committee (DCMC) says that Facebook’s egregious indifference to its corporate responsibility has led to a massive failure with far-reaching consequences. The DCMC charges that Facebook “obfuscated”, refused to investigate how its platform was abused by the Russian government until forced by pressure from the U.S. Senate Intelligence Committee. In the most damning section of the report, DCMC offers evidence that Facebook’s indifference aided and abetted the incitement and persecution of the Rohingya ethnic group in Myanmar, causing large-scale death and the flight of hundreds of thousands of Rohingya from Myanmar to Bangladesh.

In a first interim report in its Disinformation and “fake news” inquiry, published over the weekend, the U.K. Parliament’s Digital, Culture, Media and Sport Committee (DCMC) warns that we are facing a democratic crisis founded on the manipulation of personal data, and targeting pernicious views to users, particularly during elections and referenda. The Committee outlines a series of recommendations to tackle the problem of disinformation and fake news facing the whole world.

Damian Collins MP (Conservative- Folkestone and Hythe), chair of the Committee, said:

We are facing nothing less than a crisis in our democracy – based on the systematic manipulation of data to support the relentless targeting of citizens, without their consent, by campaigns of disinformation and messages of hate.

In this inquiry we have pulled back the curtain on the secretive world of the tech giants, which have acted irresponsibly with the vast quantities of data they collect from their users. Despite concerns being raised, companies like Facebook made it easy for developers to scrape user data and to deploy it in other campaigns without their knowledge or consent. Throughout our inquiry these companies have tried to frustrate scrutiny and obfuscated in their answers. The light of transparency must be allowed to shine on their operations and they must be made responsible, and liable, for the way in which harmful and misleading content is shared on their sites.