Britain needs a Which-like independent regulatory body to help citizens when they feel wronged online, alongside new safety standards and tests, a House of Lords inquiry into internet regulation heard this week. Julian Blake reports.
Rachel Coldicutt, chief executive at Doteveryone, told the Lords select committee on communications that a new regulatory body should “know how the internet works and work like the internet”. More public education should come alongside, she added.
“We’ve spent two years looking at how technology changes the world and we are pretty unambiguously in favour of regulating,” Coldicutt told the inquiry. “There’s a clear appetite in the public for additional regulating too.”
She added: “It isn’t something that can be solved with simply self-regulating and there is potentially an issue with government taking on the whole regulatory matter, not least because its own use of the internet probably needs to be subject to regulation in itself.
“So we are in favour of an independent body that understands how technology works, that is supported by public education, with some kind of body for people to turn to because that body knows who to ask, and has standards and safety tests.”
Coldicutt identified three areas that she sees needing extra regulation: platforms, emerging technologies like Ai and machine learning, as well as the use of technology by government and public services.
She pointed to consumer protection body Which or the approach taken by Public Health England as offering models for independent regulation. And she said “government needs to create a culture of responsibility” and provide “big regulatory incentives” to persuade tech platforms to change.
Doteveryone, founded by Martha Lane Fox, champions “responsible technology” for the good of everyone in society and “considers its impact and understands its consequences and seeks to mitigate those”.
The think tank has published two reports this year into UK public perceptions of technology. The first, on digital attitudes, found two in three people wanting the government to play a role in regulating tech businesses, while a second on understanding called for new codes of practice and trusted independent information.
Regulation of the internet has moved fast up the political agenda in the wake of this spring’s revelations of Facebook data breaches by Cambridge Analytica.
Data breaches and a lack of transparency have already created a major trust deficit for tech firms, with surveys by Demos and others reporting worryingly low levels of public faith in technology business.
Senior politicians have weighed in on the issue, with Jeremy Hunt last month warning social media businesses of regulation if they failed to do more to protect children’s mental health, while culture secretary Matt Hancock has warned that he “will not hesitate to strengthen the law”.
In parliament, Damian Collins MP chairs an increasingly feisty inquiry into fake news, while the House of Lords last month put the ethics of artificial intelligence into focus. MPs Darren Jones and Lee Rowley are scoping the idea of a commission into tech ethics, helped by Microsoft, Google and others.
In the US, meanwhile, Facebook chief executive Mark Zuckerberg was last month hauled before Congress to testify on data misuse and fake news. Regulation there could follow, looking at making social media companies liable for content posted on their platforms, and possibly breaking up big tech monopolies to curb their power.
This week’s House of Lords regulation inquiry, chaired by Lord Gilbert of Panteg, was taking the second of two evidence sessions on regulating the internet and the processes platforms use to moderate the content they host.
The session also heard from Konstantinos Komaitis from the Internet Society and digital media policy consultant Julian Coles. The London School of Economics, Communications Chambers and the Open Rights Group have also presented evidence on online safety, content regulation and data protection.