Business News Digital Labels & Publishers Legal

UK government proposes regulator to deal with online harms, though not piracy matters

By | Published on Tuesday 9 April 2019

Internet

The UK government yesterday unveiled plans for a new internet regulator that will seek to encourage, pressure and force digital platforms to do more to stop what ministers are calling ‘online harms’.

Launching a white paper on the topic, culture minister Jeremy Wright said at an event at the British Library: “We propose a duty of care for those online companies which allow users to share or discover user-generated content, or that allow users to interact with each other online. A duty to do all that is reasonable to keep their users safe online. That duty will be enforced by an independent regulator”.

Of course the term ‘online harm’ covers all sorts of misbehaviour that can occur on the net. The white paper does list what ministers are grouping under that term but it’s quite a long list. It includes cyberbullying, cyberstalking, disinformation, revenge porn, incitement of violence and the distribution of terrorist or violent content, or of images of child abuse. Children accessing pornography and other unsuitable content is also included, as is the online sale of illegal goods.

From a music industry perspective, the obvious thing that is missing from the list is copyright infringement. Ministers possibly think that the responsibility of online platforms to restrict the distribution of unlicensed content has been dealt with elsewhere – not least in the soon-to-be-passed European Copyright Directive – though the copyright industries had hoped that intellectual property would also have been part of this new discussion.

Noting the exclusion of copyright matters from the government’s list of online harms, the boss of record industry trade group the BPI, Geoff Taylor, said yesterday: “The BPI welcomes the global lead the UK government is taking to make the big tech platforms much more accountable for the content they host and the online harm they enable”.

“However”, he went on, “after all the promises made around the Digital Charter [that the government published last year], it is disappointing that the opportunity has been missed to create a framework that also reduces economic harm online, and the damage this causes to consumers and to business. IP infringement is one of the biggest inhibitors to the growth of our creative industries – at a time when they should be one of the UK’s top priorities after Brexit. We strongly urge the government to listen to the many concerned voices across the wider creative community, and to think through its approach again”.

Even without copyright being included, the white paper will receive push back from the tech sector, especially given that the proposed new regulator will have the power to fine digital companies that fail to meet this new duty of care. It might have other powers too, including the power to instigate web-blocks against offending sites.

Of course, no one denies that mainstream and widespread internet usage has created an assortment of challenges for society at large, and many of the things on the government’s list of online harms are serious problems that need addressing. And while some tech giants are now talking up how they are trying to meet these challenges themselves, some would argue that such voluntary initiatives are too little too late.

There is a parallel with all the recent debates around copyright online here as well. In that many tech companies turned themselves into content distributors and media platforms without dealing with any of the tricky issues that traditional content distributors and media platforms had always had to contend with, like copyright, fact-checking, defamation and decency.

By ignoring those tricky issues, the tech firms were able to distribute content and launch media channels on the cheap, gaining competitive advantage in the process. Then, when finally forced to address these entirely foreseeable problems, the tech firms argued that doing so could destroy their business models. Even though those are business models that only ever really worked because of the initial cheat, ie they ignored all the tricky issues of content distribution.

However, at the same time, law-makers have often proven themselves to be pretty damn ignorant of how the internet actually works, while also being very optimistic indeed about what automating filtering can really achieve. Which means that government-led attempts to regulate the internet and the distribution of harmful content have had mixed results to date.

Plus, while copyright critics often exaggerate the impact enforcing IP rights has on the freedom of expression, once you start regulating hateful posts, fake news and indecent content, you quickly hit upon significant free speech issues. Because who is to decide what is actually hateful, fake and indecent? Can a government regulator really be trusted with making such judgements over the tidal wave of content that appears online every day?

In its response to the online harms white paper, the Open Rights Group stated: “On the positive side, the paper lists free expression online as a core value to be protected and addressed by the regulator. However, despite the apparent prominence of this value, the mechanisms to deliver this protection and the issues at play are not explored in any detail at all. [And] in many cases, online platforms already act as though they have a duty of care towards their users”.

With that in mind, the Open Rights Group added, if the proposed new duty of care for net firms is “drawn narrowly so that it only bites when there is clear evidence of real, tangible harm and a reason to intervene, nothing much will change. However, if it’s drawn widely, sweeping up too much content, it will start to act as a justification for widespread internet censorship”.

Although Wright, like his white paper, didn’t go into many specifics, he did acknowledge that the big challenge here is getting the balance right between stopping online harms while protecting free speech.

He said: “The regulator will take account of the need to promote innovation and freedom of speech. It will adopt a risk-based approach, prioritising action where there is the greatest evidence of threat or harm to individuals or to wider society. It will also adopt a proportionate approach – taking account of a company’s size and resources. It will be regulation designed to be intelligent, but most of all designed to be effective”.

You can read the white paper here.



READ MORE ABOUT: