[Newsletter Archives] Newsflash – 17/02/2020 – OHWP Update

The following content is the basic content of an email that was sent to our email list subscribers. We try and post a copy of email newsletters here, roughly two weeks after they were sent to subscribers. If you want content like this, free, to your inbox, long before it arrives here, then click here to subscribe. Due to the way content renders on the internet, it won’t be an exact version of the original email.

Hi <<First Name>

UK Online Harms ‘Regulation’ – Update publishedIf you’ve been following the UK’s government-led consultation into how it should handle legislation that will reduce Online Harms, then you’ll be interested to know this.

There’s been a release of new information; an update that contains the summarised responses. Worthy of note:
 In particular, while the risk-based and proportionate approach proposed by the White Paper was positively received by those we consulted with, written responses and our engagement highlighted questions over a number of areas, including freedom of expression and the businesses in scope of the duty of care. Having carefully considered the information gained during this process, we have made a number of developments to our policies. These are clarified in the ‘Our Response’ section below. Keep in touch.
Matt

Online Harms Reduction Regulator – firm steps emerge – will it affect you?

On the 14th January 2020, Lord McNally placed a bill infront of Parliament that would “assign certain functions of OFCOM in relation to online harms regulation”. This executive summary appears to be a loose description of what’s contained, with the text seemingly requiring OFCOM to write a report each year with recommendations for the introduction of an Online Harms Reduction Regulator.
It is not clear why recommendations are required every year, nor why the lead has now moved fromDCMS to OFCOM (I can only assume that it is because the OFCOM is much closer to the cut-and-thrust of Regulation.

So what might it mean for Platform and Service Providers?

In the short term – probably not a lot, however, there are a couple of key points that providers may want to keepabreast of:
1) We can see that progress is being made – and is more likely to increase than decrease.
2) The Online Harms (as initially laid out in the OHWP) have been narrowed down to focus on certain ones in particular. This means that a Platform Provider is probably well advised to ensure that these are being tackled actively. The Harms laid out in the paper are:
(a) terrorism (or it could be in reference to this definition);
(b) dangers to persons aged under 18 and vulnerable adults;
(c) racial hatred, religious hatred, hatred on the grounds of sex or hatred on the grounds of sexual orientation;
(d) discrimination against a person or persons because of a protected characteristic;
(e) fraud or financial crime;
(f) intellectual property crime;
(g) threats which impede or prejudice the integrity and probity of the electoral process; and
(h) any other harms that OFCOM deem appropriate.

Unfortunately, as you have probably recognised, these descriptions of Online Harms are not well correlated with those laid out in the OHWP – and so OFCOM will probably struggle initially with the lack of tight definition of these Harms, before it can make any meaningful report.

The Bill is in its Second Reading and we will try and provide further update as it progresses.

Will Article 13 of the Copyright bill damage your platform?

There’s been a long, winding journey for the European legislation on Copyright, culminating today with a vote passing it.

Most notably, Article 13 is likely to impact Platform and Service providers seeking to keep their Platform clean.

Article 13 holds larger technology companies responsible for material posted without a copyright licence.It says that content-sharing services must license copyright-protected material from the rights holders or they could be held liable unless:

  • it made “best efforts” to get permission from the copyright holder
  • it made “best efforts” to ensure that material specified by rights holders was not made available
  • it acted quickly to remove any infringing material of which it was made aware

Those platform providers (especially with User Generated Content (UGC) will recognise and may be comfortable with such clauses. They are similar to best practice approaches when protecting User online from harmful content.

If you’re looking for a Content Moderation Company (even one that can help you with the EU copyright Directive) head over to the Content Moderation Marketplace and find one that works for you.