Will Article 13 of the Copyright bill damage your platform?

There’s been a long, winding journey for the European legislation on Copyright, culminating today with a vote passing it.

Most notably, Article 13 is likely to impact Platform and Service providers seeking to keep their Platform clean.

Article 13 holds larger technology companies responsible for material posted without a copyright licence.It says that content-sharing services must license copyright-protected material from the rights holders or they could be held liable unless:

  • it made “best efforts” to get permission from the copyright holder
  • it made “best efforts” to ensure that material specified by rights holders was not made available
  • it acted quickly to remove any infringing material of which it was made aware

Those platform providers (especially with User Generated Content (UGC) will recognise and may be comfortable with such clauses. They are similar to best practice approaches when protecting User online from harmful content.

If you’re looking for a Content Moderation Company (even one that can help you with the EU copyright Directive) head over to the Content Moderation Marketplace and find one that works for you.

How to make your website safer for humans – the basics in the UK

We often have conversations with small platform and service providers that are starting to think about making their service a safer place for their users. This applies to almost all types of website and internet based services, but in particular those that provide some social media function, some chat platform or those with User Generated Content (UGC).

One of the first things that is mentioned is the lack of clear guidance available on what the service or platform’s responsibilities are and what steps they should be taking.

A good place to start in the UK is the UK Council for Child Internet Safety and the principles they provide

Take a look at the HTML version here.

Whilst this is targeted towards protecting children online, it is sound advice for any platform or service seeking to protect users online.

We’ll unpack more on this in later posts, but if you’re looking at their advice (for example):

…use tools such as search algorithms to look for slang words typically used by children and young people, and to identify children under 13 who may have lied about their age at registration

The you may also want to take a look at our Online Safety Content Moderation Company list.

The Online Harms White Paper (OHWP) and Safety by Design (SbD) – Will they conflict?

The Online Harms White Paper will be the biggest step forwards for online Safety this decade

2019 is turning out to offer green shoots of hope in the world of online safety. We fully expect to see the much anticipated Online Harms White Paper from the Dynamic Duo (the Home Office and the Department for Culture, Media and Sport (DCMS)) as well as the much needed Safety by Design framework from the Australian eSafety Commissioner.

You may think that these are not connected, and indeed they may not be, but parliamentary comment suggests otherwise…

Lord Ashton of Hythe suggests that the OHWP may either contain SbD principles or reference some (we don’t know whether this was a reference to the Australian eSafety Commissioner work).

We hope that we’ll see some coherent advice for online safety to help companies reduce online harms, but the truth is that nobody knows what the OHWP actually contains.

Whatever the outcome, we’ll read it and try to help you with a clear steer about how to make your platform safe.

Happy 2019