We often have conversations with small platform and service providers that are starting to think about making their service a safer place for their users. This applies to almost all types of website and internet based services, but in particular those that provide some social media function, some chat platform or those with User Generated Content (UGC).
One of the first things that is mentioned is the lack of clear guidance available on what the service or platform’s responsibilities are and what steps they should be taking.
A good place to start in the UK is the UK Council for Child Internet Safety and the principles they provide
Take a look at the HTML version here.
Whilst this is targeted towards protecting children online, it is sound advice for any platform or service seeking to protect users online.
We’ll unpack more on this in later posts, but if you’re looking at their advice (for example):
…use tools such as search algorithms to look for slang words typically used by children and young people, and to identify children under 13 who may have lied about their age at registration
The you may also want to take a look at our Online Safety Content Moderation Company list.