Child abuse predators exploit COVID-19 for online abuse

This is a truly awful story about online abuse, but I thought it was worth raising some awareness of. This article contains two really troubling factors; the first about offenders taking advantage of COVID-19 to abuse children, but it also discusses the awful effects that discovery of an abuser has devastating effects on the spouse (and the rest of the family of course).

This highlights the importance of online safety in combating online abuse and how some many parties have a part to play in the solution to this.

https://www.theguardian.com/society/2020/may/14/child-abuse-predator-handbook-lists-ways-to-target-children-during-coronavirus-lockdown

Picture of a child sitting down using their phone
https://www.theguardian.com/society/2020/may/14/child-abuse-predator-handbook-lists-ways-to-target-children-during-coronavirus-lockdown

Sadly this is only one of many online harms, but it’s one of the largest and one that causes the most pain.

The issue of the impacts causes on the rest of the family is seldom discussed, but worth being aware of. Thankfully there are some charities that support those left in the wake, and the article highlights PartnerSpeak but in the UK we see organisations like Mothers of Sexually Abused Children (MOSAC) amongst others.

Thanks go to the Australian eSafety Commissioner’s Office for the work highlighting this.

[Newsletter Archives] – Newsletter – 30/04/2020 – Great Advice If You Have Young People Online

The following content is the basic content of an email that was sent to our email list subscribers. We try and post a copy of email newsletters here, roughly two weeks after they were sent to subscribers. If you want content like this, free, to your inbox, long before it arrives here, then click here to subscribe. Due to the way content renders on the internet, it won’t be an exact version of the original email.

Hi << Test First Name >>

Advice for parents, guardians and carers of young people online during the COVID-19 Pandemic.

The ever brilliant Australian eSafety Office has released a well put together guide here:

https://www.esafety.gov.au/key-issues/covid-19/advice-parents-carers

It covers the following:

As they say:

Staying connected online has never been more important, now that many of us are physically isolated from family members, friends, colleagues and support networks.The internet is a great way to socialise, learn, work, play and be entertained. But there are also risks.So eSafety is adding new content every day to help you stay safe online

We highly recommend heading over there and checking it out.

Keep in touch.
Matt

[Newsletter Archives] Newsletter – 24/02/2020 – 3 Quick Tips For Online Safety

The following content is the basic content of an email that was sent to our email list subscribers. We try and post a copy of email newsletters here, roughly two weeks after they were sent to subscribers. If you want content like this, free, to your inbox, long before it arrives here, then click here to subscribe. Due to the way content renders on the internet, it won’t be an exact version of the original email.

Hi << Test First Name >>

3 Quick Tips to make your Service Safer for your users.


You don’t have the time to be wading through reams and reams of government-issued documentation and guidance, about Online Safety for your users, so today I’ve picked just a few tips for you to pique your interest. I’d be keen to hear back (just hit ‘reply’, or comment on the blog post version of this) to let me know what you’ like to hear more about, and I’ll make it happen….

1. Content moderation

Pre, Post or Reactive?If you’ve not engaged with a Content Moderation service for online safety yet, then you may have thought you just ‘plug it in’ and it happens. Well…. You have to make some choices I’m afraid. Here are some of the basic strategies available to you:
Pre-Moderation. This is effectively where the content from Users (for example User Generated Content (UGC)) is placed in a queue and your moderation service (automated or human) forms an opinion before it gets published.
Post-Moderation. As you may have already guessed – This allows content from your users to be published immediately (giving that warm glow of achievement of having ‘gone live’) – but the content is replicated in a queue to be moderated as soon as it can be got to.
Reactive Moderation. This effectively lets the community (or (and this is not good) Law Enforcement!) report content to be moderated. This can then be plugged into your automated, semi-automated or Human moderation team.

IWF Make a Report Link

If your users are in the UK, they can make a report of images or videos of Child Abuse they come across directly to the IWF. You can link to the ‘Make a Report Button’ (https://report.iwf.org.uk/en/report) for an easy option for your users.

Default Settings

If you operate in the UK, you are now required to make sure that your service has it’s ‘Default Settings’ to be ‘High Privacy’ (unless you can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child). This is in the new ICO Age Appropriate Design code (coming into force soon) and they will have the power to fine companies that do not comply!

Hope those tiny-tips offer some pointers, but please do reply and let me know if you have other ideas.

Best

Keep in touch.
Matt

[Newsletter Archives] Newsflash – 17/02/2020 – OHWP Update

The following content is the basic content of an email that was sent to our email list subscribers. We try and post a copy of email newsletters here, roughly two weeks after they were sent to subscribers. If you want content like this, free, to your inbox, long before it arrives here, then click here to subscribe. Due to the way content renders on the internet, it won’t be an exact version of the original email.

Hi <<First Name>

UK Online Harms ‘Regulation’ – Update publishedIf you’ve been following the UK’s government-led consultation into how it should handle legislation that will reduce Online Harms, then you’ll be interested to know this.

There’s been a release of new information; an update that contains the summarised responses. Worthy of note:
 In particular, while the risk-based and proportionate approach proposed by the White Paper was positively received by those we consulted with, written responses and our engagement highlighted questions over a number of areas, including freedom of expression and the businesses in scope of the duty of care. Having carefully considered the information gained during this process, we have made a number of developments to our policies. These are clarified in the ‘Our Response’ section below. Keep in touch.
Matt

Major Tech companies sign up to 11 Principles to counter Child Abuse

Some of the large Social Media Platforms (FacebookAppleGoogle and Twitter) have signed up in the launch of the 5 Country Ministerial declaration of 11 Principles to make the internet safer.

These principles make it clear that they will take further steps that will make their users safer, and should be applauded. It steers clear of the current debate about end-to-end encryption (more to come of that later) but is a strong move in the right direction.

Online Harms Reduction Regulator – firm steps emerge – will it affect you?

On the 14th January 2020, Lord McNally placed a bill infront of Parliament that would “assign certain functions of OFCOM in relation to online harms regulation”. This executive summary appears to be a loose description of what’s contained, with the text seemingly requiring OFCOM to write a report each year with recommendations for the introduction of an Online Harms Reduction Regulator.
It is not clear why recommendations are required every year, nor why the lead has now moved fromDCMS to OFCOM (I can only assume that it is because the OFCOM is much closer to the cut-and-thrust of Regulation.

So what might it mean for Platform and Service Providers?

In the short term – probably not a lot, however, there are a couple of key points that providers may want to keepabreast of:
1) We can see that progress is being made – and is more likely to increase than decrease.
2) The Online Harms (as initially laid out in the OHWP) have been narrowed down to focus on certain ones in particular. This means that a Platform Provider is probably well advised to ensure that these are being tackled actively. The Harms laid out in the paper are:
(a) terrorism (or it could be in reference to this definition);
(b) dangers to persons aged under 18 and vulnerable adults;
(c) racial hatred, religious hatred, hatred on the grounds of sex or hatred on the grounds of sexual orientation;
(d) discrimination against a person or persons because of a protected characteristic;
(e) fraud or financial crime;
(f) intellectual property crime;
(g) threats which impede or prejudice the integrity and probity of the electoral process; and
(h) any other harms that OFCOM deem appropriate.

Unfortunately, as you have probably recognised, these descriptions of Online Harms are not well correlated with those laid out in the OHWP – and so OFCOM will probably struggle initially with the lack of tight definition of these Harms, before it can make any meaningful report.

The Bill is in its Second Reading and we will try and provide further update as it progresses.