Cease.AI uses Artificial intelligence to detect new child sexual abuse material & save victims faster that manual techniques. It is trained on real images, and the AI scans, identifies, and flags new images containing child abuse with unprecedented accuracy. It has two primary customer bases:
For Law Enforcement
- Identify and rescue victims of child abuse faster
- Reduce Investigator’s manual workload
- Protect investigator’s mental health and reduce trauma
For Social Platforms
- Protect brand reputation by detecting illegal content
- Establish your platform as a safe online space
- Reduce moderator PTSD from exposure to damaging content
- By working with both law enforcement and social networks, we not only help investigators rescue victims faster, we also provide early child sexual abuse material (CSAM) detection at the source.
Caease.AI was built in collaboration with the RCMP (Royal Canadian Mounted Police) through the federally-funded Build in Canada Program, a generous Mitacs grant, and working with leading Canadian universities, CEASE is an ensemble of neural networks using multiple AI models to detect new images containing child abuse.
How it works
For investigators, our easy-to-use plugin helps reduce workloads by filtering, sorting, and removing non-CSAM, allowing them to focus their efforts on new child abuse images. Investigators upload case images, run their hash lists to eliminate known material, then let the AI identify, suggest a label, and prioritise images that contain previously uncatalogued CSAM. Better tools for overworked investigators and reduced mental stress will only help them reach innocent victims faster.
You can find out more information at: CEASE.AI