Microsoft shares new technique for detecting online grooming of children in video game chat
"Combating online child exploitation should and must be a universal call to action."
Microsoft has unveiled Project Artemis, a technique used to detect online predators who try to groom children for sexual purposes using chat in video games.
Project Artemis is applied to historical text-based chat conversations, and evaluates and "rates" conversation characteristics and assigns an overall probability rating, Microsoft said.
This rating is then used as a determiner, which can be set by individual companies that use the technique as to when a flagged conversation should be sent to a human moderator for review. The idea is a moderator, if necessary, would refer imminent threats to police and other relevant authorities.
Project Artemis was developed in collaboration with a raft of companies, including Roblox, one of the most popular video games in the world and a massive hit on Xbox with children. It's now freely available to companies that offer a chat function, and in fact Microsoft has used it for some time in programs on Xbox, it said.
"Project Artemis is a significant step forward, but it is by no means a panacea," Microsoft said.
"Child sexual exploitation and abuse online and the detection of online child grooming are weighty problems. But we are not deterred by the complexity and intricacy of such issues. On the contrary, we are making the tool available at this point in time to invite further contributions and engagement from other technology companies and organisations with the goal of continuous improvement and refinement.
"At Microsoft, we embrace a multi-stakeholder model to combat online child exploitation that includes survivors and their advocates, government, tech companies and civil society working together. Combating online child exploitation should and must be a universal call to action."