Twitch and other video platforms must take new measures to protect users
Ofcom guidelines to protect users from harmful content.
Video-sharing platforms such as Twitch will have to take further measures to protect users from harmful content, according to Ofcom.
New regulations are intended to protect users from content relating to terrorism, child sex abuse and racism, with VSPs such as Twitch, TikTok, Snapchat and Vimeo expected to take "appropriate measures".
Under-18s must also be protected from material which might impair their physical, mental or moral development.
VSPs will be fined for any breach of guidelines, or suspended entirely for serious cases.
The new requirements came into effect in November 2020. Ofcom has been developing a regulatory framework since.
That includes:
- Having, and effectively implementing, terms and conditions for harmful material
- Having, and effectively implementing, flagging, reporting or rating mechanisms
- Applying appropriate age assurance and/or parental control measures to protect under-18s
- Establishing easy-to-use complaints processes
- Providing media literacy tools and information
Ofcom research states that 70 percent of VSP users have seen something potentially harmful within the last three months, with a third of users experiencing directly hateful content.
Ofcom will not be investigating individual videos, but promise a "rigorous but fair" approach to maintaining standards across VSPs.
"Online videos play a huge role in our lives now, particularly for children," said chief executive Dame Melanie Dawes. "But many people see hateful, violent or inappropriate material while using them.
"The platforms where these videos are shared now have a legal duty to take steps to protect their users."