Ofcom given greater powers over the web in the UK

As you’ve probably seen on the news this week, the watchdog Ofcom is set to have more power to force social media firms and others to take action when it comes to potentially harmful online content. Currently, the organisation regulates the telecoms industry and the broadcast media, and deals with complaints about them, but is not responsible for online safety.

Internet company bosses posting user-generated content could face hefty fines and even jail terms for not safeguarding those online from ‘harmful and illegal content’, for example relating to terrorism, violence or cyber-bullying.

The move, welcomed by campaigners and charities such as the NSPCC, follows widespread calls for social media corporations to take greater responsibility for their content.

Until now, the likes of YouTube and Twitter have, in the main, regulated themselves.

As regulator, Ofcom, whose new chief executive Dame Melanie Dawes will be in post from next month, won’t remove particular posts from social media platforms. But the likes of Google and Facebook will have to publish clear statements detailing the behaviour and content they believe to be acceptable on their websites.

Equally, platforms will need to ‘minimise the risks’ of potentially harmful content appearing, while removing content promptly where necessary, and enforcing standards ‘consistently and transparently’.

The government insists the changes would maintain free speech online for adults, and solely target bigger web-based businesses. However, some smaller companies have argued that the changes place a huge burden to police potentially harmful, but not illegal, content.

The key proposals

Key points from the new proposals include:

  • Any business enabling user-generated content to be shared, from video uploads to comments, is likely to be affected, covering hundreds of thousands of UK businesses
  • The government is keen for companies to return to age verification for some websites, after a bid to introduce it last year was abandoned
  • Web-based businesses will have to publish yearly transparency reports outlining harmful content removed and describing how they are meeting their own standards

This is the initial government response to its Online Harms consultation conducted nearly a year ago, and which triggered 2,500 replies. A full response is expected in the spring.

It’s definitely something you need to be aware of in terms of social media and content such as reader comments, although we’re not expecting it to be a major issue for the vast majority of our clients. At Front Page, we can advise if you’d like to discuss in more detail; pick up the phone for an informal chat.

迷人的保姆