New UK ‘Duty of Care’ Rules To Apply To Social Media Companies

  • Post author:
  • Post category:News
  • Post comments:0 Comments
  • Reading time:5 mins read

The new ‘Online Harms’ whitepaper marks a world first as the UK government plans to introduce regulation to hold social media and other tech companies to account for the nature of the content they display, backed by the policing power of an independent regulator and the threat of fines or a ban.

Duty of Care

The proposed new legal framework from the Department for Digital, Culture, Media and Sport (DCMS) and the Home Office aims to give social media and tech companies a duty of care to protect users from threats, harm, and other damaging content relating to cyberbullying, terrorism, disinformation, child sexual exploitation and encouragement of behaviours that could be damaging.

The need for such regulation has been recognised for some time and was brought into sharper focus recently by the death in the UK of 14-year-old Molly Russell, who was reported to have viewed online material on depression and suicide, and in March this year, the live streaming on one of Facebook’s platforms of the mass shooting at a mosque in New Zealand which led Australia to suggest fines for social media and web-hosting companies and imprisonment of executives if violent content is not removed.

The Proposed Measures

The proposed measures by the UK government in its white paper include:

  • Imposing a new statutory “duty of care” that will hold companies accountable for the safety of their users, as well as a commitment to tackle the harm caused by their services.
  • Tougher requirements on tech companies to stop the dissemination of child abuse and terrorist content online.
  • The appointment of an independent regulator with the power to force social media platforms and tech companies to publish transparency reports on the amount of harmful content on their platforms and what they are doing to address the issue.
  • Forcing companies to respond to users’ complaints, and act quickly to address them.
  • The introduction of codes of practice by the regulator which will include requirements to minimise the spread of misleading and harmful disinformation using dedicated fact checkers (at election time).
  • The introduction of a “safety by design” framework that could help companies to incorporate the necessary online safety features in their new apps and platforms at the development stage.

GDPR-Style Fines (Or A Ban)

Culture, Media and Sport Secretary Jeremy Wright has said that tech companies that don’t do everything reasonably practicable to stop harmful content on their platforms could face fines comparable with those imposed for serious GDPR breaches e.g. 4% of a company’s turnover.

It has also been suggested that under the new rules to be policed by an independent regulator, bosses could be held personally accountable for not stopping harmful content on their platforms. It has also been suggested that in the most serious cases, companies could be banned from operating in Britain if they do not do everything reasonably practical to stop harmful content being spread via their platforms.

Balance

Although there is a general recognition that regulation to protect, particularly young people, from harmful/damaging content is a good thing, a proportionate and predictable balance needs to be struck between protecting society and supporting innovation and free speech.

Facebook is reported to have said that it is looking forward to working with the government to ensure new regulations were effective and have a standard approach across platforms.

Criticism

The government’s proposals will now have a 12-week consultation, but the main criticism to date has been that parts of the government’s approach in the proposals are too vague and that regulations alone can’t solve all the problems.

What Does This Mean For Your Business?

Clearly, the UK government believes that self-regulation among social media and tech companies does not work.  The tech industry has generally given a positive response to the government’s proposals and to an approach that is risk-based and proportionate rather than one size fits all.  The hope is that the vaguer elements of the proposals can be clarified and improved over the next 3 months of consultation. 

To ensure the maximum protection for UK citizens, any regulations should be complemented by ongoing education for children, young people and adults to make sure that they have the skills and awareness to navigate the digital world safely and securely.

Leave a Reply