The Department for Digital, Culture, Media and Sport (DCMS) has today launched a consultation to help tackle various 'online harms'.
DCMS intends to launch a binding code of practice for online platforms and tech companies, overseen and enforced by an independent regulator. It isn't clear if this will be an existing regulator (e.g. Ofcom) or whether a new regulator will be established for this purpose. Ultimately, a new regulator does seem to be the most likely option.
DCMS believes that the problem is multifaceted, ranging from the spread of illegal and 'unacceptable' content (the latter being a worryingly ambiguous phrase) which can "threaten national security or the physical safety of children" through to the general "growing concerns about the potential impact on [children's] mental health and wellbeing" and "excessive screen time".
It sets out the Government's concerns in some detail, which include:
"Terrorist groups use the internet to spread propaganda designed to radicalise vulnerable people, and distribute material designed to aid or abet terrorist attacks. There are also examples of terrorists broadcasting attacks live on social media. Child sex offenders use the internet to view and share child sexual abuse material, groom children online, and even live stream the sexual abuse of children.
"There is also a real danger that hostile actors use online disinformation to undermine our democratic values and principles. Social media platforms use algorithms which can lead to ‘echo chambers’ or ‘filter bubbles’, where a user is presented with only one type of content instead of seeing a range of voices and opinions. This can promote disinformation by ensuring that users do not see rebuttals or other sources that may disagree and can also mean that users perceive a story to be far more widely believed than it really is.
"Rival criminal gangs use social media to promote gang culture and incite violence. This, alongside the illegal sale of weapons to young people online, is a contributing factor to senseless violence, such as knife crime, on British streets.
"Other online behaviours or content, even if they may not be illegal in all circumstances, can also cause serious harm. The internet can be used to harass, bully or intimidate, especially people in vulnerable groups or in public life. Young adults or children may be exposed to harmful content that relates, for example, to self-harm or suicide. These experiences can have serious psychological and emotional impact. There are also emerging challenges about designed addiction to some digital services and excessive screen time."
Industry reactions have been mixed, though overall cautiously positive, provided the new regulator doesn't stray too far into censorship and restricting lawful freedom of expression.
Most worrying perhaps is the 'mission creep' which go beyond regulating unlawful behaviours (which should be covered by robustly debated and properly passed legislation as well as developments in the common law) and behaviours which the government considers to be 'unacceptable', which could be banned under this new Code of Practice without the proper checks and balances and oversight - a clear threat to free speech.
Rebecca Stimson, Facebook's head of UK policy, said in a statement: "New regulations are needed so that we have a standardised approach across platforms and private companies aren't making so many important decisions alone. New rules for the internet should protect society from harm while also supporting innovation, the digital economy and freedom of speech."
Twitter's head of UK public policy Katy Minshall said in a statement: "We look forward to engaging in the next steps of the process, and working to strike an appropriate balance between keeping users safe and preserving the open, free nature of the internet."
TechUK, an umbrella group representing the UK's technology industry, said the government must be "clear about how trade-offs are balanced between harm prevention and fundamental rights".
A public consultation has been launched, and will close on 1 July 2019. This consultation aims to gather views on various aspects of the government’s plans for regulation and tackling online harms, including:
- the online services in scope of the regulatory framework;
- options for appointing an independent regulatory body to implement, oversee and enforce the new regulatory framework;
- the enforcement powers of an independent regulatory body;
- potential redress mechanisms for online users; and
- measures to ensure regulation is targeted and proportionate for industry.
White Paper & Digital Charter
The detailed, 99-page White Paper can be found here.
The Government's updated Digital Charter can be found here.
The plans call for an independent regulator to hold internet companies to account. It would be funded by the tech industry. The government has not decided whether a new body will be established, or an existing one handed new powers. The regulator will define a "code of best practice" that social networks and internet companies must adhere to. As well as Facebook, Twitter and Google, the rules would apply to messaging services such as Snapchat and cloud storage services. The regulator will have the power to fine companies and publish notices naming and shaming those that break the rules. The government says it is also considering fines for individual company executives and making search engines remove links to offending websites.