The Government’s proposed Online Harms legislation still lacks “clarity” in how it would be enforced, according to industry stakeholders.
A report from the Parliamentary Internet, Communications and Technology Forum (PICTFOR) had asked tech firms, online safety groups and other organisations for their thoughts on the Government’s plans for the proposed regulation.
And while support for the proposals was clear, some bodies raised concerns of uncertainty over who would fall under the regulation, while others said more support and education was needed for individuals to help them improve their digital skills.
Companies including BT, Google, TikTok and Huawei were among those who contributed, as well as civil society groups and other campaigners.
Proposals for legislation include introducing a new regulator to oversee internet companies, which the Government has suggested could be Ofcom, and a duty of care to their users to which the firms must abide.
But in their response to PICTFOR, BCS, the Chartered Institute for IT said the current proposals “leave room for debate about who is regulated and consequently who withholds the duty of care – the platform or the regulator”.
“To enforce the legalities that lie with a duty of care, there needs to be clarity around whether this equates to the tort of negligence in civil law, or whether the term ‘duty of care’ is being used with less legal precision, without sufficient consideration to its consequences or meaning,” it said.
“If this does not gain clarity it will lead to regulatory uncertainty and the potential for online harms to continue without anyone being held accountable or for those who are taking their responsibilities seriously taking steps that do not align with unclear expectations.”
Social media platform TikTok also suggested more work was needed around how platforms monitored for harmful content.
“The current system has a flaw that needs to be fixed. In the simplest terms, if a platform looks for harmful and illegal content, it could become liable; if it does not look, it cannot become liable,” the video-sharing app said.
“This acts as a disincentive against platforms that want to invest in and develop techniques to detect, review and remove videos that contain illegal content.
“That is why TikTok supports the development of a ‘good Samaritan clause’ which will give legal clarity to allow and encourage platforms to proactively remove illegal content. We would like the online harms framework to be aligned with this approach.”
Some stakeholders also urged the Government to consider more support for the pubic and digital skills training in order to improve online wellbeing more generally.
The Good Things Foundation, a digital inclusion charity, said it supported the regulations but remained “deeply concerned that not enough is being done to protect, empower and support people – particularly those with no or limited digital access, skills or confidence”.
Tech giant Huawei added: “To truly be the safest place in the world to be online – which is the stated ambition of DCMS – all users, children, teenagers, adults and seniors must have the education, training and skills to enable them to take responsibility for their online lives.”
Campaigners have been left frustrated over repeated to delays to the proposed regulation, with a draft Bill now not expected to reach Parliament until next year, with the coronavirus pandemic cited as a key reason for the delay.