Four years in the making, the European Union’s new data protection rules have finally been agreed by the European Council and await the approval of the European Parliament. But a last-minute addition has sparked a debate about responsibility and consent, by proposing to raise to 16 the “age of consent” under which it is illegal for organisations to handle the individual’s data. This would force younger teenagers to gain parental permission to access social networking sites such as Facebook, Snapchat, WhatsApp or Instagram. By Rebecca Wong, Nottingham Trent University
While raising this digital age of consent from 13 as it is in the US to 16 would strengthen the protections they receive, there are doubts about whether it would be enforceable. How would the firms behind social networks be able to verify their users’ ages, for example, or whether they had their parents’ permission? There are already Facebook and Instagram users below the age of 16, so that would entail potentially closing those accounts – how would this be policed?
Could parents or social network providers be prosecuted for allowing the under-16s to access a social network? The proposed new EU rules, the General Data Protection Regulation, would impose heavy fines (4% of annual turnover) for those organisations or firms that breach data protection laws, which means the likes of Facebook would have a great incentive to ensure they complied. But there are few obvious ways to do this.
Additionally, any ban may lead some teenagers to lie about their age in order to create or maintain an account, potentially putting them in more danger by pretending to be older than they are. Janice Richardson, former Co-ordinator of the European Safer Internet Network, said that denying the under-16s access to social media would “deprive young people of educational and social opportunities in a number of ways, yet would provide no more (and likely even less) protection”.
Sophisticated age verification software would be needed, such as scanning machine-readable documents such as passports. But would this be sufficient to satisfy the legal threshold? This would also introduce further problems with the need to acquire and store this sensitive data.
Informed consent
One of the chief concerns during the consultation process for the General Data Protection Regulation was the growth of social networking sites such as Facebook and how data protection rules applied to them. In November 2011, the then EU Justice Commissioner Viviane Reding said she was concerned about the growth of digital advertising and the lack of understanding of how it involved harvesting and analysis of personal information. These concerns led to the decision to update the Data Protection Directive to reflect the many changes in how we use the internet since it was passed in 1995.
While the preamble to the General Data Protection Regulation states that young people deserve protection as they may be less aware of risks and their rights in relation to their personal data, this appears to be a paternalistic view adopted by the European Commission.
For example, the Swedish Data Protection Board (similar to the UK Information Commissioner’s Office) conducted a study of 522 participants aged between 15-18 and found that the majority had experienced unkind words written about them, around a quarter were sexually harassed online, and half of those on Facebook had had their account hijacked. But it also found that the young people had a generally good understanding of privacy issues.
On the other hand, a study from Ofcom, the communications watchdog in the UK, found that teenagers couldn’t tell the difference between search results and adverts placed around them, demonstrating that young people’s understanding of how the web works, and the role of their personal data, is not always sufficient – and perhaps insufficient to represent real, informed consent.
Negotiations ultimately allowed member states to opt-out from the requirement to raise the digital age of consent, but issues remain. With an opt-out agreed, member state governments may lower the age to 13, which would cause confusion due to the way the internet functions across borders. Would a 15-year-old in one country find that his use of social media became illegal as he crossed the border into another?
Facebook, which started among US universities, was originally aimed at the over-17s before dropping its minimum age to 13, hugely expanding its number of users. But this move was not without difficulties, and an estimated 7.5m Facebook users are under the minimum age. Facebook founder Mark Zuckerberg wants the 13-year-old minimum removed altogether.
The question is, can such young teenagers or children take responsibility for holding social network accounts? While concerns around protecting teenagers from the potential dangers of social networking are well-intentioned, it seems rather that the genie is out of the bottle. Parental guidance and education is perhaps a better approach than applying the long arm of the law.
This article was originally published on The Conversation. Read the original article.