Be careful with that in-home smart devices such as AI virtual voice assistants, smart appliances, and security and monitoring technologies are entering our homes and re-defining our daily routines.
These devices may be gathering and sharing children’s data thereby posing a a threat to the privacy of children and their families, a new report warns.
The ‘Home Life Data and Children’s Privacy’ report calls for new privacy measures to safeguard kids and make sure age appropriate design code is included with home automation technologies.
The report was compiled by Dr Veronica Barassi of Goldsmiths, University of London.
The lived experience of childhood is being transformed by the production of personally identifying digital data, according to the report.
“From doctor’s appointments to artificial intelligence in the home, from social media to mobile apps, children’s everyday life is recorded, stored and shared in ways that were not possible before.”
The very newness of the home automation environments means we do not know what algorithms are doing with this ‘messy’ data that includes children’s data, argues Dr Barassi.
“Firms currently fail to recognise the privacy implications of children’s daily interactions with home automation technologies that are not designed or targeted at them,” she adds.
“Despite GDPR, it’s left up to parents to protect their children’s privacy and navigate a confusing array of terms and conditions.”
The report further argues that Home Hubs threaten to further socialise kids to divulge their data, which is one reason design code in the home must create safeguards across all ages.
Barassi stated that by introducing the concept of home life data in this report we wish to draw attention to the fact that the data that is being collected by home hub technologies is not only personal (individual) data but it is household, family and highly-contextual data.
The report was submitted to the Information Commissioner’s Office (ICO) as part of a call for evidence for age appropriate design code to protect children’s privacy.
How is home automation enabled?
Yet broadly speaking (and at the time of writing) home automation is enabled by different sets of technologies, which include:
- artificial intelligence devices (e.g. virtual assistants, robots that act as home assistants; artificial intelligence toys, etc.);
- entertainment devices (e.g. smart TVs, whole house wireless music systems; video games, etc.)
- home appliances (e.g. smart fridges; smart toilets; smart washing machines etc.)
- security technologies (e.g. smart locks; surveillance cameras; alarms, which can detect intruders and are equipped with special sensors to detect floods, fires etc.)
- energy and utilities monitoring and measuring tools (i.e. meters that monitor water and energy consumption, etc.)
- lighting monitoring devices (e.g. smart bulbs and switches that can be controlled at a distance, etc.)
- specific solution devices (e.g. devices that offer different specific solutions, such as support with recycling or intercom solutions, etc.).
Debates about the privacy implications of AI home assistants and Internet of Things focus a lot on the the collection and use of personal data, stated the report.
Yet these debates lack a nuanced understanding of the different data flows that emerge from everyday digital practices and interactions in the home and that include the data of children.
Dr Barassi argued in the report that aggregated profiles, however, constitute a risk for children’s privacy.
“Let’s imagine that you are having dinner with a friend who has a child who suffers from diabetes and you might ask Google assistant or Alexa to look for information on ‘diabetes in children’. That information would be automatically stored on your profile,” she wrote.
“Let’s also imagine that in the weeks to come you feel concerned about your own child getting diabetes and you start looking for information on symptoms. All these data traces would imply that you probably would be profiled as “parent” with a “diabetes interest’” (this is a guess because there is so much secrecy about the ways in which we are being profiled).
“If this is the case, the the question emerges naturally: if you shared your ‘household’ profile with your child, and you were profiled as a parent with a diabetes interest, would your child be profiled as possibly diabetic? The problemis that we don’t know the answer.”
The report ended by callings for a need to find new measures and solutions to safeguard children and to make sure that age appropriate design code is included within home automation technologies.