Drawing parallels – the processing of data about children in education and social care

By Sarah Gorin, Ros Edwards and Val Gillies

During our research, we have been learning more about the ways that Government agencies such as health, social care and education collect, process and join up information about families. Schools, like other Government agencies collect and process an increasing volume of information about children. Data is collected for administrative purposes, such as: monitoring attendance, attainment, progress and performance; for safeguarding children; and to promote and support education and learning.

Information about children is not only captured by the school for their own and purposes determined by the Government, but also by private educational technology (EdTech) companies who gather data on children via their use of apps, that may be free to download, and recommended by teachers as promoting learning. These companies may sell on information for marketing or research purposes. Since the pandemic the use of EdTech has grown exponentially, meaning the data being gathered on children both through schools and by EdTech providers is greater still, raising the stakes in terms of the protection of childrenā€™s personal data.

A new report by The Digital Futures Commission (DFC) ā€˜Education Data Reality: The challenges for schools in managing childrenā€™s education dataā€™ examines the views of professionals who work in or with schools on the procurement of, data protection for, or uses of digital technologies in schools. The report describes the range of EdTech used in schools and the complex issues that managing it presents.

In a blog about the report, the main author Sarah Turner highlights four key issues that constrain childrenā€™s best interests:

  • The benefits of EdTech and the data processed from children in schools are currently not discernible or in childrenā€™s best interests. Nor are they proportionate to the scope, scale and sensitivity of data currently processed from children in schools.
  • Schools have limited control or oversight over data processed from children through their uses of EdTech. The power imbalance between EdTech providers and schools is structured in the terms of the use they signed up to and exacerbated by external pressure to use some EdTech services.
  • There is a distinct lack of comprehensive guidance for schools on how to manage EdTech providersā€™ data practices. Nor is there a minimum standard for acceptable features, data practices and evidence-based benefits for schools to navigate the currently fragmented EdTech market and select appropriate EdTech that offers educational benefits proportionate to the data it processes.
  • Patchy access to and security of digital devices at school and home due to cost and resource barriers means that access to digital technologies to deliver and receive education remains inequitable.

The report is focused on the processing of education data about families, however there are many interesting parallels with the findings from our project on the way data about families is collected, processed and used by local authorities:

  • Firstly, there is a lack of evidence about the benefits of the use of digital technologies in both schools and in local authorities and a lack of understanding about the risks to childrenā€™s data privacy.
  • There is a lack of government guidance for schools as there is for local authorities about the digital technologies that they employ, meaning that organisations are left individually responsible for ensuring that they are compliant with General Data Protection Regulation (GPPR).
  • Schools, like local authorities are time, resource and expertise poor. Often neither have the data protection expertise to understand and consider the risks versus the benefits of data processing for childrenā€™s best interests.
  • There is a lack of transparency in how data is collected, handled and processed by Government agencies as well as third parties who gain access to data about families, either through children using their apps for educational purposes or through local authorities employing them for the development of predictive analytics systems.
  • Public awareness and understanding about how data is collected and processed and the risks of data sharing to childrenā€™s privacy are low and are not well understood by parents and children.

We welcome this new report by the Digital Futures Commission and hope that it stimulates more discussion and awareness amongst professionals and families.

Childrenā€™s visibility, vulnerability and voice in official statistics and their use

By Sarah Gorin, Ros Edwards and Val Gillies

Throughout our project we have been looking at parental social licence for the linking together of Government data about familiesā€™ lives across areas such as health, education and social care. Whilst our research focus has been on parents, it is also important we listen to childrenā€™s views. A vast amount of data is collected about children across Government and non-Government agencies, yet it would seem children and young people are rarely asked what they consider to be acceptable uses of their personal information. It is important that children are given this opportunity, under Article 12 of the UN Convention on the Rights of the Child, that requires that childrenā€™s views should be heard and considered on all matters that affect them.

 A recent report ā€˜Visibility, Vulnerability and Voiceā€™ by The Office for Statistics Regulation (an independent body that regulates the use of official statistics) has drawn attention to the importance of including children and young people in official statistics.

The report provides a framework for considering the needs of children and young people in the development of official statistics that they have named the ā€˜3Vā€™sā€™ framework and suggests seeing statistics about children and young people with 3 lenses: that of ā€˜Visibilityā€™, making statistics on children and young people available; ā€˜Vulnerabilityā€™, ensuring collection and analysis of data about children who are vulnerable to poorer outcomes and ā€˜Voiceā€™, ensuring statistics reflect the views of children and young people and they are given a voice in how their data is used.

In considering childrenā€™s ā€˜Voiceā€™ the Office for Statistics Regulation reflect that all official statistics producers should:

  • Seek the views of children and young people themselves rather than relying on proxies from adults.
  • Consider, and respond to, the data needs of children and young people.
  • Involve children and young people in the development of statistics for and about them.
  • Ensure children and young people have a voice around how their data are used in official statistics and in research using the data underpinning them.

Whilst the report focuses on the need to involve children and young people in the development of official statistics, the same also applies more broadly to the development of policy around the use of data. A report by DigitalDefendMe,ā€˜The Words We Use in Data Policyā€™ considers the way children are framed in data policy and the lack of representation or engagement with children about their views. We welcome these reports and the focus and commitment to improving opportunities for children and young people to be involved in developments in the way their data is linked together and used.

Voices come together to highlight the need to strengthen childrenā€™s data rights in response to Government consultation ā€˜Data: a new directionā€™

By Sarah Gorin, Ros Edwards and Val Gillies

As part of our research project, we are constantly trying to keep up with developments about the use of childrenā€™s data by Government. In Autumn 2021 the U.K Government released a consultation ā€˜Data: a new directionā€™ that we responded to, based on our learning from parents who have participated in our research.

Our submission highlighted the already fragile nature of public trust in joining up administrative data about families, particularly amongst marginalised groups and the need to press pause on data sharing and artificial intelligence (AI) systems in the U.K. with a need for greater regulation and safeguarding of familiesā€™ data from use by private companies.

Fortunately we are not alone in raising concerns about the proposalsā€¦ā€¦.

Organisations such as the 5Rights Foundation and DefendDigitalMe have also made submissions to the consultation that highlight the lack of attention to the impact of these proposals on children and their families.

In this blog, we summarise some of the key recommendations to the consultation by LSEā€™s Professor Sonia Livingstone and 5Rights Foundation researcher Dr Kruakae Pothong who have drawn on their extensive research experience to inform their childrenā€™s rights based response (for more detail see their LSE blog).

Professor Livingstone and Dr. Pothongā€™s submission highlights the need for greater not lesser support for childrenā€™s rights as data subjects and questions whether proposed changes to the UK General Data Protection Regulation are indeed in line with international human rights and child rights developments.

Key concerns raised in the submission include:

  • The need to maintain a clear boundary between the use of personal data for scientific research and the re-use of this scientific research for commercial purposes, with meaningful consent obtained for any further processing to occur.
  • The importance of scientific and public interest research that identifies children to demonstrate fully how the acquisition and processing of data would affect children and how they can exercise their data rights.
  • The importance of not proceeding with the Governmentā€™s proposed ā€œlimited, exhaustive list of legitimate interestsā€ as a basis for processing data. Currently, legitimate interests can be a lawful basis for processing data only when processing is ā€˜necessaryā€™ and when there is a balance between the interests of the data subjects and othersā€™ interests.
  • The need to maintain rather than remove (as proposed in the Consultation) the requirement for organisations to undertake data protection impact assessments. They argue that instead the additional safeguards of the data protection impact assessment and a child rights impact assessment should be mandated for use before and after the processing of childrenā€™s personal data.
  • The importance of retaining the requirement to have Data Protection Officers in all circumstances, especially educational contexts.
  • Maintaining free subject access requests as charging families would adversely affect childrenā€™s rights and make it harder for children and families to correct inaccurate information held about them, potentially negatively affecting them in the short and long term.

The Government consultation closed in November 2021 and we await the outcome.

Using Artificial Intelligence in public services – does it breach peopleā€™s privacy?

By Ros Edwards, Sarah Gorin and Val Gillies

As part of our research, we recently asked parents what they thought about the use of data linkage and predictive analytics to identify families to target public services.

They told us that they didnā€™t trust these processes. This was particularly the case among marginalised social groups. In other words, the groups of parents who are most likely to be the focus of these AI identification practices are least likely to see them as legitimate. Now a new report by the United Nations High Commissioner of Human Rights, Michelle Bachelet highlights major concerns about the impact of artificial intelligence, including profiling, automated decision-making and machine-learning, upon individualsā€™ right to privacy. 

The report makes a number of recommendations, including a moratorium on the use of AI systems that pose a serious risk to human rights and the banning of social scoring of individuals by Governments or AI systems that categorise individuals into groups on discriminatory grounds.

The right to privacy in the digital age: report (2021) builds on two previous reports by the High Commissioner looking at the right to privacy in the digital age and incorporates views of international experts at a virtual seminar, as well as responses to the High Commissioners call for input into the report from member states, including the U.K.

It examines the impact of digital systems such as artificial intelligence in four sectors, including public services. Artificial intelligence is used in public services such as social care, health, police, social security and education in a range of ways, such as decision-making about welfare benefits and flagging families for visits by childrenā€™s social care services.

Concerns are expressed about the linking together for example of large health, education and social care data sets with other data held by private companies, such as social media companies or data brokers who, the report says, may gather information outside protective legal frameworks. The involvement of private companies in the construction, development and management of public sector data systems, also means they can gain access to data sets containing information about large parts of the population.

There are additional concerns about the potential inaccuracy of  historic data and the implications of that for future decision-making. The report states that these systems unequally ā€œexpose, survey and punish welfare beneficiariesā€ and that conditions are imposed on individuals that can undermine their autonomy and choice.

A digital welfare fraud detection system was banned by a court in the Netherlands, ruling that it infringed individualsā€™ right to privacy. The system provided central and local authorities with the power to share and analyse data that were previously kept separately, including on employment, housing, education, benefits and health insurance, as well as other forms of identifiable data. The tool targeted low-income and minority neighbourhoods, leading to de facto discrimination based on socioeconomic background.

The recommendations in the report include:

  • using a human rights based approach
  • ensuring legislation and regulation are in line with the risk to human rights, with sectors including social protection to be prioritised
  • development of sector specific regulation requirements
  • drastic improvements to efforts regarding transparency, including use of registers for AI that contain key information about AI tools and their use, informing affected individuals when decisions are being or have been made automatically or with the help of automation tools, and notifying individuals when the personal data they provide will become part of a data set used by an AI system.

With concerns about the risks to the human rights of individuals and families about the use of data linkage and predictive analytics, it is vital to pay heed to the UN High Commissionerā€™s call for a moratorium. Public authorities need to pay meaningful attention to the lack of social legitimacy for AI, as evidenced in our research, and to ask themselves if the risk of further distrust and disengagement from already marginalised social groups, and consequences for a cohesive and equal society, is worth it. 

Would you like to take part in our research?

We are looking for parents to take part in the project. We want to know your views and experiences of the way that information about families is collected and used by policy-makers and service providers. 

There are two ways you can take part:

  1. As part of a group discussion – If you are a parent (of at least one child aged 16 or under) you can take part in an online group discussion that will last about 45 minutes.
  2. In a one-to-one discussion -If you are a parent (of at a least one child aged 16 or under) and have had contact with family support services (this may be childrenā€™s social work services, early years or a voluntary organisation that supports families) you can take part in an individual discussion with us that will last about 45 minutes, either online or by phone.

All group and individual participants will receive a Ā£25 e-voucher in thanks for their time and trouble, and we can provide a top-up voucher for participants using pay-as-you-go.

The research has ethical approval from the University of Southampton.

If you would like to receive further information or talk about the possibility of participating in the research, please contact Sarah Gorin, University of Southampton at s.j.gorin@soton.ac.uk