Drawing parallels – the processing of data about children in education and social care

By Sarah Gorin, Ros Edwards and Val Gillies

During our research, we have been learning more about the ways that Government agencies such as health, social care and education collect, process and join up information about families. Schools, like other Government agencies collect and process an increasing volume of information about children. Data is collected for administrative purposes, such as: monitoring attendance, attainment, progress and performance; for safeguarding children; and to promote and support education and learning.

Information about children is not only captured by the school for their own and purposes determined by the Government, but also by private educational technology (EdTech) companies who gather data on children via their use of apps, that may be free to download, and recommended by teachers as promoting learning. These companies may sell on information for marketing or research purposes. Since the pandemic the use of EdTech has grown exponentially, meaning the data being gathered on children both through schools and by EdTech providers is greater still, raising the stakes in terms of the protection of children’s personal data.

A new report by The Digital Futures Commission (DFC) ‘Education Data Reality: The challenges for schools in managing children’s education data’ examines the views of professionals who work in or with schools on the procurement of, data protection for, or uses of digital technologies in schools. The report describes the range of EdTech used in schools and the complex issues that managing it presents.

In a blog about the report, the main author Sarah Turner highlights four key issues that constrain children’s best interests:

  • The benefits of EdTech and the data processed from children in schools are currently not discernible or in children’s best interests. Nor are they proportionate to the scope, scale and sensitivity of data currently processed from children in schools.
  • Schools have limited control or oversight over data processed from children through their uses of EdTech. The power imbalance between EdTech providers and schools is structured in the terms of the use they signed up to and exacerbated by external pressure to use some EdTech services.
  • There is a distinct lack of comprehensive guidance for schools on how to manage EdTech providers’ data practices. Nor is there a minimum standard for acceptable features, data practices and evidence-based benefits for schools to navigate the currently fragmented EdTech market and select appropriate EdTech that offers educational benefits proportionate to the data it processes.
  • Patchy access to and security of digital devices at school and home due to cost and resource barriers means that access to digital technologies to deliver and receive education remains inequitable.

The report is focused on the processing of education data about families, however there are many interesting parallels with the findings from our project on the way data about families is collected, processed and used by local authorities:

  • Firstly, there is a lack of evidence about the benefits of the use of digital technologies in both schools and in local authorities and a lack of understanding about the risks to children’s data privacy.
  • There is a lack of government guidance for schools as there is for local authorities about the digital technologies that they employ, meaning that organisations are left individually responsible for ensuring that they are compliant with General Data Protection Regulation (GPPR).
  • Schools, like local authorities are time, resource and expertise poor. Often neither have the data protection expertise to understand and consider the risks versus the benefits of data processing for children’s best interests.
  • There is a lack of transparency in how data is collected, handled and processed by Government agencies as well as third parties who gain access to data about families, either through children using their apps for educational purposes or through local authorities employing them for the development of predictive analytics systems.
  • Public awareness and understanding about how data is collected and processed and the risks of data sharing to children’s privacy are low and are not well understood by parents and children.

We welcome this new report by the Digital Futures Commission and hope that it stimulates more discussion and awareness amongst professionals and families.

Problem-solving for Problem-solving: Data Analytics to Identify Families for Service Intervention

Presentation British Sociological Association Annual Conference 2021

Project PI Ros Edwards presented findings from our project at the British Sociological Association annual conference 2021.

The paper, Problem-solving for Problem-solving: Data Analytics to Identify Families for Service Intervention looks at the way that the promise of technological fixes in the family policy field has set up a set of dependencies between public services and data analytic companies, entrenching a focus on individual families as the source of social problems rather than social conditions.

Watch Ros’ presentation.

The paper that was the basis of Ros’s presentation has now been published in Critical Social Policy

What do parents think about linking information for family interventions? What we know to date.

Our project team look at responses to a pilot survey of parents undertaken ahead of their new research.

National and local government departments and services collect and hold a range of information about parents and their children. These records include details of the taxes people pay, their medical records, school data and police files. But what do parents think about the idea of the data held on their children being linked with theirs, so that government services can identify families for possible interventions? A pilot survey leading up to a new research project indicates they’re not very keen on the idea. 

Data linkage is a hotly-debated topic with arguments about better targeted, more focused public services often counter-acted with concerns around privacy and lack of trust in the organisations collecting and using the data. 

When it comes to the use of linked data to identify families where it’s thought there may be heightened risk of things such as child abuse or neglect, or truancy and anti-social behaviour, there can be few more contentious areas. But real evidence on what parents think is thin on the ground and this is where our research project hopes to make some inroads and get a better handle on things. 

Pilot survey

A voluntary pilot survey of parents carried out via the Mumsnet website and Twitter received a total of 365 responses – mostly from white mothers from relatively well-off households. We might assume, this group might feel more secure about the linkage and use of records on them and their children and that they might generally be trusting of organisations such as government departments and local councils etc. So what did we find?

Only half the parents said they had heard about data linkage and how it worked. Less than half thought it acceptable to use it to improve the planning and delivery of family support services. Far fewer (around 15 percent) thought it should be used to identify specific families who might need intervention but hadn’t asked for support or to save public money by preventing or catching family problems early. 

Their concerns included: families’ right to privacy; increasing stigma; oversimplification of risk factors; discouraging families from seeking help and problems with data accuracy and safety. About 1 in 5 did agree that the more we know about families the more the nation’s wellbeing can be improved.

Question of trust

There was little trust in organisations who might link data for the above reasons, but specific distrust of private companies with only 4 parents saying they would definitely trust them and 309 saying they wouldn’t. One participant said:

“Having algorithms produced by private companies, who are not transparent, making dubious links, which do correlate with truth, is a very dangerous tool. It will not benefit society but will only benefit private companies.” 

Parents were almost entirely against government accessing financial details such as bank details, credit cards, supermarket records, CCTV and social media posts as the following comments from participants show:

“More detailed and personal information, PARTICULARLY medical records/social services reports/DLA info, should not be ‘joined up’ as this breaches personal privacy.”  

“It might be acceptable to link to credit records and food shopping records on an anonymous basis to produce aggregate information or statistics for research purposes e.g. a study looking at the possible relationships between children’s diets and school attainment, or the impact of debt on parents BUT this sort of data analysis should NEVER identify individuals.”

“Government surveillance of families, without knowledge or consent, is an extremely questionable approach. Government should focus on supporting families, providing public services, creating jobs, ensuring quality of housing, eliminating poverty and increasing community capacities of resilience, care, safety, cohesion and fun.”

Lack of acceptance

So among this group of parents, where we might have expected greater levels of support for the idea of linking data for the purposes of identifying and targeting service intervention, we find in fact a substantial lack of acceptance and trust around the idea of family data linkage.  

A web-based survey of this nature gives us some indication of what people are thinking, but it’s important to try to look at this more carefully. As part of our new research project, we’re commissioning a robustly-designed and -conducted survey of around 900 parents and carrying out a range of in-depth interviews with parents in families most likely to be the target of this type of data linkage and policies associated with it.