Why parents need to understand how their family’s data is being used!

Our research team is encouraging parents to take time to better understand how their family’s data is collected, shared and used for service intervention. And to help with that we have created a short video animation which you can view below.

This video has been produced with help from animation specialists Cognitive Media Ltd. It is designed to inform parents about how data on them and their children is collected, shared and used by local and national government services to identify families for intervention. 

The video is also designed to prompt discussion around data security, consent, and the extent of public acceptance and trust or social licence in these activities.

University of Southampton Principal investigator Professor Ros Edwards said:

We believe that policy developments and data linkage and analytics practices to inform services interventions are moving ahead of public knowledge and consent. We should all take time to understand this better and consider what it means. We hope this video will help prompt discussions among parents and groups working with parents to make sure we are better informed and better equipped to challenge these processes if we think they need challenging

Co-investigator Professor Val Gillies from the University of Westminster added:

If you are a parent, a group working with parents or simply someone with an interest in how our data is being used, watch our animation and if you’d like to know more about what we’ve found and other resources and outputs that might be of interest take a look at the rest of our website.

Lack of transparency in privacy notices about children’s data

Privacy notices are there to tell users of services about how information about them is being collected, used and linked together. As a parent you may wish to know if information about your child is being linked together with other information about you/them and who can see this information. Whilst this seems straightforward, privacy notices are often opaque, referring to broad categories of data use and the specific ways data might be used and linked with other data can be hard to discern.

In 2023 the Government are changing the way they collect data about children who have an Education, Health and Care Plan or for whom there has been a request for a plan. The data they collect and use is changing from aggregated data (that does not identify individual children) to individual personal level data on every child.

The Department for Education have provided guidance for local authorities about how to write privacy notices on their websites to reflect this change.

However, in their suggested privacy notice they talk about the sharing of data but do not mention that children’s data may be linked to other data sources about families. This is what is written into the Department for Education  accompanying guidance document:

Person level data will enable a better understanding of the profile of children and young people with EHC plans and allow for more insightful reporting. The person information will allow for linking to other data sources to further enrich the data collected on those with EHC plans. DfE (2022) p.11

Local authorities will be required to pass personal level data about children to the Department for Education and yet it remains very unclear how they will use it.

Parents may also be forgiven for feeling concerned about the safety of their children’s information once it is passed on. The Information Commissioner’s Office has reported a serious breach in children’s data use in which children’s data held by the DfE was offered to gambling companies.

Department for Education reprimanded by ICO for children’s information data breach

The Department for Education (“DfE”) has been reprimanded by the ICO for a data breach arising from the unlawful processing of personal data, including children’s data contained in approximately 28 million records, between 2018 and 2020. The DfE had provided the screening company Trust Systems Software UK Ltd (“Trustopia”) with access to the Learning Records Service (“LRS”), a database containing pupil’s learning records used by schools and higher education institutions. Despite not being a provider of educational services, Trustopia was allowed access to the LRS and used the database for age verification services, which were offered to gambling companies (to confirm their customers were over 18).

The ICO determined that the DfE had failed to protect against the unauthorised processing of data contained in the LRS. As the data subjects were unaware of the processing and unable to object or withdraw consent to the processing, the ICO deemed that DfE had breached Article 5(1)(a) UK GDPR. Additionally, the DfE had failed to ensure the confidentiality of the data contained in the LRS in breach of DfE’s security obligations pursuant to Article 5(1)(f) UK GDPR.

In the reprimand the ICO noted that, but for the DfE being a public authority, the ICO would have fined the DfE just over £10 million. The reprimand from the ICO sets out the remedial actions that the DfE needs to take to improve its compliance with the UK GDPR, including: (1) improving the transparency of the LRS so that data subjects are able to exercise their rights under the UK GDPR; and (2) reviewing internal security procedures to reduce the likelihood of further breaches in the future. The DfE has since removed access to the LRS for 2,600 of the 12,600 organisations which originally had access to the database.

See

https://ico.org.uk/media/action-weve-taken/4022280/dfe-reprimand-20221102.pdf

Department for Education (2022) Special educational needs person level survey 2023: guide.

https://www.gov.uk/government/publications/special-educational-needs-person-level-survey-2023-guide

Education data futures: book launch

Our research project is pleased to share details of the launch of a book which includes findings from our research project.

The Digital Futures Commission launch of Education Data Futures is being held on World Children’s Day, November 21, 2022.

The book, a collection of essays from regulators, specialists and academics working on the problems and possibilities of children’s education data, is being launched by Baroness Beeban Kidron and Sonia Livingstone who will be joined by a range of other guests.

Our project is delighted to have contributed a chapter to the book which outlines some of our findings about the extent to which parents from different social groups trust schools and other public services to share and electronically link data about their children and family. The chapter goes on to relate these to the wider social licence explanatory issues of legitimacy and suspicion, as well as the implications for government efforts to bring together and use administrative records from different sources.

We argue that government and public services need to engage in greater transparency and accountability to parents, enabling them to challenge and dissent from electronic merging of their data, but that efforts towards informing parents are likely to be received and judged quite differently among different social groups of parents.

The book is open access and after the launch will be downloadable from the Digital Data Commission’s website, where hard copies may also be ordered.

Governments’ use of automated decision-making systems reflects systemic issues of injustice and inequality

By Joanna Redden, Associate Professor, Information and Media Studies, Western University, Canada

In 2019, former UN Special Rapporteur Philip Alston said he was worried we were “stumbling zombie-like into a digital welfare dystopia.” He had been researching how government agencies around the world were turning to automated decision-making systems (ADS) to cut costs, increase efficiency and target resources. ADS are technical systems designed to help or replace human decision-making using algorithms.

Alston was worried for good reason. Research shows that ADS can be used in ways that discriminate, exacerbate inequality, infringe upon rights, sort people into different social groups, wrongly limit access to services and intensify surveillance.

For example, families have been bankrupted and forced into crises after being falsely accused of benefit fraud.

Researchers have identified how facial recognition systems and risk assessment tools are more likely to wrongly identify people with darker skin tones and women. These systems have already led to wrongful arrests and misinformed sentencing decisions.

Often, people only learn that they have been affected by an ADS application when one of two things happen: after things go wrong, as was the case with the A-levels scandal in the United Kingdom; or when controversies are made public, as was the case with uses of facial recognition technology in Canada and the United States.

Automated problems

Greater transparency, responsibility, accountability and public involvement in the design and use of ADS is important to protect people’s rights and privacy. There are three main reasons for this:

  1. these systems can cause a lot of harm;
  2. they are being introduced faster than necessary protections can be implemented, and;
  3. there is a lack of opportunity for those affected to make democratic decisions about if they should be used and if so, how they should be used.

Our latest research project, Automating Public Services: Learning from Cancelled Systems, provides findings aimed at helping prevent harm and contribute to meaningful debate and action. The report provides the first comprehensive overview of systems being cancelled across western democracies.

Researching the factors and rationales leading to cancellation of ADS systems helps us better understand their limits. In our report, we identified 61 ADS that were cancelled across Australia, Canada, Europe, New Zealand and the U.S. We present a detailed account of systems cancelled in the areas of fraud detection, child welfare and policing. Our findings demonstrate the importance of careful consideration and concern for equity.

Reasons for cancellation

There are a range of factors that influence decisions to cancel the uses of ADS. One of our most important findings is how often systems are cancelled because they are not as effective as expected. Another key finding is the significant role played by community mobilization and research, investigative reporting and legal action.

Our findings demonstrate there are competing understandings, visions and politics surrounding the use of ADS.

a table showing the factors influencing the decision to cancel and ADS system
There are a range of factors that influence decisions to cancel the uses of ADS systems. (Data Justice Lab), Author provided

Hopefully, our recommendations will lead to increased civic participation and improved oversight, accountability and harm prevention.

In the report, we point to widespread calls for governments to establish resourced ADS registers as a basic first step to greater transparency. Some countries such as the U.K., have stated plans to do so, while other countries like Canada have yet to move in this direction.

Our findings demonstrate that the use of ADS can lead to greater inequality and systemic injustice. This reinforces the need to be alert to how the use of ADS can create differential systems of advantage and disadvantage.

Accountability and transparency

ADS need to be developed with care and responsibility by meaningfully engaging with affected communities. There can be harmful consequences when government agencies do not engage the public in discussions about the appropriate use of ADS before implementation.

This engagement should include the option for community members to decide areas where they do not want ADS to be used. Examples of good government practice can include taking the time to ensure independent expert reviews and impact assessments that focus on equality and human rights are carried out.

a list of recommendations for governments using ADS systems
Governments can take several different approaches to implement ADS systems in a more accountable manner. (Data Justice Lab), Author provided

We recommend strengthening accountability for those wanting to implement ADS by requiring proof of accuracy, effectiveness and safety, as well as reviews of legality. At minimum, people should be able to find out if an ADS has used their data and, if necessary, have access to resources to challenge and redress wrong assessments.

There are a number of cases listed in our report where government agencies’ partnership with private companies to provide ADS services has presented problems. In one case, a government agency decided not to use a bail-setting system because the proprietary nature of the system meant that defendants and officials would not be able to understand why a decision was made, making an effective challenge impossible.

Government agencies need to have the resources and skills to thoroughly examine how they procure ADS systems.

A politics of care

All of these recommendations point to the importance of a politics of care. This requires those wanting to implement ADS to appreciate the complexities of people, communities and their rights.

Key questions need to be asked about how the uses of ADS lead to blind spots because of the way they increase the distancing between administrators and the people they are meant to serve through scoring and sorting systems that oversimplify, infer guilt, wrongly target and stereotype people through categorizations and quantifications.

Good practice, in terms of a politics of care, involves taking the time to carefully consider the potential impacts of ADS before implementation and being responsive to criticism, ensuring ongoing oversight and review, and seeking independent and community review.

Drawing parallels – the processing of data about children in education and social care

By Sarah Gorin, Ros Edwards and Val Gillies

During our research, we have been learning more about the ways that Government agencies such as health, social care and education collect, process and join up information about families. Schools, like other Government agencies collect and process an increasing volume of information about children. Data is collected for administrative purposes, such as: monitoring attendance, attainment, progress and performance; for safeguarding children; and to promote and support education and learning.

Information about children is not only captured by the school for their own and purposes determined by the Government, but also by private educational technology (EdTech) companies who gather data on children via their use of apps, that may be free to download, and recommended by teachers as promoting learning. These companies may sell on information for marketing or research purposes. Since the pandemic the use of EdTech has grown exponentially, meaning the data being gathered on children both through schools and by EdTech providers is greater still, raising the stakes in terms of the protection of children’s personal data.

A new report by The Digital Futures Commission (DFC) Education Data Reality: The challenges for schools in managing children’s education data’ examines the views of professionals who work in or with schools on the procurement of, data protection for, or uses of digital technologies in schools. The report describes the range of EdTech used in schools and the complex issues that managing it presents.

In a blog about the report, the main author Sarah Turner highlights four key issues that constrain children’s best interests:

  • The benefits of EdTech and the data processed from children in schools are currently not discernible or in children’s best interests. Nor are they proportionate to the scope, scale and sensitivity of data currently processed from children in schools.
  • Schools have limited control or oversight over data processed from children through their uses of EdTech. The power imbalance between EdTech providers and schools is structured in the terms of the use they signed up to and exacerbated by external pressure to use some EdTech services.
  • There is a distinct lack of comprehensive guidance for schools on how to manage EdTech providers’ data practices. Nor is there a minimum standard for acceptable features, data practices and evidence-based benefits for schools to navigate the currently fragmented EdTech market and select appropriate EdTech that offers educational benefits proportionate to the data it processes.
  • Patchy access to and security of digital devices at school and home due to cost and resource barriers means that access to digital technologies to deliver and receive education remains inequitable.

The report is focused on the processing of education data about families, however there are many interesting parallels with the findings from our project on the way data about families is collected, processed and used by local authorities:

  • Firstly, there is a lack of evidence about the benefits of the use of digital technologies in both schools and in local authorities and a lack of understanding about the risks to children’s data privacy.
  • There is a lack of government guidance for schools as there is for local authorities about the digital technologies that they employ, meaning that organisations are left individually responsible for ensuring that they are compliant with General Data Protection Regulation (GPPR).
  • Schools, like local authorities are time, resource and expertise poor. Often neither have the data protection expertise to understand and consider the risks versus the benefits of data processing for children’s best interests.
  • There is a lack of transparency in how data is collected, handled and processed by Government agencies as well as third parties who gain access to data about families, either through children using their apps for educational purposes or through local authorities employing them for the development of predictive analytics systems.
  • Public awareness and understanding about how data is collected and processed and the risks of data sharing to children’s privacy are low and are not well understood by parents and children.

We welcome this new report by the Digital Futures Commission and hope that it stimulates more discussion and awareness amongst professionals and families.