Voices come together to highlight the need to strengthen children’s data rights in response to Government consultation ‘Data: a new direction’

By Sarah Gorin, Ros Edwards and Val Gillies

As part of our research project, we are constantly trying to keep up with developments about the use of children’s data by Government. In Autumn 2021 the U.K Government released a consultation ‘Data: a new direction’ that we responded to, based on our learning from parents who have participated in our research.

Our submission highlighted the already fragile nature of public trust in joining up administrative data about families, particularly amongst marginalised groups and the need to press pause on data sharing and artificial intelligence (AI) systems in the U.K. with a need for greater regulation and safeguarding of families’ data from use by private companies.

Fortunately we are not alone in raising concerns about the proposals

.

Organisations such as the 5Rights Foundation and DefendDigitalMe have also made submissions to the consultation that highlight the lack of attention to the impact of these proposals on children and their families.

In this blog, we summarise some of the key recommendations to the consultation by LSE’s Professor Sonia Livingstone and 5Rights Foundation researcher Dr Kruakae Pothong who have drawn on their extensive research experience to inform their children’s rights based response (for more detail see their LSE blog).

Professor Livingstone and Dr. Pothong’s submission highlights the need for greater not lesser support for children’s rights as data subjects and questions whether proposed changes to the UK General Data Protection Regulation are indeed in line with international human rights and child rights developments.

Key concerns raised in the submission include:

  • The need to maintain a clear boundary between the use of personal data for scientific research and the re-use of this scientific research for commercial purposes, with meaningful consent obtained for any further processing to occur.
  • The importance of scientific and public interest research that identifies children to demonstrate fully how the acquisition and processing of data would affect children and how they can exercise their data rights.
  • The importance of not proceeding with the Government’s proposed “limited, exhaustive list of legitimate interests” as a basis for processing data. Currently, legitimate interests can be a lawful basis for processing data only when processing is ‘necessary’ and when there is a balance between the interests of the data subjects and others’ interests.
  • The need to maintain rather than remove (as proposed in the Consultation) the requirement for organisations to undertake data protection impact assessments. They argue that instead the additional safeguards of the data protection impact assessment and a child rights impact assessment should be mandated for use before and after the processing of children’s personal data.
  • The importance of retaining the requirement to have Data Protection Officers in all circumstances, especially educational contexts.
  • Maintaining free subject access requests as charging families would adversely affect children’s rights and make it harder for children and families to correct inaccurate information held about them, potentially negatively affecting them in the short and long term.

The Government consultation closed in November 2021 and we await the outcome.

Generating transparency where none exists: just how are data analytics used in children’s services?

By Val Gillies, Ros Edwards and Sarah Gorin 

The Government’s public consultation on changes to the data protection framework emphasise the importance of public trust and transparency. But when, as part of our research, we tried to establish basic facts about the extent to which local authorities are linking and analysing data on children and families, we hit a brick wall.

Our research is aiming to provide a clearer picture of what parents think about the ways information about them and their children may be linked together and used by local councils. An important part of this work has been speaking directly to parents to see how much support for and trust in this type of activity there is. Alongside these valuable and enlightening conversations, we have also been trying to map the state of play among British local authorities and to find out exactly which authorities are doing what when it comes to operational data linking and matching and the application of predictive analytics to families’ data. 

The Government’s declared commitment is to roll out a ‘world class nationwide digital infrastructure and  ‘unlock the power of data’, but there is currently no central record available of which authorities are doing what. 

Freedom of information requests

To try to find out, we submitted Freedom of Information requests to 220 UK Local Authorities in the UK.  The 149 English councils participating in the ‘Troubled Families Programme’ (now called the Supporting Families Programme) must, by necessity link and analyse datasets to ‘identify’ ‘troubled’ households and claim payment-by-results from central government. Yet only 76 responded that they used data analytics. The remainder claimed that their systems did not meet our definition or responded with a straight ‘no’ to all our questions about their use. 

English councils claiming to be outside our criteria responded in a vague and evasive way. For example, some responded ‘no’ when asked about their engagement with data analytics either by positioning family work as separate from children’s services or by using the term ‘data matching’ instead. Further investigation established that many of these councils do in fact use systems with predictive analytic capacity. 

For example, Achieving for Children, a social enterprise providing services for children in several local authorities, responded to our FoI that beyond ‘some basic data monitoring
.we do not currently use nor have we previously used any data analytics, predictive analytics and or artificial intelligence systems to assist with this work’. Yet they have used business intelligence technologies on a range of projects using predictive analytics/algorithms, as noted on the UK Authority Digital Data and Technology for the Public Good website.

Side-stepping terms

Our FoI research also showed that Councils side-stepped the term algorithm and the concept of AI. Even where they engaged in predictive analytics they denied they were using algorithms – it’s hard to envisage one without the other given the tools they were employing. 

We received a lot of incomplete and insufficient information and information that was irrelevant. A number of councils claimed exemption from the FoI on cost grounds or commercial confidentiality. Where we followed up with more carefully worded requests, we received ambiguously worded replies. 

Some local authorities were more forthcoming and open, listing various tools and companies used to conduct their data analytics.  Microsoft Business Intelligence the most common tool cited. Dorset County Council has a case study on the Local Government Association website of how the tool can be used to ‘ enable local professionals to identify potential difficulties for individual children before they become serious problems’. Our FoI established the council plans to make greater use of AI in the future. 

Our analysis of the responses we received and the information we have sourced from elsewhere, points to a clear shift in service priorities away from early intervention for parental education towards child protection and crime prevention. Earlier focus on linking parenting skills to social mobility is now muted, with rationales for data innovation focusing almost exclusively on the pre-emption of problems rather than on the maximisation of children’s future potential. 

Our findings around children’s services have been reinforced by work by Big Brother Watch which has published a detailed analysis of the use of hidden algorithms by councils who use trackers to identify disadvantaged households in order to target them for interventions. The organisation found one of the biggest players in this area, Xantura, to be particularly secretive. 

Public trust 

A wide range of data ‘solutions’ are drawn on by local authorities to classify, flag, target and intervene in disadvantaged families and their children. Yet parents are not generally informed of how their information is being used and for what purpose. As we have shown it is difficult even for us as researchers to establish.

From our work here, it is hard to see how public trust and transparency will be achieved from the opaque, ambiguous and even evasive base that our FoI request endeavours revealed. 

Freedom of Information Requests on the Use of Data Analytics in Children’s Services: Generating Transparency is research by Val Gillies and Bea Gardner, with Ros Edwards and Sarah Gorin. 

Using Artificial Intelligence in public services – does it breach people’s privacy?

By Ros Edwards, Sarah Gorin and Val Gillies

As part of our research, we recently asked parents what they thought about the use of data linkage and predictive analytics to identify families to target public services.

They told us that they didn’t trust these processes. This was particularly the case among marginalised social groups. In other words, the groups of parents who are most likely to be the focus of these AI identification practices are least likely to see them as legitimate. Now a new report by the United Nations High Commissioner of Human Rights, Michelle Bachelet highlights major concerns about the impact of artificial intelligence, including profiling, automated decision-making and machine-learning, upon individuals’ right to privacy. 

The report makes a number of recommendations, including a moratorium on the use of AI systems that pose a serious risk to human rights and the banning of social scoring of individuals by Governments or AI systems that categorise individuals into groups on discriminatory grounds.

The right to privacy in the digital age: report (2021) builds on two previous reports by the High Commissioner looking at the right to privacy in the digital age and incorporates views of international experts at a virtual seminar, as well as responses to the High Commissioners call for input into the report from member states, including the U.K.

It examines the impact of digital systems such as artificial intelligence in four sectors, including public services. Artificial intelligence is used in public services such as social care, health, police, social security and education in a range of ways, such as decision-making about welfare benefits and flagging families for visits by children’s social care services.

Concerns are expressed about the linking together for example of large health, education and social care data sets with other data held by private companies, such as social media companies or data brokers who, the report says, may gather information outside protective legal frameworks. The involvement of private companies in the construction, development and management of public sector data systems, also means they can gain access to data sets containing information about large parts of the population.

There are additional concerns about the potential inaccuracy of  historic data and the implications of that for future decision-making. The report states that these systems unequally “expose, survey and punish welfare beneficiaries” and that conditions are imposed on individuals that can undermine their autonomy and choice.

A digital welfare fraud detection system was banned by a court in the Netherlands, ruling that it infringed individuals’ right to privacy. The system provided central and local authorities with the power to share and analyse data that were previously kept separately, including on employment, housing, education, benefits and health insurance, as well as other forms of identifiable data. The tool targeted low-income and minority neighbourhoods, leading to de facto discrimination based on socioeconomic background.

The recommendations in the report include:

  • using a human rights based approach
  • ensuring legislation and regulation are in line with the risk to human rights, with sectors including social protection to be prioritised
  • development of sector specific regulation requirements
  • drastic improvements to efforts regarding transparency, including use of registers for AI that contain key information about AI tools and their use, informing affected individuals when decisions are being or have been made automatically or with the help of automation tools, and notifying individuals when the personal data they provide will become part of a data set used by an AI system.

With concerns about the risks to the human rights of individuals and families about the use of data linkage and predictive analytics, it is vital to pay heed to the UN High Commissioner’s call for a moratorium. Public authorities need to pay meaningful attention to the lack of social legitimacy for AI, as evidenced in our research, and to ask themselves if the risk of further distrust and disengagement from already marginalised social groups, and consequences for a cohesive and equal society, is worth it. 

A murky picture – who uses data linkage and predictive analytics to intervene in families’ lives?

In the first of a series of blogs discussing key issues and challenges that arise from our project, Dr Sarah Gorin discusses the problems encountered by our team as we try to find out which local authorities in the UK are using data linkage and predictive analytics to help them make decisions about whether to intervene in the lives of families.

As background context to our project, it seemed important to establish how many local authorities are using data linkage and predictive analytics with personal data about families and in what ways. To us this seemed a straightforward question, and yet it has been surprisingly hard to gain an accurate picture. Six months into our project and we are still struggling to find out.

In order to get some answers, we have been reaching out to other interested parties and have had numerous people get in touch with us too: from academic research centres, local authorities, independent foundation research bodies, to government-initiated research and evaluation centres.  Even government linked initiatives are finding this difficult, not just us academic outsiders!

So what are the issues that have been making this so difficult for us and others?

No centralised system of recording

One of the biggest problems is finding information. There is currently no centralised way that local authorities routinely record their use of personal data about families for data linkage or predictive analytics. In 2018, the Guardian highlighted the development of the use of predictive analytics in child safeguarding and the associated concerns about ethics and data privacy. They wrote:

“There is no national oversight of predictive analytics systems by central government, resulting in vastly different approaches to transparency by different authorities.”

This means that it is very difficult for anyone to find out relevant information about what is being done in their own or other local authorities. Not only does this have ethical implications in terms of the transparency, openness and accountability of local authorities but also more importantly, means that families who experience interventions by services are unlikely to know how their data has been handled and what criteria has been used to identify them.

In several European cities they are trialling the use of a public register for mandatory reporting of the use of algorithmic decision-making systems. The best way to take this forward is being discussed here and in other countries.

Pace of change

Another issue is the pace of change. Searching the internet for information about which local authorities are linking families’ personal data and using it for predictive analytics is complicated by the lack of one common language to describe the issues. A myriad of terms are being used and they change over time
‘data linkage’; ‘data warehousing’; ‘risk or predictive analytics’; ‘artificial intelligence’ (AI); ‘machine learning’; ‘predictive algorithms’; ‘algorithmic or automated decision-making’ to name but a few.

The speed of change also means that whilst some local authorities who were developing systems several years ago, may have cancelled or paused their use of predictive analytics, others may have started to develop it.

The Cardiff University Data Justice Lab in partnership with the Carnegie UK Trust are undertaking a project to map where and why government departments and agencies in Europe, Australia, Canada, New Zealand and the United States have decided to pause or cancel their use of algorithmic and automated decision support systems.

General Data Protection Regulation (GDPR)

GDPR and the variation in the way in which it is being interpreted may be another significant problem that is preventing us getting to grips with what is going on. Under GDPR, individuals have the right to be informed about:

  • the collection and use of their personal data
  • information including the purposes for processing personal data
  • retention periods for data held
  • and with whom personal data will be shared

As part of their responsibilities under GDPR, local authorities should publish a privacy notice which includes the lawful basis for processing data as well as the purposes of the processing. However, the way that local authorities interpret this seems to vary, as does the quality, amount of detail given and level of transparency of information on privacy notices. Local authorities may only provide general statements about the deployment of predictive analytics and can lack transparency about exactly what data is being used and for what purpose.

Lack of transparency

This lack of transparency has been identified in a Review by the Committee on Standards in Public Life who published a report in February 2020 on Artificial Intelligence and Public Standards. In this report it highlighted that Government and public sector organisations are failing to be sufficiently open. It stated:

“Evidence submitted to this review suggests that at present the government and public bodies are not sufficiently transparent about their use of AI. Many contributors, including a number of academics, civil society groups and public officials said that it was too difficult to find out where the government is currently using AI. Even those working closely with the UK government on the development of AI policy, including staff at the Alan Turing Institute and the Centre for Data Ethics and Innovation, expressed frustration at their inability to find out which government departments were using these systems and how.” (p.18)

Whilst some local authorities seem less than forthcoming in divulging information, this is not the case for all. For example, in Essex, a Centre for Data Analytics has been formed as a partnership between Essex County Council, Essex Police and the University of Essex. They have developed a website and associated media that provides information about the predictive analytics projects they are undertaking using families’ data from a range of partners including the police and health services.

So what are we doing?

As part of our project on parental social licence for data linkage and analytics, our team are undertaking a process of gathering information through internet searching and snowballing to put together as much information as we can find and will continue to do so throughout the course of the project. So far, the most useful sources of information have included:

  • the Cardiff University Data Justice Lab report that examines the uses of data analytics in public services in the UK, through both Freedom of Information requests to all local authorities and interviews/workshops with stakeholders
  • the WhatDoTheyKnow website which allows you to search previous FOI requests
  • internet searches for relevant local authority documents, such as commissioning plans, community safety strategies and Local Government Association Digital Transformation Strategy reports
  • media reports
  • individual local authority and project websites

It would seem we have some way to go yet, but it is a work in progress!

If you are interested in this area we’d be pleased to know of others’ experiences or if you’d like to contribute a blog on this or a related topic, do get in touch via our email datalinkingproject@gmail.com