Our research team is encouraging parents to take time to better understand how their family’s data is collected, shared and used for service intervention. And to help with that we have created a short video animation which you can view below.
This video has been produced with help from animation specialists Cognitive Media Ltd. It is designed to inform parents about how data on them and their children is collected, shared and used by local and national government services to identify families for intervention.
The video is also designed to prompt discussion around data security, consent, and the extent of public acceptance and trust or social licence in these activities.
University of Southampton Principal investigator Professor Ros Edwards said:
We believe that policy developments and data linkage and analytics practices to inform services interventions are moving ahead of public knowledge and consent. We should all take time to understand this better and consider what it means. We hope this video will help prompt discussions among parents and groups working with parents to make sure we are better informed and better equipped to challenge these processes if we think they need challenging
Co-investigator Professor Val Gillies from the University of Westminster added:
If you are a parent, a group working with parents or simply someone with an interest in how our data is being used, watch our animation and if youâd like to know more about what weâve found and other resources and outputs that might be of interest take a look at the rest of our website.
Our research project is pleased to share details of the launch of a book which includes findings from our research project.
The Digital Futures Commission launch of Education Data Futures is being held on World Children’s Day, November 21, 2022.
The book, a collection of essays from regulators, specialists and academics working on the problems and possibilities of children’s education data, is being launched by Baroness Beeban Kidron and Sonia Livingstone who will be joined by a range of other guests.
Our project is delighted to have contributed a chapter to the book which outlines some of our findings about the extent to which parents from different social groups trust schools and other public services to share and electronically link data about their children and family. The chapter goes on to relate these to the wider social licence explanatory issues of legitimacy and suspicion, as well as the implications for government efforts to bring together and use administrative records from different sources.
We argue that government and public services need to engage in greater transparency and accountability to parents, enabling them to challenge and dissent from electronic merging of their data, but that efforts towards informing parents are likely to be received and judged quite differently among different social groups of parents.
The book is open access and after the launch will be downloadable from the Digital Data Commissionâs website, where hard copies may also be ordered.
By Joanna Redden, Associate Professor, Information and Media Studies, Western University, Canada
In 2019, former UN Special Rapporteur Philip Alston said he was worried we were âstumbling zombie-like into a digital welfare dystopia.â He had been researching how government agencies around the world were turning to automated decision-making systems (ADS) to cut costs, increase efficiency and target resources. ADS are technical systems designed to help or replace human decision-making using algorithms.
Greater transparency, responsibility, accountability and public involvement in the design and use of ADS is important to protect peopleâs rights and privacy. There are three main reasons for this:
they are being introduced faster than necessary protections can be implemented, and;
there is a lack of opportunity for those affected to make democratic decisions about if they should be used and if so, how they should be used.
Our latest research project, Automating Public Services: Learning from Cancelled Systems, provides findings aimed at helping prevent harm and contribute to meaningful debate and action. The report provides the first comprehensive overview of systems being cancelled across western democracies.
Researching the factors and rationales leading to cancellation of ADS systems helps us better understand their limits. In our report, we identified 61 ADS that were cancelled across Australia, Canada, Europe, New Zealand and the U.S. We present a detailed account of systems cancelled in the areas of fraud detection, child welfare and policing. Our findings demonstrate the importance of careful consideration and concern for equity.
Reasons for cancellation
There are a range of factors that influence decisions to cancel the uses of ADS. One of our most important findings is how often systems are cancelled because they are not as effective as expected. Another key finding is the significant role played by community mobilization and research, investigative reporting and legal action.
Our findings demonstrate there are competing understandings, visions and politics surrounding the use of ADS.
Hopefully, our recommendations will lead to increased civic participation and improved oversight, accountability and harm prevention.
In the report, we point to widespread calls for governments to establish resourced ADS registers as a basic first step to greater transparency. Some countries such as the U.K., have stated plans to do so, while other countries like Canada have yet to move in this direction.
Our findings demonstrate that the use of ADS can lead to greater inequality and systemic injustice. This reinforces the need to be alert to how the use of ADS can create differential systems of advantage and disadvantage.
Accountability and transparency
ADS need to be developed with care and responsibility by meaningfully engaging with affected communities. There can be harmful consequences when government agencies do not engage the public in discussions about the appropriate use of ADS before implementation.
This engagement should include the option for community members to decide areas where they do not want ADS to be used. Examples of good government practice can include taking the time to ensure independent expert reviews and impact assessments that focus on equality and human rights are carried out.
We recommend strengthening accountability for those wanting to implement ADS by requiring proof of accuracy, effectiveness and safety, as well as reviews of legality. At minimum, people should be able to find out if an ADS has used their data and, if necessary, have access to resources to challenge and redress wrong assessments.
There are a number of cases listed in our report where government agenciesâ partnership with private companies to provide ADS services has presented problems. In one case, a government agency decided not to use a bail-setting system because the proprietary nature of the system meant that defendants and officials would not be able to understand why a decision was made, making an effective challenge impossible.
Government agencies need to have the resources and skills to thoroughly examine how they procure ADS systems.
A politics of care
All of these recommendations point to the importance of a politics of care. This requires those wanting to implement ADS to appreciate the complexities of people, communities and their rights.
Key questions need to be asked about how the uses of ADS lead to blind spots because of the way they increase the distancing between administrators and the people they are meant to serve through scoring and sorting systems that oversimplify, infer guilt, wrongly target and stereotype people through categorizations and quantifications.
Good practice, in terms of a politics of care, involves taking the time to carefully consider the potential impacts of ADS before implementation and being responsive to criticism, ensuring ongoing oversight and review, and seeking independent and community review.
The Governmentâs public consultation on changes to the data protection framework emphasise the importance of public trust and transparency. But when, as part of our research, we tried to establish basic facts about the extent to which local authorities are linking and analysing data on children and families, we hit a brick wall.
Our research is aiming to provide a clearer picture of what parents think about the ways information about them and their children may be linked together and used by local councils. An important part of this work has been speaking directly to parents to see how much support for and trust in this type of activity there is. Alongside these valuable and enlightening conversations, we have also been trying to map the state of play among British local authorities and to find out exactly which authorities are doing what when it comes to operational data linking and matching and the application of predictive analytics to familiesâ data.
To try to find out, we submitted Freedom of Information requests to 220 UK Local Authorities in the UK. The 149 English councils participating in the âTroubled Families Programmeâ (now called the Supporting Families Programme) must, by necessity link and analyse datasets to âidentifyâ âtroubledâ households and claim payment-by-results from central government. Yet only 76 responded that they used data analytics. The remainder claimed that their systems did not meet our definition or responded with a straight ânoâ to all our questions about their use.
English councils claiming to be outside our criteria responded in a vague and evasive way. For example, some responded ânoâ when asked about their engagement with data analytics either by positioning family work as separate from childrenâs services or by using the term âdata matchingâ instead. Further investigation established that many of these councils do in fact use systems with predictive analytic capacity.
For example, Achieving for Children, a social enterprise providing services for children in several local authorities, responded to our FoI that beyond âsome basic data monitoringâŚ.we do not currently use nor have we previously used any data analytics, predictive analytics and or artificial intelligence systems to assist with this workâ. Yet they have used business intelligence technologies on a range of projects using predictive analytics/algorithms, as noted on the UK Authority Digital Data and Technology for the Public Good website.
Side-stepping terms
Our FoI research also showed that Councils side-stepped the term algorithm and the concept of AI. Even where they engaged in predictive analytics they denied they were using algorithms – itâs hard to envisage one without the other given the tools they were employing.
We received a lot of incomplete and insufficient information and information that was irrelevant. A number of councils claimed exemption from the FoI on cost grounds or commercial confidentiality. Where we followed up with more carefully worded requests, we received ambiguously worded replies.
Some local authorities were more forthcoming and open, listing various tools and companies used to conduct their data analytics. Microsoft Business Intelligence the most common tool cited. Dorset County Council has a case study on the Local Government Association website of how the tool can be used to â enable local professionals to identify potential difficulties for individual children before they become serious problemsâ. Our FoI established the council plans to make greater use of AI in the future.
Our analysis of the responses we received and the information we have sourced from elsewhere, points to a clear shift in service priorities away from early intervention for parental education towards child protection and crime prevention. Earlier focus on linking parenting skills to social mobility is now muted, with rationales for data innovation focusing almost exclusively on the pre-emption of problems rather than on the maximisation of childrenâs future potential.
Our findings around childrenâs services have been reinforced by work by Big Brother Watch which has published a detailed analysis of the use of hidden algorithms by councils who use trackers to identify disadvantaged households in order to target them for interventions. The organisation found one of the biggest players in this area, Xantura, to be particularly secretive.
Public trustÂ
A wide range of data âsolutionsâ are drawn on by local authorities to classify, flag, target and intervene in disadvantaged families and their children. Yet parents are not generally informed of how their information is being used and for what purpose. As we have shown it is difficult even for us as researchers to establish.
From our work here, it is hard to see how public trust and transparency will be achieved from the opaque, ambiguous and even evasive base that our FoI request endeavours revealed.
The National Data Strategy encourages the UKâs central and local government to team up with the private sector to digitally share and join up records to inform and improve services. One example of this is the area of troublesome families, where itâs thought that the use of merged records and algorithms can help spot or pre-empt issues by intervening early. But there are questions over this approach and this is something our project has been looking into. In our first published journal article, we have been examining the rationales presented by the parties behind data analytics used in this context to see if they really do present solutions. Â
The application of algorithmic tools is a form of technological solution; based on indicators in the routinely collected data, in an effort to draw out profiles, patterns and predictions that enable services to target and fix troublesome families. But local authorities often need to turn to commercial data analytic companies to build the required digital systems and algorithms.
In our paper we analysed national and local government reports and statements, and the websites of data analytic companies, addressing data linkage and analytics in the family intervention field. We looked in particular at rationales for and against data integration and analytics. We use a âproblem-solvingâ analytic approach, which focuses on how issues are produced as particular sorts of problems that demand certain sorts of solutions to fix them. This helps us to identify a double-faceted chain of problems and solutions.
Seeking and targeting families
Families in need of intervention and costing public money are identified as a social problem and local authorities given the responsibility of fixing that problem. Local authorities need to seek out and target these families for intervention. And it is experts in data analytics that, in turn, will solve that identification problem for them. In turn companies are reliant on citizens being turned into data (datafied) by local authorities and other public services.
We identified three main sorts of rationales in the data analytic companies promotion of their products that will solve local authoritiesâ problems: the power of superior knowledge, harnessing time, and economic efficiency.
Companies promote their automated data analytics products as powerful and transformational. They hand control of superior, objective and accurate, knowledge to local authorities so that they can use profiling criteria to identify families where there are hidden risks, for intervention. And their systems help local authority services such as social care and education collaborate with other services like health and the police, through data sharing and integration.
Data analytics is presented as harnessing time in the service of local authorities as an early warning system that enables them quickly to identify families as problems arise. It is the provision of an holistic view based on existing past records that local authorities hold about families, and the inputting of âreal timeâ present administrative data on families as it comes in. In turn, this provides foresight, helping local authorities into the future â predicting which families are likely to become risks in advance and acting to pre-empt this, planning ahead using accurate information.
Another key selling point for data analytics companies is that their products allow economic efficiency. Local authorities will know how much families cost them, and can make assured decisions about where to put or withdraw resources of finances and staffing. Data analytic products produce data trails that cater for local authorities to prepare Government returns and respond to future central Government payment-by-results initiatives, maximising the income that can be secured for their constrained budgets.
Questions to be asked
But there are questions to be asked about whether or not data linkage and analytics does provide powerful and efficient solutions, which we consider in our article. Concerns have been raised about the errors and bias in administrative records, resulting in unfair targeting of certain families.
Particular groups of parents and families are disproportionately represented in social security, social care and criminal justice systems, leading to existing social divisions of class, race and gender built into the data sets. For example, there is evidence that racial and gender profiling discriminations are built into the data, such as the inclusion of young Black men who have never been in trouble in the Metropolitan Police Gangs Matrix. And automated modelling equates socio-economic disadvantage with risk of child maltreatment, meaning that families are more likely to be identified for early intervention just because they are poor. On top of that, studies drawing on longitudinal data are showing that the success rates of predictive systems are worryingly low.
All of which raise a more fundamental question of whether or not algorithms should be built and implemented for services that intervene in familiesâ lives. In the next stage of our research, we will be asking parents about their views on this and on the way that information about families is collected and used by policy-makers and service providers.