Our project team writes for the LSE Impact Blog
http://eprints.lse.ac.uk/111339/1/impactofsocialsciences_2021_05_06_is_a_breakdown_in_trust.pdf
Parental social licence for data linkage for service intervention
a UKRI research project
http://eprints.lse.ac.uk/111339/1/impactofsocialsciences_2021_05_06_is_a_breakdown_in_trust.pdf
Project PI Ros Edwards presented findings from our project at the British Sociological Association annual conference 2021.
The paper, Problem-solving for Problem-solving: Data Analytics to Identify Families for Service Intervention looks at the way that the promise of technological fixes in the family policy field has set up a set of dependencies between public services and data analytic companies, entrenching a focus on individual families as the source of social problems rather than social conditions.
The paper that was the basis of Ros’s presentation has now been published in Critical Social Policy
In the first of a series of blogs discussing key issues and challenges that arise from our project, Dr Sarah Gorin discusses the problems encountered by our team as we try to find out which local authorities in the UK are using data linkage and predictive analytics to help them make decisions about whether to intervene in the lives of families.
As background context to our project, it seemed important to establish how many local authorities are using data linkage and predictive analytics with personal data about families and in what ways. To us this seemed a straightforward question, and yet it has been surprisingly hard to gain an accurate picture. Six months into our project and we are still struggling to find out.
In order to get some answers, we have been reaching out to other interested parties and have had numerous people get in touch with us too: from academic research centres, local authorities, independent foundation research bodies, to government-initiated research and evaluation centres. Even government linked initiatives are finding this difficult, not just us academic outsiders!
So what are the issues that have been making this so difficult for us and others?
One of the biggest problems is finding information. There is currently no centralised way that local authorities routinely record their use of personal data about families for data linkage or predictive analytics. In 2018, the Guardian highlighted the development of the use of predictive analytics in child safeguarding and the associated concerns about ethics and data privacy. They wrote:
âThere is no national oversight of predictive analytics systems by central government, resulting in vastly different approaches to transparency by different authorities.â
This means that it is very difficult for anyone to find out relevant information about what is being done in their own or other local authorities. Not only does this have ethical implications in terms of the transparency, openness and accountability of local authorities but also more importantly, means that families who experience interventions by services are unlikely to know how their data has been handled and what criteria has been used to identify them.
In several European cities they are trialling the use of a public register for mandatory reporting of the use of algorithmic decision-making systems. The best way to take this forward is being discussed here and in other countries.
Another issue is the pace of change. Searching the internet for information about which local authorities are linking familiesâ personal data and using it for predictive analytics is complicated by the lack of one common language to describe the issues. A myriad of terms are being used and they change over timeâŚâdata linkageâ; âdata warehousingâ; ârisk or predictive analyticsâ; âartificial intelligenceâ (AI); âmachine learningâ; âpredictive algorithmsâ; âalgorithmic or automated decision-makingâ to name but a few.
The speed of change also means that whilst some local authorities who were developing systems several years ago, may have cancelled or paused their use of predictive analytics, others may have started to develop it.
The Cardiff University Data Justice Lab in partnership with the Carnegie UK Trust are undertaking a project to map where and why government departments and agencies in Europe, Australia, Canada, New Zealand and the United States have decided to pause or cancel their use of algorithmic and automated decision support systems.
GDPR and the variation in the way in which it is being interpreted may be another significant problem that is preventing us getting to grips with what is going on. Under GDPR, individuals have the right to be informed about:
As part of their responsibilities under GDPR, local authorities should publish a privacy notice which includes the lawful basis for processing data as well as the purposes of the processing. However, the way that local authorities interpret this seems to vary, as does the quality, amount of detail given and level of transparency of information on privacy notices. Local authorities may only provide general statements about the deployment of predictive analytics and can lack transparency about exactly what data is being used and for what purpose.
This lack of transparency has been identified in a Review by the Committee on Standards in Public Life who published a report in February 2020 on Artificial Intelligence and Public Standards. In this report it highlighted that Government and public sector organisations are failing to be sufficiently open. It stated:
âEvidence submitted to this review suggests that at present the government and public bodies are not sufficiently transparent about their use of AI. Many contributors, including a number of academics, civil society groups and public officials said that it was too difficult to find out where the government is currently using AI. Even those working closely with the UK government on the development of AI policy, including staff at the Alan Turing Institute and the Centre for Data Ethics and Innovation, expressed frustration at their inability to find out which government departments were using these systems and how.â (p.18)
Whilst some local authorities seem less than forthcoming in divulging information, this is not the case for all. For example, in Essex, a Centre for Data Analytics has been formed as a partnership between Essex County Council, Essex Police and the University of Essex. They have developed a website and associated media that provides information about the predictive analytics projects they are undertaking using familiesâ data from a range of partners including the police and health services.
As part of our project on parental social licence for data linkage and analytics, our team are undertaking a process of gathering information through internet searching and snowballing to put together as much information as we can find and will continue to do so throughout the course of the project. So far, the most useful sources of information have included:
It would seem we have some way to go yet, but it is a work in progress!
If you are interested in this area weâd be pleased to know of othersâ experiences or if youâd like to contribute a blog on this or a related topic, do get in touch via our email datalinkingproject@gmail.com
National and local government departments and services collect and hold a range of information about parents and their children. These records include details of the taxes people pay, their medical records, school data and police files. But what do parents think about the idea of the data held on their children being linked with theirs, so that government services can identify families for possible interventions? A pilot survey leading up to a new research project indicates theyâre not very keen on the idea.
Data linkage is a hotly-debated topic with arguments about better targeted, more focused public services often counter-acted with concerns around privacy and lack of trust in the organisations collecting and using the data.
When it comes to the use of linked data to identify families where itâs thought there may be heightened risk of things such as child abuse or neglect, or truancy and anti-social behaviour, there can be few more contentious areas. But real evidence on what parents think is thin on the ground and this is where our research project hopes to make some inroads and get a better handle on things.
A voluntary pilot survey of parents carried out via the Mumsnet website and Twitter received a total of 365 responses – mostly from white mothers from relatively well-off households. We might assume, this group might feel more secure about the linkage and use of records on them and their children and that they might generally be trusting of organisations such as government departments and local councils etc. So what did we find?
Only half the parents said they had heard about data linkage and how it worked. Less than half thought it acceptable to use it to improve the planning and delivery of family support services. Far fewer (around 15 percent) thought it should be used to identify specific families who might need intervention but hadnât asked for support or to save public money by preventing or catching family problems early.
Their concerns included: familiesâ right to privacy; increasing stigma; oversimplification of risk factors; discouraging families from seeking help and problems with data accuracy and safety. About 1 in 5 did agree that the more we know about families the more the nationâs wellbeing can be improved.
There was little trust in organisations who might link data for the above reasons, but specific distrust of private companies with only 4 parents saying they would definitely trust them and 309 saying they wouldnât. One participant said:
âHaving algorithms produced by private companies, who are not transparent, making dubious links, which do correlate with truth, is a very dangerous tool. It will not benefit society but will only benefit private companies.â
Parents were almost entirely against government accessing financial details such as bank details, credit cards, supermarket records, CCTV and social media posts as the following comments from participants show:
âMore detailed and personal information, PARTICULARLY medical records/social services reports/DLA info, should not be ‘joined up’ as this breaches personal privacy.â Â
“It might be acceptable to link to credit records and food shopping records on an anonymous basis to produce aggregate information or statistics for research purposes e.g. a study looking at the possible relationships between children’s diets and school attainment, or the impact of debt on parents BUT this sort of data analysis should NEVER identify individuals.â
“Government surveillance of families, without knowledge or consent, is an extremely questionable approach. Government should focus on supporting families, providing public services, creating jobs, ensuring quality of housing, eliminating poverty and increasing community capacities of resilience, care, safety, cohesion and fun.”
So among this group of parents, where we might have expected greater levels of support for the idea of linking data for the purposes of identifying and targeting service intervention, we find in fact a substantial lack of acceptance and trust around the idea of family data linkage.
A web-based survey of this nature gives us some indication of what people are thinking, but itâs important to try to look at this more carefully. As part of our new research project, weâre commissioning a robustly-designed and -conducted survey of around 900 parents and carrying out a range of in-depth interviews with parents in families most likely to be the target of this type of data linkage and policies associated with it.