Using Artificial Intelligence in public services – does it breach peopleā€™s privacy?

By Ros Edwards, Sarah Gorin and Val Gillies

As part of our research, we recently asked parents what they thought about the use of data linkage and predictive analytics to identify families to target public services.

They told us that they didnā€™t trust these processes. This was particularly the case among marginalised social groups. In other words, the groups of parents who are most likely to be the focus of these AI identification practices are least likely to see them as legitimate. Now a new report by the United Nations High Commissioner of Human Rights, Michelle Bachelet highlights major concerns about the impact of artificial intelligence, including profiling, automated decision-making and machine-learning, upon individualsā€™ right to privacy. 

The report makes a number of recommendations, including a moratorium on the use of AI systems that pose a serious risk to human rights and the banning of social scoring of individuals by Governments or AI systems that categorise individuals into groups on discriminatory grounds.

The right to privacy in the digital age: report (2021) builds on two previous reports by the High Commissioner looking at the right to privacy in the digital age and incorporates views of international experts at a virtual seminar, as well as responses to the High Commissioners call for input into the report from member states, including the U.K.

It examines the impact of digital systems such as artificial intelligence in four sectors, including public services. Artificial intelligence is used in public services such as social care, health, police, social security and education in a range of ways, such as decision-making about welfare benefits and flagging families for visits by childrenā€™s social care services.

Concerns are expressed about the linking together for example of large health, education and social care data sets with other data held by private companies, such as social media companies or data brokers who, the report says, may gather information outside protective legal frameworks. The involvement of private companies in the construction, development and management of public sector data systems, also means they can gain access to data sets containing information about large parts of the population.

There are additional concerns about the potential inaccuracy of  historic data and the implications of that for future decision-making. The report states that these systems unequally ā€œexpose, survey and punish welfare beneficiariesā€ and that conditions are imposed on individuals that can undermine their autonomy and choice.

A digital welfare fraud detection system was banned by a court in the Netherlands, ruling that it infringed individualsā€™ right to privacy. The system provided central and local authorities with the power to share and analyse data that were previously kept separately, including on employment, housing, education, benefits and health insurance, as well as other forms of identifiable data. The tool targeted low-income and minority neighbourhoods, leading to de facto discrimination based on socioeconomic background.

The recommendations in the report include:

  • using a human rights based approach
  • ensuring legislation and regulation are in line with the risk to human rights, with sectors including social protection to be prioritised
  • development of sector specific regulation requirements
  • drastic improvements to efforts regarding transparency, including use of registers for AI that contain key information about AI tools and their use, informing affected individuals when decisions are being or have been made automatically or with the help of automation tools, and notifying individuals when the personal data they provide will become part of a data set used by an AI system.

With concerns about the risks to the human rights of individuals and families about the use of data linkage and predictive analytics, it is vital to pay heed to the UN High Commissionerā€™s call for a moratorium. Public authorities need to pay meaningful attention to the lack of social legitimacy for AI, as evidenced in our research, and to ask themselves if the risk of further distrust and disengagement from already marginalised social groups, and consequences for a cohesive and equal society, is worth it. 

Would you like to take part in our research?

We are looking for parents to take part in the project. We want to know your views and experiences of the way that information about families is collected and used by policy-makers and service providers. 

There are two ways you can take part:

  1. As part of a group discussion – If you are a parent (of at least one child aged 16 or under) you can take part in an online group discussion that will last about 45 minutes.
  2. In a one-to-one discussion -If you are a parent (of at a least one child aged 16 or under) and have had contact with family support services (this may be childrenā€™s social work services, early years or a voluntary organisation that supports families) you can take part in an individual discussion with us that will last about 45 minutes, either online or by phone.

All group and individual participants will receive a Ā£25 e-voucher in thanks for their time and trouble, and we can provide a top-up voucher for participants using pay-as-you-go.

The research has ethical approval from the University of Southampton.

If you would like to receive further information or talk about the possibility of participating in the research, please contact Sarah Gorin, University of Southampton at s.j.gorin@soton.ac.uk

Running focus groups with parents in a Covid-19 setting – how will we do it?

In this second project blog, the research team reflect on how Covid-19 and the restrictions it has placed on all our lives, has led to methodological, ethical and practical challenges in working with focus groups on parental buy-in for linking and analysing data about families. They outline the challenges they face and how theyā€™re adapting their approach. 

For the next stage of our project, weā€™re conducting focus groups to explore how particular social groups of parents understand and talk about their perspectives on data linkage and predictive analytics.  Back in early 2020, we were optimistic about the possibility of being able to conduct these groups face-to-face by the time we reached this stage of our research.  Now though, itā€™s clear weā€™ll need to move online, and weā€™ve been thinking about the issues weā€™ll face and how to deal with them.

Questions weā€™re grappling with include:

  • What might moving online mean for how we recruit participants? 
  • How can we best organise groups and engage parents with the project? 
  • How can we develop content for online groups that will firstly, encourage parents to contribute and enjoy the research process, and secondly, be relevant to our research endeavour?

What will moving online mean for recruiting participants?

Our intention was ā€“ and still is, to hold focus group discussions with homogenous groups of parents, to explore the consensus of views on what is and isnā€™t acceptable (social licence) in joining together and using parentsā€™ administrative records.

Weā€™re using the findings from our earlier probability-based survey of parents to identify social groups of parents whose views stand out. These include home-owning parents in professional and managerial occupations, who have stronger social licence, and mothers on low incomes, Black parents, and lone parents and parents in larger families living in rented accommodation, who tend to have weak or no social licence.

Our original plan was to recruit participants for our focus groups by contacting local community and interest groups, neighbourhood networks, services such as health centres and schools, workplaces and professional associations.  We still plan to do this, but weā€™re concerned that the pandemic is placing huge pressures on community groups, services for families and businesses and we may need to be prepared that helping us to identify parents to participate in research may not be a priority or, as with schools, appropriate.

So weā€™ve also been considering recruitment through online routes, such as advertising on relevant Facebook groups; using Twitter and putting advertisements on websites likely to be accessed by parents. Itā€™ll be interesting to see if these general reach-outs get us anywhere.

An important aspect of recruitment to our study is how to include marginalised parents.  This can be a conundrum whether research is face-to-face or online.  Face-to-face we would have spent quite a bit of time establishing trust in person, which is not feasible now.  Finding ways to reach out and convince these parents to participate is going to be an additional challenge. Our ideas for trying to engage these parents include the use of advertising via foodbanks, neighbourhood support networks and housing organisations.

And thereā€™s the additional problem for online methods, revealed in inequalities of online schooling, of parents who have limited or no online access. Further, Covid-19 is affecting parents living in poverty especially and we donā€™t want to add to any stress theyā€™re likely to be under.

Enticing affluent parents working in professional and managerial occupations to participate may also be difficult under the current circumstances.  They may be juggling full-time jobs and (currently) home schooling and feeling under pressure.  Either way, marginalised or affluent, we think weā€™ll need to be flexible, offering group times in evenings and at weekends for example. 

How should we change the way we organise groups and engage parents with the project? 

We know from reading the literature that online groups can face higher drop-out rates than face-to-face.  Will the pandemic and its potential effect on parentā€™sā€™ physical and mental health mean that we face even higher drop-out rates?  One strategy we hope will help is establishing personal links, through contacting participants and chatting to them informally before the focus group takes place.

Weā€™ve been mulling over using groups containing people who know each other, for example if theyā€™re members of a community group or accessed through a workplace, and groups that bring together participants who are unknown to each other.  Because weā€™re feeling a bit unsure about recruitment and organisation, weā€™ve decided to go down both routes as and when opportunities present themselves.  Weā€™ll need to be aware of this as an issue when we come to do the analysis though.

Weā€™re also thinking to organise more groups and have fewer participants in each group than we would have done face-to-face (after all, weā€™re not going to be confined by our original travel and venue hire budget).  Even in our online research team meetings we can cut across and interrupt each other, and discussion doesnā€™t flow in quite the same way.  Reading  participantsā€™ body language and non-verbal cues in an online focus group is going to be more difficult.  Smaller numbers in the group may help a bit, but it can still be difficult to see everyone if, for example, someone is using a mobile phone.  Weā€™ll just have to see how this goes and how best to handle it.

Thereā€™s also a dilemma about how many of the project team to involve in the focus groups. Weā€™ll need to have a team member to facilitate the group, but previous research shows it might be useful to have at least one other to monitor the chat and sort out any technical issues. But with a group as small as 4-6 participants will that seem off putting for parents? Itā€™s all hard to know so may be a case of trying it in order to find out!

What should we consider in developing content thatā€™s engaging for parents and relevant to our research?

What weā€™ll miss by holding our group discussions online is the settling in and chatting and putting us and our participants at ease ā€“ how are you, would you like a drink, thereā€™s some biscuits if you want, let me introduce you to ā€¦ and so on.  We donā€™t think that we can replicate this easily.  

But weā€™ve been pondering our opening icebreaker ā€“ should we ask something like….

ā€˜If you could be anywhere else in the world where would you be?ā€™

or

ā€˜What would be the one thing youā€™d pack in a lockdown survival kit?ā€™ 

And weā€™re also planning to use a couple of initial questions that use the online poll function.  Hereā€™s an instance where we think thereā€™s an advantage over in-person groups, because participants can vote in the poll anonymously. 

After that, weā€™ll be attempting to open up the discussion to focus on the issues that are at the heart of our research ā€“ what our participants feel is acceptable and whatā€™s not in various scenarios about the uses of data linkage and predictive analytics.

Ensuring the well-being of parents after focus groups is always important, but with online groups may be harder if the participants are not identified through community groups in which thereā€™s already access to support. We plan to contact people after groups via email but itā€™s hard to know if parents would let us know even if groups presented issues for them. We have also given some thought to whether we could use online noticeboards for participants to post any further comments they may have about social licence after theyā€™ve had time to reflect, but do not know realistically if they would be used.

Itā€™ll be interesting to see if the concerns weā€™ve discussed here are borne out in practice, and our hopeful means of addressing them work.  And also, what sort of challenges arise for our online focus group discussions that we havenā€™t thought of in advance!

If you have any ideas that might help us with our focus groups, please do get in touch with us via datalinkingproject@gmail.com