Applying a Democratic Brake to the Hegemony of Efficiency – a lesson from cultural heritage

By Nicola Horsley

In our recently published book The Trouble with Big Data, Jennifer Edmond, Jörg Lehmann, Mike Priddy and I draw on our findings from the Knowledge Complexity (KPLEX) project to examine how the inductive imperative of big data applied in the sphere of business and elsewhere crosses over to the cultural realm.  The book details how cultural heritage practitioners’ deep understanding of the material they work with and the potential for its use and misuse when linked with other data is being displaced.

It is often remarked that public debate, critical thought and legal regulation simply cannot keep up with the pace of technological change, resulting in the adoption of technologies for purposes that stray beyond ethical guidelines. Thirty years ago, Neil Postman drew on the work of Frederick W. Taylor to describe the principles of Technopoly, which revolves around the primacy of efficiency as the goal of human endeavour and thought, and prizes measurement and machine calculation over human judgement, which was seen as flawed and unnecessarily complex. In order for Technopoly to take hold, a knowledge landscape in which data were divorced from context and collection purpose, disconnected from theory or meaning and travelling in no particular direction, needed to materialise.

In the KPLEX project, we were interested to learn how cultural heritage practitioners’ expertise was marginalised by the offer of a standardised interface through which the knowledge seeker could find data that satisficed as an answer to their research question, bypassing any contextual information engaging with a human expert might offer. The standardisation of interfaces between knowledge-seekers and myriad knowledge institutions has obfuscated huge differences in the organisation, values and practices of those institutions. Many elements of archival practitioners’ work go unsung, but the drive to furnish users with detailed information about collections without having to ask for it suggests that dialogic exchange with those who are experts on the source material was an unnecessary barrier that has been removed.

While greater anonymity can promise to tackle problems of bias and prejudice, the reality is that an overwhelming amount of data that does not become formally recorded as metadata continues to be stored as tacit knowledge in cultural heritage practitioners themselves. Within the KPLEX interviews, one head of a service team at a national library described how digitisation presented the mammoth challenge of impressing the importance of context upon a user who has landed on a page without an understanding of how what they are viewing relates to the institution’s collections as a whole (never mind the collections of other institutions). It was felt that the library’s traditional visitors forged an awareness of the number of boxes of stuff that related to their query versus the number they had actually got to grips with, whereas today’s user, presented with a satisficing Google lookalike result, has her curiosity curtailed.

The introduction of new data systems to cultural heritage institutions usually involves archival practitioners working with data engineers to build systems that reorganise and reconstitute holdings and metadata to facilitate digital sensemaking techniques – with the burden usually on archival practitioners’ upskilling to understand computational thinking. Unintended consequences in general, and unreasonable proxies and imperfect, satisficing answers in particular, are at the heart of cultural knowledge practitioners’ reservations about datafication that should not be glossed over as resistance to change in their practice or professional status. Underlying these concerns is a perception familiar to readers of Latour and Callon’s work How to Follow Scientists and Engineers, which observed how differences in practice were translated into technical problems that engineers could then apply technological ‘solutions’ to – a phenomenon sometimes referred to as ‘techno-solutionism’. The result for the user searching the collections of a library, museum, gallery or archive is a slick interface that feels familiar as it appears to function in the same way as the search engines she uses for a range of other purposes throughout her day.

The ‘quick wins’ of Google’s immediacy and familiarity are a constant thorn in the side of practitioners concerned with upholding rigour in research methods, and there is a real fear that the celebration of openness is working as a diversion away from both the complex material excluded from it and any awareness that this phenomenon of hiddenness through eclipsing ‘openness’ is happening. It is clear that the new normal of the Google paradigm is having a direct effect on how knowledge seekers understand how to ask for knowledge, what timeframe and format of information is appropriate and desirable, and what constitutes a final result. Callon and Latour’s description of a black box seems more pertinent than ever. What is more, the coming together of the paradigms of the archival method and the computational method is viewed as imperilling archivists’ fundamental values if the result is modelled on the algorithms of Google and Facebook, as described at a national library:

"Even though people believe they see everything, they might see even less than before because they’re only being shown the things that the algorithm believes they want to see. So, I’m really concerned with that increasing dominance of these organisations that commercial interests will increasingly drive knowledge creation …" KPLEX interviewee

Pasquale (2015) describes Google and Apple as ‘the Walmarts of the information economy, in that they ‘habituate users to value the finding service itself over the sources of the things found’. The invisibilisation of provenance might be the most insidious effect of datafication because, when presented with irreconcilable knowledge claims, capacity to judge and choose between them will be diminished.

The unchecked permeation of commercial practices and values into every aspect of our engagement with commercial services that trade on the social is already highly questionable. When these practices and values are imported wholesale into public services, we really need a democratic braking system. This is why it’s crucial that any ‘advances’ in data infrastructure or practices are designed with the involvement of, if not led by, the people who have the closest relationship to the data, rather than those with transferable technical expertise. This cannot be done without exploding the myth that the challenges society faces are mere technical problems, and returning to an understanding of the social and an appreciation of the complexity of human stories.

Nicola Horsley is a research fellow at the Centre for Interdisciplinary Research in Citizenship, Education and Society (CIRCES), where she works on research concerned with education for democratic citizenship, human rights education and the social inclusion of migrants and refugees through educational processes.

The Trouble With Big Data: How Datafication Displaces Cultural Practices, by Jennifer Edmond, Nicola Horsley, Jörg Lehmann and Mike Priddy, is published by Bloomsbury.


Leave a Reply

Your email address will not be published. Required fields are marked *