This essay begins with a brief overview of the current state of affairs in surveillance, which I describe as a surveillance society. Here I look at the increasingly technical and depersonalised nature of surveillance, as well as the use of Artificial Intelligence (AI) and machine learning to implement broad-stroke surveillance principles. In contrast to this perspective, which operates largely independently of us, I place a Christian understanding of surveillance within a relational vision of God’s knowing and his monitoring of us as people within relationship. This relational matrix, I argue, draws us into an ethical perspective of surveillance in which members of the community owe each other certain moral responsibilities in the manner we surveil. By examining the story of Shiphrah and Puah, I contend that these commitments exist within a larger framework of power, in which the relational is a direct challenge to depersonalised surveillance. Leading from this story, I invite us to define ‘us’ more precisely in the context of the political slogan ‘Nothing about us without us!’ Here I follow Eric Stoddart’s lead in advocating for a preferential interpretation of ‘us’ as the marginalised and those discriminated against by surveillance. Using this framework, I conclude with a liberative reading of the Parable of the Lost Sheep, as well as some practical suggestions for how we, as people with power, may utilise our privilege for the benefit of the preferred ‘us’.
We currently live in what Gary Marx and Oscar Gandy refer to as a surveillance society, a sociological phrase used to describe the emergence and development of surveillant culture.[1] Today, the great majority of our social and economic activities generate unique records that are maintained in sophisticated computer databases controlled by governments, commercial organisations, and, indeed, even individuals. In other words, almost every element of modern life is dependent on the presence and ubiquity of surveillance, which often occurs quite distinctly from our enthusiasm to be watched. Thus, to not be monitored would require us to disattend from nearly every smooth functioning aspect of our economic and social lives. Surveillance itself is bound up in notions of governance and power, which in turn shapes the trajectory of social development. These uses of power, according to Foucault, help to define the categories of acceptable and unacceptable social behaviour, therefore producing ‘an arrangement whose internal mechanisms produce the relation in which individuals are caught up.’[2] This principle of power is concentrated in the distribution of ‘bodies, surfaces, lights, gazes’ which as non-human objects work to deindividualize power from the agents of control.’[3] Indeed, this disindivuzilationation is subtly increased by the present-day technological character of surveillance, which literally occurs out of sight in the mostly unknowable realm of the cloud. As a consequence of technological advancement, surveillance is increasingly being applied universally by algorithmic fiat, rather than through the individual monitoring of a person. As Andrew Ferguson has put it, speaking of policing, ‘the combination of new data sources, better algorithms, expanding systems of shared networks, and the possibility of proactively finding hidden insights and clues about crime has led to a new age of potential surveillance.’[4] Surveillance, then, is happening ‘without us’ to the extent that power now resides in the technology itself rather than the individuals being surveilled, and increasingly less and less even in the agents of surveillance.
By speaking of an ‘us’ in the context of artificial intelligence and surveillance, I am referring to a relatively simplistic portrayal of the dissonance between human and machine cognition, and our respective potential involvements in surveillance. Later in this essay I will explore the implications of defining ‘us’ more precisely. For the time being, it is sufficient to note that I am making this statement in the context of the political slogan ‘Nothing about us without us!’ which I will henceforth use to critique depersonalised surveillance systems, particularly those based on AI and machine learning. The use of technology in surveillance is well documented in the literature, and has been broadly defined by Marx as the ‘new surveillance.’[5] This new surveillance is increasingly utilising sophisticated artificial intelligence to monitor individuals via facial recognition technology, intelligent policing algorithms, and anomaly detection systems. The Carnegie Endowment AI Global Surveillance (AIGS) Index suggests that at least seventy-five out of 176 countries globally are actively using AI technologies for surveillance purposes.[6] While I do not subscribe to conspiracy theories about the dystopian consequences of technology, I do recognise that prioritising AI over people as primary agents of surveillance risks creating an additional determinant of power that operates outside of human community, cognition, and thus relationships. In artificial intelligence, machine-learning models constantly construct and reshape data based on prior computations, meaning that the system’s underlying logic is not static. When combined with enormous computer power much beyond the capabilities of the human intellect, this provides an enormous monitoring capability that is anonymous, decentralised, and self-reinforcing. Indeed, many people would see these computations as more objective than human analysis and therefore accord them preferential consideration.[7]
Given these developments in technology, it is not surprising that many commentators have drawn parallels between surveillance technology and the notion of God’s all-seeing gaze.[8] The ‘God’s Eye’ hacking tool from the Fast and Furious film franchise is an excellent example of this point. In the movies, the device is capable of hacking any type of camera based technology and rapidly deliver a video feed of the specified subject. It is easy to see here the parallel to God’s overarching view of the world, which Carl Jung has described as ‘never resting, ubiquitous, and all seeing.’[9] The concept of an omniscient and all-seeing God, which appears often in the Old and New Testaments, is profoundly woven into the Judeo-Christian heritage and the fabric of Western civilisation in general.[10] It is fair to say that the notion of God’s omniscience is much maligned in popular culture, and some regard the eye as a precursor for the pervasive monitoring we currently face under the digital eye.[11] Here, I want to push back against this claim and suggest that the notion of a God who sees everything provides a template for a relational model of surveillance. This understanding of God’s knowledge is rooted in an understanding of God’s narrative unfolding, which is ultimately grounded in the work of liberation. Thus, when God listens in on the Israelites in Egypt and hears their suffering, the point here is that the practise of surveillance is relational and personal, ultimately leading to liberative action.[12] This expression of surveillance is more than simply a knowledge of particular things and forms; it is a broader perspective. It is an understanding that is marked, not so much by data and processing, but by the relational connection between the watcher and the surveilled. As Rachel Muers has put it, this offers ‘a different possible starting point for the thinking of divine omniscience, one that understands God’s knowledge within the context of God’s relation to what – and whom – God knows.’[13]
The concept of relational knowledge challenges the global grasp of new surveillance and substitutes it with a vision of surveillance that can only be discovered relationally. Because this activity takes place between individuals, it compels us to pause and engage in critical reflection about the ethical obligations that might arise between people. As Christine Kosgaard has put it, ‘the subject matter of morality is not what we should bring about, but how we should relate.’[14] What is inherent to this ethical model is that the very nature of a relational experience plays a constitutive role in determining our ethical response. Thus, the door is firmly shut here to the idea of neutral agents of surveillance, because the simple presence of a ‘us’ entails an ethical commitment. It is difficult to understand how we, in Foucauldian terms, as subjects of power, can resist the pressure to become indifferent to the surveillance apparatus. However, as Foucault also observed, where there is power, there is opposition. Reclaiming our relational commitments to one another in the context of surveillance allows us to experience power reversal in specific circumstances and therefore oppose injustice. We can see an example of this resistance in the midwives, Shiphrah and Puah, who refuse to become agents of surveillance on behalf of the Pharaoh.[15] The significance of this episode in the larger narrative of the Bible cannot be overstated, because they may have even delivered Moses. As Eric Stoddart notes, ‘Jewish, and thereby Christian, self-understanding turns on a story of two women resisting their overlord’s surveillance.’[16] The unfolding of the Kingdom thus begins with Shiphrah and Puah acknowledging their relational commitments to their community, and consequently, challenging a depersonalized and unjust application of surveillance.
So far in this investigation, the idea of a ‘us’ has been used quite broadly, under the premise that all human relationships exist similarly to one another and function monolithically in relation to surveillance. Of course, if we accept that power and knowledge are inextricably linked, then even relational knowledge implies the presence of power as a fundamental and unavoidable characteristic. The control of surveillance mechanisms in social systems, even those that are intelligent, is directly related to individual human agency and the eventual use of information. Thus, it affects our human relationships whether we approach each other as those being observed or as those gaining information and power through surveillance. Perhaps, as with Shiphrah and Puah, there is scope within relations of power for particular types of knowledge and orders of surveillance which are not oppressive. But without clearer demarcations of how an individual can act ethically in the midst of surveillance technology, it is difficult to distinguish good and bad actions, even when they occur relationally. I want to suggest here that the way forward lies in defining more closely who ‘us’ is in relation to surveillance, and like our biblical heroines, understanding our relationships not just in terms of universal representation, but specific actions and subversions of power. Surveillance therefore becomes an ethics of resistance as well as a means of nurturing and creating new relationships. This understanding does not fully eliminate the consequences of power and knowledge that emerge in the forms of surveillance. It does, however, use these discourses of observation, relationship, and power, as the occasion for an individual to resist.
I derive this concept of specific communal solidarity from the political slogan ‘Nothing about us without us!’ and its earliest use in the disability community to describe the collective political and social demands of disabled activists.[17] As James Charlton puts it, ‘the demand Nothing About Us Without Us is a demand for self-determination and a necessary precedent to liberation.’[18] Going forward, a clear articulation of this principle will require us to specify quite precisely the needs, conditions, and consciousness that constitute ‘us’ and therefore transform the notion from a political slogan into practical action. This is not a simple undertaking since it necessitates clearly naming the sites of action where God is and is not at work. In Derridean terms, it is a question of knowing what is to be done and not done, for ‘no justice is exercised, no justice is rendered, no justice becomes effective…without a decision that cuts and divides.’[19] Here I follow Eric Stoddart’s lead in advancing a preferential optic for those that are poor and marginalised as the communities of action for our political slogan. I do so based on a liberative reading of scripture, in which I note that ‘the preferential commitment to the poor is at the very heart of Jesus’ preaching of the Reign of God.’[20] In contrast to universalizing conceptions of justice, this is a liberation hermeneutic founded in the realities of oppressed experience and the understanding that not all experiences are created equally. The preferential option thus represents a radical commitment to viewing ‘us’ as those who are marginalised, digitally impoverished, and disproportionately harmed by surveillance technology. As Stoddart notes, ‘a preferential optic for those who are poor is an endeavour to take the ‘Nothing about us, without us’ clarion call seriously.’[21]
The discriminating nature of the surveillance status quo further emphasises the importance of this preferential vision. Despite the fact that surveillance is thoroughly embedded in society, its universal application is nonetheless selective. Partiality of sight is encoded into surveillance systems themselves, and discrimination is a key function for sorting and categorising. Gandy’s concept of the ‘panoptic sort’ names these discriminatory processes by which ‘data sorts individuals into categories and classes on the basis of routine measurements.’[22] These classifications are then used to ‘control their access to the goods and services that define life in the modern capitalist society.’[23] In recent years, much attention has turned to the application of intelligent facial recognition technology and its use in surveillance systems as a way of perpetuating racism. In 2018, the ACLU conducted a test of the facial recognition system ‘Rekognition’ where images of members of Congress were compared with a database of mugshots.[24] The software wrongly identified 28 members of Congress in the database as persons who had been arrested for a crime. Unsurprisingly, the erroneous matches were overwhelmingly made up of persons of colour, including six members of the Congressional Black Caucus.[25] Data is therefore separated from a relational knowledge of human existence and processed in a biased technological matrix, where people who are not favoured are profiled and discriminated against. As David Lon puts it, this is digital discrimination, whereby ‘flows of personal data – abstracted information – are sifted and channelled in the process of risk assessment, to privilege some and disadvantage others, to accept some as legitimately present and to reject others.’[26] In many cases, these prejudices stem from the homogeneous nature of individuals who create the technology, as well as their privileged status as those who are not impacted by discriminating consequences. To address this imbalance, as previously stated, we need a preferential choice for people who are marginalised, as well as a relational inclusion of their voices and needs in any conversations concerning surveillance technology.
What does it mean, then, to practically live out the political demand of ‘nothing about us, without us’ in the context of a preferential optic for the marginalised in surveillance society? I want to conclude this essay by providing some initial suggestions on the basis of a liberative reading of the Parable of the Lost Sheep.[27] In this reading, I encourage those of us who have been privileged by a surveillance society to identify with the shepherd as the agent of surveillance who searches for the lost sheep. A comparable reading from the perspective of the lost and devalued might similarly be possible, but it does not relate to my specific context and voice. The first insight I draw out here is in our culpability for the loss of the animal, or in my reading, for the exclusion and active discrimination of the marginalised through surveillance. Brian Hauglid argues that in ancient Middle Eastern culture, it was common to say the sheep went astray rather than I lost the sheep.[28] By recognising the first person in the active particle, άπολέσας, I invite us to repent of the times when we have been complicit in the surveillance status quo. Secondly, by actively searching for the missing sheep, I commit us to active and practical action aimed at restoring right relationships and including the voices of others in the flock. Finally, I recognise that the deliberate inviting of friends and neighbours to the party once the sheep is found is a visible reminder that all surveillance should occur in a relational community that rejoices when there is justice. The solution to restoring the flock does not lie in a depersonalised approach, but rather, in setting aside our own privilege and going out into the wilderness in preferential search for the one that has been lost and excluded. This approach is a reversal of our existing social and economic common sense, and it is, I hope, a first suggestion that links together some of the concepts offered in this essay.
Thus, as I hope this essay has demonstrated, there are a number of ways in which specific examples of surveillance from the Bible may help us understand the political demand of ‘Nothing about us without us!’ in the context of 21st century surveillance. For starters, the demand for a ‘us’ compels us to consider the increasingly digital and automated trajectory of surveillance technologies. In contrast to this depersonalised approach, the concept of God watching over us demonstrates relational surveillance and an ethical commitment based on the moral responsibilities we owe to one another. Defining who ‘us’ is more closely in the context of a preferential optic for marginalised allows us to more clearly define the sites of participation for political and social action. These sites of engagement are centered around marginalised people and voices that are being watched ‘without’ their involvement. In closing, I propose that we, as individuals privileged in a surveillance society, might recognise our complicity in the surveillance system, and resolve to utilise our power to achieve the integral liberation of others.
[1] David Murakami Wood, ‘The `Surveillance Society,’ European Journal of Criminology 6, no. 2 (2009): 179.
[2] Michel Foucault, Discipline and Punish: The Birth of the Prison, trans. Alan Sheridan (New York: Pantheon, 1977), 202.
[3] Foucault, Discipline and Punish, 202.
[4] Andrew G. Ferguson, The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement (New York: New York University Press, 2020), 19.
[5] Gary T. Marx, Windows into the Soul: Surveillance and Society in an Age of High Technology (Chicago: University of Chicago Press, 2016), 17.
[6] Steven Feldstein, ‘The Global Expansion of AI Surveillance’, Carnegie Endowment for International Peace, September 2019, https://carnegieendowment.org/files/WP-Feldstein-AISurveillance_final1.pdf
[7] Alan R. Wagner, Jason Borenstein, and Ayanna Howard, ‘Overtrust in the Robotic Age,’ Communications of the ACM 61, no. 9 (2018): 9.
[8] David Lyon, ‘God’s Eye: A Reason for Hope,’ Surveillance & Society 16, no. 4 (2018): 546.
[9] Carl Jung, The Structure and Dynamics of the Psyche (London: Routledge, 2014), 197.
[10] Lyon, God’s Eye, 549.
[11] Astrid Schmidt-Burkhardt, ‘The All-Seer: God’s Eye as Proto-Surveillance,’ in Ctrl [space] – Rhetorics of Surveillance from Bentham to Big Brother, eds. Thomas Y. Levin (Boston, MA: The MIT Press, 2002), 16-31.
[12] Exodus 2:23.
[13] Rachel Muers, Keeping Gods Silence: Towards a Theological Ethics of Communication (Malden, MA: Blackwell Publishing, 2004), 194.
[14] Christine M. Korsgaard, ‘The Reasons We Can Share: An Attack on the Distinction between Agent-relative and Agent-neutral Values,’ Social Philosophy and Policy, Volume 10, Issue 1 (1993): 24.
[15] Exodus 1:15–21.
[16] Eric Stoddart, ‘Lecture 1.1 ‘Surveillance’ in the Hebrew Bible / Old Testament’, (DI5924 Surveillance, Theology and the Bible Class lecture, University of St Andrews, Semester 1, 2021-2022).
[17] James I. Charlton, Nothing about Us without Us: Disability Oppression and Empowerment (Berkeley University of California Press, 1998), 3.
[18] Charlton, Nothing about Us, 17.
[19] Jacques Derrida, Acts of Religion, trans. Gil Anidjar (New York: Routledge, 2002), 252.
[20] Gustavo Gutierrez, ‘Option for the Poor,’ in Systematic Theology: Perspectives from Liberation Theology (Maryknoll, NY: Orbis Books, 1996), 39.
[21] Eric Stoddart, The Common Gaze: Surveillance and the Common Good (London: SCM Press, 2021), 128.
[22] Oscar H Gandy, The panoptic sort. A Political Economy of Personal Information (Boulder: Westview Press, 1993), 15.
[23] Gandy, The Panoptic Sort, 15.
[24] Jacob Snow, ‘Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots’, ACLU, July 26 2018, https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28
[25] Snow, Amazon Face Recognition.
[26] David Lyon, Surveillance after September 11 (Cambridge: Polity Press, 2003), 81.
[27] Matthew 18:12–14 and Luke 15:3–7.
[28] Brian Hauglid, Luke’s Three Parables of the Lost and Found: A Textual Study, in in ‘The Life and Teachings of Jesus Christ’, ed. Richard Neitzel Holzapfel (Utah:Deseret Book Co, 2006), 245.