Below is a briefing paper I prepared on recent developments in artificial intelligence and digital technologies, as well as some Christian responses, for the group Christians in Parliament.
Rapid advances in artificial intelligence and digital technologies are making positive contributions in many areas of society, including healthcare, internet communications, transport and financial services. Yet at the same time these advances are creating unique challenges for governments and regulatory authorities around the world. This short paper aims to highlight some pressing issues and provide a Christian framework as a foundation for engagement.
1. Speed of change and unanticipated consequences
Steve Jobs announced the release of the first Apple iPhone in 2007, yet barely 13 years later it was estimated that there were 3.5 billion smartphones across the world, representing about 50% of the world’s population1. Nobody foresaw how this powerful and addictive technology would change human behaviour, relationships and society within a decade. Unanticipated consequences seem to be a feature of tech innovations. Since the start of Facebook in 2004 Mark Zuckerberg has been working on a plan to connect every single human being on the planet. What could possibly go wrong? The benefits of instantaneous digital communication for individuals, families, companies and governments are obvious. And yet Facebook and other social media platforms have inadvertently led to genocidal riots, election manipulation, teenage suicides, AI bots disseminating fake news, and an epidemic of abuse, hate speech, cybercrime and trolling. The coronavirus pandemic has been accompanied by a global digital pandemic of disinformation and conspiracy theories. Arguably the physical pandemic is proving easier to control than the digital version! An infamous slogan of Silicon Valley tech companies is “Move fast and break things”, revealing a naïve assumption that disruption of established structures and frameworks is always positive. Yet it is obvious that in a complex and interlinked society, destabilising innovations may lead to unexpected and damaging consequences.
2. Surveillance capitalism and behaviour modification
Early in 2000 the Google founders Larry Page and Sergey Brin discovered that by capturing, storing and analysing the fragments of personal data, the ‘data exhaust’, that all internet users left behind as they navigated the web gave insights into individual thoughts and emotions, and rich predictive signals of future behaviour. The development of ‘free’ internet products – Gmail, Google Maps, Google Street View, Google Books, the Android operating system and so on – allowed Google to harvest ever more detailed and invasive data from literally billions of users, providing advertisers and other commercial players extraordinary power to target their products and modify individual behaviour for commercial gain. Facebook, Amazon, Microsoft and other tech giants have adopted the same methodology, described by Shushana Zuboff as ‘surveillance capitalism’2. The annual profits and stock valuations of these companies testify to the unprecedented commercial power they exert over our lives. In order to maximise their profits, tech companies have employed sophisticated and covert behaviour modification strategies to maximise the time we spend with their products. Most of us are spending more than 1 hour per day interacting with our smartphones and many are spending as much as a quarter of our waking hours. One of the inevitable consequences of the COVID-19 pandemic has been that we are all spending a greater proportion of our time online, and therefore we are more vulnerable to the effects of covert manipulation. As Tristram Harris of the Center for Humane Technology put it, “the problem isn’t that people lack willpower; it’s that there are a thousand people on the other side of the screen whose job is to break down the self-regulation you have.”3 It is striking that the behaviour modification techniques used so successfully by the tech giants have similarities with those employed in fixed odds betting terminals. This technological onslaught tends to lead to a sense of individual hopelessness and fatalism, described by Zuboff as ‘psychic numbing’4. We know that we are being monitored and manipulated and yet we continue to use the technology because it seems that ‘there is no alternative’.
2. Shoshana Zuboff, The Age of Surveillance Capitalism, Profile Books, 2018
3. Quoted in Adam Alter, Irresistible, the rise of addictive technology and the business of keeping us hooked, Penguin Books, 2017
4. Shoshana Zuboff, The Age of Surveillance Capitalism, Profile Books, 2018
3. Concentrations of power, expertise and economic resources
A small number of commercial companies and stakeholders, based primarily in the USA and China, are exerting extraordinary monopolistic power and influence across the world. In 2020 the five leading US tech companies had a combined valuation of more than 6 trillion dollars5. Unlike most western governments who are burdened by massive debt repayments, the tech companies are sitting on unimaginable mountains of cash. And they are using their economic fire power to defend their own interests. They are purchasing the best technical brains around the world, buying up or neutering start-ups that threaten any competition and spending tens of millions of dollars per annum in lobbying politicians. For several years Google has topped the list of company spending on lobbying US politicians6. The technology is very new but the power dynamics are very old and very well-defended. One of the unforeseen side-effects of the COVID-19 pandemic has been to further entrench the power and global influence of the tech giants. Old-fashioned approaches to competition regulation, such as breaking up companies into smaller competing units, have proved to be ineffective. The technology platforms depend on ‘network effects’. The more people who use the platform, the more valuable the service becomes – not just to users but also to advertisers and shareholders. The power of the Google search engine as a means of generating a vast income stream and providing overt and covert data for advertisers rests on the fact that over 92% of all internet searches around the world are conducted through Google7. In January 2020 the runner-up search engine was Bing with 2.4%. Given the speed of change and the astonishing concentration of power, expertise and money, it is not surprising that the battle between major tech companies and regulators seems so one-sided.
7. Search Engine Market Share Worldwide | StatCounter Global https://gs.statcounter.com/search-engine-market-share
4. Chatbots and simulated companions
As the technology continues to develop at breakneck speed, new and challenging ethical and social issues are being raised. Hundreds of commercial companies around the world are developing AI chatbots and devices such as Amazon’s Alexa, Google Home and Apple’s Siri. The companies are engaged in an intense competition to have their devices present within every home, every workplace and every vehicle. It seems likely that interactions with apparently human-like and ‘emotionally intelligent’ AIs will become commonplace within the next 10 years. AI chatbots are being promoted to provide companionship for elderly people or lonely individuals, counselling for those with mental health issues, medical advice for the sick and compassionate friendship for those who are grieving. But how should we think of these synthetic ‘relationships’? Can they play a helpful role for those grieving the loss of a loved-one or those merely wishing to have an honest and self-disclosing conversation? Or could synthetic relationships with AIs somehow interfere with the messy process of real human-to-human interactions, and with our understanding of what it means to be a person?
5. Hidden bias in machine learning systems
AI systems are being increasingly used in employment screening to identify the most promising candidates. A US company HireVue has developed a widely-used job interview video platform that uses AI to assess candidates and predict their likelihood to succeed in a particular job8. AI systems are trained using data about previous applicants. Yet it is increasingly recognised that algorithmic systems can introduce bias (such as covert discrimination based on gender, race and age), and they lack accountability and transparency. Another troubling example is the use of predictive algorithms in the criminal justice system. In some courts in both the USA and the UK algorithms, which are trained on historical crime data, are being used to provide an estimate that a particular defendant will reoffend9,10. The output of the algorithm is then used by a judge to aid decisions about parole, rehabilitation and other crucial outcomes. Yet there is an obvious risk that algorithms trained on historical data can amplify and perpetuate embedded social biases. And because most algorithms are protected by commercial confidentiality, it’s also virtually impossible to understand the details of the decision-making process.
10. Algorithms in the criminal justice system, Law Society, June 2019
6. Disruption of established employment patterns
During the first Industrial Revolution in northern England, the mechanized factory displaced craft and agricultural workers, traditional middle-income jobs dried up, low wage employment increased and the profits of mill-owners surged. It took over half a century until ordinary working people saw the benefits of the Industrial Revolution trickle down11. There are obvious parallels with the current era. Although in the long run it is likely that many new jobs will be created, in the meantime many traditional white-collar and blue-collar occupations will become unsustainable. It is clearly not realistic to expect most middle-aged industrial or retail workers to re-train as web-designers or data engineers, and it is certainly possible that technological unemployment will lead to widespread social disruption which may continue for many decades. Many skilled and semi-skilled workers are likely to be made worse off initially and they may never benefit directly from new job creation. In the remainder of this short paper we will look at some fundamental responses to these challenging developments from the perspective of the Christian faith.
11. Carl Benedikt Frey, The Technology Trap, Princeton University Press, 2019
From a Christian perspective, the ability to develop innovative technology can be seen as part of our God-given created nature, the means by which we fulfil the creation mandates to ‘fill the earth and subdue it’ (Genesis 1:28) and to care for all of creation (Genesis 1:28-30, Genesis 2:5-8). In the early chapters of Genesis we see this illustrated in the development of agriculture, metalwork, musical instruments and ship-building. And it has often been remarked that the grand narrative of the Bible starts in a garden, but it ends in a city, the New Jerusalem. And a city is unequivocally an artefact, a product of technology. So the Christian faith starts and ends with an unashamedly pro-technology orientation. However, we cannot be naively enthusiastic about everything that technology offers. Many technologists adopt a rather simplistic or ‘instrumentalist’ view of their products. They are regarded as neutral tools which can be used for good or evil. But in reality, technology represents a much more profound and complex reality. George Grant, the Canadian Catholic philosopher, defined technology as ‘the interpenetration of making and knowing, orientated to the mastery of nature, including human nature.’12 In a fallen world the mastery of nature always carries with it evil and manipulative possibilities. The first example of this in the Biblical narrative is in the story of the tower of Babel (Genesis 11) where human technology is used ‘to make a name for ourselves’ and in disobedience of God’s command. So we should celebrate the enormous potential for the common good and for the promotion of human flourishing that AI and digital technologies bring, whilst being cleareyed about the need to resist and confront their potential for unhelpful and destructive mastery over our human nature. Whenever we are being persuaded to adopt a new technology, we must ask ourselves what we are giving up. What precious aspect of our humanity and of our community might we unintentionally fail to nurture and ultimately lose if we go down this road? Just because we can develop a technological ‘solution’ does not mean that we should.
12. George Grant, “Justice and the Right to Life” in The George Grant Reader, University of Toronto Press, 1998
Human uniqueness and embodiment
The development of AI and intelligent machines raises profound questions about what it means to be human, and how human beings can flourish in a technologically dominated world. Digital tech emphasises the value of disembodied information which can be extracted, analysed, manipulated, copied, stored and transmitted instantaneously around the world. It is notable that some recent secular thinkers are adopting a strange kind of materialistic Platonism in which our bodies are seen as merely containers for the real gold of our existence, the unique and precious information which resides in our brains. Some tech pioneers dream of a future existence in which all our desires and longings can be met instantaneously and effortlessly in the digital realm. Yet in Christian thinking we are created out of physical stuff, the dust of the earth, and we are physically located in time and space. We are each of us embodied persons, made in God’s image and designed for intimate and loving relationships with other physical and embodied beings. Each of us is a unique person created by a relational God for relationships. And our humanity, embodied in flesh, is central to our relationships (Genesis 2:23-24). Mysteriously and wonderfully, our fleshly embodiment is vindicated in the Incarnation and Resurrection of Jesus Christ when the Word himself became flesh (John 1:14, Luke 24:39). Machines cannot share our embodiment or our personhood. They are artefacts of human creativity, with the potential to support our unique human calling, but they can never enter into genuine human relationships. Behind the simulated compassion of AI bots and companion robots it is possible to see a shallow and instrumentalised understanding of relationships, orientated towards the satisfaction of my internal emotional needs. But the Biblical faith provides a richer and deeper perspective on human relationality, providing at its highest a reflection of the self-giving love between the Persons of the triune God. So as we reflect on the possibilities that technology offers, we must ask how we can build a future in which physically embodied human beings can flourish, and in which face-to-face embodied relationships can be celebrated and protected. Many thoughtful observers in our society have growing intuitions that the increasing emphasis on living digitally disembodied lives is unhealthy. The malign effects of the COVID-19 pandemic have reinforced our understanding of the limitations and impoverishment of disembodied virtual relationships. But, perhaps uniquely, orthodox Christianity provides a theologically and philosophically robust explanation for why human embodiment matters. In the person of the incarnate and risen Jesus Christ our physical, touchable human existence is forever vindicated and celebrated.
Work and leisure
The Biblical narrative repeatedly stresses the importance and significance of work. The Creator God himself is a worker and he calls us to work to subdue the creation and to explore its rich potential. Work, although contaminated by the consequences of the Fall, still provides meaning, significance and purpose to our human lives. Therefore, some technological visions of the future, in which every human desire, longing and whim can be instantly and effortlessly satisfied by beneficent machines, do not mesh with a Christian understanding of human flourishing. Christians understand the importance of having a sense of purpose and meaning and the intrinsic value of serving others, of building flourishing communities and relationships and of serving the common good. However, as we look to the future, we may see an increasing separation between the concepts of ‘work’ on the one hand and ‘paid employment’ on the other. Some form of focused labour is essential to our humanity, but it need not necessarily be the primary means by which our material needs are met. New forms of un-paid employment, such as caring for others, creative activities, community building, creation care and charitable engagement, may be required for those who are unable to find jobs that generate an income.
Protection of the vulnerable
A Biblical understanding of economics and social justice leads to a focus on the most vulnerable in our societies. In ancient Jewish society it was the widows, orphans and immigrants who represented the most vulnerable and time and again the God of the Bible proclaimed that he was the Defender of the defenceless and called his people to practical action for the most vulnerable (Deuteronomy 10:17-19). Therefore, as we consider the control and regulation of powerful digital technologies, our Christian responsibility is to identify those who are most vulnerable to abuse and manipulation, and to develop robust and effective strategies for protection and harm minimisation. It is generally recognised that in the treatment of heroin addicts it is most desirable that they should be helped to become drugfree. Yet for those who continue in addictive behaviours the provision of clean syringes and needles is recognised as being ethically appropriate because it reduces death and permanent harm. In the same way, although it may be desirable to help people live without addictive digital technologies, it may be a higher priority to find new regulatory approaches which counteract or minimise the addictive power of the technologies. One example might be outlawing hidden behaviour modification techniques which reinforce frequent smartphone and internet usage.
NEW APPROACHES TO REGULATION
It is clearly inappropriate to expect technologists and commercial companies to regulate themselves and the onus is on regulatory authorities and governments around the world to develop innovative and effective means of control and oversight. This is now a rapidly developing area and many new initiatives are underway. The UK is at the forefront of many of these developments and there are new opportunities for Christians to make strategic alliances with others who share common objectives. Possible initiatives include:
1. Enforcing greater transparency for consumers when AI systems are operating, including targeted advertising, dynamic pricing systems, AI screening of job applications and so on. This includes providing feedback on why individuals have been rejected by AI systems.
2. Enforcing the ‘intelligibility’ of AI systems (the ability to understand the means by which the system has come to a decision) in critical areas such as healthcare, criminal justice and military applications.
3. Developing agreed quality assurance and certification standards for AI systems and algorithms.
4. Enforcing transparency when human/ machine confusion is possible. “I’m required by law to remind you that I’m only a machine and not a human person …” (similar to current warnings about CCTV or audio recordings).
5. Enforcing transparency about the commercial value of my individual data to tech companies such as Google and Facebook and enforcing the option of paid-for confidential services and products of equal quality to ‘free’ data-mined services.
6. Regulating the availability and use of sophisticated companion and sex robots. This might include for example, developing legislation against the enaction of abuse, rape or torture with highly realistic child humanoid robots.
7. Developing new strategies to help workers made redundant as a result of technological unemployment. These might include supporting adult education, supporting the costs of relocation, reducing barriers to switching jobs and providing transitional financial support.
Rapid advances in artificial intelligence and digital technologies are creating unique challenges for governments and regulatory authorities around the world. These are compounded by the speed of technological change, unanticipated consequences, the rise of invasive surveillance and data capture for commercial gain and huge concentrations of power, expertise and economic resources in a small number of companies. There is an urgent need for regulatory authorities and governments around the world to develop innovative and effective means of control and oversight. The Christian faith provides foundations and a framework for what it means to be human – for both our flourishing and our protection – which policymakers can draw upon. As we reflect on the possibilities that technology offers, we must ask how we can build a future in which physically embodied human beings can flourish, the vulnerable can be protected, and face-to-face embodied relationships can be celebrated and protected.
Secular books and resources
House of Lords Select Committee on Artificial Intelligence report, 2018, AI in the UK: ready, willing and able? https://publications.parliament.uk/pa/ ld201719/ldselect/ldai/100/ 100.pdf
Nuffield Foundation, Ethical and Societal Implications of Data and AI, 2018 https://www.nuffieldfoundation.org/sites/default/files/files/Ethical-and-SocietalImplications-of-Data-and-AI-report-Nuffield-Foundat.pdf
European Commission, Ethics Guidelines for trustworthy AI, 2019 https://ec.europa.eu/digital-single-market/en/news/ethics-guidelines-trustworthy-ai
European Union White Paper on Artificial Intelligence – a European approach to excellence and trust, 2020 https://ec.europa.eu/info/sites/info/files/commission-white-paper-artificial-intelligence-feb2020_en.pdf
Machine Learning. Royal Society Report, 2017 https://royalsociety.org/~/media/policy/projects/machine-learning/publications/machine-learning-report.pdf
US Congress, Investigation of Competition in Digital Markets, 2020 https://judiciary.house.gov/uploadedfiles/competition_in_digital_markets.pdf
Christian books and resources
AI and simulated relationships, John Wyatt, Cambridge Papers, December 2019 https://www.jubilee-centre.org/cambridge-papers/artificial-intelligence-and-simulated-relationships
Artificially Intelligent? The myths, realities and trajectories of AI, Calum Samuelson, Jubilee Centre https://www.jubilee-centre.org/artificially-intelligent-ebook
Alien Minds, John Wyatt, IVP (forthcoming 2021)
The robot will see you now, ed John Wyatt and Stephen Williams, SPCK (forthcoming 2021)
Organisations and websites
Center for Humane Technology, https://humanetech.com
Leverhulme Centre for the Future of Intelligence, http://lcfi.ac.uk
Techhuman website, Technology + Humanity + Faith https://www.techhuman.org