For as long as we’ve been curious about anything, we’ve been curious about the future. What happens next has become a topic many writers have attempted to pin down, with science-fiction and dystopia being widely read genres. And, like how most of our thoughts happen these days, I started thinking about this after watching a Netflix documentary, Coded Bias (Crucial watching for anyone interested in how technology will, and already has, started to profoundly impact our lives). From facial recognition software failing to recognise Black faces, algorithms rejecting women for jobs and generating a worse credit rating for them than their male counterparts and the ethics and often racial bias of technological surveillance, the documentary shows how a biased society produces biased technology that can be detrimental to the future. So, when we are the ones that create program and control technology, are we the ones that create the problems and are they more ‘human’ than we think? Can understanding human nature help us to understand and improve technology, even when they are seen as so inherently different?
This ‘inherent’ difference was particularly highlighted by Phillip K. Dick in his novel Do Androids Dream of Electric Sheep?. Immediately, what is shown is a clear distinction between the ‘human’ and the ‘non-human’, what is real and what is artificial. The whole concept of reality is seen as a status symbol and a commodity to be bought, the ultimate pride being a ‘real animal’ as a pet and the greatest shame being the ownership of a man-made replica. This filters through in the extreme to ‘humans’ and ‘androids’, with androids being seen as a threat and needing to be ‘retired’, a fact that is seen as mundane as they’re not deemed ‘real’.
What is interesting, however, is the metric that is used to measure who is human and who is a ‘Nexus-6’ type Android and the ideologies behind it. The decisive factor in both how to determine who is an Android and what, in the minds of their human counterparts, makes them so dangerous, is a lack of empathy. The quiz devised to tell a Nexus-6 Android from a human is a series of hypothetical scenarios to do with their reaction to a number of hypothetical scenarios involving the treatment or mistreatment of animals. Empathy, however, is an interesting trait to be chosen, especially in the world of the novel.
For a start, lack of empathy is seen as a complex trait that is almost both inherently ‘human’ and inherently ‘not-human’. The idea of, in the technological world, robots and their lack of emotion that is ‘uncanny’ to human eyes and, in the human world, ‘psychopaths’ who commit great atrocities without feeling any empathy are seen as frightening. Empathy helps us to cooperate and stops us from hurting each other. Just living in the world, though, means it doesn’t take long to think of the many times, big and small, we’ve seen real, human people act unethically.
This ties in with the main concept of the ‘quiz’ - the idea of not wanting to see animals hurt. In the world in which the book is set, the population of animals has dwindled, making it even more vital that they are cared for. Understanding the importance of animals and not caring about their mistreatment is seen as demonstrating a lack of empathy and the evidence that with whom you are speaking is not human, but an Android. But what at this point is not mentioned is why there are so few animals in the first place. The dystopian world of the novel takes place after ‘World War Terminus’, a mass nuclear war, has left the world virtually uninhabitable, with its occupants having to colonise Mars for a better environment and most animal life being killed. It could be argued that to judge someone’s empathy by how they treat the few remaining, highly prized animals is hypocritical from the race that has such disregard for the welfare of the planet and its inhabitants that it killed them off in the first place. To complete the picture, animals are not so treasured because of some ‘empathetic’ connection to them; their rarity has increased both their economic value and their cultural position as a ‘status symbol’, which is why they’re so prized.
The hypocrisy surrounding the concept of empathy presented by Phillip K. Dick doesn’t end there. Whilst androids are presented as cold and purely technological, the ideas of reality and artifice, and the biological and the technological have become so closely entwined that the ‘humans’ within the novel have become more themselves like ‘robots’ to our eyes. This is presented right from the start, when Rick Deckard wakes up and discusses their ‘Penfield mood organ’, a technological device that can set and change their precise mood for the day. When this is discussed in parallel to what we see later in the novel, an almost virtual-reality experience is designed to be practiced regularly to increase empathy. With these factors combined, the ‘authenticity’ of any emotion is open to scrutiny. When feelings are generated not by the mind but meticulously planned and strictly controlled by an external technological device, are they real? It could be argued that the humans controlling them have become as emotionless as the androids they are fighting with. Empathy is presented to be this natural instinct absent in the Androids but, whilst even the best of us can find it hard work to be kind and slip up from time to time, it is shown to be a much more commercialised and artificial process. The way humans work on their ‘empathy’ almost mimics machine-learning processes.
With this in mind, a final point to consider is the differentiation between ‘human’ and ‘Android’ in general. Whilst it would be tremendously diminishing the real-life impact to compare a science-fiction novel to real world atrocities, it can’t be denied that the novel contains the echo of colonial patterns of thinking. Colonising groups have always used misguided, under researched and often stereotyped ‘traits’ and ‘differences’ as justification for murder and their systematic overruling of another group. Like so often has been done, the ‘humans’ have taken a supposed trait/absence of a trait, seen it as a reason for potential ‘threat’ and used it as a reason to ‘retire’ the Androids, completely ignoring the hypocrisy of their actions. In this case, however, the treatment of a different group as not ‘human’ is taken very literally.
Whilst Phillip K. Dick wrote much before the technology we see today was even predicted, let alone created, one thing is for certain: technology and humanity interact, and as technology starts to govern our lives even more, the errors of the past and the present will make their way into the future; It’ll just be in a different form.