Home > Books > Zendegi by Greg Egan

Zendegi by Greg Egan

It’s a curious coincidence that this book opens with a problem I’ve been wrestling with for some time. Being one of the dinosaurs, I’m still hoarding my collection of singles and LPs accumulated over the early years. I copied the 78s to tape many moons ago but I worry about how long the tapes will remain playable. Like Martin Seymour in Zendegi by Greg Egan (Night Shade Books, 2010), I dream of digitising all the recordings but find myself lacking the will. My wife has little interest and will not shed many tears if the original recordings are put on to the funeral pyre when my body is finally sent on its way. She’s not a Hindu and, therefore, would not consider sati (or suthi) an appropriate way of celebrating my death. But relieving herself of the option of replaying some of the hits from the 1950s might give her peace in her remaining years.

Anyway, Martin discovers that, unless you carefully check the sound levels on all the records to be transferred to the computer, it’s very easy to end up with wave shaping, i.e. distorted sound. Being something of a perfectionist, that would mean I could not listen to any of the affected tracks. Because he’s pressed for time, Martin makes the discovery after he has disposed of the originals. This loss makes him sad. But, in a more serious way, it also foreshadows the problems explored in this book. It all starts with the efforts of Nasim Golestani to map the part of a finch’s brain that decides what song to sing. She eventually creates a computer model that replicates bird song. It’s not clear how successful this is because it’s a bit difficult to ask real finches what they think of the tone and melody produced by the computerised version. The rest of the book then moves up to artificial intelligence experiments on replicating human abilities. Not unnaturally, there are some rich people who think it would be just dandy to have themselves uploaded and so achieve immortality.

Greg Egan keeps this real in his consistent rejection of the notion it would be possible to make a recording of anyone’s brain waves and so reproduce the human being. The best his scientists can manage is the replication of physical skills in avatars. Zendegi is a gaming platform and the owners make a lot of money out of people wanting to play football and other sports alongside or against their favourite players. Even inducing natural language abilities is fraught with difficulty because, like the bird song, computers have no understanding of how and why each individual note is significant. So avatars can be given access to comprehensive vocabularies but, even with multiple brain scans taken over months, there’s no consistency in the avatar’s performance as the target human. There’s no reasonable prospect of being able to “clone” a human personality by digitising his or her brain waves.

This is not to say that avatars could not undertake routine tasks and so displace the need for human labour. For example, it might be possible to build systems sophisticated enough to replace call centre staff or to perform other tasks not relying on face-to-face contact with real people. In a sense, this is simply extending the displacement of the thousands of administrative and secretarial staff in the management of any business. With software able to take dictation from bosses who refuse to learn how to type, there’s no longer a need for shorthand and typing skills sitting expensively in another office, nor for the clerks who file all the paper copies of correspondence generated, nor for the filing cabinets thereby closing down industrial production and terminating further jobs. All forms of automation seriously limit the need for human workers. Machines are cheaper and, once they have learned the jobs, make fewer mistakes. So, in all this continuing debate about the extent to which real world societies should allow the development of automated systems, Greg Egan is asking and answering some relevant questions.

However, I find it strange he should place most of the action in a near-future Iran. Although it’s certainly relevant to consider whether, in any sense, machines might capture souls, the political backstory to this novel simply gives us a thriller scenario and does not significantly advance the science fiction element. I’m not convinced the Islamic reaction to the phenomenon of avatars in a gaming environment is constructive in advancing the plot. The reaction of the Christians to the Zendegi project and another US-based attempt to create a massive AI capable of running human government is somewhat predictable and not given much space for development. Indeed, the whole tenor of the book is less science fictional than I expected. The first third is more or less a straight thriller about journalism, and the latter two-thirds is the increasingly sentimental story of Martin and his son. Although the two parts of the book do tie together in the relationship between Martin and Omar — initially a neighbour who gets involved in helping Martin get the news — Martin is somewhat self-absorbed as a person and fails to understand the significance of the relationship. He sees surface reality and is not particularly good in assessing the person underneath. As an early incident shows, you can dress up a man in women’s clothing but this does not convert the man into a woman. Gender identity is based on the whole package of the personality, the physical behaviour and the context. Similarly, you can capture features of human behaviour in avatars on Zandegi, but this does not make them human.

So Zendegi is a sentimental journey through life made by a two slightly inadequate people. Neither Martin nor Nasim are particularly successful as humans although they do manage to get things done. They work on a project together and it fails. I think that sums it all up really. The book is good in part but unsatisfying because it fails to really engage with the social and political implications of the work being done. We see it but there’s not enough meaningful discussion of it. The real questions are whether something approximating human is better than nothing and, if what you create is a kind of Frankenstein monster, would it be moral and legal to kill it by wiping it from the server?

For another review of a book by Greg Egan see The Clockwork Rocket.

  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: