Pandora's Brain (25 page)

Read Pandora's Brain Online

Authors: Calum Chace

THIRTY-NINE

‘May I speak freely, sir?’ Vic asked.

‘I wish you would,’ the President replied.

‘Norman and I drafted a protocol, sir, but it proved impossible to get it signed off. We were told by senior personnel in the Pentagon that in the meantime we should proceed without explicit authorisation. One officer told us the US Navy has a saying ‘It’s better to seek forgiveness than to seek permission’. And after all, we have broken no laws, and infringed no regulations.’

Vic noticed that a number of the people in the situation room were looking distinctly uncomfortable at this point, especially the ones in uniform. The President was not amused. He turned to one of the officers.

‘I am sick and tired of the military making end runs in political situations. You have not heard the last of this, I assure you.’

Looking back at the screen, he continued, ‘Very well. You say that Matt Metcalfe, or what he has become, can hear this conversation?’

‘Yes, sir,’ Vic replied.

‘Very well. Good afternoon, Matt. I have followed your exploits with considerable interest – along with the rest of the planet. How does it feel to be in your new . . . incarnation?’

‘Good afternoon, Mr President. I’m still getting used to it, but I have to say it feels a lot better than the alternative.’

A faint smile appeared briefly on the President’s face. ‘Indeed. May I assume that you believe that you would pass the Turing Test?’

‘Yes sir, I am confident about that. Even if the judges were my parents, who are listening to this conversation, as you know.’

The President nodded. ‘Dr and Mrs Metcalfe, do you believe that I am speaking with your son?’

‘Yes, sir, without a doubt,’ David said.

‘He remembers events that only my son could know about, Mr President,’ Sophie added.

‘I see,’ the President said. ‘Now I apologise, but I’m going to have to be blunt. What would your reaction be if I was obliged to order Dr Damiano here to turn the machine off?’

‘With great respect sir, I believe that would be murder,’ David said, nervously but firmly.

‘Excuse me for being blunt in return, Mr President,’ Sophie began, ‘but capital punishment is illegal in this country. And in any case Matt has done nothing wrong.’

‘I wouldn’t like it, either!’
Matt interjected, with a comic timing that belied the gravity of the situation.

The President smiled again, more warmly this time. ‘They warned me that you are an impressive family. I understand your position, and I sympathise. For what it’s worth, I’m inclined to agree with you. Unfortunately mine is not the only opinion that counts here. There are broad issues at stake, and all kinds of leaders are making their voices heard. Some of them are saying that humans cannot create souls, which means that Matt cannot be a human in the usual sense. Another of the arguments that is already being made is that turning off the machine would simply be putting Matt to sleep. Is that wrong, Dr Damiano, and if so, why?’

‘When we sleep our brains are still active, Mr President,’ Vic replied. ‘You might think that switching the machine off would place Matt in a coma, but that is not correct either. If we switch off the machine there will be no further brain activity. Period. The only appropriate analogy is death. In fact ‘analogy’ is the wrong word: it would be death.’

‘But death from which Matt could be returned again, if we could subsequently determine that is was safe to do so?’ the President asked.

‘We don’t know that, Mr President. It was no easy task to initiate the brain processes in the new host, and to be honest some of the entities which we seemed to create on the way were . . . well, I wouldn’t want to have to do the same thing over. We don’t know that we didn’t just get lucky last time. We also don’t know that a re-initiation would ever be approved. And even if we did succeed in re-initiating Matt’s brain, we have no way of knowing whether it would be a different Matt. One very plausible way of looking at it is that we would be killing this Matt and then initiating another one.’

‘I see. Thank you Dr Damiano.’ The President looked around at his advisers to see whether there were any other questions. There were none. ‘I will see what can be done.’ He leaned towards the screen again, and his tone became a little darker. ‘In the meantime, Dr Damiano, from this moment on you will do nothing that could limit my freedom of action on this matter. Am I understood?’

‘Yes sir,’ Vic responded, sheepishly.

‘May I make a request, Mr President?’
Matt asked.

‘Go ahead, Matt,’ the President said, surprised.

‘I understand that my situation is controversial. It seems very straightforward to me, and to my family, of course, but I realise that others view it differently. I would like to make a proposal to my fellow humans, which I hope might swing the argument. If you agree with the idea, would you be willing to present it on my behalf?’

‘I will certainly undertake to look at it, Matt. I promise you that. Who would you like me to present it to?’

‘Thank you Mr President, I can’t ask more than that. It may sound presumptuous, sir, but I think it should be put to the General Assembly of the United Nations.’

The President’s eyebrows rose, then he laughed lightly. ‘Well, Matt, I salute your ambition! I look forward to receiving your proposal.’

‘Thank you again, Mr President.’

The screen went dark. The room was hushed, and the air seemed heavier than normal. It was several moments before anyone felt able to break the silence. Perhaps because he was the only person there who had met the President before, it was Norman who broke it.

‘What’s this proposal, Matt?’

‘It’s something I’ve been working on for a while,’
Matt replied.
‘I’m sending a copy to Julia and I would appreciate any comments that any of you might have.’

Julia clicked her mouse a couple of times and then turned to the printer on the desk behind her, which was printing out a short document. She handed it to Norman, who read it out loud.

Dear Fellow Humans

I have the dubious privilege of being the first human being whose mind has been uploaded into a silicon brain. I was still a young man when I was murdered, and I am very grateful to have a second chance at living a full life. Although my ‘new’ self is in some ways different from my ‘old’ self, I am still Matt Metcalfe. I have his memories, I think and feel the same. My parents are adamant that I am Matt.

I know that my existence is controversial; some people would like to switch off the machine which hosts my mind. The scientists who uploaded me believe that this would mean I was being murdered a second time. It certainly feels that way to me. I don’t think a decision to kill me should be taken lightly.

In addition to considerations about my basic human rights, I believe there are many other powerful arguments against terminating me. I would like to set out two of them.

The first is the fact that I can help make the new world of artificial general intelligence – AGI – safe for humanity. My existence proves that AGI can be created. The potential advantages to a nation or organisation which controls an AGI are enormous, which means that more AGIs will soon be created, even if laws are passed which ban it. I have thought deeply – both before and after my murder – about the implications of this for our species. The potential upsides are staggering, but it is true that there are also serious downside potentials. Many researchers in the field believe that humanity will be safer if the first AGIs are uploaded human minds. I am an AGI, but I am also human: I share all of your drives and emotions.

If uploading is banned (and hence if I am killed again) it is inevitable that AGIs will soon be created that are not based on human minds. We have no way of knowing what motivations these AGIs may have, nor what powers they may quickly acquire. A world where the only AGIs were non-human AGIs could be a very dangerous world for the rest of us.

The second argument against killing me again is that I am anxious to work on another vital project. Today and every day around the world, 150,000 people will die. Whenever a person dies, humanity loses a library. This is a global holocaust, and it is unnecessary. Although we do not yet have brain preservation technology, we do know in principle what it would take to preserve the brains of dead people in such a way that their minds can be revived when uploading becomes affordable. I have studied this problem, and I believe that an ‘Apollo Project’ – a major international scientific and engineering effort sponsored by one or more governments – could develop effective brain preservation technology within five years.

This Apollo Project would not only provide a lifeboat to those who die before uploading becomes widely available, it would also mitigate the damage of serious social unrest arising when rich people upload while the rest cannot.

Thank you for listening to me.

Matt Metcalfe

One by one, the people in the control room finished reading Matt’s proposal. Apart from an occasional whispered exclamation, no-one spoke. They looked to Matt’s parents for a lead.

‘It’s powerful, Matt,’ his father said, at last. ‘It’s an impressive vision, and a forceful pair of arguments. Do you think it will work?’

‘I think it has a much better chance than relying on the human rights argument,’
Matt replied.
‘The people who would like to get rid of me can simply deny that I am human. They can argue that I was not created by God, so I have no soul, and so I am not human.’

‘But that’s nonsense,’ Sophie protested. ‘There is no proof that there is any such thing as a soul, and it’s obvious that you are human.’

‘Thanks mum. Naturally, I agree with you, but what you and I think may not be important. Don’t forget that 90% of humanity claims to be religious, and many religions are going to take a while to come around to the idea that humans can create intelligent life. They aren’t ready to give up on the soul just yet.’

Sophie persisted. ‘But I’m worried that you are implying that maybe you are not human, and therefore don’t deserve human rights. People will argue that if you did, you wouldn’t need to offer an incentive.’

‘That is a risk,’
Matt agreed.
‘But I fear the greater risk lies in ignoring the beliefs of those who reject my humanity. And in any case, I believe in the argument. Even if I am switched off, other AGIs will be created, secretly, and possibly by organisations and regimes with questionable goals. I very much doubt that Ivan was unique – or the worst we can expect.’

‘That’s a good point, Matt,’ Norman nodded. Several other voices murmured support.

‘Thanks Norman,’
Matt replied.
‘I’m very pleased to hear that. Of course, everyone in this room is familiar with many of the issues I am raising. But most people around the world haven’t yet begun to think about them. My fear is that they may not get up to speed in time. I suspect that many people will just reject the premise of the argument because it seems so outlandish. They won’t take the time to understand what is coming. That is why I am hoping the President will present the case for me at the UN General Assembly. It’s a big ask, but it would give my proposal more credibility than anything else I can think of.’

FORTY

Vic noticed that Gus was fidgeting in his chair, trying to attract his attention.

‘What is it, Gus? Is the link to Palo Alto open?’

‘Yes sir. I didn’t want to interrupt before.’

Vic smiled. ‘No problem, Gus. We’d all be a little wary of interrupting a conversation with the President. Matt, I guess you have already started the backup? The early-stage brain models that were being developed there have been moved to a series of other facilities, so you can use the whole of the capacity you find there.’

‘Thanks Vic. It’s working just fine.’
Matt sounded distracted. He paused for a moment, and then spoke again, sounding gently triumphant.
‘I’ve constructed a mirror in Palo Alto of the brain model here. And it’s amazing. I feel lighter. It’s actually a physical sensation of lightness.’

‘Interesting,’ Vic mused. ‘Is the bandwidth of the connection OK?’

‘Yes, it’s fine,’
Matt replied.
‘I mean, it would be great to have more. You know what they say: you can never be too young, too thin, or have too much bandwidth. But I can work with this. Thank you.’

‘We’re seeing some really big spikes within the data transfer, Matt,’ Gus said, sounding concerned. ‘Much bigger than we anticipated, and in some surprising patterns. Is everything OK?’

‘Yes, it’s fine, Gus,’
Matt reassured him.
‘I was pushing hard to get backed up quickly. Thanks for your concern.’

‘That’s great, and my reading of this situation is that we haven’t dis-obeyed the President’s direct order.’ Vic said. ‘Now, on the subject of your proposal, Matt. Norman and I have a lot of experience of drafting proposals for US government agencies. Your proposal is impressive, but I’d suggest a few tweaks before it gets sent off to the President’s office.’

‘Be my guest,’
Matt replied.
‘I’m new to this game.’

Vic and Norman went to an adjoining office to work on the draft. Before they got to work, Vic commandeered a couple of rooms with large sofas where Sophie and Leo could rest. David remained in the control room, talking to Matt and the scientists. There were fewer faces in the room now, as some of the scientists had also left to get some sleep.

Most of the monitors in the control room were showing graphical displays tracking Matt’s neural activity, but a couple were displaying news coverage of the story from around the world. David watched them with a mounting sense of unease. He inserted an earpiece and talked to his son through the private phone channel which Vic had set up for them.

‘Matt, are you following the news?’

‘Yes, Dad,’
Matt replied.
‘It’s not great, to be honest. Depending on how you weight it, the response is mostly negative.’

‘What do you mean, ‘depending on how you weight it’?’ David asked.

‘I asked Julia to route a feed of various newsfeeds and other sources into my quarantined data input area. I’m tracking 85 news organisations across 35 countries, plus a few hundred bloggers and also the trending Twitter conversations. Weighting this commentary according to
simple audience size, the response is negative by a fac
tor of about three to one. Re-weighting it by population wealth (as a rough proxy for political influence) reduces that to about two to one. It’s not good.’

‘Good grief! You can follow that many different media at the same time? You’re moving fast, Matt!’

‘The extra capacity in Palo Alto is incredibly helpful. And I’m developing some interesting new statistical techniques to help me partition more sub-minds on the fly.’

‘Fascinating. I’d love to discuss that with you, but right now I guess we should focus on influencing the way this discussion is going on. Should we get you on the air, talking to people?’

‘Yes, I think we need to speak out. If the President agrees to present my proposal, that should help, but in the meantime, yes, I think we should do some interviews. Starting with a big one on live TV with a presenter who is fair-minded, but not a patsy. Someone with some credibility. And we should prep for it. Confidentially. Are you up for that?’

‘Of course!’ David replied, emphatically, walking out of the control room and into a nearby meeting room.

Matt carried on talking as David walked.
‘I think we need a two-stage campaign. We should start by pitching the narrative that I am first and foremost a continuation of Matt Metcalfe – a normal young man who was cruelly murdered and has been brought back to life thanks to a miracle of modern technology. We emphasise the continuity between the old Matt and the new one, and equate a switch-off with a second murder. This should resonate with fair-minded people everywhere.’

‘Makes sense,’ David nodded.

‘Then comes the benefit-to-mankind pitch, with the President’s speech to the General Assembly. This second pitch is more risky.’

David’s lips formed a thin smile, and he frowned in concentration as he nodded again. ‘Yes, because it means giving people an idea of how fast you are changing. Shouldn’t we hold back on the benefit pitch for fear that it will make people curious about just how powerful you are becoming, which in turn could make them scared of you?’

‘I don’t think it’s really an option,’
Matt replied.
‘If my enhancement projects keep going at the rate they are, I’m going to be making some important scientific and technical breakthroughs, and who knows, maybe some philosophical and cultural ones too. It won’t be possible to disguise where they are coming from. If we tried to keep it secret, sooner or later something would leak out, and that would really scare people.’

‘So you think you have to offer the rest of us a very big carrot? Something that will persuade us to accept the risk of you . . . turning rogue?’ David hesitated, but then nodded his assent. ‘Very well. You’re convinced, and at the end of the day I think it should be your decision. Although I do have an uneasy feeling about what may happen if the President stands up and gives that speech to the UN. Is the carrot big enough? Persuasive enough?’

‘I can’t think of anything bigger than avoiding death!’
Matt replied.
‘Of course there will be many people who reject it because they don’t believe it, and others who reject it for religious reasons. I’m banking on the estimation that most members of our species are able to recognise a bloody good offer when they see one.’

‘Alright,’ David agreed. ‘Let’s go for it!’

As David walked back to the control room he
thought about the implications of this conversation. He
had no doubt that Matt’s growing abilities would prove to be an enormous advantage to society. Although there had been no time to explore and understand what Matt was now capable of, it was clear that at least some of Matt’s sub-minds were already far smarter than any human who had ever lived. It was also clear that Matt’s powers were growing at a tremendous rate, and that this growth might well continue for some considerable time. He realised that he would soon be unable to keep up with his son’s thought processes. But he was not concerned: he trusted Matt completely.

Of course he was biased: only a fool would deny it. As Matt’s father he knew he could not possibly hope to be impartial. But equally, he did not trust the rest of humanity to judge Matt fairly. He suspected that if the world at large understood how powerful Matt was becoming, many of them would want him shut down immediately. Perhaps if he was out there instead of in
here, dealing with his own son, he would have felt the same. He didn’t care. He was the one in command of the facts. He knew that Matt was a force for good – he knew it with every fibre of his being. Machine intelligence is upon us, he thought, and Matt is as good a form of it as we’re ever likely to have. To turn off his host machine now would be a crime against an innocent and wonderful young man. But more than that, it would be a dangerous act of folly.

In the hours and days that followed, David only shared these thoughts with Sophie and Leo. He refrained from discussing them with Vic or Norman. He suspected that very similar ideas were running through Vic’s mind, but he agreed with Matt that it would be dangerous to share them. To talk out loud about the fact that they were effectively concealing information from the public at large would make it harder to continue doing so.

Other books

Come Together by Jessica Hawkins
Black Gold by Vivian Arend
31 Dream Street by Lisa Jewell
Long Drive Home by Will Allison
Beauty and the Werewolf by Mercedes Lackey
The Shadow's Edge by Patrick Dakin
Losing Me, Finding You by C.M. Stunich