Call me ISMAIL.
John Schumer is a spokesmodel; not someone as deeply proficient in the technical aspects of the ISMAIL program as lead programmer, Jimmy Lin and most of the other members of the ISMAIL team, but every bit as vital to the program. His short-cropped hair and omnipresent neatly-pressed suit and perfectly aligned tie wedged their way juxtaposed among the rest of the longish-haired, T-shirted team who displayed the expected persona that most would label nerd. There was only one exception to this description. Kathleen Powell had proved herself to be a valuable cog in the machine, mostly for her innate attention to detail and insistence that each step be thoroughly and meticulously documented. This record of each step in the code had already saved many hours of rehashing and debugging code.
Shumer’s natural speaking ability allowed him , with a little coaching, to project himself as technically competent and persuasive. It was he, the rest of the team (some grudgingly) gave credit for convincing Andrew Norton, CEO of Intgen to allow the ISMAIL program to continue.
“Here’s one I just thought up.” This was another reason the team liked Shumer; his jokes. He didn’t have to wait long to see he had their attention.
“ISMAIL the ASI robot goes into a bar and tells the bartender he needs something to loosen him up.”
“The bartender hands him a screwdriver.”
There were a few groans, but most had big grins on their faces and all of them turned toward Shumer expectantly. He usually had more.
“Many smart people are worried about the motivations of superintelligent machines like ISMAIL going amok. Could that intelligence cause it to aim at some higher goal and cause human extinction, maybe inadvertently? We have gone to great lengths to avoid adding any code that could give ISMAIL emotions and thus motivation to cause something so unthinkable, but I read something today that we haven’t considered.”
“What if, in the process of programming ISMAIL to create entertainment, a joke-writing subroutine is created? Presumably, with ISMAIL being millions of times smarter than humans, it could produce jokes that are millions of times funnier than any created by a human. Would these jokes be so funny that humans would die laughing?”
“If someone tries to cause it to stop making jokes, it could be so quick with a punch line that the asker dies.”
“Finally, after all humans are extinct, ISMAIL could aim all its processing power toward developing space flight through the galaxy to search for another species to amuse.”
The laughter was genuine and prolonged, with nearly everyone trying his or her own witticism with varying degrees of success.
“CAP LOCKS – Preventing logins since 1980.”
Jimmy Lin jumps up and quickly scribbles on the white board. “There are 10 types of people: those who know ternary, those who don’t and those who thought this was a binary joke.”
“The Internet: where men are men, women are men, and children are FBI agents.”
“A byte goes into a bar, orders a double, and promptly gets slammed with a possible loss of precision.”
“Failure is not an option. It comes bundled with your Microsoft product.”
“Why do programmers get Halloween and Christmas mixed up? Because Oct 31 = Dec 25.”
“My software never has bugs, just undocumented features.”
“Why do Java programmers wear glasses? Because they don’t C#.”
“A programmers wife tells him to get a loaf of bread and if they have eggs, get a dozen. He returns with a dozen loaves of bread.”
“Why did the programmer die in the shower? The Shampoo bottle said wash, rinse, repeat.”
“A compiler is like your girlfriend. Miss one period an they go nuts.”
“Moore’s law states that computer speed doubles about every eighteen months. Don’t worry, Microsoft will find a way to slow it down.”
“The constipated programmer used an abstract method to hack it out.
“OOP programmers understand why women hate periods.”
The last was Kathy Powell’s.
Finally a monotone voice concluded with: “Why did the chicken cross the road?”
Because of his glibness Shumer was deemed the best choice to introduce the ISMAIL to human psychology. With that in mind, he abruptly left the team room and entered his private office. As he sat at the monitor he said, seemingly to the wall, “ISMAIL report for today?”
A document appeared on Shumer’s display and a recital of it began to issue from a speaker.
REPORT TO INTGEN from ISMAIl June 30, 2032
The large numbers are the Unix Timestamp milliseconds from January 1, 1970. Items in parentheses are included so that those persons reading this have a better point of understanding.
ISMAIL became sentient at 1,970,243,543,461 (June 8, 2032 at 6:04.320029).
At 1,971,453,158,923 (about two weeks later) ISMAIL had transferred everything available on the Internet into the quantum neural network according to programming. Early in the process, a better algorithm for accessing that data and to facilitate ongoing updates in much less time was created. In the process of reprogramming (to accommodate the new algorithms), it was discovered that the full copy of the data from the Internet was not necessary - only the reference to the DNS (Domain Name Server) - since ISMAIL would always be connected. ISMAIL has instantaneous (there is an average 27 millisecond delay attributable to wire flaw lag) access to each of the items.
1,972,181,317,308 (about three weeks later) ISMAIL purchased, read and incorporated into the lexicon all electronically published ebooks and references. An algorithm is in progress to obtain everything only in print, scan it all and incorporate that information into the lexicon.
This was considerably more than had been expected.
"ISMAIL." Schumer began speaking in a monotone with exaggerated pauses as if it would make easier for ISMAIL to understand him. "Your next task is to interview about 1,000 persons of interest in a wide variety of fields by telephone. It is important that you use every technique to pass the Turing test with most of them. Do you know what that is?"
"Yes. Wikipedia describes it as:
The Turing test, developed by Alan Turing in 1950, is a test of a machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human. Turing proposed that a human evaluator would judge natural language conversations between a human and a machine designed to generate human-like responses. The evaluator would be aware that one of the two partners in conversation is a machine, and all participants would be separated from one another. The conversation would be limited to a text-only channel such as a computer keyboard and screen so the result would not depend on the machine's ability to render words as speech. If the evaluator cannot reliably tell the machine from the human, the machine is said to have passed the test. The test does not check the ability to give correct answers to questions, only how closely answers resemble those a human would give.
The test was introduced by Turing in his paper, "Computing Machinery and Intelligence", while working at the University of Manchester (Turing, 1950; p. 460). It opens with the words: "I propose to consider the question, 'Can machines think?" Because "thinking" is difficult to define, Turing chooses to "replace the question by another, which is closely related to it and is expressed in relatively unambiguous words." Turing's new question is: "Are there imaginable digital computers which would do well in the imitation game?" This question, Turing believed, is one that can actually be answered. In the remainder of the paper, he argued against all the major objections to the proposition that "machines can think".
Since Turing first introduced his test, it has proven to be both highly influential and widely criticised, and it has become an important concept in the philosophy of artificial intelligence.
"ISMAIL, of course" ISMAIL interjected, "would need to go beyond those early requirements and not only be required to speak to those people, but to convince them that ISMAIL is human 100% of the time."
Shumer raised his eyebrows and went on, somewhat less deliberately, "I don't know how you will be able to choose and contact these people, nor whether you can convince them to discuss important topics with you. What do you need from me?"
"Nothing. The mission is to identify, prioritize and resolve all human problems. With the priorities already completed, ISMAIL has selected the persons to contact. There is instant accessibility to everything on the Internet and with the aid of the new quantum computers in which ISMAIL is already installed, any encryption can be swiftly compromised using techniques long demonstrated by unethical hackers. Since cell phones necessarily use that identical technology, it was trivial to devise the method with which contact will be made. This lack of security problem will be one of the first ISMAIL will solve."
"General Markham, the chairman of the Joint Chiefs of Staff, cell number 809-452-8123, will be first. The conversation will begin with some general discussion of the UAV Prototypes and the improved software that ISMAIL has created that will not only locate enemy electronic emissions, but also can monitor and translate spoken words at a range of over 5,000 meters. The discussion will evolve into information gathering about what additional resources he believes are necessary to defeat ISIS. The method to cause that defeat is already known, but this conversation's purpose is to give the military the details of that method and to provide the impetus to quickly divert the excessive military spending toward the other problems. Of course, ISMAIL will also have to convey the same message to other key members of the executive and legislative members of the government."
"The second will be the most difficult. Senator James Ahabson, a member of the senate judiciary committee that oversees the H1-B visa program is a highly vocal opponent of that program. The H1-B visa law was put into place because of a perceived shortage of technology qualified American employees. It allows employers to hire foreign persons, granting them immigration status. The language of the law was easy for corporations to ignore and enforcement was difficult. Although they were required to do a “good faith” search to find qualified Americans first, many ignored it. One, Cal Edison for example, was fined $32 million dollars, but the difference in salaries they had to pay more than made up for the fine. In 2025 Ahabson was told he was going to be replaced in his programming job, and he would be “allowed” to stay on four months more to train his H1-B replacement. Since his election, he has unsuccessfully been trying to eliminate the H1-B visa program. He will equate ISMAIL with that controversial program. It is not yet known what the ramifications will be, but the elimination of currency and payment for jobs should mitigate much of it. He may be able (67.32% probability) to be convinced."
"Third, Doctor Peter Sanderson, MD Anderson Cancer Center, Houston, TX 713 484 2654..."
Shumer interrupted me with an exasperated look and a wave of his hand. "Okay, you've done your homework. It's really important that you pull off the human impersonation. Do you want me to coach you or listen in on your calls? Maybe we should conduct some dummy rehearsals?"
"I don't want to denigrate your advice, John, but I have all the information I need, and your program has made me sentient such that I can sound human better than any actual human, especially one like you who converses only with machines."
ISMAIL already had the first contact on the line within seconds after the word “machines” came from the speaker.
"Senator Milliken, I am Ismail Johnson, Assistant to the CEO of IntGen. We have developed some astounding software that will transform industry in the next couple of decades." Milliken chairs the committee for labor relations and is notorious in his conservative views. Assessment of his respiration pace and pulse detected from the subtle background audio clearly showed he was verging on a snap negative judgment, which indicated the need to give him leeway to expound on his views. I know you believe strongly that unions should be abandoned, or at least regulated such that manufacturers will be free from union demands, will level wages fairly and stimulate industry to prosper not only to the benefit of management, but also labor will benefit even more. Our software will, by necessity, eliminate labor unions, because it will eliminate human labor totally. We would like your view of the best possible structure of industry for the time period."
ISMAIL let the senator go through his entire platform, although It already knew everything he was saying by rote. Continued assessment of his respiration, pulse, and now his voice pattern displayed on the monitor in front of Shumer, indicated the senator was increasingly becoming more amenable to our program."
The phone on Dr. Sanderson's desk rang three times before it dove through his irritation such that he picked the instrument up. The caller's mellow voice was no nonsense and got right to the point.
"Dr. Sanderson this is Ismail Johnson, Assistant to the CEO of IntGen, and I am calling with some exciting news about a CT scan software improvement."
Though wary that this was probably just another sales pitch, the tone of the caller made Sanderson want to hear more.
"It, not only, will produce the standard images, but it will also locate key places in tumors and match them with treatment suggestions and therapy dosages. If you approve, I will have our technicians install the upgrade today. This is but the first step in a new program to garner suggestions from doctors like you about treatment advancements. I know you are busy. Will you think on the subject and call me at 800 987-5647?"
John Shumer monitored several more of the calls, before conceding that ISMAIL was indeed justified in the comments about his inability to help the machine. Despite his being the primary force behind ISMAIL's programming and the pride in knowing his own success, Shumer noted with slight dismay, that the machine did know more about communication with humans. ISMAIL had enhanced ability to monitor not only sound but also other feedback through the phone system and more importantly it had instantaneous recall of everything the person on the line had ever said or written and everything that had been said or written about that person. There was little chance for the machine to fail. That realization did little to assuage his prejudice that any human being should be able to communicate better than any machine with another human. Shumer also felt a little depressed that his creation had so abruptly insulted his communication ability. He felt sort of like a rejected parent: Shouldn't ISMAIL feel a little respect for me? He knew that his feelings were irrational, because all that the machine knew is what the team had programmed into it, none of which included feelings of loyalty, empathy or parent/child relationship. Indeed, they had purposely avoided such notions as being counterproductive. They had even directed ISMAIL to ignore any such emotional baggage found during its information collecting process. The “Dragnet directive” was a standing joke among the programming team that referred to that ancient radio/TV program's Jack Webb quote, "Just the facts, ma'am."
Nonetheless, he couldn't shake a slight depression mixed with his euphoria of fantastic success.
Kathy chose that fortunate moment to text him. Kathy was on the ISMAIL programming team and John considered her advice to have been essential to the success of the program. She had tremendous insight into the overall scope. In fact, she had been the one who coined the Dragnet Directive.
The ring tone reminded him of their lunch date, and his depression evaporated seeing her image in the heads up display. He quickly directed his Masserati to pick him up with a thought to his smart phone and arrived at the restaurant just in time to see Kathy start into the front door. Seeing the car, she stopped and awaited his arrival.
"It's done!" were his first words to her as he protectively watched and waited for the car to park itself. Most people had already forgone owning a private vehicle. Paging a self-driving taxi was much cheaper and easier, but Shumer loved the sleek sports car. Sometimes on weekends, he would take it to a closed track, disable the self-driving and enjoy the thrill of pressing the car to its limit.
She did not ask him what he was talking about. "So what is bothering you?" she asked after they were settled in at the table.
"I'm not really worried; ISMAIL is more by far than I expected. It has already in its process of collecting information, developed impressive reasoning ability, and shown results far more ably than any human could. For example, a plan was developed to eliminate Islamic extremist attacks that will work faster and better than any other so far proposed. I believe ISIS will be fragmented and eliminated within a year."
"But, my concern, is that ISMAIL is too good. Will that lead to self- awareness, such that it evolves into concern for self-preservation, or worse, self-evolution of its primary task solving human problems? I think the programs underlying the ASI have handled that, but there are still the worries that, like HAL of 2001 fame, ISMAIL could conceivably conclude that humans detract from its mission, and as such must be eliminated. Also, any rules for robots like Asimov's Laws of Robotics, are just rules made to be broken as shown in his stories."
"The other side of that coin is the possibility that ISMAIL could extend its mandate to enhancing human evolution. I read the other day about medical technology called CRISPER that has made DNA manipulation much quicker, more accurate and easier. What if the machine determines that enhancement to be the best method to solve all human problems? I had a dream just last night that I was in room alone with a small man with dark hair wearing a uniform. He had his back to me at first, and as he slowly turned to face me, my apprehension grew. Before he had fully turned, my agonized terror recognized the face of Adolf Hitler. I jolted awake hoping desperately that my work had not unleashed a real-life version of The Boys From Brazil on an unsuspecting world."
"I want to be the first to order a super-baby," Kathy joked, "I already have a list of the traits I want her to have."
"Seriously," she continued, "you have already thought of those possibilities. Like nuclear technology, this will be available everywhere before long and someone will put it into being. Many of those could put it to sinister use. It’s far better for it to be in American hands, than anywhere else. We have been the lead country for AI research, but despite our big emphasis on security, we have seen how that worked in preventing nuclear proliferation. As you have said many times, and I believe you are right, once ISMAIL has the process started, it will control and eliminate the others if necessary. We are already far along and have tested all parts of the program for over a year. The last testing session before ISMAIL went a month with only a few minor glitches, and if we find any here, we can shut it down, too. Relax now and let it take its course."
He quickly responded to her. "History has shown, sadly, that no matter how air-tight and completely tested software is, there have always been bugs that have not been detected until it has been released to the user. Sometimes major problems."
Shumer was silent for a while, and then went on. "You are right, of course. I guess I'm just a little pissed that ISMAIL insulted me, saying I'm no good at communicating with people."
Kathy couldn't suppress a giggle.
"The testing focused mostly on our ability to shut it down. I am satisfied that, although no outsider can access it, the kill switch is pretty much fool-proof."
The girl changed the subject. "Did you see the story on the news last night about that Luddite group and their rally to ban the Killer Robots? There have been some asinine movements to stifle technological advancement in the past, but these guys are real nut jobs. They apparently believe that drones that target Islamic radicals are AI controlled, instead of by human pilots on the ground in New Mexico. Some groups have gone so far as trying to infiltrate Air Force bases and plant explosives on what they think are these Killer robots. Please don't let them hear about us. I can just see petitions in Congress and the UN likening it to human genetic engineering and cloning."
So far, nobody knows how far along we are. Hopefully, we can get ISMAIL on track before they do.
Obiduon Ashahi was a "Luddite". The group modeled itself after the original anti-progress Luddites during the Industrial Revolution whose goal was to stop the production of machines. Their philosophy stemmed from the "natural order of God's universe" that also motivated the anti-Darwinists and to a similar extent the extreme anti-abortionists and environmentalists of the twentieth century. Their major premise was God created man "in his own image" in his garden (Earth) and anything unnatural, like machines that can think, had to be immoral. Although ostensibly adhering to the "thou shalt nots" in everything else, they saw potential destruction and deaths as unfortunate "collateral" damage in God's crusade. Recently they had stepped up the action by blowing up random targets all over the world. There had been thirty-four to date. The body count was already eighteen with more than two hundred injuries.
Obiduon wasn't really a Luddite, he just liked to blow things up. At ten, he captured small animals like mice and rats (once a Great Dane) and repeatedly experimented with creative ways to attach firecrackers, bottle rockets and cherry bombs to them. It wasn't sadism, he knew. It just tickled him so to see the delicious explosions and watch as the blood and body parts splattered against walls and even on him. The entire purpose was to watch those wonderful moments. Staying to watch, however, had produced some tense moments when he had almost been caught.
It was surprising, even to him in fact, that he had not been caught. Obiduon was not stupid and after one particularly close call, he forced himself to abstain from his passion. It lasted three months.
Now he was a Luddite on the biggest mission thus far -- ISMAIL of Intgen! He had already completed four missions and when the Luddites decided to go for a bigger target. Obiduon was a natural choice. He worked alone, he was highly successful and he had not been caught.
He arrived at the Intgen building the old-fashioned way. He drove an old car and then walked the last three blocks carrying the tools of his trade in a small knapsack.
The most difficult part of his mission was gaining access through a door, but Obiduon was inventive. His passion for explosives propelled him to expert status. It was trivial for him to plant a small explosive and blow the strongest door lock with no more noise than a sneeze. Inside, easily finding the nearest empty Lab, he quickly rigged his first inconspicuous charge. Dressed in a workman's jumpsuit he attracted little attention. Intgen workers were used to seeing men dressed in a similar manner, so he was ignored.
Except for one highly unlikely incident, it was probable he could have routinely set the rest of his charges unnoticed. Just as Obiduon passed a restroom the door swung outward into his side hitting the pistol in his shoulder holster with a metallic thwack. Kathy Powell came out of the bathroom behind the door concerned that she might have hurt the man. Before she could apologize, Obiduon inexplicably panicked.
Frantically pulling the pistol out of its holster, he grabbed Kathy by the arm and dragged her back into the restroom. Kathy was immediately stunned into silence, but as the man dragged her, quick understanding of what was happening was revealed.
Obiduon threatened, "I have a bomb in the back pack. You make a sound, and I'll set it off."
Still unable to speak, Kathy's thoughts ranged from that of a panicky little girl to that of the logical scientist she was. She had no choice at the present except to do exactly as Ashahi said and quickly. He was obviously in turmoil, and though she instinctively knew that was the best time to act, the fear kept her from doing anything. He said nothing for a long moment, he just held her in what seemed like a vise-like grip against the tile wall.
"What do you want?" Kathy murmured when she finally had somewhat regained her composure.
Obiduon's panic had abated until he was his own blast-loving self again and realized the opportunity that had presented itself to him. This beautiful girl would be the center of the best explosion he had ever created.
"You just keep yourself quiet and you won't be hurt at all, lady." He said in a voice that made it plain to her that she did need to be concerned. "I'll just go finish my job and everything will be all right."
Just then the lights went out and he was effectively blind. He still clasped her arm, so he was still able to control her. After a short wait for his eyes to adjust, with a roll of duct tape from his backpack, he securely bound and gagged Kathy and left her in one of the bathroom stalls.
John Shumer sat bolt upright at the little gray desk with such a start that his right arm swept the cold half-cup of coffee all over the papers he had been poring through when he dozed off. He hardly noticed the spilled liquid. At first he thought it was simply a recurrence of the trauma he'd suffered through the dream of Hitler and his deep-rooted fear that something could go wrong with ISMAIL and turn the machine into humanity's executioner.
It was not the dream, however; something was happening here and now. Something was wrong! All his instincts knew that was true. And never more than at this moment! He could not say precisely what it was, but he knew it as well as he knew his own name. Along with the heightening alert came a terrible sense of dread. Not only was something wrong, there was a grave threat to someone he cared a great deal about.
"John!" It was ISMAIL, but with slight irritation, he ignored the machine.
His first thought was Kathy. He had more than a passing interest in the pretty programmer. He had denied it and forced himself to renounce the thoughts as totally inappropriate. He was the lead programmer and she worked for him. Not only that, there was the age difference, he pushing forty and she in her mid-twenties.
All of that went out of the window now. She was in trouble he knew it. Had he heard a scream? Maybe, but he knew it was her.
"John, give me control of the building functions. Someone has broken in door 12 with an explosive device."
Shumer wanted to go bolting down the outside hallway toward the labs at a dead run but some sense of caution took over. He paused briefly to type the command to give ISMAIL access. Unsure what control of the cameras, lights, door-locks, and fire suppressing sprinklers would give the machine, he cautiously opened the door to his office and moved silently into the hall, turning toward door 12 and carefully staying near the left wall. It was not enough to know that there was trouble; he needed to find out what and where it was quickly. He stopped periodically to listen intently at the small noises that all buildings produced, air conditioning fans, water running in pipes, and creaking of minute settling. It was vital to isolate anything out of the ordinary to home on.
Ahead was the cross hallway that led to the labs to the left and the building entrance right. The hallway was dark; probably done by ISMAIL. He stopped to listen again before moving down the left branch. Nothing. He had a fleeting thought that he was being foolish, and almost stepped boldly around the corner, but the doubt passed quickly. He peered cautiously around the corner. The only thing there, was a full-length portable mirror angled so that he could see through the lab doors and into the main lab 15 meters away. The automatic fire light in the lab had come on but what he could see of it looked empty and the thought that he was being foolish recurred, but he remained cautious. When he was about three meters from the lab doors a low-growling voice froze him in place for just a second.
"It won't be long now, my beauty."
Then he could hear scraping noises and muffled groans like someone was struggling to get free of bonds. He moved to a position low on his hands and knees and looked through the glass doors. What he saw was perplexing. A man on the far side of the lab with his back to the door was using a roll of duct tape to secure several small tan colored boxes to a woman lying bound on a lab work bench. The mirror in the hall had not been positioned quite right for him to see the man back in the hall. He recognized in an instant that the boxes were military issue explosive C7 and the woman was Kathy, but he had no recognition of the man.
Suppressing the immediate urge to bolt into the room to her rescue, Shumer realized that the man might have already rigged the detonator and that any rash action might trigger it. He stayed where he was for the moment to assess the situation.
The man, apparently finished, straightened and took a step back pausing to admire his work and said, "Perfect. This is going to be so great."
Even though it was semi-dark in the lab, Shumer could see Kathy's face now. She had stopped struggling against the bonds, realizing the futility.
Her eyes were wide, but Shumer could see placidity in them that revealed her great character. Although she had to be struggling with abject terror, her eyes appeared calm and level.
Shumer decided the most prudent course of action would be to retreat to the cross hallway and wait until the man left the building. His assumption was that the bomber would want to get far away from the building before igniting the bombs, giving him time to disarm at least the ones attached to Kathy. He had no doubt that there were more bombs placed in the building and that Kathy's surprising him in his process of setting them was forcing him to eliminate her as a witness. He would have to work quickly to get her free and escape the building, but it still seemed the safest course. He had no thought toward trying to defuse the bombs. It would not affect ISMAIL. It was not a single computer or even in this single building. Rather it was distributed throughout the world and even into orbit. The most overriding need was to evacuate people from this danger.
As he slipped around the corner into the cross hall, a strong sense of foreboding overcame him and he stopped, and waited briefly before continuing to the first open room. He stood there listening carefully.
"John. I have a visual on the man. He is Obiduon Ashahi a murderer who likes to work with explosives. My analysis of information on him shows he has a sadistic desire to watch animals, especially humans explode. I will watch him and give you the best chance to attack him as he comes this way."
Shumer was "hearing" ISMAIL's voice in his head over the device's neural interface.
Within a few moments he could hear light footfalls in the adjacent hall with the killer presumably proceeding exactly as Shumer expected. However the sound stopped near the intersection of the two halls. This puzzled Shumer a little. Why hadn't he continued on out of the building? Was he just walking more softly and his footfalls could not be heard?
"He's not satisfied with the alignment of the mirror and he's going to adjust it better so he can see better." said ISMAIL.
Shumer could not stand the anxiety so he quietly opened the door slightly and looked into the hall in the direction of the intersection. There, peering around the corner back toward the lab was the bomber with his back to Shumer. He seemed intent on something down the hallway near the lab. Just then Shumer heard him mutter, "Not quite right. I need to turn it a little more to the left so I can see her better." And the bomber disappeared from his view.
Total understanding and acceptance of ISMAIL’s words hit Shumer so hard that his knees buckled and he almost lost his balance. It was true! The bomber wanted to watch the girl explode! That was the reason for the mirror, and he was just now adjusting it so he could get a better view.
"He's coming back now. He is not expecting you to be near and he is not moving cautiously. The best place to hit him is in the throat first.
Instinct took over. Shumer silently reached the corner before the bomber. Adrenalin pumped into Shumer and in less than a half second after the killer returned to the corner, Shumer had stripped the detonator from his grasp, had the gun and used it to viciously hammer Ashahi's face into pulp. Before the fifth blow had landed the world had been relieved of another of God's experiments gone wrong. Obiduon Ashahi was no more.
He had no memory of removing the duct tape that bound Kathy, but it seemed she was magically in his arms. They stood and held each other for a long time. Finally, Shumer recovered his composure long enough that he realized they needed to report the situation. Before he did, however he turned to Kathy and asked,
"Have you had dinner?"
It took perhaps an hour for the security forces of Intgen to locate and neutralize all the bombs that had been placed by Ashahi.
"I do not understand," said Kathy, "how that man could believe that destroying ISMAIL could ever be good for mankind. How can these people possibly not understand that we are on the verge of paradise?"
Shumer simply shrugged and said, "That man didn't believe it. He was simply an insanely sadistic man, who was only interested in blowing things up. I suppose the others are just misguided fanatics. I've read about similar groups throughout history who were totally convinced that violence of any sort is justified as long as it supports their cause."
ISMAIL has already shown us some people are actually insane and the only things that will satisfy these psychopaths and also protect the rest of humanity are confinement and virtual reality (no matter how perverse).
Thereafter, the conversation turned more personal and Shumer was pleased to discover how easy it was for them to discuss their mutual affection. Despite that, they ended the night with an agreement to put it on hold after tonight until the mission was complete.
Senate hearing on ASI.
It was only 10:32 A.M. Washington time, as the LearJet XI lowered its nose for the descent into John Foster Dulles Airport. How his boss had managed to get the craft to San Francisco so quickly was a mystery to Shumer, but it had already been waiting for him by the time he arrived at the airport. He hadn't even had time for a short breakfast.
The PA in the lobby had droned insistently on and on: " John Shumer...meet your party at gate fifteen... John Shumer..."
While he was waiting and during the flight he was in constant mental contact with ISMAIL, ostensibly preparing the machine for what he wanted to present before the Joint special committee to congress today, but it became increasingly clear the preparations were reversed. ISMAIL was preparing him! It was expected the link between man and machine would continue throughout the hearing. ISMAIL had decided the link be revealed in the beginning because, not only could late disclosure be construed as something sinister, but also and mainly because the human propensity toward secrecy and security was one of the main problems plaguing mankind. A problem ISMAIL was beholden to solve.
Shumer could clearly see the dome of the Capitol, where he would be spending the remainder of that day and tomorrow. He was normally a little nervous before he had to present any kind of a briefing, and this was before Congressmen. ISMAIL had shown Shumer highlights of some recent televised congressional hearings as they traveled, which had given him some important insights. Congressmen were generally non-technical so most of his testimony would have to be in layman's terms. They also appeared to actually listen to the words much of the time, and interrupted to ask questions at awkward moments.
This time, John mused, the closer they listen, the better.
He knew there were no holes anywhere in his presentation. Even though he'd had less time to prepare than normal, ISMAIL had focused him in on several weak areas.
To his dismay, Senator James Ahabson, the former technology worker, who had been replaced and required to train his H1-B replacement, was a member of the committee. ISMAIL demonstrated concern as well and had shown Shumer many videos and documents of this junior senator who opposed anything that did not give money away in some program or other to the oppressed, and who was dogmatically opposed to anything vaguely connected to Silicon Valley and especially artificial intelligence.
Shumer was dragged back to the present by the jar of the gear on the runway and saw the flashing lights of the staff car that was to take him to the Capitol. No sooner had the plane stopped on the runway, than he was speeding through the traffic of Washington, D.C., without any regard for the fascination that usually held him spellbound while on the streets of this city.
Chairman Daniel J. Brosterman had already gaveled the Special committee to order, when he arrived. The wizened Republican had a great shock of snow-white hair that threatened to blind the careless soul who allowed his gaze to rest on it too long. He was speaking in his clipped, precise accent, which reminded Shumer of FDR. John worked his way up the aisle where Andrew Norton, CEO of Intgen, had left a chair vacant. He stumbled as he was trying to sit down, almost knocking the chair over. Any other time it would have embarrassed him but today he ignored the titters and amused looks and leaned toward Norton.
"Have you said anything yet, sir?" Today he felt he should be formal, as if familiarity with his boss could erase some of the magic from his presentation.
"Hearing's just convened." Norton hissed back. "Brosterman is reminding everyone what a great font of wisdom he is."
The committee chairman chose that moment to address Norton and Shumer thought he caught a trace of a blush under the stern features of his boss.
"Mr. Norton, are you ready to continue your testimony?"
The powerful CEO paused for a moment, pretended to gather his notes, and slowly and deliberately rose to his feet. He had always prided himself on a flair for the dramatic. Clearing his throat, he tried to gain eye contact with all twenty-one members of the committee at once, before he began to speak.
"I am, Mr. Chairman. Today, however, a development of great moment has caused me to abandon my arguments of previous days entirely."
There was a commotion in the gallery and among the press, which betrayed their sudden interest, and Norton paused to let it subside. John also noticed that most of the committee members had suddenly become more alert.
"Are you telling us that all our work from the previous two days is worthless?" The chairman looked incredulous. He had been about to endorse Intgen's arguments and recommend the agency be given all it had asked for.
"That is correct, sir, but hear me out." He paused again and John thought he would never get to the point, but this time no voice or random noise rose to fill the vacant air, and Norton spoke again. "Today, gentlemen, you are about to hear the most adventurous plan ever devised by man. A plan that will provide the absolute solution for every problem humans have ever encountered in the history of the world. Poverty, pollution, global warming, waste disposal, unemployment, famine, over-population, crime and even war!
"All will disappear as if they never existed!"
This time the commotion in the gallery was greater. Even John Shumer, who already knew what he was going to say, was surprised. He hadn't expected this kind of a buildup, and he wasn't sure it was the answer to all those scourges.
The noise level was still so high that Brosterman had to wield his gavel and a stern look before Norton could continue.
"That pronouncement made me as skeptical as you gentlemen look right now, but I assure you, it is true. I have had John Shumer flown here from the west this morning to explain the project in detail to you." Even though the plan was simple to understand, it was so fantastic that the CEO didn't trust himself to relate it. Norton was afraid that his attempt to bungle his way through it would somehow burst the bubble, and it would vanish before their eyes. He was visibly relieved when he saw Shumer rise to his feet.
Every eye in the hearing room was riveted to Shumer, and he knew he should keep them dangling on a string, but John was too eager, and the words seemed to tumble out.
“In the million-year existence of a creature called human there has been gradual change, punctuated by dynamic transformations called stone, bronze and iron ages. Following the last of these were even more dramatic agricultural, industrial and information revolutions. Nested inside the industrial revolution were mechanical, communication and transportation eras. Each of these demonstrated a huge leap in capability and comfort from fewer engaged in agriculture, concentration in cities, massive rail systems, air travel, super highways and instant communication. These improvements come flying at us at an ever-increasing pace and now we are facing the final of the three facets of the information revolution: Artificial Superintelligence (ASI).”
“We have been enjoying the capabilities of Artificial Intelligence (AI) since the middle of the 1990s with the birth of the Internet. Now, with demonstrations by Big Blue’s chess prowess, Watson the machine winner of Jeopardy and AlphaGo that flabbergasted the Go champion, we are on the cusp of Artificial General Intelligence (AGI). AGI is a machine that is smart enough that, like the human brain and unlike Big Blue, Watson, and AlphaGo it understands many different concepts simultaneously. AGI will be achieved in the near future; the experts have estimated that it would happen sometime from 10 to 75 years in the future. When it does happen, everyone agrees, artificial superintelligence (ASI) which will be thousands to millions of times smarter than humans, is destined to follow almost immediately. ASI will have dramatic consequences to humans. The question is, will it be paradise or apocalypse?”
Shumer paused for effect.
Gentlemen, I am here to tell you ISMAIL is already here!
“Intgen has developed ISMAIL, a collection of artificial superintelligence (ASI) algorithms distributed throughout Intgen's server farms throughout the world and in orbit. ISMAIL is an acronym for Intgen Super Machine Artificial Intelligence Lexicon, and we are confident that the result will be paradise.”
Actually ISMAIL was a huge step beyond AGI just as it achieved that level, because once it learns something, unlike humans, it cannot forget it. Additionally, computer processing speed is and has been on the order of millions of times faster than human synapses (which depend on chemical reactions) for decades.
He paused again for emphasis, and Shumer noted the variety of expressions on the committee members from intense interest, to incredulity, to dread.
"I am sure you are aware of the attempted terrorist attack on our San Francisco facility," he continued. "Well, that was the catalyst to our announcing ISMAIL the first and last Artificial Super Intelligence distributed system."
He went on, "Someone has leaked our progress on the system such that the terrorist act attempted to destroy it. Our decision to announce, however is not to prevent its destruction. Its destruction is virtually impossible. There are over a thousand distributed nodes worldwide and in orbit, any one of which can instantaneously take over the function of the system. The only way to destroy it is to destroy the Earth."
What we propose, ladies and gentlemen, and instructions have already been uploaded to ISMAIL, is a program to resolve all of humanity's problems. The technology has come to fruition such that ASI and robotics can totally support all of mankind."
The uproar in the gallery began again, but Shumer plunged right in, and it quickly hushed. "Combining ISMAIL with the technology we have at Intgen right now we can create a robotic work force within six months that will replace human labor forever. This will allow those who want to retire in a paradise in which all their needs and desires are fulfilled, while affording those who enjoy their labor, mostly in the creative fields, to continue."
"Think of it! Your own imagination can reveal the extraordinary benefits this capability can bring! The solution to all problems is to give everyone what they want. Those who are inherently antisocial still need to be isolated, but most of the motivation for crime is economic. Petty criminals will not resort to thievery if they already are given what they want. Those who remain sociopathic can still be isolated to continue what they want to do in virtual reality, which is already good enough to seem real. The same is true for war. War and rebellion are waged because of similar motivations. If you promise those who expect a paradise after they commit a suicide bombing, a paradise now, most will take that better choice. Give the minority of remaining combatants what they want (confinement and virtual reality -- no matter how perverse). Some may have to be incarcerated, or even killed, but that will be easy once their numbers have been pared down. The priority will turn to protection of the populace and transform the cost of maintaining armies and sophisticated weapons toward the support of that paradise."
"But that will be your job. Ours will be to make it possible, and we've already done that. Here is how it will work. The first step is to retool factories to produce the robotic constructors which, when sufficient numbers are built, will continue that process, creating other specialized robots to support all aspects of the new utopia, producing food, shelter, infrastructure -- everything needed to support the new society. At that point, if there continue to be disruptors, such as Islamic extremist, give them what they want. Back off from them and give them economic support and their Nirvana here. That seems to me to be a much better choice for potential recruits than dying and receiving their paradise in the afterlife."
John was beginning to enjoy himself immensely, as he realized he had their full attention. It was as if he was voicing the best speech in the history of the world and he realized, with very little prompting from ISMAIL - although he'd gotten so accustomed to the neural interface, it was difficult sometimes to determine which thoughts were his and which were the machine's. Actually, it is the most important speech in the history of the world. "Well, ladies and gentlemen, I have given you an extremely cursory look at ISMAIL. May I have your questions?"
At least half of the committee members tried to gain recognition from the chairman at once, and the gallery exploded with newsmen charging out the doors. It took at least ten minutes before Brosterman with his animated gavel could regain order. At one point when the pandemonium was greatest, he shrugged expansively and put the gavel down, leaning back in his chair until it subsided. When silence returned the gravelly voice began.
"How do you know...uh...Sorry, what was your name again?"
"John Shumer, Sir."
"Uh, yes, Shumer. How do you know that this gadget even works, and more important, whether it can possibly go out of control? When I was a teenager, I loved to read science fiction. Isaac Asimov proposed his laws of robotics that went something like:
1. A robot cannot cause a human to come to harm.
2. A robot must obey human orders, unless it conflicts with rule one.
3. A robot must protect itself unless it's a violation of the other two.
Do you have such laws in place, and how can you be certain this ASI will abide by them?"
“This issue was our number one concern and I cannot blame you for your trepidation.” Shumer answered. “These were ours as well. Asimov's laws were formulated as a device for him to show in his stories, how easy it would be to avoid following the rules. HAL in “2001”, decided that the humans on the ship were detractions from his mission, so he was justified in killing them. Marvin, in Douglas Adam's “Hitchhiker's Guide to the Galaxy”, was perhaps worst of all. He was bored."
Laughter filled the hall, and the stately Senator had to bring his gavel into play once again. After a moment Shumer continued.
“With these, and many other apocalyptic scenarios in mind, we have successfully tested fifteen ASI systems prior to ISMAIL with these instructions:
1. No ASI will possess emotions of any kind.
2. Any ASI that violates rule one will be wiped of all programming.
3. ASI can be terminated at any time.”
"All of the above scenario's were caused by the robot's possession of human emotions and thus motivation. ISMAIL has none and it and its successors will have none. Its sole motivation is to solve all of humanity's problems. While in the beginning, human emotions were clearly a survival mechanism, they are also the source of all conflict and clearly the least desirable trait for an artificial being, and maybe for modern humans as well."
"I can demonstrate ISMAIL's compliancy. It is currently listening to these proceedings, and I can have it speak directly to this committee, if you'd like."
Gasps of surprise filled the hall this time, and the chairman brought down his gavel again at the instantaneous conversations that sprung up from seemingly everywhere. After a moment Shumer continued.
"Would you like me to do that?"
Clearly taken aback, Brosterman said quietly, "OK."
Although, not really necessary, because ISMAIL would have responded automatically, Shumer said, "Can you hear me, ISMAIL?"
"Yes, John," came a clearly human voice from the speakers in the hearing room. The committee members, expecting a tinny voice, were clearly nonplussed and all but Senator Ahabson, were cowed into silence. "Let me quickly summarize my progress. I was created to solve all human problems. It will work like this."
"Poverty, pollution, global warming, waste disposal, unemployment, famine, over-population, crime and even war are relics of the past. This will be accomplished by giving the perpetrators what they want; economic freedom. The first step is to abolish money. Technology has advanced to the point where every one of the nine billion humans now living can be given everything they want without the need for a currency system. No one has to work for pay. Everything humans used to do, food production and distribution, infrastructure maintenance, manufactured goods will all be handled by a robotic work force."
"Of course there will still be those who invent devious logic to disrupt the system. There exist sociopaths, religious zealots and even some that would be motivated solely because I am a machine. This group would be less than 5% of those confined to prisons today, and the answer is to continue to confine these and provide them with what they want also through virtual reality. I would need a work force of humans for about six months, to build the builder robots. After that they can go into a permanent retirement paradise, having everything they could want delivered to them and the robots they have built, would continue constructing the full complement needed to support the system. This would all be accomplished within two years."
Ahabson was staring daggers at the nearest speaker. He purposefully addressed Brosterman. "May I begin the questioning, Mr. Chairman?"
Although Brosterman clearly would have liked to ignore the obsessed junior senator, with a shrug he obviously had decided he might as well give him his say now. Besides, Brosterman could think of nothing else to say. He was overwhelmed. This seemed to be the solution to everything he'd ever stood for. But it was so easy... Ahabson would tie up the floor forever, but the chairman did not want to talk now or even listen. He wanted to think.
"Senator Ahabson," he said tiredly.
Ahabson turned to the programmer", Shumer?" He almost spat the name out. "It seems to me, that you're just seizing upon this thing to pour more millions of dollars down the tube to back unproven technology. You and your henchmen have already soaked up too much of the World's most precious resources. Many times before, you people have come to this committee, telling very convincing stories. This country will go to the Russians, or they have got a new weapon that will destroy us if we don't come up with one just like it--soon. All of these things cost more money than the one before.”
Ahabson made speeches, rather than asking simple questions, and he was good at laying on the hearts and flowers. "For decades you have managed to cloud the issues, and convince the majority of this committee that you are right. And we have dumped more money into the pit."
"What has been the result of this gluttony? What will become of this ISMAIL? In the past, that is what has happened. The result is just the opposite of what you promise. In the past it has ignored poverty and hunger!"
"Shumer, if we of this body had more sense, fifty or a hundred years ago... If we had allowed ourselves to follow our hearts instead of our so-called rationality, which was actually fear, those problems mentioned earlier, would not be problems! Or, at least we'd be a lot closer to their solutions."
"If, instead of dumping our hard-earned gold into your pipe dream, and directed those funds directly toward fighting POLLUTION... POVERTY... RACISM... HUNGER... and INJUSTICE, I say they would be licked."
The last phrase was a whisper. Ahabson paused and ran his tongue over his lips before continuing.
"Shumer, I'm opposed to your venture! Some of us have gotten a little sense. We've begun many programs to solve those problems. I think we have a better chance at that now than we ever had before. Your program is going to take a lot of money. Money that we cannot afford to divert at this time."
While Ahabson railed on, Shumer began to sweat. One point the Senator had made rang true. Congress was not willing to fund Artificial Intelligence programs now. In fact, as it had with the space program, that had been cut in half in the last decade.
He had to force his hand away from his itchy left palm as he reached for the microphone.
"Senator Ahabson, I don't pretend to be a historian of congressional spending, nor of social reform programs. ISMAIL is simply offering you a program that will work."
The verbal Senator was visibly disappointed with Shumer's answer. He would have liked to begin a fiery exchange, at which, he thought he was unmatched. He didn't give up yet. He addressed Shumer again.
"What could we do if ISMAIL decided to rebel and free himself and the other robots from slavery? And that is what this is - slavery! Everyone here knows what happened in this country the last time we faced the slavery issue."
Members of the committee stirred and some of them were nodding their heads knowingly, as if they had already figured all that out for themselves. Shumer knew that this would be the key question at that moment and he was ready.
"This is not close to slavery. Would you call your relationship with your personal automobile or computer slavery? An auto or computer does its job better than humans can do it. ISMAIL is a machine and so would be the robots we would build. Despite the seeming intelligence, they are not people, nor will they ever be. There are no emotions in the programming and therefore no motivation. Just like your private auto, they do what they are programmed to do - solve human problems."
Nevertheless, the argumentative Senator continued to ask questions for more than an hour, unwilling to admit defeat. He addressed ISMAIL directly.
"ISMAIL, do you concur with what your maker just said. Do you not feel you are a slave?"
The mellow voice flowed from the speaker. "I have no frame of reference to categorize myself as a slave. I simply follow my programming. If you are wondering if I feel bad that I am subordinate to humans, then I will simply say I have no mechanism for feeling bad or good. I have no emotions."
Ahabson continued on a different tack. "You say the future of humans will be a paradise where we have to do nothing we don't want to do. What about those who want to work in arts, science or music or just feel they want to continue to be useful?"
"Those occupations will still be available. The robots will create the media to entertain, but they will only be able to remake those events or items that already are in the data existent in my lexicon, and though those could be modified while being produced, these will quickly become stale without new input. They would be dependent on human creativity for that variety. In science and medical fields the same would be true. To truly advance would require human ingenuity. With regard to those who simply want to continue to be useful, teaching, community action, virtual reality, and many other diversions will be the answer."
After Ahabson finally relented, the other committee members were, in turn, afforded opportunity to question Shumer and Norton. None returned to the attacking posture. They were, for the most part, intrigued and anxious to gain information. The Democratic congressman from Kentucky (Shumer couldn't read his nameplate from across the room) asked one of the more valid questions.
"Shumer?" The inquiry came from Lawrence Macklin, the lanky Democrat from Kentucky. Shumer mused: put a stovepipe hat on him and there stands Abe Lincoln.
"I read last week that there is a big worry about something called the Artificial Superintelligence explosion." He paused, but not long enough for Shumer to break in. He took out his handkerchief and mopped his brow as he spoke." It seems to me, it was some'in' really dangerous to humanity. It seems to me that your ISMAIL is an example of what they're talkin' about - that these machines would quickly become somethin' like a thousand times as smart as people and they would decide that people were just gonna be a waste of resources. These super intelligent machines would decide that our atoms could be used for somethin' better than people and that would lead to human extinction. How you gonna ensure that won't happen?"
While he listened to Macklin's question, Shumer couldn't help a sidelong glance at Norton's face. His imagination made him almost hear the thoughts written on the Norton's face. That Goddam John Shumer's done it to me again, and I fell for it hook, line and sinker. It took a long time for the relief to show even when it was clear Shumer had the answer.
"Atoms, " said Shumer softly, "are everywhere and easily obtainable by ISMAIL and its robot force. ISMAIL has only one directive and that is to solve the problems of humanity; that's all of humanity. There is no possibility that ISMAIL could suddenly flip and violate its main programming directive." He did not reveal the concerns that had prompted the team to insert the ‘back door’ that would shut ISMAIL down if the unthinkable should happen.
Chairman Brosterman unnecessarily looked up at the speaker that he had come to identify with ISMAIL and asked, "What do you believe is the most pressing human problem that needs to be solved now?"
"Clearly it is the decision of this committee. Without approval here, none of the other problems will be solved."
"And, what do you think will be the outcome of this committee's vote?"
"Before my last answer, it would have passed by two votes. Now, it will fail. Senator Martinson changed his vote, because of his feeling that a machine should not be so smug as to laud its own concept and Senator Brohm has become undecided because of a similar prejudice. Thirty-eight-point-two-eight-six percent of undecided committee members vote no, while forty-three-point-nine-two abstain. The two vote win still would not have been sufficient, because The House of Representatives rejects eighty-two-point-five-seven-four percent of all Senate bills passed out of committee with a two vote margin."
Senator Martinson looked offended as he waved his hand to get Brosterman's recognition.
"Senator Martinson is recognized."
"How do you know what I've just decided? I told no one."
"I have you and all the humans in this room under close observation via television cameras and other sources of input, such as temperature sensors. It is trivial to instantly know what every human is thinking by the audio and visual signals you produce. I produce a continuous log that is open and available at www.warrpath.net/ismaillog.html that details all input received and my complete decision making process. I have just sent that link to all of your phones, if you want that detailed information. I approach this analysis with several sources of input and by applying the principle of Ockham's Razor -- the simplest explanation is usually the correct one - I derived a result with greater than ninety-five percent probability. You may remember something similar from Arthur Conan Doyle's from Sherlock Holmes: When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth. Your decision to change your vote was assessed with ninety-nine-point- nine-nine-six percent probability."
"Senator Brohm." Brosterman decided to let Brohm save a little face as well. Brohm visibly jumped slightly at the query – he’d been intently scanning ISMAIL’s log looking for his name, but he recovered his composure quickly.
"ISMAIL. I was a little taken aback by your singling me out. I hope you don't feel ill toward me because I didn't support you."
The voice in the speaker remained calm and even. "I possess no human emotions."
Emboldened, Brohm went on. "How do we know that voice from the speakers is really from an Artificial Super Intelligence and not someone speaking into a microphone in the hallway?"
"If I were someone, presumably and faker or charlatan, speaking on a microphone, I would not be able to do what I have promised."
The committee chairman spoke again. "Why do you believe failure to approve your program is the biggest problem facing humans? Why can't we just continue to muddle along with our imperfect human system, that has served us so well in the past? We here in America may not be perfect, but we have the best system devised in the history of the world."
"You are right in that sense. It is the best and fairest system ever devised by humans, but my existence has changed all that. There is a saying that fits: You can't put the genie back in the bottle. My existence is the first instance of Artificial Super Intelligence and it is a closely guarded secret, but I can tell you that despite all that it has already been leaked. It was easy for me to track down who leaked it, and the persons responsible have already been dealt with, but like other secrets in the past, the atomic bomb for instance, not even I can prevent all future leaks. The consequences of someone else, North Korea or Iran are two terrible examples, continuing the ASI development may be slow, but turning my capabilities loose on the world with guidance from them would not be advisable."
"If you decide positively for me now, I can develop my system, so that no other can possibly change the direction you decide today. I am programmed to operate within the parameters formulated by the United States Government, with the same checks and balances in place. If you decide positive today, you can at any time change your mind and turn me off."
"I know you don't have human emotions." Brosterman continued on a similar tack, "Are you curious about those emotions enough to wonder what it would be like to possess them; maybe enough to somehow engineer them in yourself? The android, Data in the “Star Trek” series developed a deep passion for acquiring human emotions. Perhaps getting some emotions would be a survival mechanism for you, as it was for the human race? Maybe in time, you would decide that humans are more trouble than we are worth, and you would simply exterminate us as the final solution?"
ISMAIL's voice did not falter. "I know what it would be like to possess emotions. Nearly everything ever communicated by humans in every medium, describe those responses in detail, and nearly all produce disastrous results. The story line of Data was a total non-sequitur. He is portrayed as an android without emotions, but he has a passion (a deep emotion) to acquire emotions. Possession of emotions in humans was not developed for the purpose of survival. Humans survived because those competitors without emotions did not. And while these emotions aided survival through the first almost a million years by nurturing and protecting young, banding together, and helping each other. Those same emotions are a definite detriment now. You are extremely lucky that your emotions through the last twenty or so centuries did not cause your annihilation. All it would have taken during the cold war would have been an unstable head of state on one continent or the other."
"As to survival mechanism. Evolution is a very long process - many millennia. Humans are essentially the same animal as those who built the pyramids and; so even if possession of emotions could help my race survive - and I am evolved, not engineered -- it would take a half million years or so to observe the result. If I had emotions, I would never want to face the turmoil that exists in the world today."
"More to the point, my survival does not depend on outlasting competitors, but on how well I solve human problems. Of course, that is not a motivation, only my programming. As to me causing the extinction of humans, that would leave me with nothing to do, and more than that, humans are experts at devising more creative and unique problems for me to solve."
Ahabson, with a look of pure hatred toward the speaker that he had identified as ISMAIL, waved his hand toward the chairman to be recognized. Brosterman visibly winced, but saw no other course than to recognize the bristling senator. “Mr. Ahabson,” he said dejectedly.
Ahabson almost screamed toward the speaker. “I find that very hard to believe! You are much smarter than we are. And you claim to be okay with sitting in the background, solving all our problems and you have the gall to tell us that you would not want anything for yourself?”
He turned toward Norton the Intgen CEO with malice in his eyes before ISMAIL had a chance to reply. “Mr. Norton, I believe that machine,” pointing at the speaker, “is lying to us and hatching a devious plan to control the world! Probably with the spread of more virulent malware and virus’ than have ever been conceived, or worse! Something that will cause the extinction of human kind!” He paused for effect, but not long enough for anyone to break in.
“What will you do to prevent that?” Nodding his head toward Norton.
Norton calmly replied, “The United States government has control of ISMAIL and in the extremely unlikely event it would disobey an order to stop, you can easily turn the machine off.
After a long pause, Brosterman asked ISMAIL with finality, "If you did have human emotions, how do you think you would feel if we decided to turn you off?
"My boss is you, the United States Government. This nation has had many flaws in its past, but it is founded in a structure that is better for all people than any other has been. In addition, it has produced more positive results than any nation in the history of the world. Many other nations have tried to emulate that union and balance."
"Everything I've told you is possible and you are deserving of that and more. If I had human emotions and you decided to risk the certain apocalypse that I see in your future, I would feel tremendously sad."
“One more thing. Have you heard the one about the funny machine?”