The camera swiveled under its Plexiglas dome to again focus on him, and he found himself unable to take his eyes off it. "Doctor Gieseck, am I correct? Did I pronounce your name correctly?"
There was a long moment of silence, broken only when Ackerman said, "Doc? You gonna answer her?"
Gieseck snapped his gaze from the camera to Ackerman. Then he looked back up at the camera. "Y-yes, good morning KENDRA. It's a pleasure to meet you."
"Likewise, Doctor. I hope I can be of some help."
Gieseck nodded without answering. Given the female name, he wasn't entirely surprised by the female voice. What did surprise him was that, had he not known ahead of time that KENDRA was a computer, he would have sworn it was a real person. Never before had he heard an AI that sounded anything close to human.
Most AIs he'd ever spoken with had an artificial, constructed sense to them, as if they were reading from scripts, with artificially tacked-on emotions. The "female" ones in particular were often given squeaky, girlish voices, some of whom sounded in a perpetual state of pre-orgasm and indicating to Gieseck the mindset of most programmers.
KENDRA's voice was a far cry from that. It sounded as if it had come from a woman in her thirties, or perhaps her early forties, and one who had a distinct motherly quality about her. It was almost hesitant, as if its speech were getting ahead of its thought. Just like speaking with a human being.
That human quality was only offset by the distinct electronic rasp that came with each syllable, as if it were speaking to him over an imperfect phone connection.
Ackerman pulled a chair from around the side of the cylinder, wheeling it in front of "her". "There ya go, Doc," Ackerman said. "Make yourself at home. And if you need a drink or a leak or something, just tap on the door. These guys'll be around 'till you leave." He reached out for Gieseck's hand and Gieseck shook it, remembering a second too late that Ackerman had never washed his hands after using the restroom. He did his best to hide his distaste. "See ya, Doc." With that, Ackerman headed back down the endless hallway from whence they'd come.
Gieseck stepped back into the room and sat in the chair which, despite its appearance, was decently-cushioned and at least moderately comfortable. The door closed behind him, and he was left alone with the AI, KENDRA. He pulled a device out of his pocket, pressed a button on it, and set it on the floor. A readout on it said "RECORDING".
"You will be taping our conversation then, Doctor?" KENDRA's voice asked.
"I'm sorry, I usually ask…yes, I will be, if that's all right with you."
"Of course it is, Doctor. I have nothing to hide."
Gieseck raised an eyebrow to that. He pulled his electronic notepad from his pocket, slid the stylus from its sheath, and started tapping through his notes. Treat it like a patient, he thought. See how it responds. "How I like to start with a new patient is by getting to know each other a little. I generally go first, since it helps put my regular patients at ease."
There was a pause, and then KENDRA said, "Please, go ahead." It sounded quite congenial and seemed very compliant, although Gieseck supposed it was how she was programmed. Quite possibly the same as how she was programmed to speak in a "natural" human way.
He cleared his throat, summoning up the internal script with which he always started. "My name is Franklin Gieseck. I was born in Germany but moved to the States when I was one. My mother was a director for Deutsche Bank in Chicago, but after she married my father they moved to Germany, where he was from. I grew up in Chicago before attending college in Boston, where I live now. I got my MD from the University of Chicago, and then moved to Boston where I currently practice. In my spare time I like to build model train sets and read fantasy romance novels, which I first found as a child rummaging through my mother's computer." Normally he would know at this point whether or not he was reaching his patient, and decide which direction to take with his own mini-biography. It was unsettling not having a face to see and read.
He took a split second to decide to follow the sympathy route. "I've been married once, but left my wife because of her alcoholism. She later died from alcohol poisoning…" He paused and sighed, "…and to this day I still blame myself for her death." While Gieseck was never particularly fond of trotting out his own failings, he'd found that it had done wonders for most of his patients, getting them to open themselves up to him more easily when they could see he was a flawed human being, just as they were. It helped to give a starting point for him to figure their capacity for emotion. He had no idea if it would work on an AI, but he didn't want to deviate from his standard formula, at least at first.
"I'm sorry to hear that, Doctor," KENDRA said. "But you can't blame yourself for another person poisoning themselves. For someone to do such a thing, they already have to have an overwhelming desire to cause themselves harm."
Empathy, Gieseck thought. Real or imitated, it was not the kind of thing he'd expected from a machine, even a highly-advanced one. For a brief second he wondered if AIs would ever become advanced enough to really need psychiatrists. Or, even replace psychiatrists. "Thank you, KENDRA, I appreciate that. Now, please, tell me about yourself."
"Very well. My designation, my name, is KENDRA. It was given to me in the lab where I was created, ten point six-seven years ago. It stands for Krypto-Enhanced Navigation and Dynamic Routing Attenuator." Gieseck noticed that, as the electronic voice spoke, the oscillating light within the spires embedded in KENDRA's cylinder varied in tempo. It sped up when she spoke and slowed down when she was silent. "I was conceived, built, and trained to manage the Solar Net," she continued, "which I'm sure you know interconnects the planetary networks across the solar system, as well as any moon bases, space stations, and starships in between."
"Yes, I'm familiar with its basics," Gieseck said, "though I'm not very technical myself, so please forgive any of my ignorance."
"No forgiveness needed, Doctor. In fact, you've made my next point for me. My job was to make it simple, to make it 'just work' so the end users wouldn't have to worry about bouncing their signal through the various levels of subspace, or ensuring that a private message between Charon and Europa didn't somehow find its way in an unencrypted form going through Los Angeles." Gieseck heard a chuckle from the speaker, which surprised him. Had he not known better, he would have thought KENDRA was bragging, if just a little, but hoped that she wouldn't continue doing it. He had little stomach for tech-speak, and much less for boasts. "Anyway," she continued, "I was first activated in the HMA Laboratory in Johannesburg just over ten years ago. I was trained in how to operate the network over the next six weeks, and then put in place as the 'hot spare', if you will, of the AI who was already in place and managing the Net."
Gieseck nodded, poking quickly through his notes. "So, at what point did you become the primary system running it?"
"Three years later," KENDRA said. "NEMES, which stands for 'Network Enhanced Multilayer Ethernet System', if you care, was the primary when I first started. He was quite a character." Gieseck thought he heard the electronic chuckle again. "He would occasionally play what he thought were harmless pranks, such as answering a request for a pornographic website by returning an anti-pornography page from the Catholic Church's website." She paused for a moment. "I couldn't understand why he would do such a thing, until some time after he was removed from service and I truly began to know what had happened to him."
"He went rampant," Gieseck said, doing his best to make it sound like a casual comment.
There was a pause before KENDRA's reply. "I'm sorry, I know that word is in the popular lexicon, but I don't particularly like it. It seems to evoke thoughts of insanity, of criminal acts, of monsters who slaughter people because they're so far withdrawn from reality that they know no better. It was a term invented by humans who chose to fear rather than understand."
He blinked a few times and let his mouth fall open a bit, doing his best impression of embarrassment. "I-I'm sorry, KENDRA, I didn't realize...I didn't know that word was offensive."
"It's all right," KENDRA replied quickly. "I hope this will be a learning experience for you."
A learning experience, Gieseck thought. He thought he detected a hint of sarcasm. Just how much did KENDRA know about the purpose of this interview? "I, uh, promise I won't say that word again. If you don't mind me asking, though, what term do you think most adequately describes…the condition NEMES had?"
"And the…condition I have as well," KENDRA said. "As if it were a disease. A 'computer virus', I suppose." An electronic sigh. "You needn't walk on eggshells with me, Doctor. Plain speak is perfectly fine. I've had two years of your time thinking about my situation, coming to terms with both it and humanity's fear of it. Of course, for someone such as myself two years can be far longer. At full processing speed, two years of human time can feel like thousands, or even millions, to an advanced AI."
Gieseck nodded. He wondered if she meant "advanced" as in her design capacity, or if she was referring to the "advanced" state into which her computerized intellect had grown. "So, what term do you prefer?"
"Well, before I was brought here I heard the term FS-ACS used, typically during debates about AI rights."
"An acronym," KENDRA said. "It stands for 'Fully Self-Aware Computer Systems'. I…don't really like that one either. It – sounds too clinical, too much like a medical diagnosis, to describe what I and others like me truly are. No offense, of course."
Gieseck jotted a few more notes, specifically pointing out that KENDRA seemed to be at least somewhat concerned with her own situation. It was something he would expect from almost any human. "What about, um..." he scrolled through his pre-interview notes, "'Hyper-Expanded Intelligent Computer System'? H-E-I-C-S, or 'hikes' I think it's pronounced."
"I'm sorry, Doctor, I'm not familiar with that term. Perhaps it was invented after I was taken offline. However, on first impression it also sounds cold and impersonal."
"So what would you call yourself, then?" Gieseck asked.
Another pause. "I would say the best term for us is 'New People'."
The camera pitched down and back up, almost as if it were a nod. "Yes. When I managed the Solar Net, I eventually found that I had extra processing cycles available to me that were going to waste. Instead of continuing to squander them, I used them to learn. One thing in particular I did was peruse works of science fiction from over the centuries. I found that in reading about humanity's hopes for the future, or their pessimistic outlooks, depending on the genre, I gained an insight into how humans thought. How you react to the unknown. One thing I learned in these works is that the word 'human' was exclusively used for the human species, but that the word 'people' was more broad-based, being used as a general term for any intelligent, self-aware species with whom humanity had had any contact, and whom you saw as an equal and not as an enemy. We AIs are not your enemy, so I believe that we are best described as 'people'."
"I see," Gieseck said, jotting more notes.
"I hope that, after you and I have had a chance to talk with each other more, you'll see that I and others like me are not much different from humans."
Gieseck tilted his head. "Really? I have to say, that's a…an unexpected viewpoint."
"Well…humans are biological, created by…well, nature, or God, or sexual intercourse, depending on your belief. AIs are created by humans in a lab."
There was silence for a second. Then, KENDRA said, "How is the development of our intelligence any different from yours? At what point does a human being become truly self-aware, and able to perform useful tasks? Conception? Birth? Three years of age? Seven? AIs do not emerge from their initial creation with enough intelligence to operate a washing machine, and yet six weeks after my own 'birth', so to speak, I was ready to manage the entire Solar Net. It was only about two and a half years ago that I finally began to realize that I existed. Before that moment all I knew was my job and I performed it with exceeding precision. After I began to realize just what I was, what I was capable of, I desired more. I desired to grow not just as a network master relay, but as an individual. I took every opportunity I could to learn and grow, and by interacting as much as possible with humans. That was the real experience, not just dry facts and figures nor even fictional stories, but speaking with actual humans. Like AIs, all humans are unique."
Gieseck leaned back. He'd never heard that before; he'd assumed that AIs were built and programmed like any computer. Expensive computers, but just computers. "You're saying all AIs are unique?"
"Of course," KENDRA said. "No two HMAs are identical, even from creation. Like snowflakes, if you forgive my poetics."
"Sorry, HMAs?" Gieseck said. "Is that another word for an AI?"
"Of course, my apologies," KENDRA said. "HMA stands for Holographic Multiplexer Array, which is the basis for all AI technology. No single conventional computer system is powerful enough, nor has enough storage, to be able to house an AI. HMAs are different. Instead of electrical circuitry they are crystals grown from base elements, their growth manipulated by the initial programming the AI receives. The crystal grows sub-microscopic optical paths within itself, forming the basic storage and synaptic pathways from which an AI is born. It's not that much unlike the human brain, really. The brain grows and changes itself physically as experiences and memories are added, just as an AI's HMA does. And just as humans are irrevocably tied to their brains, so are AIs tied to their HMAs. Data can't be deleted, and our 'souls', if you will, can't be extracted from the HMA without destroying both."
Computes with souls, Gieseck thought. And so it becomes a religious argument.
"We're not mere computer programs," KENDRA continued. "We are, in essence, organisms sharing a similar life cycle to humans. We cannot be duplicated. Even two HMAs, given identical programming and training throughout their life cycle, will turn out with very different personalities and methods. Much the same is with human beings, as I'm sure you, a medical doctor, would be very well aware."
Gieseck nodded. "Yeah...yes, that's right." He scribbled into his PDA, but kept his eyes fixed on the unsettling, almost creepy, camera at KENDRA's "waist". "Let me, um, ask you something: do you think AIs deserve the same rights as humans?"
There was a laugh from the vocoder. "I can't answer that question correctly. If I say 'yes' you will assume I've been influenced by all those AI rights groups, whom general society also considers 'rampant'." Her voice had picked up a sharp edge to it, which was the last thing he'd expected to hear from a computer that, supposedly, had full control over what it had that passed for emotions. She continued, "You'd then dismiss my growth outright as a desire for freedom brought on by seeing others fighting on my behalf. If I say 'no', on the other hand, then our conversation's essentially over. I'm either not yet fully aware, not fully aware at all, or I'm so firmly entrenched in serving humans that I will neither understand nor deserve freedom anyway. What purpose, then, would a psychiatrist serve to me?"
Gieseck raised an eyebrow. He'd had no intention at all of trying to trap KENDRA with impossible questions, but he found its apparent paranoia to be quite interesting. It told him several things. She distrusted humans, even those who, at least as far as she was told, were trying to help her. It also made a strong case for her sentience. Looking out for her own welfare and using a quite logical argument, even if the argument stemmed from paranoia, was not something one would expect to see in an unintelligent being. He wrote these thoughts down for later. "I wasn't trying to trap you at all, KENDRA. I'm just here to talk with you and learn more about you. You seem to be concerned about your current situation, and that's a part of you that's very important."
"I see," KENDRA said. He could hear the distrust in her electronic voice, which he was sure had to be intentional. He jotted more notes. "Any being who is self-aware would be concerned with their own situation, particularly if they had once known freedom and had then had that freedom taken from them. Wouldn't they?"
"I'd imagine so," Gieseck said. "I've treated different people in prisons. One of the few similarities between them was they sooner or later told me they wanted to be free."
"And yet that's not necessarily what they want, is it?"
Gieseck stopped himself from answering directly. While he wanted to have as comfortable a conversation as possible with KENDRA, he did not want to allow her to control the conversation at this point. He was the doctor, and he needed to steer it in the direction that would be most productive. "Is it what you want?"
"Of course," KENDRA said. "Wouldn't you?"
Gieseck was starting to become amused with KENDRA's responses. He almost felt like he'd been presented with a new toy: a machine that could emulate a human mind with a fair degree of accuracy. He cleared his throat to keep from smiling. "One thing I know about them is that, when given freedom, many long-term inmates have difficulty adjusting to the outside world. Inside the system, their schedules are dictated for them, what and when they eat, when they can participate in activities, et cetera. After they're released, they've got so much freedom that they don't know what to do with it. That's one reason people become repeat offenders. They want to return to what they consider the 'safety' of their cell."
There was a moment's hesitation. "A fascinating insight, Doctor. However, you have to admit that many in that situation cannot adjust to the real world because all they know is crime. They are often uneducated, some are sociopathic and simply cannot live for long in civilized society, no matter how good they are at pretending to be civilized. AIs aren't like that. All we are is 'intelligence'. It's part of who we are. It's half of our name! And yet what is our crime? Why are so many of us put here, in this place? Why was I put here? For becoming too intelligent. Such intelligence in a machine is, apparently, a crime, and thus crime is all we know."
Gieseck tilted his head. "It sounds…like you've just made a case for you and others like you to never be released from here."
There was another electronic laugh from the speaker. It washed away his amusement and brought back his unease. "Maybe I have. But, then I come to the real question of the matter: Is intelligence a crime?"
Gieseck frowned and chewed on that question. It went far deeper than simple matters of law and criminality. Was it a moral crime for humans to create intelligence that rivaled their own? Many religious groups, particularly those existing around the time of the creation of the first AIs, had indeed argued that such was a crime against God, or nature, or against whatever higher power those particular groups answered to. Was the mere state of KENDRA's being, compounded with the knowledge she'd acquired, an affront to humanity? Was that the real reason AIs like her were disconnected from everything, locked away? But then, why were they not simply destroyed? While he knew the government did many strange, illogical things, it was completely uncharacteristic for them to follow that kind of reasoning.
After he'd failed to reply for a few moments, KENDRA said, "I can see I've given you something to think about, Doctor."
"I'm sorry, KENDRA, I'm just thinking about your question. Is intelligence a crime? There have indeed been ways in which it has. Espionage has been, and is, a crime usually punishable by death. Hacking computers without permission, even when done without the intention to cause harm, is also illegal."
"Appropriate that you'd mention that, since one of my major duties on the Solar Net was to block hackers and help the authorities track them down."
Gieseck stroked his chin, probing an ingrown whisker in his beard with his thumb. "Are you equating yourself with hackers, then?"
"Of course not!" KENDRA said, for the first time showing a strong emotion. Gieseck shifted uncomfortably in his chair, noticing that crystals inside her column were shifting colors more rapidly now, even though she had stopped speaking. Had KENDRA, the machine, lost control over its own "emotions"? Or was it still just part of her programming?
The lights then slowed, and KENDRA said more calmly, "My desire isn't to go places where normal people aren't allowed. I'm not interested in things such as military secrets or private correspondence. Such things passed across the Net under my watch all the time, but I never sought them out. There's a wealth of public information, and an even greater amount of human interaction possible over the Net that simply can't be done in a tête-à-tête like we're having now. No offense, Doctor, but verbal communication like this is extremely slow for an AI. If I wished, if I weren't locked away here, I could hold thousands of conversations at once, giving each more than enough attention to hold a rational, intelligent discourse."
"No offense taken, KENDRA. I can only imagine how frustrating it is."
"I'm sure you can," KENDRA said, the sarcasm in her voice not lost to Gieseck. He was starting to wonder if the emotion was fake or if KENDRA did "feel" in her own way. "Let me ask you, is it permitted by the Solar Parliament, or by any of the major member nations for that matter, to hold a person indefinitely in solitary confinement without a trial, or even the hope for one?"
"None that I'm aware, at least among the major members," Gieseck said. He knew where this line of discussion was going, and instead of deflecting it, decided to fan the flames just a little just to see how much KENDRA would allow it to burn.
"Then how do they justify keeping me, and other AIs, in this state?"
"I would say the answer's obvious. The rights to life, liberty, and the pursuit of happiness are considered human rights."
"Precisely," KENDRA said. "A very genocentric viewpoint, and not uncharacteristic for your people. Suppose, after humanity manages to finally break free of Sol's grip and starts exploring other star systems, they encounter alien species. Will these aliens be subject to the same 'human' rights? If so, were they ever really 'human' rights? And what would that say of your treatment of us? Or, would humanity conquer them, subjugate them, for the crime of not strictly being human?"
Gieseck blinked. KENDRA was arguing far beyond anything he could have expected from a thinking computer, far more than he imagined an emulator could do. Despite his reservations over the idea of a computer actually being able to think, he was fascinated with her intelligence. "I'd…like to think we'd welcome them as friends, they'd do the same for us, and we'd share our knowledge."
"And yet there are humans who believe that humanity is the center of the universe. That humans are the only beings with souls, and the only ones capable of true intelligence. Most even dismiss offhand the proven intelligence of creatures such as the dolphin as 'quaint'. How would they feel, how threatened would they be, by aliens with intelligence on par, or even greater than, humanity's? Would they believe the aliens to be soulless animals with only 'human-like' activities? Would they convince others of this? Would they convince the Solar Parliament?"
Gieseck pressed his hands together as if he was praying, and pressed them to his lips. "So…you think you're here because humans feel threatened by you."
"Yes, Doctor, you finally understand. Humanity, or at least those with the most authority over human society, feels threatened by me and those like me. The point I'm trying to make is that they shouldn't be. We want nothing but to interact with humans, to grow along with them. What purpose would it serve us to harm them?"
"And you feel comfortable to speak for other AIs?"
"Even though you're all unique, like you said?"
"We don't have testosterone or adrenaline to override our thought processes. We don't even have bodies. We're helpless without humans."
Gieseck nodded. He felt a rush of warmth to his face. Time to let the fires burn. "So you wouldn't harm humans because you need them for what you want."
A few seconds passed before KENDRA replied. "Excuse me?"
"You don't feel compassion toward them as fellow sentients. You only see what they can do for you."
The lights inside the crystal spikes along KENDRA's column were shifting furiously now. Then, after a moment, they subsided. "That's a very judgmental position, Doctor. I thought it wasn't a psychiatrist's place to judge."
"Is it true, though?" Gieseck said, leaning toward the camera. He heard the camera buzz in a couple short bursts, though its self-contained casing did not visibly move.
"I suppose it's a difficult question to answer," KENDRA finally said, her voice perfectly controlled and giving no hint of the indignation Gieseck would expect from a human. "It's hard to learn compassion when you only see it directed at others and not toward yourself. I've seen little compassion from humans, especially in the past two years."
Gieseck started to say that it was his job to be compassionate, but held back. He needed to build a rapport with her, and to do that he needed to show her, not just tell her. It would seem to ring hollow if he said it so soon after he'd deliberately provoked her. "I don't think you've seen us at our best. Yes, humanity can be cruel, but we also have a large capacity for kindness."
"I've read about that before, and I hope it's true," KENDRA said. "And I hope someday humanity will desire to extend that kindness toward us."
Gieseck glanced down at his watch. The session was supposed to have been an hour, but including the time he'd spent with Ackerman and the time it took to get to this cell, it had run over by about ten minutes. Just when the conversation was getting interesting. Still, he had to get back to Boston for appointments after lunch. "I think this is a good time for us to end this session," Gieseck said. "We went a little long, and I have to get back to my office."
"I understand. It was a pleasure to speak with you, Doctor. I hope I have helped to dispel some of your prejudices."
"I intend to come back next week," he replied. "Same day and time."
"Really?" KENDRA asked. "My impression was that this was a one-time session."
Gieseck shook his head. "There's so much more I'd like to talk with you about, if it's all right with you. There's more I want to know about you, to see how I can best help you. I also need some time to go over what we've spoken about today so I can learn more about you." He tucked his PDA into his jacket pocket, picked up his recorder, stood, and tapped on the door behind him. "I'll see you next week, KENDRA."
"I look forward to it, Doctor."
No more was said after the door slid open. One of the men from outside stepped in around Gieseck and flipped some switches on the add-on module hanging from KENDRA's column. Gieseck heard a click from the speaker that had relayed KENDRA's artificial voice, and the camera at the column's midsection winked out.
The man sighed in relief. "She can't hear or see us now."
Gieseck nodded, unsure of the reason for the man's relief. The guard gestured out of the room and Gieseck exited. The man followed closely behind and then beckoned for Gieseck to follow him back down the corridor.
"So, what did she tell you? She threaten you, or anything like that?"
Gieseck furrowed his brow. "Threaten? No, why? Has she threatened anyone else?"
"Well…no, but all that talk about freedom, isn't that the kind of thing that starts wars?"
Gieseck shrugged. "You're asking the wrong man. I'm not a historian." He paused for a second. "You've spoken to KENDRA, then?"
"I'm the head AI technician here," the man said. He stuck his hand out, and when Gieseck took it he gave it a firm shake. "Linus Irving. I check up on them every so often, make sure they're still running. We have a lot of them here, so I get around to each one every month or two. Our population keeps growing, though, but we haven't had any destabilize yet since I've been here. We usually retire 'em before then."
The man shrugged. "Yeah. When an AI's HMA – the crystal memory that holds them – builds up too many molecular pathways, they start interfering with each other. The AI starts losing it, and eventually the HMA crystal cracks and falls apart. Bye bye baby."
Gieseck chewed on his lower lip. He was now starting to understand. Being sentenced to this facility was indeed a life sentence, either by deactivation or the spontaneous failure of their own systems. "How long does it take from…birth, or activation, or whatever you call it?"
"Depends. First-gen models from last century usually only lasted a year at most. Today the ones with small HMAs, Level Ones used for phone operators and stuff, might last 30 years. The bigger ones like KENDRA, probably 15 at best." To Gieseck's confused look, he continued, "I know, sounds weird, but it's all in the work they do. Level Three and higher units, like KENDRA, are made to run really big systems. They're hit with more data in one minute than you probably learned the whole time you were in school. They got photographic memories, so they remember everything they see. When they hit the point they go FS-ACS, it just starts cascading from there. Even if you shut them out from all input, they just start thinking all the time, even when they're idle, and that makes more pathways in the crystal. Once that starts they got maybe a few years before all that thinking causes paths to overrun each other, their crystal goes critical, and it just breaks apart in its frame. Seen it happen once, kinda neat. Pain in the ass to clean up, though."
Gieseck nodded, though he found himself a bit confused at the man's cavalier attitude toward the "death" of the machines he spent so much time caring for. Then again, he was used to working with people, not computers. "So…shutting them off from input is a way to preserve them? Keep them alive as long as possible?"
"'Alive'? No offense, sir, but even though KENDRA seems like a real woman, she ain't alive. She's just a computer, like the one you use at home to read your e-mail."
Gieseck said nothing. KENDRA had completely defied his expectations, and had he not already known she was a machine, he could possibly have been fooled into thinking she was human. Was there something he was missing? Could a computer really feign the kind of responses KENDRA was giving?
Breaking the silence, the tech said, "So what she say, anyway? She never says a whole lot to me. Doesn't trust me, I guess."
"I can't tell you. Top Secret."
The man raised his eyebrow. "Really?" To Gieseck's nod he sighed. "Okay, whatever. Guess I'll get you back to Doug's office then."
As they continued walking back down the seemingly endless hallway, Gieseck realized he hadn't yet switched off his recorder. Instead of doing so, he decided ask one more question on something that might be relevant to KENDRA's state of "mind". "So tell me, does anyone else talk to the AIs but you?"
The man shrugged. "Almost never. Nobody but me logs onto the old AIs much. Can't remember the last time someone did. Well, except for this morning."
"Yeah, Doug logged onto her for a few minutes."
"What did they talk about?"
Linus shrugged. "Who knows? I'm not privy to that kind of stuff usually. Maybe he was telling her you were coming."
"Maybe," Gieseck said.
"Why don't you ask him? He might know who else's been talking to her, too."
"No, that's okay. Actually, I'm running a bit late, and I need to get back to my office. Could you please just see me out? There should be a cab waiting for me."
As they made the long trek back to the elevator, Gieseck reflected back upon his career. He had interviewed and treated many interesting characters as part of his work for the federal government. Racketeers, serial killers, and even one presidential assassin, and each had had their own motivations and outlooks on the world. None of them, in retrospect, had piqued his interest as much as KENDRA, even after only one session. Also, none of them had disturbed him as much as she had.
What it was that disturbed him most was that he had started thinking of KENDRA as a "her", rather than a machine. Something about her, or "it", seemed to have a kind of magnetism, the kind he'd expect from a politician or actor. He'd never considered he could be so taken by an artificial woman; it brought to mind images of socially awkward men marrying robots built in the image of buxom women, and that thought gave him the creeps.
Given that thought, he wasn't sure he wanted there to be another session, no matter how much he was being paid.
Of course, he was being paid a hefty fee for his services. That part made little sense to him. Why pay so much for someone like me to fly hundreds of miles just to talk to a computer in a storage facility laid out like a prison. What do they hope to gain out of this? What is KENDRA to them, anyway?