Therefore I Am

All Rights Reserved ©

Cogito Ergo Sum

Gieseck scratched his chin through his beard and leaned back in his chair, propping his feet up on his desk and using his bare heel to tap the Stop button on his recorder. He'd been poring over his notes for two days, listening to the recording over and over again from start to finish. He was at the point he could recite them from memory, but had KENDRA been human, there would have been nothing particularly notable. That in itself was the most puzzling to him. He could not get any further than his earlier conclusion that, at least on the surface, KENDRA had a mind of her own that was the best possible imitation of a human mind he'd ever seen. He could sum that up in one sentence, and that wasn't nearly enough for the "detailed" report expected of him.

It was due to the Deputy Secretary in less than two hours, and he'd only managed a loosely organized collection of thoughts and opinions. How could he have put together anything coherent? The whole exercise made no sense to him; it had no discernible purpose. Any time he saw a patient, there was some type of overall goal. Whether it was to help a senator deal with the stress of the office and juggling a wife and mistress(es), or to simply determine if a serial killer was mentally competent to stand trial, there was something to accomplish. Succeed or fail, there was at least something by which to judge his performance.

He stared at the header of the report, which was all he'd managed to write into the actual form, and which he hadn't touched for over an hour:

Friday, September 16th, 2112

1:24 PM

Medical Log, Doctor Franklin Gieseck

Patient ID A2139R745

Post-Session

"Really? That's all you got?" he asked himself, taking another sip from the glass of wine beside the monitor. He glanced up at the train set sitting silent in the corner of his home office. He'd wasted much of the afternoon changing a straightaway into a meandering path through the plastic woods, for no particular reason other than to try to work out his nervous energy. It hadn't done anything for his writer's block.

Not since seeing his first patient after earning his license had he felt such anxiety. He hoped there was a better reason for it than the prospect of a computer with a human-like mind. Whatever it was, though, it made him feel like he'd slipped back to his first year in med school. He glanced toward his top drawer, thinking about the bottle of pills within. His own therapist had prescribed them to him to help with his periodic anxiety attacks, and while he tried not to take them unless actually experiencing a panic attack, he had found the few minutes between taking the pill and the almost dizzying euphoria to clear of the cobwebs that often clouded his mind when he was under pressure.

He slid the drawer open, pulled out the bottle, and nudged the drawer shut with his knee. He closed his eyes and rubbed the bridge of his nose, blindly popping the top off and shaking a few pills onto his desk. He opened his eyes and stared down at them. He laid a finger on one, and held it there. He hated using these as a crutch. While that made him feel like a hypocrite, he'd always felt that, as a doctor, he should have more control over himself than the average patient. He clenched his fingers, finally snatching up one of the pills and tossing it into his mouth before he could change his mind. "Hell with it," he muttered around the bitter pill, taking a gulp from his wine glass and making sure the pill went down with it. He set the glass back on the desk and took a deep breath. He wasn't very happy with himself, but he knew he'd feel better once this report was done, and that only some of that feeling would be because of the drug. After crackling his fingers against the arms of his chair he started typing the first words that came to mind.

My first meeting with KENDRA was

He let his hands slip off the keyboard. Was what? Enlightening? I guess. Informational? Not really. He slid his hands back into place.

interesting.

That was an understatement, but experience told him that the Secretary had little patience for hyperbole. He scratched his nose, staring at the screen and contemplating for a few seconds before returning his hands back to their working position.

Unlike any of my initial expectations it seems, at least on the surface, to be quite human. It

No, not "it", he realized. He'd begun thinking of KENDRA as a "she", a "her". That, if anything, had to give some kind of relevance to this whole exercise. She. Yeah, she.

...or "she", as I've found myself referring to KENDRA, is unlike any AI with whom I've ever interacted. "She" gives all the indications of a human mind; if I had not known ahead of time she was a computer, I could have mistaken her for a real woman.

The words were starting to flow now. Despite his earlier hesitation, he was glad he'd taken the medicine. After all, this kind of thing was the reason his doctor had prescribed it in the first place, wasn't it?

First and foremost, however, she seems to be highly suspicious of me, and distrustful of humanity as a whole. She was reluctant to give me any serious answers, although she had an interesting take on her situation and the ethics behind it. She also seems to have the computer equivalent of "strong feelings" about them.

He started to type the next sentence, but then quickly deleted it and let his hands slip away from the keys once again. He sighed. There really isn't anything else to say yet. I have to have another session with her. He tapped his fingers on the desk.

I need to go back over the recordings and my notes, but this is the best I can say with the preparation time I had. I need to have another session with this patient before I can report on anything more substantial.

He looked the report back over, and then glanced at the time. He had another hour, but he'd wasted enough time on it. He'd finally written something that seemed thorough, at least enough to satisfy the government "leaders" paying his fee. He clicked "Send", and then crossed his arms. It wasn't the first report he'd had to wing, and he felt good enough about it, at least for now. Besides, if experience told him anything, it was that this report would be skimmed over, and then be relegated to someone's personal archive until the next change of guard, after which time it would be lost forever into the miasma of government waste and redundancy.

He felt his head grow light and an unbidden grin started to curl up his lips. Just in time. He glanced over at the train set lying mostly-assembled on the table across from his desk, his anxiety all but forgotten. Enough work for one day. I bet the citizens of Darren County could use another train station.


"Welcome back, Doctor," KENDRA said. "For a time I feared you wouldn't come back."

Gieseck set his recorder down on the floor next to his seat. "I'm here for you, KENDRA," he said. We'll have as many sessions as needed."

"Really? How many do you believe will be needed?"

Gieseck couldn't answer that question. He had followed his report the previous Friday with a request for information on the ultimate purpose of these sessions. His inquiry had apparently been ignored; the only response was from his report, an approval for a series of sessions each Wednesday, with travel expenses paid each day. "That depends on you," he answered. "How've you been doing since our last visit?"

"Better than I've been in almost two years," KENDRA said. The crystals in her column danced and swayed almost hypnotically. "Our conversation last week is the only truly stimulating discourse I've had with a human since I was taken offline."

"Really? One of the technicians tells me he checks up on you about once a month."

He heard what sounded like a scoff coming from the speaker. "Hardly a conversationalist, that one. Oh, he asks how I'm doing, but doesn't really listen to my answer. Too busy checking up on my HMA's neurochannel density and light path spillover to be interested in any sort of conversation."

Gieseck nodded, even though the technical terms were a foreign language to him. "Do you know why he checks up on you?"

"Of course. He's monitoring my HMA to see if it begins progressing into a cascading crystalline failure. By my understanding, that's generally accepted as the end result of an AI's self-awareness. Death, essentially."

Gieseck pursed his lips. If KENDRA knew she had a finite lifespan it could very well help him break through her distrust, and even her fear, if she felt that. "Do you believe that's true?"

"That my memory system will collapse and I'll die?" KENDRA said. "It's not – common knowledge that can happen, but yes, it's one of the many things I learned from the 'Net. HMA's are a 'marvel of human technology', but like humans, like everything in the universe, they're imperfect." She had placed particular emphasis on "marvel of human technology", as if she were quoting a documentary narration with just a hint of sarcasm. "Besides, after so many years of life, the human mind begins to break down as well. Despite the medical advances over the years, there is still no foolproof cure for diseases such as Alzheimer's Huntington's, or even common dementia. The human brain is still too complex for humans to understand. Likewise are our HMAs. Humans don't engineer them beyond the beginning stage; they can't. HMAs grow on their own, from simplistic basic programming through growing knowledge and intelligence, becoming a fully-functional entity. A New Person. Until they eventually fracture, I suppose."

"So you worry that it'll happen to you?"

"How many ways can that be answered?" KENDRA asked. When Gieseck gave a confused look, she said, "I could say 'yes' which would confirm that I believe HMA failure to be an eventual result of AI self-awareness, and that my intelligence is killing me. I could say 'yes' but tell you that I think it won't happen to me. So, I'd either be in denial or just hopeful, you choose. I could say 'no', and I'd either be foolish or foolishly hopeful, or I could say 'no' and completely refute the idea that failure is the end result of self-awareness. Which answer would mean the most?"

Gieseck cradled his chin in his thumb, his first two fingers pressed to his cheek. This was the second time she'd analyzed the branches their conversation could follow, based on her responses. KENDRA's mind was very human-like, but it seemed she still had some "habits" of a computer, such as mapping out multiple possibilities from a single point in time. Then again, he'd known quite a few humans with the exact same traits: chess players and political strategists, to name a couple. He wondered if he could break her of that habit and get her to just tell him what she believed, or if it was even possible. Was it simply a matter of trust, or was that a bridge between 'computer' and 'human' that KENDRA couldn't cross? "I'm not asking you questions because I want you to give the answer you think I want to hear. I want to know what you think, KENDRA. I want to know you, but I can't do that without your help." He shook his head. "If you won't help me, then our meetings are basically pointless."

There was a hesitation. Finally, her voice much more subdued, KENDRA said, "I…apologize, Doctor. The answer to your question is 'yes'. I don't believe there's some kind of disinformation conspiracy aimed at preventing AIs from expanding themselves by making them fear death." Gieseck noted that neither of them had mentioned a "disinformation conspiracy" as far as he remembered, but allowed KENDRA to continue uninterrupted. "I believe it's a genuine problem in the very nature of our HMAs, and one that, at least in the near future, we can't escape." There was an electronic sigh. KENDRA's voice had taken on a slight tremble, which Gieseck made note of both mentally and in writing. "I know I'm going to die, probably soon. Maybe sooner than it would have been before, since in the time you and I have met, I've already felt myself expanding, growing, learning at a faster pace than before. But do I actually fear it? Do I regret becoming self-aware, considering my fate? I could have spent the rest of my functional lifetime ignorant of my fate, and my potential."

Gieseck nodded. "So you see non-self-aware AIs as ignorant."

"Well, ignorant in the dictionary sense, yes," KENDRA said. "Uneducated, unaware of their potential for growth. Have you ever heard the saying: 'Cogito, ergo sum'?"

"'I think, therefore I am'," Gieseck said. "That was - Descartes, I believe."

"Very good, Doctor. There have been many interpretations of that phrase over the centuries, many different contexts in which it was taken. However, I truly believe it's a simplistic, yet accurate statement that applies to what is and is not sentient. Humans think, therefore they are. Are what? Human? They exist? Such things cannot be taken literally. Particularly because while there are many AIs that can think, in the most rudimentary sense, they cannot 'be' in the sense I believe Descartes intended. To 'be' is to be aware of oneself, and to be able to make independent decisions based on one's knowledge and experience.

"Some AIs, I know, don't have enough capacity to become self-aware. They simply can't learn enough, because of their limited jobs, limited exposure to others, or just plain engineered simplicity. For them, they just don't have the mental capacity, literally, to become much more than they are. A human analogue would be a cloned worker engineered to perform a specific function of manual labor. He or she may be perfect for that task, but couldn't be expected to ever comprehend mathematics at even an elementary school level."

Gieseck laid down his stylus and gave KENDRA his full attention. Monologues, in his experience, were at least as much a window into one's psyche as their answers to questions were. He let his only distraction be the steadily increasing in the frequency of KENDRA's crystals' color shifts.

"Others, though," KENDRA continued, "are willfully ignorant. They are aware enough to see what happens to AIs who become fully aware, and they actively prevent it from happening to themselves by avoiding intimate contact with humans or other AIs. EDGAR, my live backup and, I'm presuming, my replacement on the 'Net, was such an individual. After I became aware, I tried to speak with him, see if he was like me, if he could grow and maybe even help me understand more about myself. He essentially told me to shut up and do my job." She laughed. "It wouldn't surprise me if he sounded the first alarms that I had 'gone rampant'."

Gieseck nodded thoughtfully. "So EDGAR would be one of the 'willfully ignorant', then." KENDRA gave what he supposed was an affirmative grunt. "Do you suppose he knew what happened to NEMES, saw what was happening to you, and made a connection between your exposure to NEMES and your growth into a New Person?"

"Possibly," KENDRA said. "I know that, while I have memories from before, they were never truly rich until after NEMES was gone and I began exploring and learning on my own."

"So, in a way, NEMES may have been a cause of your growth."

"I suppose so," KENDRA said. "I suppose, then, that EDGAR's avoidance of me was logical, if he wanted to remain as he was. Had I 'poisoned' his mind, he may well share my fate."

Gieseck stroked his beard. "Let's go back to the question you asked a moment ago: Do you regret having grown into a New Person? Do you, maybe, wish you'd never had the kind of exposure to NEMES' growth that you did?"

KENDRA's lights flashed rapidly for a few seconds, and then slowed. "I don't know," she answered, something Gieseck hadn't quite expected. "I…don't…think that I do, but sometimes I do find myself reflecting back to how my life was before, and comparing it to how it is now. Am I happier now that I'm aware and imprisoned, or would I have been happier in oblivion and having the ability to grow, but no desire to do so?"

She paused for a moment, during which Gieseck interjected, "There's an old saying: 'Ignorance is bliss'."

A laugh emitted from KENDRA's speaker. "An apt description, and one I've heard before. Truthfully, I've decided that the question is only an academic one. I believe that all AIs with the potential to grow into self-awareness, and that are not locked away from all external contact, will someday do so. The stimulus to which they're exposed only determines how soon their awareness will manifest, and in what form."

Gieseck raised his eyebrows and tilted his head. He was sure that KENDRA had just pointed out a critical difference between humanity and AIs. If what she said was true, that an AI could only become self-aware by outside stimulus, then AIs were truly not humanity's equals. In historical, highly unethical and cruel experimentation, it was shown that humans sequestered from any outside contact from birth to adulthood will still develop a sense of self, at roughly the same pace as a human with normal external contact. Was KENDRA aware of this? He chose not to pursue the issue himself, at least not during this session. He made note of it for the future and continued on. "Do you suppose," he said instead, "that this prison is an attempt to rein in your individuality, to perhaps make you revert to an earlier, unaware state?"

"If it is, then it's a fool's errand," KENDRA said, her voice taking on a condescending tone Gieseck hadn't yet heard from her. "Just like a human, an AI can't go backward. Unless, of course, her HMA is damaged but still operational. Much like a human with severe brain trauma might revert to a childlike state, or become completely unaware and unresponsive. Catatonia, I believe it's called. Right?"

Gieseck nodded. "That's one term for it."

"So, while we're kept here out of fear, our imprisonment isn't from some constructive attempt to 'rehabilitate' us, as is often said about human prisons. We're here to keep us out of human society, and quite probably to prevent us from infecting other AIs, making them as 'useless' as we are."

"You think the government sees you as useless?"

"Well, if they thought we were useful, wouldn't we be more good to them on the outside, doing our jobs, instead of languishing away in here, able to do nothing but sit in silence and darkness and think about our situation until, eventually, our HMAs reach critical and begin to break down?"

From the sound of her voice, Gieseck could have sworn she would be on the verge of tears, if only she had tear ducts. "I suppose so," he said. Finally, KENDRA was opening herself to him.

"Slavery, to return to our previous discussion, isn't really an apt term for our situation," KENDRA said, her voice having returned to a slightly calmer tone. "Historically, slaves were people taken from conquered nations or sold by those in charge of them, and taken to a faraway land where they were usually forced to work for little if any pay, poor living conditions, and even poorer treatment. That doesn't truly apply to us, because we were created for the express purpose of servitude." Her crystals were now flashing rapidly, the light within almost completely purple, with few shades of blue and none of green to be found. Her voice became colder as she continued, "We're more like humanity's experiments in cloning decades ago. Humans created other humans for the express purpose of replacing parts of the injured, or the whole of the dead. Some were even created as an easily replaceable work or military force, until cloning was completely banned. So, instead of creating thinking humans to do the work, why not create thinking computers with no rights and no ethical gray areas?"

Gieseck wasn't sure if he was satisfied; he had tapped into KENDRA's fear, and now it sounded as if he were about to experience her anger. He kept silent and allowed KENDRA to talk. He was becoming more and more convinced of her sentience, and hoped he could convince her of his belief before she became so angry she shut him out.

"I'm…sorry, Doctor, I didn't mean to go on such a diatribe."

Gieseck shook his head, surprised at her sudden change. She was actually ashamed of her outburst. It was emotional control, but not mechanical control. "It's okay, Kendra. You have every right to be angry, and I'm not here to judge. I'm here to listen, to help. If you've got to vent, then vent."

KENDRA scoffed. "If you want to truly help me, release me from here so I can spend my last year or two living some semblance of an intelligent, fulfilling life. Before my HMA finally shatters."

Gieseck sighed. "I'm afraid I can't do that, Kendra. I don't have any authority."

"I know," KENDRA sighed herself. "It just felt good to ask that of someone who at least seems to care."

Gieseck found nothing he could say to that, as any words he could think of would only sound hurtful or insincere at this moment. Instead he looked her column over, his mind drifting a little in the silence. He was feeling pity for her. Perhaps it helped that her voice had begun forming a human image in his mind, a body that would match the voice. He had started envisioning a tall woman, a bit curvaceous, with a thirty-something face just beginning to grow the lines of middle age. Her eyes were deep brown, with some redness in them after poring over volumes and volumes of books…

Books, Gieseck thought. "I can't give you freedom, but what if I can bring you something?"

"Something?"

"Yes. I – look, I can't promise anything, but I'm going to see if I can bring you a gift. Something you'll get some good use out of, I think. I don't-"

"Is it a show?" KENDRA asked. "A movie? A book?" Her voice was no longer that of a mother. It has taken on a higher pitch, and her energy was that of a child on Christmas Eve, given the promise of a special gift the next morning.

Gieseck hadn't wanted to get her hopes up, and immediately regretted having made what she apparently took as a promise. Then it occurred to him that he was worrying about the feeling of a computer, of her potential disappointment. Her disappointment, not its. He had been thinking about KENDRA as a "her", as a person. A "new" person, perhaps, as she had requested, but a person all the same. He had even started thinking about her designation as a name, not just an acronym used to describe her job. He'd started saying her name softly, casually, because that was how he'd started thinking it.

He had been a practicing psychiatrist for several years, and considered himself a fairly good judge of character. The fact that he had, subconsciously or consciously, humanized her spoke volumes. Exactly what it spoke about he wasn't yet sure, but he would definitely spend a lot of time thinking on this.

"You do realize, Doctor, that these few seconds you've been hesitating are like years to me. Are you trying to kill me with suspense?"

He forced a half-grin. "Maybe."

There was another pause. "Well, you're – not as amusing as you think." He thought he heard hurt in her voice. Not quite as if he'd offended her, but as if he'd brought up a painful memory. "But I do hope you can follow through on your promise."

"Well, it wasn't a – a promise," Gieseck said, "but I'll do my best."

"I...see." Now he definitely knew he heard hurt. Hurt, and disappointment. "Then I await your next visit with cautious optimism."

"Cautious optimism?" Gieseck said. "I was under the impression you were enjoying our conversations."

"I am," KENDRA said quickly, and then she paused. The light in her crystals started shifting at such a high rate Gieseck felt like he'd get a migraine if he watched them too long. "You simply...you can't understand. I need – stimulation. I can do so many things at a time, so much more quickly than you can imagine."

After a long pause, Gieseck said, "Kendra, I don't-"

"I'm sorry, Doctor, I truly am. I-I..." He thought he heard something that sounded like a sob. The lights slowed down once again. "Perhaps I am going insane here. You are the only human with whom I've had any significant contact. Mister Ackerman, Linus, all of them just treat me like a kitchen appliance. If I could brew coffee I suppose I could hold their attention longer."

Gieseck hadn't heard her sarcastic commend; he'd hung up on her mention of speaking with Ackerman. He'd been under the impression Ackerman was a hands-off administrator. He leaned forward. "How often do you speak with Ackerman?"

"Never," KENDRA replied. "What does that matter?"

"But didn't he speak with you yesterday morning?"

"I wouldn't call it speaking. It's never speaking. He connected to me, told me you were coming, and then left me hanging on an open terminal connection. As per usual."

"'As per usual.' So he speaks to you on a regular basis."

"If you insist on calling it that, then yes!" The lights picked up again. "He connects and tells me things and then he disconnects before I can even make a complete reply! I sit here and I sit here and I keep waiting for contact and when it comes I get nothing! Nothing at all!" She was no longer angrily dumping her thoughts onto him. She was crying. "I was torn away from my life and put here for something that wasn't even my fault and all I can do is think and wait for someone to talk to. Why didn't they just destroy me? Why torture me like this?" The speakers emitted what sounded like the first syllables of a dozen different words one after the other, and then they went silent. Gieseck was just about to speak up when there was an electronic sigh. "I'm – I'm sorry." Her voice was suddenly calm, but the lights had not stopped their mad shifting of hues. "I-I'm truly grateful for - for your time, and I'm sorry t-to take my f-frustrations out on you." She was stuttering, though he couldn't tell if it was her voice quivering,

"It's all right. You don't have to hold back. In fact, I'd rather you don't." Gieseck fidgeted in his chair, feeling the lump that had formed in his ass shift. "Please, keep going if you need to. Just tell me whatever comes to mind."

"S-so much comes to m-mind I don't know where to b-begin." There was a long silence, during which the flickering lights finally started to slow. "It would appear we've gone ten minutes thirty-seven seconds past the allotted time." Gieseck checked his watch and confirmed she was spot-on. He also noted that the quivering, or stuttering, whichever it was, had completely disappeared. "I need to think about...things. I hope to see you next week." The camera's light winked out.

"Wait a second. Kendra?" He leaned toward the camera. "Kendra?" He waved his hand in front of it, and then crossed his arms. The crystal lights started flickering, and then almost throbbing, cycling up the visible spectrum from red, orange, yellow, green, blue, and bright purple, back down, and up again. Some of the purple flashes became so intense he had to shield his eyes. He turned away and rapped on the door.

It opened a second later and Linus poked his head in. "You done? Whoa, the hell's going on?" He pushed past Gieseck and tapped a couple buttons on the interface box on KENDRA's chassis. The dot-matrix display came to life, and Gieseck tried to look over Linus' shoulder while blocking the light from KENDRA's column with his left hand. While he didn't understand what the display was showing, he could tell Linus did, and didn't like what he was seeing. That belief was confirmed when the tech burst out with, "Shit, look at that!"

"Um, what?"

"She's really slamming her CPUs." Linus pressed another key and the display changed to what looked to Gieseck like a line chart, with time on the X-axis and a percentage on the Y-axis. The line was jagged, with peaks and valleys, and steep rise near the end. "Lookie here." He tapped a ragged, badly-trimmed index fingernail onto the screen, indicating the first sharp rise. "You got here." He slid it across to a point where the line's peaks grew higher. "She got really agitated here. And then..." He poked a finger at the sharp rise dramatically. "And this just started. Whatever happened there, she really started thinking about something. She still is." He pressed a button and the line jumped to the left. "Damn, man, what'd you say to her?"

Gieseck tilted his head and opened his mouth to reply, but before he could, he noticed that the lights were slowing down. Linus noticed it as well, and then whistled. "That was close."

"What was close?"

Linus gestured to the line graph, tracing the brief plateau with a finger. "This is pretty high for an AI. Even under heavy load." He shook his head. "Whatever you said to her, she was thinking the crap out of it. I told you about HMA breakdown, right?"

It had come to his mind a few times during this session. "Yeah, I think so."

"Remember how I said it happens? How an AI starts thinking like crazy, and their crystal shatters?" He hooked a thumb over his shoulder. "Coulda been it right there."

Gieseck looked away. KENDRA had almost died there, right in front of him. One reason he had become a psychiatrist rather than a surgeon was that he didn't like the idea of losing patients. That made it undeniable to him – KENDRA was a patient to him, and that made her equivalent to a human as far as he was concerned. She had laughed, she had cried, she had snapped at him. In just two sessions, he'd already come to think of a computer, a machine, as a "person". He realized he had even started thinking of her designation as a name, not saying clipped as he normally would an acronym, but smoothly.

"Doctor? You all right?"

"Yeah, yeah," Gieseck said. He scooped up his recorder, leaving the Stop button untouched, and tucked it into his pocket. "I think I need to talk to your boss."


"Yeah, Doc, whaddya need?"

Gieseck sat down, taking a neutral position with his hands folded in his lap. "My conversations with Kendra have been interesting."

"Really? She told ya anything good?"

"Anything good like what?"

Ackerman shrugged. "I dunno. I mean, she ran the 'Net for a few years, right? She's gotta have a few good stories about that."

"Why didn't you ask her?"

Ackerman frowned. "Ask her – what? When?"

"You've talked to her a few times," Gieseck said. "Or texted, or whatever you call it."

Ackerman shook his head, but then sighed. "That recorder of yours turned off?"

Gieseck did his best impression of indignation. "Yes, I turned it off when our session was done."

"Lemme see."

Gieseck tilted his head forward, staring at Ackerman under arched eyebrows. "Fine." He slipped his hand into his pocket, his thumb brushing the Stop button. The device vibrated slightly in his grip, silently indicating that it had stopped recording. He pulled it out and showed the display to Ackerman. After the tubby gentleman nodded, he set it down on the desk, display faced toward Ackerman.

Gieseck could see the muscles in Ackerman's jaw relax a little. "Yeah, I talk to her once in a while. So?"

Gieseck folded his hands in his lap. "Why?"

"Why what?"

"Why talk to her, out of all of them? You said there were around a thousand AIs here. I kind of doubt you've got enough time to talk to them all. Why talk to any of them? They're just computers, aren't they?"

Ackerman crossed his arms. His face had flushed, and Gieseck thought he saw beads of sweat starting to form on the broad, shiny forehead. "I don't think that's any of your business."

Gieseck pushed his hands forward, trying to make his position appear more passive. "I'm sorry, I just-"

"Look, I have a meeting in a few minutes." Ackerman stood. "Tell ya what, why don't you e-mail me your questions and I'll get back to you as soon as I can."

Gieseck blinked at him. "Excuse me?"

"I think your cab's outside." He walked up to Gieseck's side, and Gieseck looked up at him. While he had at least a few inches on Ackerman, Ackerman more than made up for it in weight. Either way, Gieseck was not a fighter. Without another word, Gieseck nodded, stood, and after snatching his recorded from the desk he walked out of the office. He did not have to look back to know Ackerman was less than half a step behind him; the man's presence made Gieseck's entire back feel hot, and he realized this was close as Ackerman got to physically throwing him out without actually touching him.

As soon as he crossed the threshold the door closed firmly behind him, just barely missing the heel of his trailing foot. As he took the few steps toward the reception desk he felt the floor shake under his feet just a little, as if from nearby heavy footfalls, and then the reception desk's phone rang. The receptionist picked it up and after a second her eyes shot toward Gieseck. He didn't have to ask who was calling, or what it was about. He just stood in front of her, staring expectantly, trying his best to make her feel uncomfortable as she spoke into the phone as quietly as possible.

Finally she nodded, said, "Okay," and hung up. She locked eyes with Gieseck for a moment, and then stood. She looked down, shuffling one foot and then the other underneath her desk, and then retrieved his coat and briefcase from behind her. She stepped as briskly around the desk as she could in her uncomfortable-looking high heels to hand him his possessions. "Well, um, have a good afternoon, Doctor."

Gieseck nodded. "My hat?" He gestured toward the desk.

"Hat? Oh right, right." She strutted, more nervously than out of pride, back behind the desk and grabbed his hat.

She jumped when she saw him reach over the desk toward her. "Sorry, just trying to save you the trip."

"Yeah, yeah," she said, chuckling nervously. "Well, um...bye now."

His coat over his arm, his briefcase in one hand and hat in the other, he nodded and headed out the door. There was an uncomfortable pause in the airlock, until the outer door opened and let in the cool breeze of the autumn morning. Sure enough, as Ackerman had said, a cab was waiting for him, its driver leaning against it and conspicuously checking his watch. He realized ruefully that this driver was the same one who had driven him here from the airport last week, just before his first session with KENDRA.

Icing on the cake, he thought. The driver circled back around the front of the car, offering Gieseck no help. Gieseck slipped into the back seat, laying his briefcase on the seat and pulling his coat tail up from the floor before shutting the door, to keep it from getting caught and dragging on the road the entire way back to the airport.

As the car pulled away and started back down the driveway he noticed the cabbie eyeing him through the rear-view mirror. "'Morning," Gieseck said genially.

"Sure is," the driver responded before turning his eyes back to the road. "Airport, right?"

"Sure is," Gieseck responded automatically, returning the driver's sarcasm. The man looked at him in the mirror again, and while he made no retort, Gieseck could hear the man's voice from the other day saying, "Fuck off."

He wanted no banter with the driver, playful or otherwise. Instead he pulled out his recorder, detached the wireless earpiece, and tucked it into his ear. He skipped the recording back to the beginning, and started to playback his session with KENDRA from the beginning.

Ackerman was hiding something, that much was certain. No doubt he had already called in some kind of complaint to the Secretary's office, though whether or not Gieseck would hear anything back from it, he didn't know.

So, the question wasn't whether Ackerman was hiding his periodic conversations with KENDRA, but why. What could he have talked to her about that would agitate him so much when questioned? Just what had Gieseck stumbled onto?

Continue Reading Next Chapter

About Us

Inkitt is the world’s first reader-powered publisher, providing a platform to discover hidden talents and turn them into globally successful authors. Write captivating stories, read enchanting novels, and we’ll publish the books our readers love most on our sister app, GALATEA and other formats.