We pulled into the university visitor parking lot and hopped onto the Marguerite Shuttle to the Stanford Institute for Human-Centered Artificial Intelligence. (HAI)
Flashes of memories floated through my mind as we rode through my alma mater with its palm trees, bubbling fountains, and red tile roofs. We circled and passed the sandstone quadrangle with its thick Romanesque architecture, beautiful mosaics, and morals on the church.
The world and weird thoughts—specifically, my world of going from the son of a struggling coal miner to a workaholic tech millionaire—ran through my head too fast for me to grip. I felt an odd, muted nostalgia mixed with deep pride and a shade of regret for things that I might have done differently, such as receiving DARPA money for classified research.
I reached up to slip the godforsaken sling from my arm, gingerly touching the bandage and splint. Wincing, I said, “I won’t be needing that to lecture lab students.”
“Keep the sling on. It will add a dramatic effect to your presentation,” Professor Farrow. advised.
Patricia Farrow, a renowned professor of Computer Science at Stanford for forty years, now the provost responsible for academic and budgetary affairs. She was my professor back in the late 90s during grad school when Chris Helm pulled me aside and awarded two million for my research. ‘The DoD honors your technological innovation with the Young Faculty Award.’ Helm had said.
How proud I was—and remained. Military-funded research is not all unethical, I reminded myself. How was I to know my technology would be used in an unregulated mission?
Provost Farrow was now in her mid-sixties, with a refined, classic look that made her look years younger. Today, she wore a crisp white button-down shirt paired with a black cardigan and pearls. And her statement eyeglasses—always of the boldest colors to make a fashion statement—were sapphire blue in an oversized silhouette.
With an incredulous stare fixed on my former professor, I answered, “What? And look like the fugitive tech-millionaire? That ought to impress these bright students.”
“Sean, if some highly successful entrepreneur had walked into the research lab when you were studying and said he nearly got killed for pulling out of dark money — would you have applied for the grant?”
Rather than respond, I heaved the sling over my head and back over my wounded arm. We looked like masked bandits already anyhow, with bandanas wrapped across our faces. “Maybe it’s good I haven’t shaved and kept the scruffy look for the occasion,” I laughed.
“Don’t worry about looking dapper. We haven’t staged a complete return of the university research labs, so this will be very informal. It will be a hybrid meeting, maybe a dozen students will be there at most,” she said. “Others that don’t yet feel comfortable with in-person meetings with the virus going on, will link into you over Zoom.”
Glancing out the window of the shuttle bus, I realized my sense of location at Stanford had faded with my confidence. With three gunshot wounds inflicted by men influential in my career, I felt off-kilter. It didn’t help that I hadn’t been home for two months.
We approached the stop sign in front of the Cantor Arts Center and I made a mental note to walk around the sculpture gardens after I speak. It would calm my nerves. Images of my surprise visit at the doomsday bunker splashed in my head, starting with the urgent knocks on my door:
Faking a calm, I shouted: “Hark! Who goes there?”
“Mr. Colemen, it’s me, Provost Farrow.”
My former professor? To set my plan in motion, I had emailed her but never expected her to show up here of all places. After opening the door, I asked, “How did you get into this stinking hole in the ground”
“Hole in the ground? This is one swanky shelter, Sean. You’re forgetting my connection to Stoddard.”
“You scared me with those loud knocks. I thought you were the royal threesome back to gun me down again.”
Professor Farrow ran her hands through her thick hair, which she seemed to gracefully let go grey, “That three powerful technologists came after you prove the urgency of stopping these killer-robot projects.”
“You came all the way here to tell me that?” I’d felt honored and embarrassed at the same time.
“No. I came to rescue you from this stinking bunker and bring you to Stanford to persuade prospective students to be leery of where their research funding comes from.”
“Prospective students? So, they’re all new?”
“There’s a mix of incoming students and existing Ph.D., M.S. researchers, professors, social scientists and—”
“I have no presentation ready—”
“You ARE the presentation. Just tell your story. And no offense, but you look like shit,” she declared. “It’ll help if you showered and shaved.”
“That’s what my girlfriend said. No, let’s get going. if I AM the presentation, looking like shit will only show my story better.”
She laughed. “We live in unprecedented times, why not an extraordinary appearance from one of the brightest Stanford tech-entrepreneurs I know?”
Provost Farrow’s voice broke my reverie. “We’re here,” she announced.
She introduced me to Dr. Wang, the director of the HAI Lab. Dr. Wang was a young Asian woman with smooth ebony skin and doe-like brown eyes behind black-framed glasses. She fluttered around the lab like a thirsty butterfly seeking scarce minerals or water.
Five university security agents flanked the doorway, and I wondered if it was because of me getting shot, or if they’re always here now since America became a mass shooting playground.
I was immediately set up for the Zoom lecture. As Provost Farrow predicted, a dozen or so students were in the facility, all sitting three desks apart and wearing masks. An enormous screen showed a gallery grid pattern of participants, and it reflected maybe ten thumbnails.
Feeling naked, disheveled, and suddenly nervous, I pressed the start video icon, hoping my image didn’t resemble a crazed software executive. The next John McAfee. But I know that’s what I looked like.
But it didn’t matter. Julie’s threat of leaving me if I don’t pull out of military technology echoed in my head. And I want to take this a step further. I can still hear Caryssa Flynn’s words, “Think of our next generation Sean, can our nation afford perpetual war?”
And here, in front of me, are a collective brain trust within that next generation. The intelligent brain trust used by the powers that be to carry out questionable missions.
I started right in, no hesitation, “Greetings Stanford community. My name is Sean Coleman, an alumnus of Stanford with both a B.S. and a Master’s in Computer Science: Artificial Intelligence. Among many other endeavors, I am the Founder and CEO of Dazzle! I’d like to tell you a captivating story.”
Beads of sweat formed on my forehead and I realized it was from fatigue as much as nerves. My body was beaten, and I was looking forward to finally going home after this and meeting up with the love of my life for the first time in over two months. The thought that my beautiful house was ten minutes away calmed me but didn’t completely diminish the duress.
Expressions stared back at me, waiting. A few nods, as if to say, “We know who you are, your face was all over the news.” As Provost Farrow mentioned, I’ve become a bit of a legend after getting shot in the backyard of my Los Altos Hills home.
I looked at the screen, then turned my head to make eye contact with those in the audience, and back to the screen. “I’d also like to ask a couple of thought-provoking questions and provide some alarming facts.” I had no gripping photos, props, or visuals. Just me and my traumatized appearance.
I seemed to have a rapt audience, as people shifted upright and looked directly at their screens.
“As alluded to already, I’ve made it into a few shocking headlines after getting shot three times. Three powerful men in the tech world came after me, one of them a graduate of Stanford —”
A gasp came from my captive audience, and I saw the horrified look on their faces at the mention of Stanford. Should I mention Chris Helm’s name? Better yet, should I expose President Crown and his band of military powerbrokers?
I continued, keeping it as high-level as possible, “The bottom line is, they tried to murder me for pulling out of both DARPA funding and a Saudi-led venture capital firm.”
Hands flew to faces, and they squirmed in their seats. DARPA, the darling of Silicon Valley, and a path to forever war. But I saw recognition in people’s eyes. An unstated confessed complicity to unburden their souls—as if they had a confession to make.
I plunged on; “One of the men who shot me—” I cut myself off and took a deep breath. I’m talking to one of the top AI academic and research programs in the world.
My mouth operated on muscle memory as words tumbled out; “The head of the Joint Artificial Intelligence Center is one man who shot me—”
“Lt. Jeff Snead?” More than a few people asked, in surprised sounding voices. A few visual cues showed me my audience didn’t—couldn’t—believe what I said. Heads turned, looked away, or shook back and forth in rebuttal.
I let that mind-boggling fact linger in the air for a moment. Now they really must think me far-fetched. To calm their brilliant scientific minds—or at least mitigate the unease, I added, “It’s not AI in general endangering lives, but AI-powered weapons could end up being as disastrous as the atomic bomb.” Maybe I should add the threat of automated jobs and the spread of fake news.
That didn’t seem to settle well with them, and I feared I was losing my audience.
“We’re thrilled to see you’re alive and well after that dreadful ordeal, Mr. Coleman.” The thumbnail that lit up was a face I recognized as a highly respected professor of Biometrics. Relief washed over me as I relaxed into the speech.
“Thank you, I appreciate the sentiment,” I said.
My audience remained as silent and cold as new-fallen snow, deep in their thoughts. I needed to get this out.
I said, “As you all know, Stanford research is one of the most powerful think tanks in America, in the world. You are each in a unique position to shape the future. The mission statement of this lab is to use AI to improve the human condition, to benefit humanity. In this mission, I advise you to be leery of research funded by the Pentagon—certainly reconsider anything to do with autonomous weapons in collaboration of any arms company.”
I saw many heads shaking, some looked down or away from their screens. Perhaps some even left the meeting. One person countered, “That will erase a lot of research funding.”
“Yeah, I agree. It’s better to send a robot into the battlefield than a human being,” another voice piped in,” Humans could never keep up with AI.”
“That’s how I thought at the early stage of my career.” I patted the wound on my arm and could still feel the dull ache in my torso and head. “It nearly got me killed.”
“Most of us are focused on research from our homes to fight the Cloud Virus pandemic now, in search of a vaccine,” someone said. “We can’t even get into our labs. So, this might be a moot topic.”
“Thank you for this excellent use of your research—yet I still warn you not to work with DoD funding even for the pandemic. Trust me, the agency is not equipped to fight infectious diseases. We don’t need a war-machine, we need medical technology. We need clean energy. Once we get past this crisis, you will be back to other research. I urge you to ask yourselves how someone could distort your innovations to destroy people and the planet.”
Or come after you when you try to pull out of defense funding. My thoughts ran wild.
I looked at my face, staring back at myself. A shimmer of perspiration resurfaced on my forehead, and I wondered if there’s an intense shine on the camera. Thick, dark eyebrows bunched together above my eyes, and I tried to relax my features. What the heck am I doing here? Committing career suicide?
Just as I was about ready to end the lecture, a male student in the physical audience raised his hand and spoke up, “My name is Takashi Shimada. The academic researchers in my homeland[TLS1] pushed back on any research with possible military applications.”
“What country is that?” I asked, excited at the prospect of a healthy debate rather than the stifled dissent I seemed to receive.
Although the active speaker’s face came over the large computer screen, I decided to respond while looking directly into his eyes. There’s nothing like the human touch, and it seemed forced or unnatural to speak to any of the people in the room through a device. “That’s impressive, Mr. Shimada,” I said. “Modern Japan places morality over power and money by going above and beyond past treaties.”
The student nodded. “Thank you. Japan had its day of imperialism and learned the hard way. More than a dozen universities in Japan forbid researchers from accepting grants from the Defence Ministry. They acknowledge the dangers of military academic research. We root for the spirit of peace in our society.”
A defiant female tone rang out, “Why are you at Stanford then? Why not go back to Japan to study?” I noticed her name was Rachel Locke.
I cringed. The healthy debate was turning into borderline cyberbullying. Maybe it’s time for someone to force the USA to sign a peace treaty. It would do wonders for our nation and world.
Then I recognized her. The long blonde hair and blue eyes touched by storm clouds. Like fire and water set in a field of freckles; features hard to forget. She’s the student that got kicked out of the Hacking for Defense class after refusing to run back into a violent riptide of forty-five-degree surf on Coronado Beach after doing mega-pushups and other military-grade strenuous exercises.
I remembered feeling mortified seeing the college kids I mentored at Stanford, training with Navy Seals as part of their classroom curriculum. I’d gone on a field trip with them and watched from afar, as these civilian university students were treated like soldiers.
She looked the same on the Zoom screen, minus being soaked in seawater, caked with sand, and in a world of hurt with her classmates.
Before I could speak, the Japanese student countered her in a calm tone, “I will be going back to Kyoto University. I’m temporarily here to study abroad. The Kyoto researchers aim to contribute to social order, human peace and well-being and won’t carry out military research that leads to threatening these aims.”
“That’s beautiful,” Rachel Locke replied.
It surprised me how quiet faculty members remained. Whatever happened to students and faculty protesting against war? Were they permanently silenced after crazed National Guard members had slaughtered innocent college students at Kent State?
My hand instinctively moved to my pocket, where I had folded a copy of the incriminating photos from the Springnest mission. I do have gripping photos, I thought. But how could I show them this horror carried out in the name of “national security” without mentioning our rogue military operations?
My eyes moved toward the armed security guards surrounding the lab and my mind raced between the Kent State massacre and the near-fatal night while soaking in my hot tub, watching the water turn crimson.
Chasing the thoughts away, I scrambled for a quote relevant to my unprepared lecture. ’We learned the hard way,’ the Japanese exchange student had said. Why not use an example from the nuclear physicist who led the design of the atomic bombs used on Hiroshima and Nagasaki?
“Let me put a quote out to you, please tell me who said it,” I challenged. ‘Now I am become death, the destroyer of worlds.’ Who used this quote, and why?”
“Oppenheimer,” came the first response from the room. It was from a friendly man with a round face and glasses. “Because he deeply regretted leading the Manhattan Project.”
“Actually, he was quoting Hindu scripture, the sacred text of the Bhagavad-Gita,” came another response over Zoom. The image filling the screen was a youthful woman with a colored dot on the center of her forehead, large hoop earrings, and a bright red shawl-like scarf wrapped around her left shoulder. “He was trying to seek inner peace after realizing the death and destruction he helped to cause. He felt the weight on his soul. The scripture stands for ‘Song of God.’”
Nodding, while switching eye contact from screen to room, I added, “Robert Oppenheimer became the target of mass surveillance after opposing the atomic bomb. Can we use this lesson today as a warning to reign in certain academic research of military technology? Did the brilliant senior research fellow, Edward Teller—here at our beloved Stanford campus—win the greenlight from Truman based more on money than morality?”
I noticed the Japanese exchange student had remained silent for much of the discussion.
“It’s not exactly like we’re designing atomic bombs in this lab,” came a retort from a man appearing to be in his late fifties. He wore black-rimmed glasses and a black shirt with a paisley tie, looking every part of the typical military robotics entrepreneur. The name displayed was Professor Ally.
I recognized him from my Stanford days and was careful to use his name and accomplishments in my response.
“True, Professor Ally, and thank you for your contributions to firefighting and disaster recovery robotics,” I said. Professor Ally’s features visibly softened, and a proud smile formed on his face.
My mouth went dry, and I took a sip of water before continuing. “But I’m not referring to the good use of AI. I’m speaking of autonomous weapons—machine intelligent robots—that rely entirely on AI rather than human input to decide what and who to kill. They’re this generation’s potential Manhattan Project.”
“We have this discussion all the time,” another voice declared. It seemed everyone unmuted and spoke simultaneously, reiterating that the HAI is all about using AI for the benefit of humanity.
But we all know we’ve taken the powerful technology into dangerous territory.
“Yes, and that’s why I started here with my message of how and why I nearly got killed.” Due to the cultural obsession with wanting the world to fear US Military technology,” I thought. Power and money are putting us all in danger.
Nobody said anything, so I finished my lecture with my main point: “Use AI for the social good. Artificial intelligence is ready for social entrepreneurship. The key is to look at true social problems, to bring about positive social change within the community and around the globe. My message is simple: Think about what the environmental and social outcome may be of each invention.”
“We don’t want to be the next Oppenheimer deeply regretting what we dedicated our life to” It was the woman in the bright scarf who had mentioned the Bhagavad-Gita. Her name displayed one word, Makali.
“Thank you for your input, Makali.” I took a deep breath and spewed the words out, “Each of us can see that technological superiority has surpassed spiritual wisdom and human compassion. It’s time we act upon our convictions.”
That advanced artificial intelligence was deemed more worthy than my life remained unsaid. But I saw many heads nodding, others stared blankly at the screen. I ended with one more quote: “Peace need not be impracticable, and war need not be inevitable.”
One person jumped in with “JFK.”
Another added, “During his 1963 address to American University.”
“So much good his speech made,” came a skeptic. “Five months later he was assassinated.”
“Yes,” I said. “And by what powerful force? I have good reason to believe the same that came after me; the CIA and military intelligence. He wanted to stop the arms race—to end the Cold War to free up recourses for the greater good. And today, we have an even more dangerous arms race between nations on who has the best military artificial intelligence. A race that could cause human extinction.”
I continued, “JFK had also mentioned that a topic where ignorance abounds, and the truth is rarely perceived is the most important topic on Earth; Peace—”
The Japanese exchange student finally spoke out again. “Yes! He said, ’What kind of peace do I mean? What kind of peace do we seek? Not a Pax-Americana enforced on the world by American weapons of war.”
The lecture ended at the same time I heard Provost Farrow’s voice. “Well, hello Mr. Helm. Welcome back to your alma mater. What brings you here?”
I turned from the screen and looked directly into the eyes of Chris Helm, the man who called me his “best buddy” for nearly two decades. The man who handed me my first huge funding stream from the DoD for my tech-innovation. The man who tried to kill me when I pulled out of the war games.
Good thing neither of us could get our guns through the high security.
But that won’t help me once I leave. A jolt of fear hit me hard. Julie is on the way to my house. I’ve put her in great danger.