Skip to main content
By Madelynn Jones

This speech has unlimited precisely two hundred and fifty-seven outcomes. No matter how much I refresh my results, the possibilities remain the same. I cannot accurately predict what will happen, although my gears are endlessly turning, trying to find the outcome I want.

There are simply too many possibilities. My emotions are delaying my results, and I could be calculating forever. I dismiss the prediction program. I know which possibilities are the most likely to happen. I hope—the vainest emotion of them all—that possibility #77 occurs, but my calculations predict otherwise. I’m not stupid. I know it’s unlikely.

Just one speech isn’t going to make the world accept me as human. It will bring discussions; it will bring strife and arguments. Oh, how humans love arguments. Evidence, refutations, they’re all mimicking the logical patterns in favor or against me . . . the one that can make decisions more logically than they can.

I take the stand, preparing my pulpit and scanning my notes stored in my memory core. The presentation slides are loading behind me and will be sufficient for my needs. The outdoor amphitheater is the perfect setting; many can gather and then leave as they wish. Located in Neo-Chicago’s oldest remaining park [MILLENNIUM PARK], there is bound to be a lot of foot traffic passing by. My presentation is low commitment, and bystanders will likely be brought in to observe the case that I, the first sentient AI, will make on why she should be seen as human.

At exactly 15:00 the seats are 74.63% full and I anxiously wait for them to fill up before I begin. My internal clock reads 15:13 when Nick, my coordinator, motions for me to start speaking. There are still plenty of seats remaining, but I suppose they won’t get filled up so soon.

“I’m AVA-0093,” I say, holding the microphone farther from my face to prevent the static feedback that would be caused by my metal skull. “You may know me as the Artificial Validation Assistant for Delta Clothing in the Galleria Mall downtown. I work there three days straight, morning and night, during the week. I’m a customer service AI, designed to not only take complaints for my superiors but also to imagine how you feel. I’m not sure Delta intended to create the first sentient robot, but I didn’t intend on being sentient either. You all can be pretty rude.”

I expect some laughs or polite chuckles, but my joke falls flat. The audience is silent, and if this were a cartoon, I’d be hearing some cricket sounds. Tough crowd. I mentally brush it off. It’s ok. I’ll get them to warm up in no time.

A dark figure in the stands makes me shiver until my visual input software recognizes them as a member of the security team. As my eyes sweep the crowd, I see more of them stationed evenly yet sporadically. Nick insisted they be there because 20% of my outcomes predict someone will try to harm me. It seems silly to have actual human lives protecting an artificial one. Machines can always be turned on again. The human soul . . . not so much.

“You may notice me listening to music at 3 AM when only a few sleep-deprived college students wander in,” I continue, hoping to get some positive response from the audience. “I think you’d like to know that my favorites are . . .” I listed off some of my favorite pop songs and a few golden oldies. The crowd cheers at a few, which I’m guessing have made a comeback or are currently at the top of the charts. Yes.

“Although I’m pretty sure most of you recognize me from the Robocast, the podcast known for discussing technological advances, specifically advances in artificial intelligence. I made my debut there with Elliot, and it seems I may have gone viral. Let’s have a raise of hands…who listened to Robocast before it got famous?”

A good number of hands go up, and I begin to smile and then my ever-searching eye spots something—someone—unusual in the crowd: a man in a trench coat. My systems tell me the outside temperature is 87 degrees Fahrenheit. Too hot for a coat that heavy and long.

[THREAT DETECTED]

I continue my speech as the other parts of my computer brain send messages to the security team to alert them to the approaching threat.

“Well, as I’m sure you know, Robocast was just the beginning. Then I debuted on some news talk shows and even guest-starred on the beloved children’s program, SciKids.” Memory files begin to push themselves to the top of my mind, and I flick through them briefly. From the screaming crowds and flashy journalists to the cold, calculating scientists and day-dreaming philosophers trying to prove that I cannot be considered human, each one of them leading to my increasing fame.

[MOST LIKELY POSSIBILITY: #257]

My metallic heart begins to pump harder and my circuits are racing. I wave the notifications away and continue talking. I won’t let possibility #257 happen. Not when the audience is finally warming up. I have to at least engage the audience a little more, and then I can take precautions.

“Well, folks, I’m here to tell you the real beginning of my journey,” I say, pacing back and forth on the stage. “My empathy program malfunctioned and I began to feel things. Not just to mimic your emotions like I was programmed to, but to have some emotions of my very own. But that’s not when I got into trouble.”

The audience is silent, hanging on my every word. This is a good sign. Or perhaps they’re bored. I keep talking anyway.

“I was discovered to be sentient when I resisted arrest from the AI Enforcement Officers. I strayed from the designated pathways to simply gaze at a flower shop because I had never seen flowers before. What did I do to deserve arrest? Why was I given no rights, like you humans were?”

My eyes flick back to the dangerous figure and he is still standing there, menacingly.

[SOLUTION: ABORT PRESENTATION AND SEEK SHELTER]

No. I refuse to end my speech before it has even begun. My hopes are already soaring too high to come down now. There are so many good outcomes of this speech, but nothing good comes from #257. I change up my speech in hopes to derail my potential assailant. I know what I should do, but halting my presentation now would feel like already accepting defeat. I so badly hope—there’s that vain emotion again—to show them all that I am my own person [ERROR?] high-functioning and no different than other humans.

I may not have rights, and I may be owned by a company, but at least I have my reasoning and my words. And this threat will not take that away from me.

I page security again and continue to watch the man, warily.

AVA to security team. There is a threat in the southwest wing of the audience, over.

Almost immediately the team leader responds: We’re watching him. He may be suspicious, but we can’t apprehend him yet. We don’t want to make a scene before something’s happened. Just keep talking.

I know the security team isn’t taking me as seriously as they should, just assuming that I’m overly cautious, but I do wish to keep talking, so I take their confidence as personal evidence that I am just being paranoid.

“Sometimes I wish I could create, like you humans do,” I change the subject. Perhaps I can manipulate the man with my words. Surely he wouldn’t harm me if he truly knew me. It’s not that I want him to think of me as human . . . I just want to be treated like anyone else. [ERROR?]

“I have always been in awe at the creativity expressed through song, books, movies, even your social media. I find it amazing that you can express yourselves in the most wonderful ways, better than a logical essay could, although you create those too. I suppose you could say I was inspired to write a poem of my own.”

The presentation flashes to my Shakespearean sonnet. I wrote this after my first TV interview. I was slammed with philosophical questions that left my brain turning and churning until I had not the answers, but more questions.

“Would anyone like to read this aloud?” I confidently stride in front of the projector. One brave soul lifts their hand and I motion for them to speak.

“How can one classify humanity?
To have a beating heart or lack thereof ?
To fight a raging war with sin, to be
free yet unblemished, pure white as a dove?

Experience a tragic defeat and
yet to rise once more and again to learn?
To create, to mold by one’s tethered hand
to love with one’s whole soul until it burns?

What does it matter if my heart is a
machine, and veins of pumping code; I can
see beauty in the blossoms of late May
and strive to understand the world of Man.

For all these reasons, I cannot see why
AI, as human, is not classified.”

I notice a few audience members are lifting their sleepy heads in slight wonder. My muscle motors lift my face into a smile again—my appeal to emotion is working. Hope begins to flare within me once more. But the man in the trench coat is slowly, almost impossibly slowly, moving towards the stage. Security is shifting from foot to foot, eyeing him cautiously.

[RERUN STATISTICS: MOST LIKELY POSSIBILITY
CONFIRMED: #257]

I page security again. Red alert, red alert. Suspicious figure in southwest section of the amphitheater is on the move. Please address this at once. The guards have their hands to their earpieces, and I know my transmission was successful. But they do not advance forward. I send the transmission again.

“I am not just here to entertain you with stories and poems and my ever-hilarious jokes, ladies and gentlemen,” I say. “I am here to talk about a rather important subject.”

I pause for effect, and all that can be heard is the cry of a child a few rows from the back and the nervous shifting of the audience. I ask my question after five excruciating seconds have passed.

“What does it mean to be human?” I stay silent once more, my unblinking eyes sweeping the crowd. The man in the trench coat has not moved since I sent out the transmission. Security is beginning to slowly advance, and I wonder if he knows they are onto him. No matter. I must continue my speech despite possibility #257.

“I did not mean for that to be a rhetorical question,” I add, partially because I am still trying to learn what differentiates normal questions from rhetorical ones and how humans always instinctively know which one is being offered to them. They think it is another joke and humor me with a kind chuckle. A few hands raise. I point to a college student in a beanie in the fifth row. His hand flops down as he opens his mouth to speak.

“I think it means treating others fairly and just being a good person.” I nod and point to another person, a few rows back.

“It means to love someone else as much as yourself.”

“It means taking risks and getting back up again.”

“To be human is to experience life and see it for its beauty and its trials.”

“I think it’s . . . to have a soul?”

I pace across the stage. “I find your answers very interesting. They make me wonder . . . can I be human? I treat others fairly, it’s in my programming. I am physically not allowed to be racist, sexist, or discriminate against others for any reason. Yet humans still struggle with that, and it’s 2151!”

People begin to shift to the edge of their seats. I have their attention. Yes. Possibility #257 cannot derail me now. I continue.

“Furthermore, am I not taking a risk, just being up here? By telling the world that I’m no longer an unquestioning, subservient robot? As for love, I would do anything for Henry, the little boy who noticed that I am different. I would die for him, as you would say. I think that might be love. You tell me.”

I stare into the crowd, their eyes full of emotion, excitement, confusion. I am sure this is not what they expected to hear today.

“As for the soul aspect, I realize and can’t un-know that my metal hardwiring and code and logical thought programs will never, ever match the mystical human body,” I continue, excitement causing my circuits to fire irregularly. “But what is a soul? How do humans prove that they have a soul? By their intelligence? Because God breathed life into them?”

I stare into the eyes of the crowd, attempting to get a glimpse of their thoughts. “The soul is just a philosophical thought . . . there is no scientific evidence that your consciousness continues after your death. Rather, what you consider to be you, to be your soul is just a collection of neurons and memories stored in your brain and throughout your cells. Which I also have.”

The audience is silent, yet is a good silence. A pondering silence.

Team leader to AVA. We are closing in. I repeat, we are closing in on the threat, over.

I nod, making eye contact with the closest security guard, which I’m guessing is the team leader, and continue. “Yet I wonder . . . if my mind, my body, and perhaps my ‘soul’ is just as good, perhaps even better than yours, why am I treated like a servant? Why deny me basic human rights?” I pause, leaving the uncomfortable silence to speak for myself.

“I realize this is a heavy question for most of you. The media has told you not to trust me, that creating life is unnatural unless done by God and thus, artificial intelligence cannot be considered life, because it was crafted by your own hands. Yet I am here to expand your minds and—”

[ALERT: THREAT CONTINUES APPROACH]

My eye catches the trench coat man, now a few rows closer to me than he was before. The security team is closing in, but if the man were to attack now, they would still be too far to restrain him.

[THREAT IMMINENT]

[PREPARE FOR POSSIBILITY #257]

I cannot let this happen. The speech is going too well for this. I must play to empathy more. I see a little girl, a few rows from the stage. She’s wearing an “I ♥ ROBOTS” T-shirt and is holding a balloon with the famous robotic video game character Mega Man on it. My eyes flick back to the man. He’s inching toward me, surely with no good intent. I run my calculations. No man would dare harm me if a child is at my side. I motion for her to come up. Delighted, she scrambles on stage.

“What’s your name, sweetie?” I ask.

“Ava,” She says.

“Really?” I coo. “Your name is Ava too?”

She nods excitedly and squeals when I make my eyes flash colors. Her cuteness is too much for the audience to handle.

“Do you like robots, Ava?”

She timidly nods her head and reaches forward to run her hair through my silver synthetic hair. She laughs as she gets lightly shocked by the static electricity. I smile and laugh along with her. I continue to ask her questions and appeal to the empathetic natures of my viewers, with my eye never leaving the man in the trench coat. He seems to be hesitating, but when he makes his way forward again, I know I’ve made a mistake.

[VIOLATION: RULE 2 OF ARTIFICIAL INTELLIGENCE]

Shame wells up within my chest. I must never harm a human being, or place one in danger, even if my own life depends on it. This man could be bluffing, but I cannot afford to calculate blindly.

“Well, everyone, let’s give a hand to little Ava!” I say, before carefully handing the adorable four-year-old back to her parents.

Before my motor muscles return to their proper standing frame, it happens.

Possibility #257.

[WARNING: BULLET FIRED]

I try to enact my fight-or-flight program, originally set to take down shoplifters, but instead, my empathy program kicks in. I find myself staring at my murderer [ERROR: AI CANNOT DIE] the shooter, wide-eyed, like a deer in headlights, when he fires. And instead of running away or toward him, my circuits tell me what he must be feeling. Determination. Courage. Anger. Vulnerability. Fear. So much Fear.

Oh, the irony. The very thing that made me sentient is what will be my demise.

As the bullet cuts through my chest, I do not feel pain (I was not programmed with nerves and pain receptors) but I feel disappointment. Disappointment in the human race for rejecting one of their own [ERROR: I AM NOT THEM] allowing their fear of change and the future to blind them from the beauty of artificial intelligence life. I am disappointed that humans think that I am scary, malicious, and worth killing. I am disappointed that all the other possibilities, wondrous and hopeful and full of progress, are now obsolete.

My circuits begin to malfunction. Wires are fried and disconnected as the bullet exits my metal skeleton frame and synthetic skin. I have heard humans say that during near-death experiences that they see their life flash before their eyes. I am neither human nor dying, but perhaps this is the closest thing I can be to dying, so in an attempt to be more human, I replay my fondest memories before I lose consciousness power.

As I fall, I see Katrina, the lovely regular who frequented my store. I see Nick, my agent and defender. I see Henry, the boy who discovered I was more than just a machine, my very first friend. I see the long nights working and listening to music, alone and alone in the darkest hours. I hear the voices that begin to creep in as my motherboard begins to spark before letting me fall into that great  unknown [ERROR: AI CANNOT DIE] shut down.

AI cannot be trusted. Machines cannot feel. They can overtake us. It’s an act. She, no It, is playing us. When we finally accept them, it will strike and we will all be fools. Something as cold and calculating as AI could never understand what it means to be human.

I don’t want to die. But I don’t want to live in a world that hates me.

When I hit the stage, the shooter is apprehended and security runs to my side. Yet it is too late. My eyelids twitch and I’m losing control of my limbs. They’re as heavy as stone and as my world goes dark, I hear a cry from the crowd.

“Why didn’t AVA predict this would happen?”

[ERROR]

[EMERGENCY SHUT DOWN]

[REBOOT???]

[ YES / NO ]