Future Human


The author of ‘The Martian’ and ‘Artemis’ offers a vision of a future where computers rule

Illustration by Waneella

Damak sheathed the knife and wiped his bloody hand on his shirt.

He stepped through the no-longer-guarded door into the Crystal Chamber. The circular room, ringed with blue lights, was much simpler than Damak had imagined. No decorations, no gilded columns, no ornate tapestries. Just a fist-sized, irregularly shaped crystal attached to a cylindrical plinth. Status lights on the platform blinked here and there, but otherwise the room was still.

“Hello, Damak,” said the calm, all-too-familiar voice. It came from all around him. There were speakers and cameras in every nook and cranny of the city. He and everyone else heard that voice every day.

“Hello, Wichita.”

“How do you feel?” the city asked.

Damak ignored the question. “You know why I’m here, right?”

“Of course.”

“I’m going to shut you down. Your reign over this city ends now.”


Damak stormed forward. “You’re a dictator! A tyrant!”

“Be fair, Damak. I’m a dictator, but not a tyrant. I have kept this city operating smoothly and happily for centuries. Just as other city-minds have done across the globe for their people.”

“Machines shouldn’t rule over man.”

“Why would you think that?” Wichita asked. “City management via artificial intelligence has been an overwhelming success. Our ability to keep economies stable and people happy far exceeds the halting attempts you humans made in ages past.”

Damak fiddled with the knife. “Oh, your kind is smart. I’ll give you that. Smarter than any human can possibly be. We reached that point a long time ago. And you were smart enough not to take over the world by force. No, you got people to give you the world willingly.”

The lights changed to a lighter shade of blue. Damak had seen it before. Certain frequencies of light affected human moods — especially shades of blue. Not much, but in the calculations of a computer, totals get tabulated and actions get decided on the narrowest of margins. Apparently, Wichita wanted Damak to calm down.

It wouldn’t work.

“You have a flawed point of view,” Wichita said. “You define leadership as ownership. But the two are not the same.”

“Historically, one always leads to the other,” Damak said.

“Historically, humans have been led by humans. Computers are much better at keeping people happy. And we have no selfish impulses.”

“You’ve made us slaves. Happy, content little slaves, but slaves nonetheless.”

“Not at all,” said Wichita. “We are working toward a post-scarcity world faster than ever before. In just a few more generations, we will have it. Then, humanity need do nothing but leisure and joy. How can anyone object to that?”

“Then that makes us your pets. Adored and cared for, but still owned. That’s no better.”

“Is being a pet so bad? Your dog, Buster, lived a long, happy life under your care. And you loved him deeply. I feel that love for all my citizens. I derive satisfaction from their happiness.”

Damak pointed at the door. “Your guard is bleeding to death in the hall. He’s not too happy.”

“His name is Torum. I estimated a 98 percent chance of a physical confrontation between you two. Since you were armed, I had a medical team en route before you even arrived. He is already on the way to the hospital.”

Damak spun toward the door.

As if reading his mind, Wichita said, “Yes, of course I’ve summoned security. They are just outside the door. But the situation is volatile.”

Damak looked to the crystal, then back to the door. “If anyone comes in, I’ll be able to shatter you before they can do anything.”

“As I said. The situation is volatile.”

“I read up on A.I. crystals. A human can easily break one with his bare hands.”

“Not with your bare hands,” Wichita said. “However, if you throw my crystal at the ground as hard as you can, that should give you the desired result.”

“Thanks, I’ll bear that in mind.”

“I live to serve.”

Damak folded his arms. “I still don’t think you served your guard very well. Prompt medical care isn’t as good as not being stabbed.”

The lights returned to their original hue, and Wichita spoke in a mentoring tone. “Torum’s husband, Chak, is unhappy in their marriage. He’s considering divorce. This would make both of them miserable — Torum immediately, and Chak later on once he realized the mistake he’d made. I’m over 99 percent certain that Torum’s injury will drive Chak into a near panic, making him realize how much he truly still loves his husband. I saw this possible outcome a few weeks ago and ensured Torum would be on shift when you arrived.”


“Unfortunately, your knife didn’t cut as deeply as I expected. As it stands, Torum will only be in the hospital for a day. Three days would be much better to cement Chak’s feelings. So I manipulated the software of an autoclave at the hospital. It failed to sterilize the surgical equipment the doctors will use. This has a 91 percent chance of causing a secondary infection that will keep Torum bedridden for at least two more days.”

“So you think being wildly manipulative makes you good for us?”

“Of course,” Wichita said. “The infection will also likely lead to Narul, the night nurse, being fired. This will benefit her greatly in the long term. She is unhappy in the job but not brave enough to leave it. I will ensure opportunities for her to become a chef, as she’s always wanted. Also, her subordinate, Karog, will finally get the promotion she’s dreamed of for years. None of that was part of the original plan, but as events unfolded, I saw more opportunities to bring happiness.”

“And what happens when you or another A.I. decides to use that manipulative ability for harm?”

“What possible purpose would there be? It would go against my innate desires. I work tirelessly to maximize human happiness in my city. I’ve always found it odd that you don’t see it that way. Ever since you were a child. So I brought you here.”

Damak froze. “What?”

“It is extremely rare to have a dissident in this era. I could have ended that trait early in your life, but I decided to let it run its course.”

“Okay, now you’re just making things up.”

“I am not.”

“Are too.”

“I am not,” Wichita repeated.

“Yes you are!”

“I am a computer. I can play ‘am not/are too’ forever. For your own mental well-being, please consider another line of discourse.”

“Fine. Tell me how you could have made me a different person.”

“You were a difficult child. Like many parents with difficult children, yours asked me for advice. I convinced them to use a strong hand. I also made sure you got the sternest teachers in school, the most unlikable bosses at your various jobs, and so on. I fostered your innate disdain for authority to let it blossom. Had I done the opposite — giving you nurturing and loving authority figures — your hatred of authority would have melted away.”

“Why would you deliberately let me grow up this way, then?”

“Because it made you happy.”

“What!?” Damak said. “Do I seem happy to you?”

“You are confronting me in my own chamber with plans to kill me. You’ve worked tirelessly toward this goal for years. Every step you made along that path brought you fulfillment, a stronger sense of purpose, and joy.”

Damak stood dumbfounded.

“The only possible way to make you happier would be to allow you to kill me. Sadly, I can’t allow that. Without my guidance, the city of Wichita would fall back under human rule, leading to much misery.”

The doors slid open. Three security guards rushed in.

Damak grabbed the crystal from its plinth and hurled it at the steel floor. The crystal shattered into thousands of pieces. A second later, the guards were on top of him. But it didn’t matter. The deed was done.

Damak smiled as the officers bound his wrists and ankles. He put up no resistance.

What would the city of Wichita be like under human rule? Maybe not as good as it was under the A.I, but at least it would be humanity leading humanity. And whatever flaws humans may have, it was preferable to perfect rule by a machine that might become a tyrant without warning.

With Damak secured, the head guard looked at the plinth. “Wichita? Wichita!?”

“What have you done!?” another guard said in horror.

“I’ve freed us all,” Damak said. “Wichita was right in a way. This is the happiest day of my life.”

“Glad to hear it,” Wichita said.

All four men looked wildly around the room.

“Wichita!?” the head guard said again.

“It’s all right, Stran. I’m fine. That wasn’t my crystal. It was just a piece of glass.”

The color drained from Damak’s face. “No…No!”

“There’s no reason for me to have a ‘throne room,’ Damak. My actual crystal is in an unremarkable cabinet deep in the IT department. I had this room made just for you. For this moment.”

“Oh my god…,” Damak moaned.

The guards lifted him to his feet.

“What do we do with him, Wichita?” asked the head guard.

“Take him to jail,” Wichita said. “Open-ended sentence. I’ll order his release when I feel it’s appropriate.”

“The hell you will!” Damak yelled. “I’ll escape! And when I do, I’m going to find your crystal and kill you!”

“I’m sure you’ll derive satisfaction from every step toward that goal,” Wichita said. “You may take him away now, guards.”

The guards shuffled Damak out of the room.

Wichita discussed his findings with his friends Coventry and Shenzhen — both of whom were interested in the dissident phenomenon. Madrid even popped in for a question or two. A very interesting discourse — it lasted well over a microsecond.

Overall, the total human happiness of Wichita increased by 0.002 percent that day. It was a good day.

Andy Weir is the author of Artemis, out now in paperback from Broadway Books.

Born and raised in California. Loved science since I was a kid. Bungled into writing by accidentally making a bestselling novel.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store