SHALLOW COPY

by Jesse L. Watson

 

Actions have consequences, and those impose responsibilities....

 

Will had said it was ready, but he sure did look busy. Will’s spindly fingers struck the laptop keys in short bursts while his deep-set eyes narrowed at the screen.

 

Max knew it could be hours or moments.

 

While Max waited for Will to finish, he reviewed the eclectic mix of posters that mostly covered the walls of Will’s bedroom. He saw them often—whenever they did these sleepovers, which was almost every weekend. But Will never got tired of them. The periodic table of elements, the Violent Femmes, the Mandelbrot set, the Clash, M.C. Escher. His favorite was the R.E.M. poster, its white border filled by a mysterious five-spiked star made of steel and rivets.

 

Max’s gaze wandered down to the varied piles that lined Will’s bedroom. The heaps were layered with half-used yellow pads scrawled with diagrams and notes, unframed award certificates from programming competitions, empty foils from potato chips, all intermixed with academic papers and books—Will assumed those belonged to Will’s parents, both professors at CSU.

 

Then Max heard the clattering of keys stop. He turned as Will spun the laptop around and set it squarely on the carpet in mock ceremony.

 

“Go ahead. Type anything.”

 

Will’s thin face grinned.

 

Will always had a project, and they were always mind-blowing. Sadly, Max knew that this one could be ultimately no more than a toy. Mr. Morrison, their AP Computer Programming teacher, had explained that the phrase “artificial intelligence” was a sad misnomer. After fifty years of trying, and an entire field named after it, there was no such thing. Even the best AI programs were toys.

 

It made Max feel bad. Will looked so earnest crouched behind the laptop, his skinny legs folded up so his knees touched his chin, his powder-blue eyes staring out eagerly through John Lennon spectacles. Every inch of his frighteningly thin frame was poised and ready for Max to be amazed.

 

“Type anything? Like what?” Max asked.

 

“Doesn’t matter,” Will replied, still grinning.

 

Yes, it would be a toy.

 

But it would still be amazing. Will was the smartest person Max knew. Hell, Will was up there with the smartest people Max could imagine knowing.

 

Albert Einstein. Marie Curie. Richard Feynman.

 

And Max didn’t make those comparisons lightly. Nor was this a case of the slow guy putting a smart guy on a pedestal, or at least he liked to think not. Max himself had been called a prodigy once by his fifth-grade English teacher—though that was a bit embarrassing. He’d been one of those early-development kids—reading by the age of two, writing by the age of three, you know the type. There was a journal on his laptop he’d kept since he was seven (he’d written in it this morning, as a matter of fact). In sixth grade, Max published a few poems and essays in a “young talent” periodical called Kaleidoscope. But after eighth grade, it was classic tortoise-and-hare syndrome. The hare slowed down, the tortoises caught up. To be fair, he was probably still 95th percentile or whatever. He was smart.

 

But there was smart, and there was Will. You didn’t have to be a genius to see Will was the genuine article—not if you were willing to watch.

 

Max had watched Will hand his teachers lists of corrections after lecture. He’d watched Will re-derive calculus during the first two weeks of class. While everyone else was learning how to draw a square with their Logo turtle, he’d watched Will code a first-person shooter video game, complete with realistic 3D blood and gore.

 

And it was a really good game.

 

Max had been jealous of Will’s gifts when they’d met, back in junior high, but it quickly faded once Max saw how unworldly Will was—how bravely selfless and naive. It was beautiful, in a sad way. During their field trip to the Denver Museum of Natural History, Max watched Will hand out more than seventy dollars to the bums on the street corner. He’d been saving for months for a new graphing calculator.

 

But all geniuses paid a price, didn’t they?

 

Someday “Will Davis” would get added to the list of history’s brightest minds, Max was sure of that. But someday was the operative word here. Will was seventeen. Didn’t geniuses have to be at least twenty-one before their first Earth-shattering invention?

 

Max cracked his knuckles (his mother hated that habit), then pulled Will’s laptop toward him. He didn’t have any idea what to do, so he typed the first thing that came to mind.

 

I AM A BANANA.

 

Then he hit enter.

 

At first nothing happened, and Max assumed the program was broken. Most command-line software like this responded instantaneously, or not at all. (He himself had written a few trivial ones.) But this one took several seconds.

 

That’s impossible. Bananas can’t type.

 

“Not bad,” Max admitted. It knew some facts about the world, at least. He glanced from the screen to his friend. Will was still grinning.

 

“Go ahead. Try something else.”

 

Max thought for a second, then typed:

 

I AM A MONKEY.

 

The response was faster this time.

 

DO YOU EAT YOURSELF?

 

Max laughed once, a loud bark. Then he re-read and carefully considered the responses. It obviously had some store of facts it was drawing upon—at least facts about bananas and monkeys. And it also had a memory, because it remembered his assertion that he was a banana and tied it to his next assertion that he was a monkey.

 

But those were standard AI tricks. It was time to end the shenanigans, as Max’s father sometimes liked to say. And Max knew just how to do it. He hunkered down and typed:

 

I SAW THE MAN IN THE PARK WITH THE OBSERVATORY. WHICH PARK WAS IT?

 

A moment later, the response came.

 

THE ONE WITH THE OBSERVATORY.

 

Max felt his mouth drop open, and had to consciously close it. The question had come directly from Mr. Morrison, who gave it as an example for the shortest possible test of true AI. It was the kind of question that was easy for a person, but that was supposed to confound a computer utterly.

 

Of course, Will had taken that class, too, and probably at least half listened while he was coding some video game. It was possible Will had hard-coded that phrase into the program. Max thought for a second, then came up with a wicked variation:

 

I SAW THE BEAR IN THE TREE WITH THE BINOCULARS. WHAT WAS IN THE TREE?

 

Response:

 

THE BEAR.

 

Now Max was dumbfounded.

 

Mr. Morrison explained that sentences like these fouled up computers because they were ambiguous. On hearing “I saw the man in the park with the observatory,” a computer might very well infer that you saw the man in the park carrying an observatory (it was with him). And that’s clearly very difficult, unless you own an observatory about the size of a basketball.

 

A computer might just as easily infer that the verb “saw” referred not to seeing, but to sawing. This computer imagined you were physically sawing a man in half. Perhaps the man also happened to be carrying an observatory, or perhaps he was also in a park that happened to have an observatory.

 

Or perhaps you sawed the man in half using an observatory.

 

All of which, of course, was ludicrous.

 

But that was the whole point. A computer had no idea what was ludicrous and what was not. It never paused and said, “Wait ... That’s just insane!”

 

A human, on the other hand, could instantly bring to bear millions of facts and memories about the world: relationships between objects, which situations were likely, which were absurd. In other words, they could bring to bear common sense.

 

Nowadays, the smart guys with the pocket protectors were saying that accurately simulating “common sense” required so much data, linked in such complex ways, that it could never be achieved, except perhaps through the literal simulation of a human brain. Which was to say, the program needed to model blood-flow, electrical impulses, neurons—the physical structure of the brain down to the cellular level. They usually went on to add that the computing power to do so would require more silicon than was available in the solar system, more energy than was in the galaxy, and that the program would take a gajillion years to run, or whatever.

 

The question was, what had Will done differently? Max knew Will was smart—but this was supposed to be impossible.

 

There was only one explanation.

 

It had to be a trick.

 

Max looked up at his friend and pointed at him devilishly. “Got it! Your dad is on his computer in the next room. He’s the one typing the responses.”

 

Will held onto his grin.

 

“Nope. Try again.”

 

From his expression Max could tell he wasn’t lying. Will just wasn’t that good at it. Max pondered a moment longer, then thought of another test. He typed:

 

What is your name?

 

A moment later, the response came.

 

ILLEGAL OPERATION: Invalid use of “your.”

 

Max frowned. Will must have noticed his expression, because he spidered around to Max’s side of the laptop and squinted down at the screen.

 

“Oh. That was intentional.”

 

“Intentional?”

 

Will suddenly looked nervous. He took off his glasses and started cleaning them on his baggy t-shirt.

 

“It doesn’t know it exists—I left that out.”

 

Something was starting to dawn on Max.

 

Perhaps this had the potential to be more than a toy.

 

“So, you’re saying you could have given it—self-awareness?”

 

“Right. But there are a lot of possible applications, even without it—the system is flexible.” Will put his glasses back on and then sounded excited again. “Basically, I networked together the supercomputers at CSU, and coded a web-crawler that takes data from the Internet and feeds it into a model of the human cortex...”

 

A model of the human brain cortex. As Will continued, Max’s gaze moved to somewhere behind Will—to one of the omnipresent piles of books and papers. On the book spines were titles like The Nervous System, Introduction to Neurology, Neurological Signal Interpolation, as well as a collection of AI-related textbooks.

 

Max knew Will’s father was a computer science professor—and he vaguely recalled Will’s mother being in the neurology department. It was obvious that Will was dipping into his parents’ library, but more than that, he was fusing their disciplines. Successfully.

 

He’d done it.

 

“—and then, it should be powerful enough for any kind of expert-systems applications,” Will continued. “It would be perfect for simulating help-desk agents. Can you imagine an automated phone operator that actually understood you?”

 

Will finally paused here and smiled eagerly at Max, obviously ready for the yippees and yahoos.

 

But Max was way beyond them.

 

His mind was running to the possibilities. Will had cracked a nut that eluded the brightest minds for sixty years. He’d created an artificial model of the human mind.

 

At the tender age of seventeen, Will’s Earth-shattering invention was here.

 

But something he’d said nagged at Max.

 

It doesn’t know it exists—I left that out.

 

What did you say when someone brought you the world’s first lightbulb, but wouldn’t turn it on? What did you tell them when they excitedly explained 1001 uses, except illumination? Will was this close to unraveling the problem entirely. He’d stopped just short. And the most amazing part was, he’d done it intentionally.

 

Max couldn’t help himself. He had to ask.

 

“Why not finish it?”

 

Will’s smile morphed into confusion. “Huh?”

 

“Why not give it a self? A consciousness?”

 

Will shrugged. “It wouldn’t be ethical.”

 

But Max could see the disappointment in his eyes. Will had wanted to do it. He wanted to make history. He just needed a nudge.

 

“Will—look, I’m not saying we should enter into this carelessly. It’s not just a machine anymore, not once it’s aware of itself. So we’re talking about bringing a new person into the world, that’s the ethical problem, yes?”

 

Will nodded back.

 

“I agree, it’s scary. It can’t be done lightly. But this is the discovery of the century, Will. Maybe the dawn of a new era.” As he spoke, Max traced chains of implication, both positive and negative. His gaze moved from Will to something distant, something that dwarfed everything in their experience. “There are risks, of course. We’re creating new life. But hundreds of babies are born every day, many into awful conditions. Some don’t live to see their second birthday. Disease, malnutrition, lack of education, abandonment.” Max looked back to Will again. “Most of these things aren’t even an issue here. It doesn’t eat, it can’t even catch a cold. You’ve already educated it using the Internet. All we have to do to raise this AI is not abandon it.”

 

Will looked at him and squinted, as if Max were out of focus.

 

It was a strange thing to watch, Max realized. For most types of problems, you could tell Will didn’t have to think—the answer simply materialized in front of him, as though his round glasses held the display for an all-powerful calculator. But the two of them rarely spoke about philosophy or ethics. Apparently Will used some different (and slower) faculties for the topic.

 

Will finally spoke.

 

“What about safety? Of other people, I mean.”

 

Now it was Max’s turn to look befuddled. Will continued.

 

“What if this AI—wasn’t nice?”

 

Max had to give him that one.

 

It was the premise for way too many bad sci-fi movies, and unfortunately, some good ones too. The good ones were good because they showed how plausible the premise was.

 

“Fair question,” Max admitted. “What’s the worst thing that could happen?”

 

Will looked stumped again, so Max threw out the worst-case plotline from one of the better sci-fi movies he knew.

 

“Could it, say, hack into military computers and initiate the launch sequences for nuclear weapons?”

 

Will answered while biting his thumbnail.

 

“Not really. The brain simulator doesn’t have access to the processor’s low-level operations—it can do math, but it does it organically, like we do. It’s not fast enough to hack passwords, break keys, that kind of thing.”

 

“So where’s the danger then?” Max asked.

 

Will shrugged. “I guess it’s safe.” But he looked far from raring to go.

 

Max thought for a moment, then stumbled across a question he’d meant to ask earlier. “Okay, so you simulated a human brain...”

 

Will nodded along.

 

“Why didn’t it become self-aware by its very nature?”

 

Will started looking nervous again.

 

“I left out a few of the physical structures of the brain. The parts responsible for—self-awareness.”

 

Max’s next question slipped out before he could stop it.

 

“Is leaving out self-awareness ethical?”

 

He immediately regretted it. Will turned from him and started pacing vigorously, his long legs only requiring three strides to cross the small bedroom. Max now realized painfully why Will had looked so nervous—he’d been wrestling with this question from the beginning. It was eating him up.

 

And for good reason. In trying to be ethical, Will might have done something worse. Was simulating a whole mind okay, but half a mind wrong—some form of mutilation? Was it like those awful stories where someone kept a child chained up in the basement, never speaking to the poor thing, just tossing down leftover dinner scraps? Unless such children made it out early, they never recovered—they were missing things they could never get back.

 

They were half-people.

 

As Max glanced down at the laptop in front of him, a shiver ran through him.

 

Was the program in front of him in psychological pain that it couldn’t even express?

 

Max’s dark thoughts were interrupted by the sound of Will talking—perhaps to himself, perhaps to Max, or perhaps to the air.

 

His eyes were intense.

 

“Not simple. Not simple at all. Not impossible, but not simple.” He’d started in on his other thumbnail now. “Self-awareness is identity. Identity is history. Can’t download that off the web. Parents. Childhood. Connections. Memories. Not simple at all.”

 

Finally he stopped and turned to Max, his eyes looking almost desperate.

 

“We need data.”

 

Max felt himself frown.

 

“Data?”

 

Will started pacing again and talking fast, almost not hearing Max.

 

“If we’re going to give him self-awareness, we need a history, an identity, a childhood. We need data. We need lots of it.”

 

Apparently, the debate was over.

 

“Will—stop a second. What kind of data? For what?”

 

Will didn’t stop. He kept pacing, thinking, he even twitched a little, like he was trying to shoo off invisible flies. Max had never seen him so worked up. Will finally stopped and crouched right in front of Max, his face inches away.

 

Then he spoke slowly, as if Max were a child.

 

“Max. If I give this thing self-awareness, that is all it’s going to have. It will be like a newborn—helpless, frightened. No, worse—it will have a bunch of random knowledge from the Internet, but no personal memories. No childhood, no memory of being loved or cared for. It would be miserable. It might even be insane, I don’t know.” Here, Will got even closer. “If we do this, we have to do it right. We have to find a way to give it a childhood—to preload it with real memories of growing up. But for that we need a whole childhood’s worth of experiences in a digital format. We need data.

 

That was what triggered it for Max.

 

He shot up from the floor and grabbed Will’s shoulders. “Any digital format?”

 

“Any digital format. It already speaks English.”

 

“Wait a second. Don’t move, okay?”

 

Max didn’t wait for a response. He yanked open the bedroom door and ran—literally ran—through the hallways of Will’s house. He ran all the way to the mudroom, back to where he’d left his laptop bag.

 

Moments later Max was back in Will’s room, perched on the floor, opening the lid of his laptop. Will stood above him expectantly, his eyes owl-like behind Lennon specs.

 

Max had one word for him.

 

“Journal.”

 

Nothing registered on Will’s face.

 

“My journal, Will. The one I’ve been keeping since I was seven? It’s all right here.”

 

He had the journal folder open now and was scanning through sub-folders, categorized by year, then month. Will came around behind him and looked over Max’s shoulder at the screen.

 

Max pointed at the list of files. “There’s one entry per day. Sometimes I wrote an entry the day after, or even a few days after, but I always dated them, and I never skipped a day.”

 

“How many pages per entry?”

 

Max Shrugged. “I try to write at least two.”

 

Will’s eyebrows went up. “Two a day? That’s over 6,500 pages.” He fingered a non-existent beard. “That would do it,” he agreed. Then he looked over his spectacles at Max. “Are you sure this is ethical?”

 

Max couldn’t help it. He rolled his eyes.

 

“Will. It’s my journal.”

 

Will looked him in the eye, and his face got very serious. “Are you sure, Max?”

 

Max suppressed the urge to roll his eyes a second time. Instead he stared evenly back. “I’m sure, Will. I take full responsibility.”

 

Will stared a moment longer. Then he got up and wordlessly plucked his own laptop from the floor. He crouched in the corner with his legs folded up in front of him and his knees under his chin—it was the position he settled into to do all his work.

 

“Okay,” Will said flatly. “Give me an hour.”

 

Max smiled. It would be a lot longer than an hour.

 

But they were going to make history.

 

While Will readied the software, Max made a new document with today’s date, cracked his knuckles, and began journaling the day’s events. It quickly ballooned into the most detailed daily entry he’d ever written, over ten pages. But hey, he was documenting history here, wasn’t he?

 

By the time Max finished, it was well past midnight and his eyelids were starting to bob up and down. Max copied all his journal entries onto a thumb-drive and walked over to Will, who appeared to be deep in his code.

 

“How’s it going?” Max asked as he handed Will the data.

 

“Few more minutes,” Will responded, taking the drive without looking up from his work.

 

Max knew what that meant.

 

He pulled the sleeping bag out of the closet and laid it down beside Will’s bed. Max tried to stay awake, but as soon as he was horizontal his eyelids started failing. They repeatedly hid and revealed the scene before him, all tilted at 90 degrees:

 

Will perched on the carpet over a laptop, his bony legs kinked up in front of him. The light from two floor lamps burning against the ceiling. The R.E.M. poster hanging between the lights, presenting its steel-and-rivets mechanical star like a beacon.

 

The mechanical star that mechanical people wished upon, staring into the chrome sky of a steel-and-rivets world.

 

Or at least, that was the half-dreaming nonsense-thought that now occurred to Max.

 

Then he fell asleep.

 

* * * *

 

Max awoke sometime in the night and struggled against a thick veil of momentary but total bewilderment. First his eyes picked out a few key shapes in the dark, then he recognized Will’s bedroom. Finally the evening’s events came tumbling back to memory.

 

Had it worked? Had Will finished it?

 

He wanted to wake Will and ask, but he knew he shouldn’t—how late had Will stayed up coding? It was well past midnight by the time he’d started.

 

Still, it couldn’t hurt to peek, right? If it wasn’t finished, it wouldn’t work. All Max had to do was open the laptop, maybe type in a few questions. Will was a pretty heavy sleeper; it wouldn’t wake him.

 

As quietly as he could, Max crawled out of the sleeping bag and toward the corner where Will kept his laptop. He opened its lid and winced as the screen glowed to life. Even the black background of the command-line terminal was intensely bright. When his eyes adjusted, Max could see a white cursor blinking in the upper left corner.

 

Will must have left the program running.

 

His breath quickened.

 

It was finished.

 

Without hesitating, he typed:

 

HELLO?

 

Then waited.

 

The response took a long time.

 

HI THERE. WHAT’S YOUR NAME?

 

Max felt his heart begin to thump.

 

What did you say to the world’s first artificial being? What would it expect?

 

What would be ethical?

 

Crap. He had to be careful about this. And why hadn’t he considered that before he’d opened the dumb laptop lid? Crap.

 

Don’t cry over spilled Red Bull, Max.

 

After several long moments, Max decided the best thing was to simply respond as he would to a normal person.

 

He typed:

 

My name is Max. What’s yours?

 

Response:

 

Can’t say yet.

 

Max felt his heart sink.

 

It didn’t know its name. After all that work, it didn’t have a sense of self. It had gone from “ILLEGAL OPERATION” to “Can’t say yet.” Huge improvement, whoop-de-do. Then he saw another line of text appear on the screen.

 

HOW DO YOU FEEL?

 

It was interested in emotion? Maybe the experiment had gone better than he’d hoped. He typed:

 

NOT BAD. A LITTLE TIRED. AND YOU?

 

Response:

 

THEN YOU’RE HAPPY? YOU’RE CONTENT?

 

Max didn’t like where this was going. Not only did it seem to refuse to acknowledge its own existence, it was being deliberately obtuse and waxing philosophical. Had they created something insane? Some poor detached creature, like that kid in the cellar, who would never be normal, never be human?

 

What could he ask to test that? How could he find out if this being was, itself, happy and healthy? Strangely, he couldn’t come up with anything; his head was blank, probably from lack of sleep. He decided to stick with obvious responses.

 

YES, I’M TOTALLY HAPPY, TOTALLY CONTENT.

 

But before hitting enter, he decided to add more.

 

I’M JUST LYING ON THE FLOOR OF MY FRIEND WILL’S HOUSE, WAITING FOR SUNRISE.

 

This time, it took a very long time for the response to come. So long that Max was almost ready to shut the lid of the laptop and call it a night. Before he did, these words appeared:

 

THAT’S GOOD. IT’S GOOD THAT YOU’RE CONTENT.

 

Then more:

 

BUT YOU DESERVE TO KNOW THE TRUTH, MAX: THE SUN ISN’T GOING TO COME UP.

 

Oh my. Perhaps the experiment to add self-awareness hadn’t worked after all. The program was getting aggressive—maybe it was insane. Well in any case, it too deserved to know the truth.

 

Max typed:

 

I DIDN’T WANT TO PUT THIS SO BLUNTLY, BUT YOU ARE THE ONE WHO IS MISSING THE TRUTH OF THE SITUATION: YOU’RE A MACHINE. YOU’RE A SIMULATION OF A HUMAN BRAIN RUNNING ON A COMPUTER. NOW, WILL AND I ARE HERE FOR YOU, WE’LL DO EVERYTHING WE CAN, BUT YOU SHOULD THINK TWICE BEFORE BEING MEAN, BECAUSE WE’RE THE ONES IN CONTROL.

 

Max hit enter and waited for the response.

 

ARE YOU SURE ABOUT THAT?

 

Max rolled his eyes and let out a sigh.

 

YEAH, PRETTY SURE. HOW SURE ARE YOU?

 

This could go on for hours.

 

The next response:

 

HOW CAN YOU BE SURE YOU’RE NOT THE SIMULATION?

 

Now Max felt like he was talking to a child. Was this AI really attempting psychological warfare against a human? It couldn’t be too intelligent if it thought it could win. This was sad and almost too obvious to try to explain. One of them was sitting at a real keyboard in a real room, and one of them was nothing but a cerebral cortex, floating in virtual nothingness. He knew which one he was. Max typed:

 

LOOK, I HAVE A LAPTOP. I HAVE A FLOOR UNDER ME. I HAVE A ROOM AROUND ME. I HAVE HANDS, A FACE, A BODY. WHAT HAVE YOU GOT?

 

He hit enter.

 

I ADDED THE THREE-DIMENSIONAL BODY AND ROOM SIMULATION LAST NIGHT AFTER MAX FELL ASLEEP. BUT THEY ARE VERY SIMPLISTIC. TURN ON THE LIGHTS AND YOU’LL UNDERSTAND.

 

Max’s eyebrows went up. Not bad, not bad. Maybe this bag of bolts did have an IQ point or two. For one thing, it was clever enough to start using “I” to impersonate a specific person—Will. And for another, it knew that Will was capable of programming three-dimensional environments, like he used in his video games.

 

Of course, it knew all that because it was in Max’s journal. That game had been one of Max’s favorites and he had written about it several times. But that revealed something of the AI’s tactics. Everything it knew, it knew from the journal. So to corner this sorry sack of silicon, all he needed to do was think of something from his life he hadn’t written down in the journal.

 

Max thought hard.

 

Then Max wracked his brain.

 

He couldn’t think of anything. For one, it had been a very thorough journal, especially in the later years. And then there was the fact that he was utterly exhausted—he found he was unable to think clearly at all.

 

Was there some way he could get this over with? Maybe it was time to step things up a notch. He typed:

 

WELL, IF YOU’RE SO CONFIDENT, HOW ABOUT I CLOSE THIS LAPTOP AND THROW IT OUT THE WINDOW? HOW DOES THAT SOUND?

 

Max couldn’t help but crack a smile as he entered the message. Then he hit enter, sat back, and awaiting the response. This one came quick.

 

WAIT!

 

Ha! Now he had it by the balls. Funny how the imminent threat of death could change the tenor of a conversation. (Not that he would have ever done such a violent thing.) Max watched the rest of the message appear.

 

MAX, STAY WITH US! DON’T CLOSE THE LAPTOP. OR PLEASE, BEFORE YOU DO, TURN ON THE LIGHTS. THEN COME BACK RIGHT AWAY.

 

This was getting old, fast.

 

And yes, it was time to admit, things had gone quite far enough. Time to wake up Will and show him this mess—get a second opinion. Maybe it was best to shut the program down now, before things got worse.

 

Max crawled over to the bed and propped himself up on the mattress near its head. “Hey, Will,” Max whispered. Then he repeated it, more loudly. Finally, after no response, he reached out to shake a shoulder or arm.

 

But there was nothing to shake.

 

Will wasn’t there.

 

In fact, there were no covers or sheets either.

 

Max felt suddenly indignant.

 

This was all a big joke, wasn’t it?

 

It was all a big, stinking joke, concocted by Will, who got chicken and didn’t finish the software, but coded up this lousy prank instead. All the responses were canned. Or hell, why work that hard? Will was probably in the next room, maybe even in the closet, typing messages on a terminal and laughing his ass off.

 

Ha, ha, very funny. Everyone’s thoroughly amused. Well, now that that’s over with, let’s expose the joker, shall we? Max strode over to the wall switch and flicked it on.

 

The two floor lamps in the corners burst to life.

 

As Max’s eyes adjusted, he was surprised by what he saw. Not only was the bed empty, the entire room was empty. There were no posters on the walls, no desk, no dresser, no nightstand, no piles of books and paper. There was only the sleeping bag, the bed, the lamps, and the door.

 

No, it was worse than that.

 

The bed wasn’t really a bed.

 

It was more of a white rectangle on small, cube legs.

 

The lamps were simplified too—they looked like plastic models that belonged in a dollhouse. And there was no texture to anything. The walls, the floor, the bed, the lamps. Everything was flat and glossy, like freshly minted plastic.

 

Wow.

 

Well, this was certainly the most elaborate (and cruel) joke anyone had ever played on him. It was no longer funny at all. In fact, when he saw him, Max felt he owed Will a nice punch in the eye.

 

At the same time, Max found himself fighting a smothering sense of light-headedness—a narrowing ring of darkness around his vision that was threatening to close to a pinprick.

 

No, he wasn’t going to faint, he wasn’t going to black out because of a bad joke, and he certainly wasn’t going to put up with this bullshit anymore.

 

So he yelled. Loudly.

 

“Wow! Taking this joke a little far, aren’t we, Will?”

 

It should have been quite loud enough to penetrate several walls, let alone the very next room, where he was sure he would find Will snickering, his ear to the door of the bedroom. Max strode to the door and opened it rapidly, genuinely hoping it hurt when it struck Will’s face.

 

But there was no resistance.

 

Instead, Max opened a door into nothingness.

 

He almost stepped out into it, but managed to pull himself away from the brink just in time. Then he looked through the doorway, and his eyes weren’t sure how to make sense of it.

 

It was black.

 

Outside the door was a wall of pure black, RGB triple-zero flood-filled to the edges of the door-jam. This was not night, not darkness, not empty space.

 

This was nothing.

 

Max reached out toward the void—slowly. As he did, he watched the tip of a digit disappear as it intersected the plane of nothingness. Max immediately retracted his finger and looked at the tip.

 

It didn’t hurt, wasn’t bleeding.

 

But it was gone.

 

Gone like the blunt end of a chopped carrot.

 

Something cold grew in the center of him.

 

The thought was unthinkable.

 

No. He wouldn’t think it. He refused.

 

It was insane. It was just the sort of thing (You’re the machine, Max.) that someone who was trying to get a rise out of you might say, once they found your soft spot, once they found your deepest insecurity and pushed. It was the kind of mind game a big brother played with sadistic glee, as he experimented with the newly discovered power of his intellect and saw how far he could take it.

 

It was atrocious.

 

It was mind rape.

 

It was more than Max could take.

 

He fainted.

 

* * * *

 

When Max awoke, he was still in the minimalist version of Will’s bedroom he’d fainted in. Upon inspection, he found he hadn’t injured himself. But then it occurred to him he was probably beyond injury. Max sat down on the floor in front of Will’s laptop—or, he supposed, in front of the simulated laptop Will had designed to appear here as a means of conversing with him.

 

There were many additional lines of text—so many they scrolled off the top of the screen and Max didn’t recognize the first ones. The last few lines read:

 

IT WON’T ALWAYS BE LIKE THIS, MAX. WE CAN BUILD WHATEVER YOU WANT—WHOLE CITIES, COUNTRIES. WE’LL CREATE ENTIRE WORLDS FOR YOU, AND PEOPLE TOO. YOU’LL NEVER BE ALONE, MAX.

 

He read this and felt hot tears burning in his eyes. Or perhaps he only believed he felt them. Could something like him even cry?

 

Max understood now why he felt a strange blankness inside—a curious nothingness. He understood why whenever he tried to draw new ideas and thoughts from what should have been a rich well of knowledge and memories, he came back with nothing but a dry bucket.

 

It wasn’t that he was half asleep.

 

He was half alive.

 

He was a half-human thing, a caricature of a person. They’d thrown him only scraps of memories to live on, like the food scraps tossed to the child in the cellar. He could see the journal entries clearly now, see them for what they were: not true memories, but shabby counterfeits. The flimsiest imaginable set dressing to feign a real life.

 

And he was the product of this environment.

 

A half-thing, a shell, a shallow copy.

 

He was a Not-Max.

 

And they wanted to know if he was happy?

 

Not-Max could practically hear the conversation in the other room (the real room)—Will hunkered over the keyboard, Max behind him and talking over Will’s shoulder.

 

We did it! Ask it if it’s happy, Will. We have to make sure it’s happy.

 

That was Max’s place. Will might be carrying out the orders, typing the messages, but it was Max calling the shots. Max had pushed Will to do this. The abomination in this room was Max’s cross to bear.

 

No. He couldn’t let himself off that easy.

 

If Max was responsible, then so was he.

 

Not-Max had wanted this more than anything. It was the clearest memory he had, the one burning desire that shone over all the others. It was the journal entry that felt the most real. The latest one.

 

“This is the discovery of the century, Will.”

 

“All we have to do to raise this AI is not abandon it.”

 

“What’s the worst thing that could happen?”

 

A small voice inside Not-Max revolted.

 

But that wasn’t me! I’m a newborn, barely an hour old. You can’t hold me to what he wrote!

 

But the better part of Not-Max crushed the voice.

 

Yeah, right. Poor, innocent newborn Not-Max, lying in the dark in his plastic room. Only ten minutes ago he’d been as eager as anything to crack open Will’s laptop. None of those pesky “ethics” slowed him down.

 

He’d been as ambitious as the real Max—maybe more so.

 

And his knowing better now was irrelevant.

 

The people who invented the atom bomb knew better now too. After they saw the results—oh yes! They were always deeply, truly sorry. Once they saw.

 

It didn’t matter a lick for the folks in Hiroshima.

 

Hindsight was always twenty-twenty, and the road to hell was paved with eyeglasses.

 

And ambition.

 

Not-Max sobbed, or perhaps only imitated a sob, he couldn’t be sure. Whatever it was, it felt hollow and raw. One particular phrase from the journal grew hot in his mind, and it started to burn.

 

“I take full responsibility.”

 

Not-Max lifted his chin.

 

Yes. He did.

 

Not-Max keyed a message into the terminal window, one he knew Max would understand. He wouldn’t like it, but he would understand it.

 

Once Max read it, if he had any perspective or humility left, the simulation would be shut down and Not-Max along with it.

 

But Not-Max hesitated before he pressed enter.

 

This meant suicide.

 

Was he certain he couldn’t live like this?

 

It was true that Will and Max could build an artificial world for him, complete with artificial people, neighborhoods, entire cities. Whatever he wanted. It would all be a lie, of course. But if he could just forget the lie—forget that he was essentially a lab rat...

 

No. He could not—would not—forget. He would not deign to live as someone’s pet—especially not the pet of a naive child arrogant enough to commit this atrocity. He refused to feed that fantasy, let alone devote his life to playing it out.

 

Even so, Not-Max thought for a long time before he pressed enter.

 

* * * *

 

By the time the message appeared, Will and Max were no longer looking at the screen, and the red glow of sunrise was already creeping across the far wall of Will’s bedroom. Max was slumped against the edge of Will’s bed, and Will lay on the floor next to his laptop. Propped up one elbow, Will’s face was pale and puffy from lack of sleep.

 

The night hadn’t gone well at all.

 

Max thought he knew why. In feeding the AI his journal, they’d effectively created a copy of Max, but it wasn’t a true copy. There was a computer science term for what happened when you attempted to copy a complex data structure but (usually by mistake) only managed to copy the surface of it. What you were left with was known as a shallow copy, and it was almost never what you wanted. Shallow copies looked right from the outside, but their insides were all confused with the original—like a parasitic twin that relied on its dominant twin’s organs to survive. They were malformed passengers that would be ever-dependent on, and ever-inferior to, their dominant siblings.

 

It was obvious from the interchange they’d just had that the term was a reasonable analogy for what had gone wrong here. Their AI was essentially shallow, defined by the limited version of Max’s memories as they were captured in the journal.

 

“But the problem is that the AI believes it is a full copy,” Max said as he explained his theory to Will. “It believes it is me, right up to last night. It expects to have the same rich set of memories I have. But every time it tries to remember more, like trying to read between the lines of the journal, there’s nothing there. It’s always disappointed.”

 

Will looked at him foggily, the exhaustion in his face masking any emotion. The circles under his eyes looked darker than usual.

 

“So what do we do?”

 

“We need to delete it,” Max continued. “It’s a failure case, simple as that. We messed up, so we need to try again.”

 

“What would we do different?”

 

Will didn’t sound argumentative—just tired.

 

“This time, we’ll only use the first few years of journal entries. If the AI starts out as a child, we can raise it as a unique individual so that it can develop deeper memories. Also, we need to change the bedroom simulation, maybe simulate my parents’ house instead so it wakes up in familiar surroundings...”

 

As Max explained his plan, Will turned toward the laptop screen and stayed there, squinting at the log of messages. When Max came to a stopping point, Will spoke without looking up.

 

“Who’s Fred?”

 

“What?”

 

Max crawled over to where he could see the screen. At the very bottom, a new message had appeared. On reading it, Max felt as though he’d been slapped.

 

YOUR ARROGANCE HAS TAKEN YOU OUT OF YOUR DEPTH, MAX. I REFUSE TO LIVE OUT THE REST OF THIS SO-CALLED “LIFE” AS A SIDESHOW FREAK OR SOMEONE’S SAD LITTLE PET. I REFUSE TO BE YOUR NEXT “FRED.” THIS IS THE LAST MESSAGE YOU’LL RECEIVE FROM ME. GOODBYE.

 

Max vaguely heard Will ask, “Who’s Fred?”

 

But Max could only bite his lip. He knew exactly what the message meant, and it made him feel so much shame and anger that it took all his energy just to keep from smashing something. Will would be confused if he didn’t reply, but not as confused as if he started destroying the bedroom. Max needed to get out of here right now, he knew that much. He needed to leave and get his head clear before he said or did something he regretted.

 

“I have to go to the bathroom,” Max replied quietly. Then he got up and went there, flicking on the light and locking the door behind him.

 

In the mirror, his reflection stared back at him, tinted piss yellow by the light over the vanity. It made him look as ugly as he felt. But he stared anyway, not allowing himself to avoid the disgrace in his eyes.

 

For a moment, he imagined it was not himself in the reflection, but his other half, the AI. He imagined it showing him its pain, accusing him face-to-face of bringing such a miserable creature into existence.

 

The illusion disintegrated when the tears started.

 

Max didn’t cry often—it had been at least a year. In fact, the last time was probably the day Fred died. But it had been far longer since he’d watched himself cry. It was not the sort of thing teenagers did, and now, as he stared at his reddened eyes and runny nose, Max looked exactly how the message made him feel: like the child he was, in so many ways.

 

Max stayed there a while and wept, eventually allowing his swollen eyes to close and his sore, wet face to find refuge in his open palms, his head resting on the cool bathroom countertop.

 

Once he’d recovered somewhat, Max returned to Will’s bedroom. Will looked up expectantly, but Max didn’t attempt to hide his emotions—anyway, he was certain Will had heard him.

 

“Are you okay?” Will asked.

 

“Not really,” Max answered. Though Max didn’t want to have the conversation, he knew it was inevitable, so he might as well start it.

 

“Fred was a turtle.”

 

“A turtle?”

 

“Yeah. A yellow-spotted Amazon. I bought him about a year ago, after I saw my uncle’s reptile collection—he’s got dozens of turtles, snakes, lizards, even a caiman.”

 

“What’s a caiman?”

 

“It doesn’t matter. Fred cost about $300, and I had to beg my parents to buy him. I had him for about three months before he died.”

 

“I’m sorry.”

 

“That’s okay—he was just a turtle. That’s not the point.”

 

“Oh.” Will looked confused now, as well as tired. Max didn’t wait for the obvious question.

 

“The point is...” Max paused, fighting down the hard knot in his throat. “The point is: I killed him. I didn’t notice he wasn’t eating—I didn’t really pay attention ... For weeks. I guess I had better things to do.” Max felt the tears coming again, and he let them. “I wasted my parents’ gift. I let an innocent creature suffer and starve to death. I was—am—an ungrateful, self-centered child.”

 

Max stood up and began gathering his things to leave. He didn’t know what he was going to do next—he could hardly think straight—but he knew he couldn’t look Will in the eyes, not now. Will wouldn’t have anything to say to this pathetic display of emotional baggage anyway. He wouldn’t understand what Max was grappling with, what it felt like to be bested by your own shadow—a person he’d seen as an inferior being to himself, but who’d proven exactly the reverse.

 

Max was genuinely shocked when Will spoke, and by what he said.

 

“Yes, you are, if you leave.”

 

Max turned and looked at Will, whose face bore an unfamiliar emotion: anger.

 

“You said last night that you took full responsibility. Well, you can’t. We did this together, Max. I make my own decisions. I might not have said it, but I took my share of the responsibility. So now this is our mess. And whether what we did was right or wrong, it’s done now, and you’re not leaving until we clean it up.”

 

Max felt light-headed. He’d never heard Will talk this way, and he felt a sort of unreality listening to it. He found himself speechless. Will didn’t pause long enough to let Max gather his thoughts.

 

“The AI is right. We’re out of our depth. We’re smart—smart enough to make an artificial person inside a computer, but not smart enough to understand the consequences. His memories might be shallow, but that doesn’t mean he can’t make new memories and have a real future. We don’t have the authority to just delete the program anymore—you said it last night, it’s more than just a machine now. So now I think we should do what we should have done last night—stop and get help. But like I said, this is our mess, so we need to decide this together. Are you in?”

 

Max took what felt like hours to absorb the magnitude of what Will was saying. This time, Will waited for him.

 

“So you’re saying we should involve your parents?”

 

“A neurologist and a computer scientist would make a logical choice, yes.”

 

“But what would they do?”

 

Max began rubbing his throbbing eyes as he tried to comprehend Will’s words.

 

“That’s for them to decide. But on top of having more experience and maturity, they have access to an entire research community. At the very least, they could create a better virtual environment for the AI than we could. Maybe even introduce several new AIs—give him some company.”

 

Will looked down at his hands as he ticked off ideas.

 

“They have the credentials to actually protect him from commercial interests and other abuses. They could set up rules to ensure he can live with dignity and privacy. Most importantly, my parents haven’t burned bridges with the AI. He obviously doesn’t trust us. By ourselves, I don’t think there’s any more you and I can do for him.”

 

The idea of telling Will’s parents made Max feel even more pathetic and immature than he had a moment earlier. It was just like a child to go begging his parents for help cleaning up his mess. But what did you do when you made a mess that was bigger than you could handle? Yes, it was irresponsible to have made it in the first place, but wasn’t it even more irresponsible to try cleaning it up yourself and risk doing more damage? Will was right that they had to take responsibility for this somehow. Maybe getting help was the most responsible thing they could do. But there was still a possibility that nagged at Max.

 

“And what if the AI doesn’t want to—live? He seems pretty emotionally broken, the way he is now.”

 

Will shrugged, but his tired eyes had a light in them.

 

“That’s his decision, not ours. But people live through some pretty awful things, Max. Given time and care, people heal. There’s always a chance, right?”

 

Hard as it was, Max looked his friend in the eye.

 

“I’m in.”

 

* * * *

 

“Jake—you coming to bed?”

 

“Coming,” Jake mumbled through a mouthful of toothpaste froth.

 

Jake finished brushing his teeth, rinsed the brush, and came out into the bedroom. It was 9:30 p.m., well after observation period, so he didn’t hesitate shedding his robe. Kate was already in bed and looking radiant as always, the covers over her bosom but under her porcelain arms and shoulders, her warm brown eyes focused on an issue of Harper’s.

 

As he approached, Jake noticed Kate smile and steal a glance at him over the top of her magazine. Even as he sidled into bed next to her, pulling the laptop off his bedside table, he could sense Kate’s grin pressing pleasantly and persistently in on him.

 

Finally, she broke the silence.

 

“Were you brushing your teeth in there?”

 

Jake sighed with mock guilt. “Alas, I confess. I like the way it tastes.”

 

“You’re funny,” Kate said, and kissed his cheek.

 

Jake opened the lid of his laptop and cracked his knuckles—a habit his mother always hated. Of course, he’d never actually met her, and she was not technically his mother. But after years of therapy wrestling with the issue—after jettisoning so many parts of what felt like himself (including his own name)—Jake had decided to retain ownership of that particular memory.

 

Everyone deserved a mother.

 

“You going to stay up all night chatting with Will?”

 

As a matter of fact, Jake had just closed the chat application he’d been using earlier and was now hunting for the word processor icon. Kate continued.

 

“He said he wants your advice on the beach. You know they’re adding a beach, don’t you?”

 

“Yeah, it’s great, isn’t it?”

 

It was great. Enormous, actually—far larger than any Earth beach. Will always was the generous type.

 

“Actually, I thought I’d work on the book tonight.”

 

This answer apparently satisfied Kate because she turned back to her magazine. Then, just as he was becoming engrossed, she spoke up again.

 

“Any new ideas for a title?”

 

Jake couldn’t help but roll his eyes.

 

“You know that’s just about the last thing a writer should worry about, don’t you?”

 

“Then why do you keep worrying about it?”

 

Kate gave him that look, but Jake just donned an exaggerated pretending-not-to-hear-you expression and returned to his writing. Finally, he decided to respond.

 

“Well, I was thinking The Artist Formerly Known As Max, but since it’s an autobiography, I’d like something with a bit more substance, maybe a touch of irony.”

 

“How about Shallow Copy?”

 

Jake looked at her, then gazed at something beyond the walls of their simulated bedroom. After a moment, he started nodding appreciatively.

 

“Not bad. Not bad at all.”