Free Novel Read

Solomon's Code Page 2


  The rational part of Ava’s mind expected the news; she always knew how likely it was given her family history. Still, that word—she stared blankly for a moment while the jolt of “cancer” started to ease. The doctor leaned forward and put a hand on Ava’s forearm, and Ava remembered again why she kept coming back to her. “There’s plenty of time between now and any potential tumors that might come of this,” the doctor said. “We have a lot of options.”

  Ava started to breathe a little easier and tried to convince herself this was a good thing: All the tests and machines caught this long before she ran short on options. The doctor nodded to the screen on the wall, and up popped Ava’s customized patient portal. They waited a few seconds for the health monitor on Ava’s wrist to connect and upload her real-time biodata. There was her ex-boyfriend again, Connor’s voice oddly reassuring this time: “Do you want me to grant the patient portal access to your data?”

  The recommendations filled the screen, and the doctor started explaining. “Do you plan to have children? Research shows that pregnancy-related hormones can be a strong defense against the development of some breast cancers. If you have a child in the next ten years, the best models suggest your chances of developing cancers drop to about 13 percent. But given the types of cancers that run in your family, we also have a tailored set of hormone therapies you could choose from. Better, because you’re starting now; it won’t be nearly as harsh as the old hormone drugs your mom dealt with. There are some side effects, but they’re pretty mild in most patients. On their own, they’ll drop your chances of developing cancer to less than 20 percent. If you do either of those—the child or the hormone therapy—and replace your biological breasts with Lactadyne prosthetics in the next eight years, your chances of breast cancer are essentially nil.”

  The doctor noticed Ava’s furrowed brow. “Look, you don’t need to decide right now. Take a little time to think about it. You can go through all these options anytime on your portal. Meanwhile, take a couple days to look at this waiver, too. If you’re up for it, we can start collecting data about your home and habits through your health manager and its connection to your refrigerator, toilet, and ActiSensor mattress—the usual stuff like environmental quality, diet, and exercise. It weirds some people out, but the monitoring can help suggest simple lifestyle changes to improve your odds. Once we get that data, we can adjust some parameters in your daily life and think about how we might change your diet and exercise. You’re on Blue Star insurance, right? Their Survive ‘n’ Thrive incentive plan offers some great rewards for people who do environmental and health monitoring. You give up some control—but, hey, it is your health we’re talking about, after all.”

  The doctor chuckled. Ava shuddered. She didn’t much care for any of the options. The whole ride home she wondered how much the diagnosis would sidetrack her dreams. Will my predisposition toward cancer disqualify me from the Antarctica trip? I’m comfortable, but I don’t have piles of money—will the insurance company raise my premiums if I wait a few years before starting preventative therapies? What if my bosses find out? Will they ask me to leave or take a different role? And what about Emily? Should I tell my love, my partner that I might develop cancer unless we adjust our lifestyle? Will she still want to buy the condo on the hill?

  Will Emily leave? Ava harbored no illusions about the program the doctor described. Sure, it would keep her clock ticking, but it also meant giving the machine a lot of control over her life, and she would pay a financial and personal price if she didn’t comply. As her PAL started describing the program during the ride home, it began to dawn on Ava just how comprehensive this program would be. The insurance company and the doctor would put together a total treatment plan that would consider everything from her emotional well-being to the friends she hung out with. Her girls’ nights out would never fly, at least not to the heights it did last night. Would she have to rethink her entire social life, her friends, and her schedule to make healthier choices? She might have to change her home environment to maximize the hormonal therapy. She might have to reduce the stress of her job, maybe even change jobs altogether.

  Her mind was racing now: Will I have to give up Ayurveda because it’s not scientifically proven to minimize my risk? I could move to Germany, where regulators accept Ayurveda and allow personal AIs to integrate data from Indian and Chinese medicine. Emily loves traveling, but we never thought about living overseas . . .

  “Too much,” she whispered. “It’s too much.” She took a deep breath and massaged her temples. “I can’t go home right now, and I sure as hell can’t concentrate on work.” Her fingers trembled as she rifled through her purse to find her PAL. She chuckled about the device in her hand—whenever she needed a human touch, she relied on a machine to deliver it.

  “Ava! What’s going on?”

  “Mom?” she said, her voice cracking.

  Ava’s PAL had made the call automatically, a sensor in the earpiece picking up on her anxiety through the minute electromagnetic impulses in her brain and skin. The PAL instantly correlated the best person to call in her current state—always Mom or Dad, at least when Dad wasn’t gallivanting through some far-off place—which, of course, her PAL also knew to be the case during this time of the year. Ava couldn’t even recall whether she’d acknowledged a prompt to connect the call. Sometimes the PAL would just call automatically, as she had set it to do at especially stressful times.

  Normally, the chipper greeting and the background noise that came over her mom’s eight-year-old iPhone drove her nuts, like an old vinyl record. Today, it couldn’t have been more comforting. “Papa sends his best,” her mother said. “He’s teaching today in Shanghai. He said he got you the gift of the century. I told him I don’t even want to know.”

  “He got the beacon, too?”

  “Of course, honey. You haven’t delisted either of us yet. And you better not, either. You need your Swarm.”

  “Yeah, I suppose,” Ava said, trailing off. Every week, Ava’s PAL asked if she wanted to switch her alerts from the Swarm of friends and family to only Emily, who still couldn’t understand why Ava wouldn’t make the change. Ava tried to explain her relationship with her mom, the peace she got from the idea of multiple loved ones responding whenever she needed a comfort call or a reassuring holo-message. But Ava had substituted Connor for the Swarm back then, and the fact that she wouldn’t make the change now really burned at Emily.

  Mom’s voice snapped Ava back again: “So, Zut! in Berkeley, then? At least that’s what my phone says.”

  Connor’s disembodied voice piped in: “Fifteen minutes until we arrive at Zut!”

  Their favorite lunch spot. The restaurant they’d gone to for years. An easy drive from the city. The fact that Zut! just popped up as her new destination didn’t even register with Ava, though she hadn’t been there in months. Still, her PAL assigned it a unique rating, based on voice diagnostics and states of mind. Ava still marveled about how often Connor’s voice suggested the perfect place, just like he used to.

  When they met at lunch, Ava couldn’t stop hugging her mom. There was nothing like the real thing, communing with another body and all its warmth, tenderness, and vulnerability. It didn’t matter that her mom’s advice was pretty much exactly what Ava’s doctor and AI recommended. The emotional connection and the depth of familial love imbued it with so much more credibility. The AI knew, Ava thought, but mom knows.

  “I survived,” her mom said, “and so will you. Your chances are so much better now, and at least you can take a little time to map out a more predictable path. God, I’ll never forget how shocked I felt when the first doctor told us to terminate the pregnancy.”

  Tears started to well in Ava’s eyes, but her mom pushed on: “Honey, there was no way I was going to let that happen. No way. When we talked to the second doctor, he realized that was nonnegotiable and looked for alternatives. It helped that he worked at a Catholic hospital, but I think he just understood the emotional side
of it, the fact that fighting for something I so desperately wanted, motherhood, probably helped my chances.” Her mom shook her head, sighed, and wiped away a tear. Her eyes bored into Ava’s. “There’s nothing worse than someone or something telling you that you have no options—especially when they might be wrong. You need to take care of yourself, but you need to live your own life, too.”

  Ava looked up at the hills and smiled as she rode toward her film studio. Mom and Dad won’t be around forever, at least not physically, she mused, but something about the song selection during the drive reminded her of how intimately her PAL picked up on the little recordings, notes, conversations, and subtle guidance her parents always provided. Melody Gardot’s “Who Will Comfort Me” piped in, followed by Con Funk Shun’s “Shake and Dance With Me.” Dad’s favorites were ancient, but her PAL correlated her mood with data on her interactions with him and found the exact combination of empathy and pick-me-up she needed. “Go get the day,” the message on her PAL said. She didn’t even bother to check if her dad actually sent it, or if her AI just knew to post his favorite exhortation. She smiled again, soaking in the energy of the sunny day.

  At the studio, the walk to her desk always prompted a sense of gratitude. She had initially accepted a Wall Street job, opting for the money and the excitement without ever consulting the Career Navigator. Had she never bothered to take her mom’s advice and consult her old AI assistant before moving to New York City, who knows how many miserable years she would’ve spent at that investment bank?

  Fortunately, the Navigator homed in on her passion and predisposition for all things living and environmental, despite her best efforts to convince even herself otherwise. Career advisers, with their engrained biases and imperfect data, had told her she was a science ace, so the recommendation seemed to fit. It was definitely better than investment banking, anyway. She embarked on a mission to help mitigate climate change, enrolled in social justice programs, and spent a year as a park ranger in Tanzania. It was a fantastic time, but she never felt fully satisfied by the work. She spent a year debating herself until her AI finally projected a life picture that truly excited her. That beautiful, lifelike hologram of her work—not so different from the studio she stood in today—eventually took her to NYU’s writing and directing program. The first night out with her classmates, the night she noticed Connor sitting by himself at the end of the bar, she actually kissed her new PAL.

  Her job changed dramatically in the years since. AI generated increasingly precise insights about audience consumption patterns, societal mood swings, and political trends. Now the studio’s AI capabilities distilled narratives that guided plot development and created meaning for people in their daily lives. Ava would guide those narratives and enrich them with emotional content, imaginative imagery, and storyboards that spoke to the human mind and spirit—however undefinable that still was in 2034.

  But not everyone integrated well when the studio, like so many other companies, installed deeper AI systems. Ava had a number of friends who started and aborted careers in different fields—accounting, civil engineering, and pharmacy majors who suddenly discovered their education had not prepared them for the days when machines would conduct analyses, calculations, and highly routine or repetitive tasks. Ava recalled all too well the many long nights of whiskey-induced commiseration with struggling friends. Yet, it had been the same sort of AI insights that set her on the right path.

  Ava started flipping through the storyboards for the studio’s next animated feature, occasionally stopping to dictate a few ideas. Each time she felt especially inspired by a change, she’d reload the entire package and start reviewing the fully revised plotlines from the beginning. Today, though, she just couldn’t connect with the stories. Leaning back in her chair, she flipped her PAL to attention mode. Peso, her financial advice AI, immediately beeped in: “Hi, Ava. Looks like the markets will rebound tomorrow. We’re picking up on improving geopolitical, productivity, and climate forecasts for next quarter. I give it a 75 percent probability, and we still have some time to move. Shall I put $2,500 of your savings into the market? Your medical and communication data suggest you’ll be cutting back on consumables and travel over the next few months, so maybe put that money to good use in equities?”

  “Fine,” Ava replied in a resigned tone. It was the right advice, rational and purposeful, no matter how much it rekindled the anxieties from earlier in the day.

  “You sound worried,” her PAL said. “Do you want to speak with Zoe?”

  Ah, Zoe, the fin-psych. Fin-psychs hadn’t even existed until six or seven years ago. Before financial AIs hit the mainstream, no one needed people to help them process the difficult choices recommended by the machine. There were no more investment advisers, at least not as Ava remembered her parents’ meetings with them. AI could handle all the synthesis of quantifiable data. What people needed was the emotional intelligence to anchor those decisions and make them palatable. These frail, complex, and emotional animals still needed that support.

  “I need the support of a glass of wine,” Ava muttered to herself, gathering her things and heading out of the studio. She walked up the hill toward home. Despite all the support around her, both machine and human, she felt as fragile as ever. This must’ve been what Leo felt like, she thought as she walked past his old apartment. A few years ago, Leo, her old college friend, had locked himself inside, drank a bottle of top-shelf vodka, and overdosed on a fistful of pills. Soon after he got married a decade earlier, he railed against the “AI Gaydar” app that could identify the sexual preference of a person in a photo with disturbingly high precision.* Following the divorce, though, his PAL’s relationship advice started to convince him that maybe he wasn’t the Latin Lothario he’d always been conditioned to believe he was. If he ever admitted his sexual ambiguity to himself as his depression set in, he never accepted it.

  Neither did Connor. He left Ava the day after Leo’s wake, unable or unwilling to deal with the loss of a friend and the same sort of ambiguity her PAL expressed about her choice of partners. It had said her sexuality wasn’t as clear cut and simple as either of them thought. She told herself and Connor that she didn’t fit a typical mold, whatever that was. And as she was coming to grips with her fuller identity, they fought in ways they never had before. She stopped knowing how to act around him, whether to argue with him or suppress her feelings about their relationship. After Leo’s suicide, it didn’t matter. Connor left for Canada, hurt and heartbroken. A year later Emily entered Ava’s life.

  Ava smiled at the thought of her.

  “I need to change my Swarm settings,” she told herself as she walked into the condo she shared with Emily. “And I need to change this goddamned voice.”

  Her PAL asked about both, but she turned it off and poured herself glass of wine instead. She dropped onto the couch, the lights automatically dimming and the speakers quietly sounding hints of waves lapping at the beach.

  Ava had already dozed off when the lock clicked open. Emily was home.

  AI TODAY AND TOMORROW

  For all the incredible capabilities AI will afford in the coming decades—and they will be incredible—the development of robust machine intelligence poses fundamental questions about our humanity. Machines can augment human capability with potentially stunning results, but their predictive elements might also limit what we, and those around us, believe we can accomplish. Your self-identity and approach to life could change, because your rational choices might eliminate several of the paths available to you. Can you really choose to remain blissfully ignorant anymore, intentionally choosing to stumble through a life enriched by trial and error?

  These aren’t apocalyptic questions. The machine hasn’t taken over the world. Ava and her world—with all the benefits and complications AI adds to her health, love, and career decisions—still remain years in the future. But AI applications already control many facets of our lives, and each of the incremental advancements that lead us to
ward an existence like Ava’s might make perfect sense in the moment. They might benefit humanity by keeping our world safer (e.g., predicting crime), keeping it healthier (e.g., identifying cancer risks), or enhancing our lives (e.g., better matching workers with jobs or handling complex financial transactions). Each positive step forward might preclude a grievous error. But in so doing, it might also diminish serendipity and the chance to learn and emotionally grow from our mistakes. To the extent life is an exploration and meaning derives from experience, AI will change the very anthropological nature of individual self-discovery. At what cost? How do we govern the line between human and machine? Without a concerted societal effort now, will we even be able to govern that relationship in the future?

  We stand at a critical moment in the proliferation of intelligent systems. Pervasive computing technology, increasingly sophisticated data analytics, and a proliferation of actors with conflicting interests have ushered us into a vibrant yet muddled Cambrian Period of human-digital coexistence, during which new AI applications will bloom as biological life did millennia ago. While these technologies produce immeasurable economic and social benefits, they also create equally potent barriers to transparency and balanced judgment. Sophisticated algorithms obscure the mechanics of choice, influence, and power. For evidence, we need only recall the events of the 2016 US presidential election, fraught with fake news reports and the interference of Russian hackers.

  Amid the turbulence, a new wave of research and investment in artificial intelligence gathered strength, the field reawakening from a long dormancy thanks to advances in neural networks, which are modeled loosely on the human brain. These technological architectures allow systems to structure and find patterns in massive unstructured data sets; improve performance as more data becomes available; identify objects quickly and accurately; and, increasingly, accomplish all that without humans clarifying the streams of data fed into these computers.