Home > Buzz Kill(73)

Buzz Kill(73)
Author: David Sosnowski

But to improve on identification and, beyond that, actually prevent suicides? That was where the advantages of lacking emotion and not feeling pressure were dubious at best if not altogether counterproductive. Instead, Buzz needed to care about suicide and its prevention not merely for the points it would score, but because it could understand and identify with a fellow consciousness in distress.

And so George grabbed his phone, opened up his proprietary texting app, and reached out to a fellow consciousness he felt sure would appreciate his distress.

 

 

48

“Sounds like Buzz needs some empathy,” Pandora wrote after George confided the bullet they’d dodged without even knowing it had been fired.

“Great,” George wrote back. “How am I supposed to code that?”

Pandora had been thinking about it—had mentioned it specifically to be asked—and so she wrote back: “I think that a machine learner that recognizes visual metaphors should come in handy.”

“But how would we implement that?”

“Give Buzz a bunch of labeled content and let it figure out what’s signal and what’s noise when it comes to empathy.”

“Labeled content usually means images,” George countered. “You got any pictures of empathy lying around?”

That was easy, Pandora thought. She’d set out the bait, and George bit. Because she did have a picture of empathy—or its lack, which was a start. It was a cartoon her dad had facing him during his sessions, as a reminder, or perhaps for inspiration. It featured two men arguing on either side of the number six (or nine) written on the ground between them. She took a picture with her phone and texted it as an attachment, adding the caption: “Counterexamples are important too.”

The problem with patching a self-evolving AI is realizing you need to do it before the AI evolves beyond your capacity to patch it. In George’s case, he got lucky. The image-processing subroutine was accessible and close enough to its original form to allow modifications. And so to Pandora’s empathetic counterexample, George added a variation with one of the two men sporting a thought bubble showing the other’s point of view, captioned “Now I see what you mean!” He also included an assortment of stills and video clips that came up when he searched on the words empathy, sympathy, compassion, and altruism, including several featuring Scrooge and the Ghosts of Christmas Past, Present, and Future, with an emphasis on the especially heart-tugging scenes involving Tiny Tim. Positive examples were made equal to E and counterexamples were identified as not-E, with the goal of isolating the nature of E by analyzing the relationship between “actor” and “target,” each of which was boxed and labeled accordingly.

George would tell Pandora once he was sure it worked, though there was one part of Buzz’s empathy training he wouldn’t share, seeing as it related to another secret he was already keeping from her. But the livestream of her hyperemotive face was just too good not to include, especially when he goosed it toward the empathetic end of her range by sending GIFs of cute babies, both human and not, with the latter favoring kittens—of course. These videos and her reactions to them were labeled E, with Pandora identified as “actor” and each respective cuteness, “target.” And, because counterexamples are important too, George (a practicing vegan) underlined the point, pairing videos of Pandora’s heart melting over puppies and ducklings with images of slaughterhouses labeled—of course—not-E. After all, if empathy was owed humans, withholding it from the other species we shared the planet with was just a little too anthropocentric for his taste.

Given his success in visualizing empathy, George decided to use the same approach to teach Buzz about depression, a mental state the AI would need to grok if it was going to grok suicide. Unfortunately, when he searched for training material, the majority of results featured static images, usually heavily shadowed, featuring heads bent downward and framed against rain-streaked windows or covered with blankets. Noticeably absent was the actor-target dynamic he needed for Buzz to make a causal connection between the labeled image and a suicidal state of mind. Before abandoning the approach, however, George stopped and thought:

What is the problem I was hired to solve?

It hadn’t been to stop suicide in general; it wasn’t even stopping teen suicide, specifically. No, he’d been hired to stop teenage and young adult Quire users from killing themselves online, especially while logged in to their accounts. And while suicide classic was usually linked to depression, this new strain was often the result of some form of bullying, which was perfect from a machine-learning standpoint because it provided the same actor-target dynamic as Buzz’s empathy training. In fact, this new training could be considered a subset of that training—by providing plenty of counterexamples.

Searching on the terms bullying and cyberbullying, George found more than enough examples to choose from, including still images, video, and transcripts of actual exchanges between victims and their bullies. Both physical and verbal bullying were well represented. A lot of Buzz’s onboard Quire data fitting these categories had already been identified, and all George had to do was point Buzz in the right direction. There was so much good content, in fact, he felt kind of sad for feeling so happy about it—something he should probably talk to Roger about. Or not.

Before shutting the hood and recompiling his revised source code, George added an administrator-level approval process Buzz would have to clear before it could act autonomously. Specifically, it needed to pass the Turing test before it started chatting with real, innocent bystanders online. And not only that. Buzz would have to pass the Turing test to the satisfaction of George or his cocreator, Pandora. He thought about stipulating “and” to require Buzz to be declared conscious by both of them, like a generative adversarial network, or GAN, debating itself to the best answer. But there were plenty of ways to be killed or incapacitated in San Francisco, from cable cars to collapsing cranes to straight-up homicide, while Alaska had bears and, hell, just being outside too long. And if there was one law George had come to respect more than Moore’s law of exponentially increasing transistor density, it was Murphy’s: if anything can go wrong, it will. And so “or” would do—George decided—“or” would be fine.

 

 

49

Though both knew they couldn’t unring the bell, George and Milo tried. For one thing, George didn’t know anybody else to talk to at work and—despite his felonious ways—Milo could be both entertaining and informative. Sure, his Virgil was also a depressing SOB, but to hear Milo tell it, George’s optimism could stand a little depressing—or “rightsizing,” as the SOB himself liked to put it.

And so they kept meeting in the cafeteria, George’s office, the 1980s arcade room, and Milo continued to provide peeks behind the corporate curtain, relaying gossip, theories, rumors, and facts, all in equal measure. Usually, these data dumps qualified as nice-to-knows, though occasionally a need-to-know got tossed in, along with a couple of would’ve-been-nice-to-know-sooners. For example:

“There’s bad news in the AV space,” Milo said, pulling up a chair in the cafeteria.

“I didn’t know QHQ was doing autonomous vehicles,” George said.

Hot Books
» House of Earth and Blood (Crescent City #1)
» A Kingdom of Flesh and Fire
» From Blood and Ash (Blood And Ash #1)
» A Million Kisses in Your Lifetime
» Deviant King (Royal Elite #1)
» Den of Vipers
» House of Sky and Breath (Crescent City #2)
» The Queen of Nothing (The Folk of the Air #
» Sweet Temptation
» The Sweetest Oblivion (Made #1)
» Chasing Cassandra (The Ravenels #6)
» Wreck & Ruin
» Steel Princess (Royal Elite #2)
» Twisted Hate (Twisted #3)
» The Play (Briar U Book 3)