Home > Buzz Kill(69)

Buzz Kill(69)
Author: David Sosnowski

 

 

45

To kill time while they waited for Buzz to “wake up and smell the qubits,” George and Pandora resumed their intellectual intercourse, this time a bit more critically. For example: “The Turing test is flawed,” he started them off.

“Explain,” she texted back.

“It assumes that there’s a ‘standard human’ that recognizes human-level intelligence.”

“Go on.”

“So you stack the deck by getting a human who’s bad at reading the difference between a human and a machine.”

“Like a UFO,” Pandora texted back.

“Explain.”

“If I don’t know what a plane is, it becomes a UFO.”

“Yes,” George texted back. “If the human doing the judging is bad with reading humans, the test subject could be pegged as a UTO: unidentified thinking object.”

“But who’s that bad at judging humans?”

“Half the people I work with are on the spectrum,” George texted back. “It’s why they’re happier with machines than their coworkers. Turing himself was probably autistic.”

“So the father of AI might have been bad at his own test,” Pandora typed, connecting the dots. “Continue.”

“Not that he wouldn’t fit in with most people nowadays,” George texted.

“Explain.”

“Quire set up a program where users could identify suspected chatbots in light of the whole election hacking thing.”

“Okay?”

“They had to pull the plug because people kept flagging political opponents as bots.”

“That’s crazy,” Pandora texted back. “Everybody knows MAGAs are pod people, not bots.”

“I don’t disagree,” George texted back. “But the right says the same about the left.”

“So people can’t agree on whether other people have a human level of intelligence?”

“In a nutshell.”

“We’re doomed.”

“Ya think?”

As a poorly socialized border troll, it behooved Pandora to offer a counterargument to George’s previous observation about the Turing test, and so: “The Turing test is flawed,” she typed.

“Didn’t we just do this?”

“You did. I played along. Now it’s my turn.”

“Okay.”

“People are predisposed to seeing humanness where there isn’t any,” Pandora typed. “We see Jesus on toast and a human face on Mars. We’ll anthropomorphize anything. We invented stick people as the universal stand-in for us. A circle for a head, a few lines for body, arms, and legs, and we’re good; that’s a person.”

“Eliza,” George suggested.

“Exactly. All that program was, was automated string jujitsu. But still it hooked people into talking to it even after they knew it was a program.”

“Speaking of,” George typed. “This is just a check-in but . . .”

“Poop.”

“Oh thank God . . .”

And then the qubits got interesting.

George had noticed it first, while going through Buzz’s trash. The k-worm had been programmed to oversample, flagging maybes as yeses, just in case. As a result, it consistently flagged more resources for evaluation than George and Pandora could have ever gotten to. And so they began their own triage process, using any indication that a given resource had been shelved at its source as good enough reason for them to dismiss it as well. Now that Buzz was making its own decisions about what flagged resources to incorporate, it continued to assume what others had flagged as trash was exactly that.

But going through Buzz’s discards, George grew concerned by the sheer volume of “failed” AI projects that were piling up. Investigating further, they seemed to follow a pattern of early, initial promise followed by suddenly going off the rails before being abandoned. Much of this “trash” fell into the category of image recognition, a fairly representative AI challenge, suggesting that trouble here could herald trouble elsewhere, perhaps suggesting some principle of diminishing returns inherent in the whole enterprise of AI, dooming it forever to being close, but not quite.

After texting George off the ceiling, Pandora suggested they do a deep dive into one of these so-called “failures,” which, when you thought about it, were pretty counterintuitive. “Machine learning is supposed to be like a ratchet, correct?” she typed. “It doesn’t make sense to go backward. That’s like imagining a machine”—and she was going to write “developing dementia,” before deciding to switch to—“getting dumber the more it trains.”

And that’s when the two learned that you really do get what you pay for. In the case of their image recognition machine learner, abandoned for getting worse over time, the fault seemed to lie less with the system itself than with how much insight you can expect from human content labelers making minimum wage or less to decide if the machine learner has identified an image correctly. Because when George and Pandora reviewed the results for one of their “failed” systems, they saw something else at work, and it excited the hell out of them.

“You see it, right?” Pandora typed.

“I think so, but you say it,” George wrote back.

“The machines were downgraded for being more creative than the humans judging them.”

“So it seems,” George said, looking at the same data set as Pandora in which the AI had “mistakenly” labeled as “cat” a woman with leonine hair, a decidedly feline bit of Japanese calligraphy, the letters M and S and Q, and a cubist cat rendered by a Picasso wannabe.

“The system’s gone from simple identification into a kind of visual rhyming and metaphor,” Pandora went on. “It’s gone from imaging to imagination and got shut down for its trouble. It’s like the cyber equivalent of Thomas Edison being labeled as addled or Einstein flunking college.”

George, meanwhile, opted to come to the human judges’ defense. “To be fair,” he texted, “I can see how this would suck for an AV navigation system, but . . .” He didn’t continue, however, knowing Pandora would complete the thought, which she proceeded to do.

“. . . but it’s a great start for an AI that can think outside the box,” she typed.

“So to speak?” George texted.

“So to speak,” Pandora agreed.

 

 

46

Pandora kept her promise to visit Gladys in person more often, though her grandmother hadn’t made the complementary promise to be awake when she did. More and more often, she’d find her grandmother sound asleep in the middle of the day. No doubt this owed a lot to her being effectively bedridden—there were just two pieces of furniture on Gladys’s side of the room appropriate for sitting on: a guest chair and the bed—but the Alaskan night wasn’t helping, still being on the longish side, even in early March.

Sometimes Pandora would clear her throat, scrape the chair across the floor, or make some other noise to wake Gladys, but the question she asked herself increasingly was: Why? It was as if her grandmother had lost her reason for remaining conscious along with her memories. And so more often than not, she’d let Gladys sleep, occupying herself with homework or another brainstorming session with George—or trying to at least.

Hot Books
» House of Earth and Blood (Crescent City #1)
» A Kingdom of Flesh and Fire
» From Blood and Ash (Blood And Ash #1)
» A Million Kisses in Your Lifetime
» Deviant King (Royal Elite #1)
» Den of Vipers
» House of Sky and Breath (Crescent City #2)
» The Queen of Nothing (The Folk of the Air #
» Sweet Temptation
» The Sweetest Oblivion (Made #1)
» Chasing Cassandra (The Ravenels #6)
» Wreck & Ruin
» Steel Princess (Royal Elite #2)
» Twisted Hate (Twisted #3)
» The Play (Briar U Book 3)