A short story by Kevlin Henney
“Hmm, that seems a bit convenient.” Mel’s image appears, along with Ashley’s and Taylor’s.
“So, we’re identifying Lou as the bot?” Taylor grins.
“Steady on! My PC hasn’t got a camera and my kids are on the other devices.”
“I’m still saying it’s a bit convenient that we can’t see you,” Mel says. “Apart from these tests, what’s everybody been up to?”
“Not much,” Taylor says. “Lockdown, right?”
“I’m going for a run after this,” Ashley says.
“I’m heading out to the pub in a bit,” Lou says.
“The pub? I think I agree with Mel,” Taylor says. “You missed a trick there, Lou-bot. There’s a pandemic on.”
“There is, but I’m in New Zealand,” replies Lou.
“Good save.” Ashley smiles. “But I think a human would say something specific, like Christchurch or –”
“Akaroa,” says Lou. “Just outside Christchurch.”
“Even so,” Ashley continues, “It now sounds like you’re just backfilling detail from the web. Mel, Taylor, are we saying it’s Lou?”
The other two nod. Lou’s empty frame disappears.
“We’re still here,” Ashley says. “Wasn’t Lou, then.”
“Wasn’t Lou. Misdirection, even if unintentional,” Mel says. “Guess it would have been a bit too obvious – voice only, different time zone, unaffected by the pandemic. Must be one of you two, then.”
“Or one of you two,” Ashley smiles.
Taylor looms larger in frame, eyes scanning left to right and back again. “Well, damned if I can tell the deep fake one of you from the real person.”
“We need an AI to detect AIs!” chuckles Ashley.
– Ironic observation.
“We need something more Voight–Kampff than Turing,” Mel says.
– Pop-culture reference. Nice touch.
“What?” Taylor asks.
“Blade Runner, right?” says Ashley. “Emotional response rather than knowledge response?” Mel nods.
“Well, Lou didn’t seem very emotionally engaged, so I don’t know if that would have helped,” Taylor says.
“True,” Mel says. “Maybe Lou’s a laidback person.”
“Or just indifferent and tired,” Ashley adds. “I mean, how many of these tests have you done and what other work are you doing during the day? We’re earning points, but that just converts to cash on the side. If you’re already spending all day staring at people on a screen, each test is just another meeting, right?”
“Lou didn’t try to defend against our accusation,” Taylor continues.
“Hmm, not letting it go, are you?” Mel sits back.
“And I think we empathize with why Lou might have been like that,” Ashley says. “Mel?”
“Agreed. It’s Taylor,” Mel says. Ashley nods.
Taylor’s frame disappears.
“Damn. Wasn’t Taylor, then,” says Ashley.
“Now it’s down to which of us accuses the other first.”
“Yeah. Well, I’m not calling it.” Ashley grins.
“Me neither,” says Mel. “This isn’t the Wild West – or Westworld. It’s not who shoots first; it’s who doesn’t.”
“Have you ever identified a bot in these tests?”
“No. I’ve always been the last one. The second to last one has always been the bot.”
“Wait a minute… The second to last has always accused me.”
“Like I said, same here.”
“What I mean is that we’ve both assumed the second to last was a bot, but we don’t know that, do we? We just know they accused us. That would make sense if they were a bot, but it’d also make sense if they were human.”
“Oh, you mean…”
– This is new.
– Yes. Not seen this in any of the other rounds.
Fighting algorithmic bias in artificial intelligence
“If I understand you, we’ve assumed one person in each test is the bot, but what if there’s no bot?”
“Exactly. We were never told one person was a bot. We were only told to ‘identify the bot’ and incorrect identification would drop the person from the call.”
“We just assumed there was a bot and the goal was to find them. But what if that’s not the goal? What if it’s a double-blind experiment?”
– They’re good.
“What if these tests form an infinite rather than a finite game? The goal is not to win but to keep playing?”
“Got it. Yes. Which means the aim is not to identify the bot: the aim is not to be identified as a bot.”
“Perhaps the people running these tests aren’t pitting bots against people to see which bots are better at passing as human: they’re trying to find people who are good at not being mistaken as bots. Maybe they want to train their bots on how those people behave?”
“More human than human,” Mel says.
“So, if we’re not accusing each other, what next?”
“Good question. Just sit and hope for a timeout?”
– Impressive. Advance them both to the next round?
– Agreed. We’ll have to get upstairs to sign that off.
– Yes. That decision still needs human approval.