AI is the New Snake Oil

The New York Times claimed that “A.I. Is Writing Fiction” despite having no solid evidence for their story’s example. What actually happened, as mentioned in an article published just an hour earlier, is that Mia Ballard’s novel, Shy Girl, had been accused of using AI tools while creating this book and, as a result of those accusations, Hachette had pulled the book from UK shelves and cancelled the US release. Because neither Hachette nor Ballard have confessed to using AI tools, we can only speculate whether the “A.I.” actually “wrote” the book.

What the NY Times published, in lieu of credible evidence, is a tweet from the CEO of a tech company (archive). Not only does this CEO’s main product detect AI, which would give him an obvious incentive to overstate his findings, his tweet implies that he illegally downloaded the book from a piracy site to do his “research.”

And instead of explicitly stating this massive conflict of interest, or even mentioning the implied copyright violations, the NY Times posted an update that included their own “research”, which also implies that they, too, illegally copied the unpublished book.

Does this mean the NY Times supports copyright infringement? Considering they sued OpenAI for copying their articles without permission, I doubt it. Instead, they probably saw a tech professional in a high position who confidently did something minor as part of a larger story and didn’t question whether or not that seemingly “less important” action was appropriate. (This is purely conjecture, though.)

The point is that the New York Times, like so many others, were fooled by confident people with technology that, by its nature, exudes confidence.

America Loves a Confidence Man

It’s no secret that I love a good con artist story, whether it’s Huckleberry Finn or Ferris Bueller. Instead of the macho, “tough guy” stereotype, I’ve always been drawn to depictions of men with golden tongues rather than iron fists. And I’m not alone, either, as this archetype spans mediums and genres, giving us recent films ranging from Smoke and Stack in the horror-action film Sinners to Mr Wolf’s crew in the Ocean’s 11 inspired illustrated novel series The Bad Guys.

Beyond popular fiction, the “American Dream™” is built on the idea that people can climb from the bottom rung of society to the penthouse, if not through hard work and determination than through ingenuity and moxie. We, the people, revere “the outlaw” as much as “the cowboy”. This idea of a self-made man is what props up the US government, capitalism, and all institutions of imperial power from the White House to Wall Street. American heroes are individuals who rise to the top through brute force or gumption. And what trait do they have in common? The confidence of someone who earned their place.

But the problem with “confidence” is that it’s often an indicator of success, not a guaranteed byproduct. Speaking with confidence doesn’t prove that someone’s right, simply that they’re confident. So if you can’t make money working hard, why not focus on having confidence instead?

This is the basic gist of the “snake oil salesman;” he’s certain his product works. As he talks, he maintains the same pulse as a napping security guard. His arms move around the product with great measure and infinite ease. He’s relaxed and calm, which are not the traits of a liar trying to steal your money, right? Therefore, he must know what he’s talking about; or, at the very least, he must believe what he says.

Now replace that caricature with a tech bro: thin, if not athletic; non-traditionally handsome; casual clothes from overpriced brands; the title of “self-taught entrepreneur” because he skipped the traditional path (because he was just too darn clever to follow the beaten path.) On top of that, this man is talking about an area foreign to the masses which seems to be constantly changing at an ever-increasing speed. If he says his machine learning model is better than OpenAI’s, it must be, right? In an age where all of Earth’s history lies within reach, there’s no way someone could easily lie about something that anyone with a Computer Science degree could easily disprove. Right?

Except we have near-limitless examples of confident tech bros dousing their laps in lighter fluid moments before striking a match. We see robots do amazing feats and then learn they were actually people in costumes. Investors are starting to get worried and businesses are not seeing any gains from integrating AI into their workflows, leading some economists to predict a decrease in productivity.

Generative AI just isn’t as useful as it was promised to be.

Chat Bots are Confidence Men

A recent study compared the relative accuracy and confidence between human and AI responses. It showed that AI are confident regardless of accuracy:

In humans, confidence appears to encode an internal estimate of success probability that flexibly integrates information about task difficulty, discriminability, and correctness. In contrast, GPT-4’s confidence remains uniformly high, weakly sensitive to accuracy, and strongly shaped by task structure, suggesting that it reflects surface-level properties of its output distribution rather than a robust internal monitoring process. (source)

Chatbots are designed to be confident. In normal human interactions, confidence is often an indicator of accuracy. That’s the key difference: AI tends to be confident regardless of accuracy while humans tend to be confident because of accuracy. AI isn’t designed to be accurate, just sound confident.

As a result, chat bots are, by definition, “confidence men,” no different than Tatum O’Neal from Paper Moon or Ryan O’Neal from real life (allegedly.)



Comments

What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.