It used to be an obscure oddity, now we all need to understand it. 18 years ago, I posted this image: …and I still can't get it out of my head. Sorry. Why do we have such a creeped-out reaction to images that aren't quite right? A robot that looks too ...
‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ 

The Uncanny Valley

It used to be an obscure oddity, now we all need to understand it.

18 years ago, I posted this image:

…and I still can’t get it out of my head. Sorry.

Why do we have such a creeped-out reaction to images that aren’t quite right? A robot that looks too much like a person, or a song that we can somehow tell has an AI voice.

The creepiness predates AI, and was first named in a paper by Mori fifty years ago. But it’s so visceral that it almost certainly originated along with our fear of snakes and other evolutionary safeguards.

There are probably two things going on.

First, there’s a corpse alert. Corpses are dangerous, and something that’s alive/not alive is a warning sign. Same thing with zombies.

Second, imposter alert. Imposters are even more dangerous than predators, and we honed our imposter-detection skills a long time ago.

And now, everyone has AI available to them, and many of us are churning out experiences that border on the uncanny valley.

Not many people care about an automated drum track on a pop single, but we get uncomfortable when the lead singer isn’t quite human. We don’t mind when a website figures out our zip code for us, but when a bot apologizes for a late shipment, it means less than nothing. We’re okay with animation, but not with an educational video that combines beautifully shot real footage with an animated human that’s almost but not quite real…

While it’s possible to get used to snakes, and, perhaps, to corpses, I’m not sure that the general population is in any hurry to get used to either, or to the uncanny valley.

It’s likely that AI quality will increase fast enough that many of the most egregious valley moments will stop happening. But none of that will help with the expectation chasm. When you install an AI admin, or use AI for customer service or therapy, we will always end up with a valley sooner or later.

The solution is simple but takes effort: don’t fake it. Celebrate your genre, make a promise and keep it. Not in the way we need to label the ingredients in food, but simply to avoid the surprise realization, to protect your customers from the ick. Triggering an evolutionary survival mechanism is rarely good for your career.

“I confused and alienated people as I worked to save money trying to get them to think this was a person” is not much of a mission statement. Our job is to find problems and solve them, not to hustle our way with shortcuts that feel creepy.


Three videos for today:

The talking dog and AI.

Hank Green on the essential Mola sunfish metaphor.

Talking with Jon and Becky about We are For Good and the work of non-profits.

      

The gap between “I” and “no one”

This is where empathy lies, and it’s an easy chasm to fall into.

“I can’t imagine eating durian ice cream,” is not the same as “no one likes durian ice cream.”

We fail as marketers, editors and project managers when we can’t find the empathy to bridge the gap. It’s a lovely shortcut to make things for yourself, to imagine that you are the client, the reader or the customer. But most of the time, you’re not.

“It’s not for me, but it might be for you.”

      

Popular (and good)

Popular is easy to measure. Good, not so much.

Setting out to make something popular requires only a focus on the crowd and on the moment. Most pop music is popular simply because that’s what it was built to do.

Good work can be good without being popular. And so the two goals aren’t easily aligned.

It helps to begin by becoming comfortable with what good feels like to you. Because conflating it with popular is a trap.

      

A nearly perfect score

After playing 498 days in a row, my score today in Bongo was the second-highest in the world:

There’s a difference between casual online games that have a right answer, and those that are open-ended.

In crossword puzzles and most of the games from the Times (like Wordle and Connections) you’re trying to guess what the puzzle constructor had in mind. This can lead to frustration, because the idiosyncratic nature of inventing clues and answers means that you might not be in sync with the person at the other end. They’re inherently closed systems.

Bongo, on the other hand, is generative and combinatorial. There are bazillions of possible right answers, and your goal is to find a right answer that’s worth more points than anyone else’s. It doesn’t matter that I invented the game, I have no advantage over everyone else, because we all begin with the same tiles.

For me, open-ended games are time well spent. Have fun.

      

AI slop

It’s not slop because it was created by an AI. It’s slop because it’s slop.

I just read the first two pages of a sci-fi novel on my Kindle. The author proudly proclaims that the 400-page book was created without any AI whatsoever. Alas, the book is slop. The writing is overwrought and the dialogue is banal. If a page isn’t worth writing, it’s unlikely a chapter is.

Slop happens when a marketer who should know better stops trying. It’s when we prioritize volume over impact. If we measure the cost of what we create instead of its value, it’s likely we’ll end up with slop.

AI makes this easier, no doubt. But it pays to focus on avoiding slop, not in worrying how the slop is made.

The question is now, “Who approved this?” not “who made this?”

      

More Recent Articles

[You're getting this note because you subscribed to Seth Godin's blog.]

Don't want to get this email anymore? Click the link below to unsubscribe.

Safely Unsubscribe ArchivesPreferencesContactSubscribePrivacy