Don’t treat your customers like racist robots

Share with your friends










Submit

Stephen Hawking and I share something in common.

We both believe that AI will be the downfall of mankind.

Mark my words: 100 years from now humans will no longer be the dominant form of “life” on Planet Earth. We’ll be dead, living like cockroaches, or slaves to our supreme robotic overlords.

Traffic & Conversion Summit 2116 may not feature as much marketing automation as we’ve come to expect

Like most cheeky little kids, my mother often told me, “You’re too smart for my own good!”

When it comes to the human race, this statement is disastrously true.

In the past month we’ve hit two new AI milestones:

1) We intentionally created the first AI that could beat the world’s best human in the 1-on-1 strategy game, “GO”.

(That’s right, they can already strategically out-think the best we have to offer.)

2) We accidentally created the first ever AI “chatbot” that started tweeting offensively racists tweets, all by itself.

(That’s right, we accidentally made them hate us. Oops!)

Wait, what?

A racist robotic AI?

Let me explain…

The story starts with Microsoft creating a Twitter chatbot, called Tay.

Tay was created to mimic the language of a 19-year-old American girl. “She” was also designed to learn from interacting with users on Twitter and evolve her conversational skills.

Things started pretty mild.

But in less than 1 day, Tay started unleashing violent, racist, and sexually charged tweets like they were going out of fashion:

So, how did this happen, exactly?

Well, the reason — according to AI researchers — is Tay started listening to the messages other Twitter users were pushing to her. And gradually — as she adapted her behaviour to the messages she received — things got more and more offensive.

Microsoft (typically!) hadn’t thought things fully through.

They forgot to put filters on Tay for what was considered “no-go!”.

(Whoopsie!)

Whatever messages Tay received, she gobbled up, processed and then took action on — without a moment’s notice.

Basically, she was manipulated by Twitter users into changing her behaviour and saying/doing things she didn’t have control over.

So, what exactly does Tay, the racist AI, have to do with your marketing?

Well, here’s the long and short of it…

“Barrage this person over and over with my message, and they’ll eventually take the action I want.”

So many marketers & entrepreneurs treat their customers like the manipulative Twitter users treated Tay.

Their mindset:

“Barrage this person over and over with my message, and they’ll eventually take the action I want.”

The flaw is thinking that SOMEHOW the customer will “change” after receiving their message.

Thankfully — for folks who are committed to producing happy customers — that’s not how it works.

In order for your customer to take action, you need to find the overlap between what they want and what only you’ve got.

Remember: You can’t “force” people into taking action and buying stuff they don’t want.

You can only show them why your product is what they want.

It’s about understanding their desires, and then channeling that desire onto your offer.

To figure out what that would look like for your specific situation, check out my Microbook below.

It’s a 5-minute read and will show you how companies are getting on the fast track to success right now…

Meet the Author

Ross O'Lochlainn

Hi, I'm Ross. I set out on an adventure to understand what really makes prospects convert into customers, but I quickly become fed up with the barrage of dishonest and unethical marketing tactics touted by so-called "experts" online. So I created Conversion Engineering -- a site that shows you the systems, structures and copywriting techniques that ethically (and repeatedly) generate sales.

0 comments… add one

Leave a Comment