Sunday, September 16, 2007

Advertisers: Trust Scientists, not Their Toys


A photo illustration of our lab at Texas Tech University.



Thanks to the many colleagues who pointed out the recent Ad Age article titled, "Hidden Persuasion or Junk Science?"

In the article, Mya Frazier outlines recent techniques by marketing consultants to use the tools of neuroscience and psychophysiology to better understand consumers.

There are a lot of great points in the article, but to me the most important point is about motivation. To dredge up the cliche Watergate-era quotation, "Follow the money."

People such as A.K. Pradeep, founder of Neurofocus, are in business to make money. That's fine. I am all about capitalism. But they are not scientists. The do not follow the facts for the facts' sake. They follow the money. And the money wants a quick solution. And there is no quick solution.

For years, I have been advocating the use of psychophysiological measures. I have attempted to argue against self-report measures. To shorten the case, I have reduced it to, "People lie." This idea is hardly mine alone. It's one that was cultivated in me by a group of like minded scientists.

According to Ad Age, consultants agree with the basic tenet:

"Amid the many vagaries of marketing research, one thing is clear: Consumers lie. About what they want. About what they need. Sometimes they do it purposely. Most often they simply don't seem to realize what they're doing at all. Mr. Pradeep and his peers in the field of neuromarketing say they have the solution."

If people lie, then consultants lie. It logically follows. Trust me.

Although they agree with the basic tenet, they do not agree that it applies to them.

I'm not calling Mr. Pradeep a snake oil salesman. I don't know the man. I have no reason to believe that he's not the most well intentioned consultant ever. But if consumers can lie because "they simply don't seem to realize that they're doing at all," then I see no reason why the same must not be true of marketing consultants, too.

As a scientist, I took entire courses trying to alert me to my biases. I've sat in coffeehouses with colleagues debating the nature of evidence. I really care about how I know what I know.

I know, for example, that it's in my nature to look for evidence that confirms my suspicions. So instead I look for negative evidence, or evidence that shows that I am wrong. This idea dates back hundreds of years and is common to science. It is far less common to industry.

Even looking for negative evidence is not enough. Even then, I am somewhat imprisoned by my own biases. We all are. That's why scientists publish their work in academic journals. To be published in a journal, a piece must be blind peer reviewed by others in the field. That is, our names are stripped off, and similarly trained peers dissect the work. Only then do the ideas see the light of day.

The process has its flaws, sure. But at least our ideas receive some sort of scrutiny. This idea was not lost in the Ad Age article.

"I don't want to trash people doing it, I'm just saying the incentives are such that there's no quality control because none of this data is published in peer-reviewed journals," said Paul J. Zak, the founding Director of the Center for Neuroeconomics Studies and a professor of economics at Claremont Graduate University. "I think the payoff is pretty low for marketers."

Here's the threat. I know a good deal about advertising. I also know a good deal about cognitive science. I certainly know more about the latter than the average marketing director. And I know that if you hook people up to any of the devices mentioned in the article, you're going to see differences.

The logic is simple:
  1. If you can perceive a difference in two stimuli, then that had to be a psychological event. That is, you psychologically perceived a difference.
  2. Psychological events tend to "live" in the brain.
  3. Finally, if you know there was a difference in the brain, and you go looking for a difference in the brain (or downstream peripheral nervous system), then you will find one.
And if you know something about advertising, you can interpret that difference in a logically consistent way.

But this is nothing more than a glorified focus group. Finding differences is child's play. The hard work is theorizing about the nature of those differences. That's very hard work. Trust me. And I see no incentive for consultants to do the hard work.

There is every incentive to look for any (likely) spurious correlation between data and sales. But there is much less incentive -- especially in the short-term -- to look for reasons why your relationship with an advertised brand might manifest itself in a particular way.

Allow me to give an example. More than two years ago, my lab set about investigating emotional psychophysiological responses to advertised brands. We were inspired by Saatchi & Saatchi CEO Kevin Roberts' ideas about Lovemarks.

We could have taken our show on the road after that initial idea. But we did not. We're scientists. We collected data. We tested some assumptions to try to ensure that we were not just seeing what we wanted to see. Those first data were recently published in the Proceedings of the American Academy of Advertising.

Before those ideas saw the light of day, three advertising scholars had to sign off on them. Now more than two years later, we are submitting the second round of data to the American Academy of Advertising for consideration.

It takes time to get it right. It's much easier to play a hunch. And if you have any idea what you are doing, then hunches often sound correct.

Most of the people mentioned in the article are thinking the right kinds of thoughts. They're doing the right kinds of things. They are just not doing them in the right kind of way. They are not giving the facts due diligence. It is this seemingly fast and loose treatment that makes terms such as "junk science" show up in headlines.

Labels: , ,

1 Comments:

Anonymous Anonymous said...

Glad youare back. Missed you

3:30 PM  

Post a Comment

<< Home