Monday, April 09, 2007

Frame Your Science or Have it Framed

Update: Tuesday, April 10, 2007, 7:48 a.m.: Persuaders quotations added.

A few weeks ago, the Texas Tech college of mass communications faculty sat around a room talking with heads of media industries. The question was asked, "How often do you read our research in academic journals?"

The answer was never.

This is a problem. And if it is a problem with mass communications, it must be an even worse problem with more traditional sciences.

If you were to pick up the latest issue of a communications journal, you could probably read it and make some sense of it. If you're not a scientist, you might get a bit lost on the methodological detail. But you could probably learn how to make a better advertisement.

But you don't.

And we don't encourage you. As public scholars, most of us fail completely.

The problem is intensified with controversial topics, such as global warming, evolution, and genetically modified food products. In these areas, politicians, pundits, and a vast array of non-scientists have an agenda to push.

They do not want to do science. They want to sway public opinion. They want to do it quickly. And you're never going to do that with a long lecture of the science behind it all. Like the introductory textbook, they want a simple metaphor or exemplar. They want something simple.

"My grandfather was not a monkey," you might hear.

Sure it oversimplifies everything and misses the scientific point, but it resonates.

Sometimes rather than an exemplar, politicians and pundits might try to frame the way in which an argument is discussed. GOP researcher Frank Luntz helped renamed "global warming" as "climate change" for a large chunk of the Republican party. Here is an expert from the Frontline documentary, "The Persuaders."



FRANK LUNTZ: Look, for years, political people and lawyers – who, by the way are the worst communicators – used the phrase "estate tax." And for years, they couldn't eliminate it. The public wouldn't support it because the word "estate" sounds wealthy. Someone like me comes around and realizes that it's not an estate tax, it's a death tax because you're taxed at death. And suddenly, something that isn't viable achieves the support of 75 percent of the American people. It's the same tax, but nobody really knows what an estate is, but they certainly know what it means to be taxed when you die. I'd argue that is a clarification, it's not an obfuscation.

DOUGLAS RUSHKOFF: Luntz has admonished Republican politicians to talk about "tax relief" instead of "tax cuts," and to replace the "war in Iraq" with the "war on terror." He once told his party to speak of "climate change," not "global warming."

FRANK LUNTZ: What is the difference? It is climate change. Some people call it global warming, some people call it climate change. What is the difference?

DOUGLAS RUSHKOFF: It apparently made enough difference to Republicans that they began to use "climate change" almost exclusively.

Sen. JAMES INHOFE (R-OK): –cause global – cause climate change.

SPENCER ABRAHAM, Secretary of Energy: –the President's global climate change initiative–

Vice Pres. DICK CHENEY: –climate change research–

Pres. GEORGE W. BUSH: –and we must address the issue of global climate change.



What is a scientist to do? My former colleague, Matt Nisbet, studies this area, and this week came out with a bold agenda for scientists in the journal Science with science writer Chris Mooney.

"In reality, citizens do not use the news media as scientists assume. Research shows that people are rarely well enough informed or motivated to weigh competing ideas and arguments. Faced with a daily torrent of news, citizens use their value predispositions (such as political or religious beliefs) as perceptual screens, selecting news outlets and Web sites whose outlooks match their own," Nisbet and Mooney write.

There are important implications in scientists stepping away from the proverbial microscope and into the policy arena. However, as Nisbet and Mooney point out, sticking to the facts might end in a lost battle to defend their science.

Public science is a part of science, and science plays an increasingly important role in society. Scientists who hide in the ivory tower and allow others to frame their ideas may have honor on their sides, but they will have few voters or their government funding on their sides.

Read a Weblog-based discussion of this issue on Nisbet's blog.

Labels: , ,

Friday, February 02, 2007

Good People, K-Y Jokes Fill Lab


This world contains a lot of great people. If you read the news a lot, cynicism might color your perception. But great people dominate.

In my lab, we measure psychophysiological responses to media. One of the responses we measure is skin conductance. This indexes activity in the sympathetic nervous system, as this activates the eccrine sweat glands in the palms.

Since salt affects conductivity, we need a saline free gel for those sensors. You can buy expensive electrode gel online, but you can also just use K-Y Jelly. When experiments are running, we use a lot, so we buy it in bulk.

And perhaps it is a sign of our sophomoric senses of humor, but buying K-Y Jelly in bulk is always funny. It is especially funny because we usually also buy medical tape (for holding down sensor wires) and paper towels in bulk. It's just an odd combination.

I've known the order was coming for many months, and Wednesday I finally provided my shopping list to the college's executive assistant. This is, perhaps, the nicest woman whom I have ever met, and I felt bad sending her out on the errand. But it's a government procurement card, and I cannot use it.

In a line for the ages, this poor woman was beginning to get a weird look from the cashier as she rang up 7 ... 8 ... 9 ... 10 ... 11 ... 12 tubes of K-Y. A final awkward glace from the cashier led our crack staffer to deadpan, "We're planning a big Super Bowl party."

Truly a line for the ages.

A couple of hours later we're sitting in lab meeting recounting this story with great hysterics, and I look at the people there, and I see a great group of people.

I didn't say anything at the time, but I felt damned lucky. I bolted halfway across the country for this job at Texas Tech, and here I have an amazing lab 6 months later. My doctoral student is as good as any I have met, three master's students are working in the lab, and I have a great undergraduate working in the lab (and another one studying abroad in Italy this semester).

I have no idea how I managed to find 6 great people in just 6 months, but man are we having fun in the lab. The K-Y tale led to many bad jokes, and I think everyone was smiling most of the time.

Our first psychophysiology experiment participant should run in about two weeks. You might here the champagne pop all the way from Lubbock.

Labels: , ,

Sunday, January 28, 2007

Science Publishing: Ideas or Dollars?

Myriad problems plague the academic publishing model. Quantifying success with numerical indicators encourages scholars to publish for publishing's sake.

Likewise, publishing in academic journals is prized far more than books or book chapters. In no small irony for faculty researchers, publishers make money on journals. Researchers can only make money from books and book chapters, which are less valued.

Reading the Chronicle of Higher Education, I ran across this article.

Publishers' Group Reportedly Hires P.R. Firm to Counter Push for Free Access to Research Results
By SUSAN BROWN
The Association of American Publishers has hired a public-relations firm with a hard-hitting reputation to counter the open-access publishing movement, which campaigns for scientific results to be made freely available to the public, the journal Nature reported on Wednesday.

The firm, Dezenhall Resources, designs aggressive public-relations campaigns to counter activist groups, according to the Center for Media and Democracy, a nonprofit organization that monitors the public-relations business.

That's right. Hire an attack dog to tackle those radicals suggesting that science -- of all things -- should be about ideas rather than profits.

A few minutes later, I saw that my former colleague Matt Nisbet also wrote about high priced journal subscriptions today. Nisbet references an excellent article in the Washington Post titled, "Publishing Group Hires 'Pit Bull of PR."

Labels: , ,

Sunday, August 20, 2006

More Thoughts on Peer Review

Updated 9:22 p.m. August 20, 2006

It seems that academic peer review is my favorite rant on this Weblog.


  • On June 10, 2006, I said that Numbers Chasing Sullies Science in reference to many insecure scholars in my discipline who prefer to publish several heaps of garbage rather than a single original idea.
  • On May 7, 2006, I said that the Academic Publishing Model Is Broken in reference to how libraries were being pillaged by journals.
  • On December 3, 2005, I said that the System Is Fundamentally Broken after I had a good paper rejected because too many good papers were submitted.
  • And on November 24, 2005, I said that Some Days You Just Want to Know because I was spending more time anticipating reviewer complaints than concentrating on good science.

My favorite pet peeve made it back to the forefront Saturday when a review came back negatively. I can handle negative reviews; they're part of the process. But this one was extremely wrong minded.

In any given field, there are only a handful of journals that matter. And when a poorly trained editor rotates into the top spot, it is a painful few years.

Poor scientific training shows itself in an editor. You can mask poor training as a researcher. But as an editor, you are exposed. If you miss the point of being a scientist, you cannot be an editor. You will make bad decisions. You will let in finely polished bad ideas and fail to recognize good ideas when they fail to conform to some standard.

My career is far from perfect, but the best decision I ever made was to head to Bloomington, Indiana. Thanks to Annie Lang, I feel that I was well trained as a scientist. I care passionately about what it means to do science. I care about the process. And I respect the process.

And I hate ... HATE ... when the trappings of science get confused with science. Are the data good? Is the idea meaningful? This is the key. Yet too many poorly trained scientists -- and editor X -- seem to not understand. They appear to have gotten their Ph.D.s as the "walks like a duck" university.

Well science that walks like a duck can still be -- and often is --bad science. Polish is confused for quality, and good ideas can be lost if they do not fit into the square hole, as the square peg should. It takes good training to know the difference ... or to care enough to look for the difference.

When I received my first journal article to review, I asked Annie for advice. As always, she came through in the clutch. She said to separate the data from the idea. Are they good? Is it good science no matter how poorly written? Can the science be saved? And ever since, I have reviewed every article with this metric. Science deserves no less.

I wish that my coauthor and I had received the same treatment. Instead I spent today using the handful of good ideas in the review to improve the paper (there's some merit to the system), which will go out to a better journal tomorrow. And rest assured, this piece will find a home. The idea is good, even if the polish was a bit smudged.

In the meantime, we wait for another right-minded editor to rotate onto another journal.

Labels: , ,

Saturday, June 10, 2006

Numbers Chasing Sullies Science

It seems that the phenomenon of scholars whoring themselves for numbers is gaining more widespread attention.

My colleague, Dr. Matthew Nisbet, just forwarded an interesting article from the Wall Street Journal regarding the active manipulation of journal impact factors. Most people outside of academe are familiar with the "publish or perish" label. Although I find this characterization to be terribly misleading, the need to demonstrate an active research program does come with drawbacks.

As professors at research universities, we are expected to publish the results of our research in academic journals. In order to have one's work published, it is "blind" reviewed by other "peer" researchers in the area. Since research specializations are often quite small, the degree to which these reviews are actually blind varies. That is, I know who does what in studying emotional, attention, and media.

Some journals are better than others. However, calling one journal "better" than another is somewhat akin to calling chocolate ice cream better than vanilla ice cream. So there are quantitative indicators. One of these indicators is the impact factor. This statistic is kept by ISI Thompson, and it provides an index of how often a journal is referenced by other journal articles.

The thought is that if your work is important, then other people will cite it. The more that work in a particular journal is cited, the higher its impact factor climbs. Some journals are not tracked by ISI. Thus, they have no impact factors. They are the lepers of science.

Keep in mind that I am a quantitative scientist. All of my research involves numbers. I love numbers. In this respect, I am like the Count from Sesame Street. "I love to count. Ah Ah Ah." However, when you introduce quantitative indicators to science, you immediately pervert the process.

The Wall Street Journal article (Begley, 2006) provides evidence that some journals are actively trying to manipulate their impact factors. That is, after an article is basically accepted, the WSJ reports that the American Journal of Respiratory and Critical Care Medicine asks every author to cite more papers from that journal before publication.

This is as blatant of a manipulation of the process as I can imagine. But impact factor perversion is just the tip of the iceberg. As I have written before, numbers chasing threatens science at every level.

Take the NFL sack record as an analogy. New York Giants defensive end Michael Strahan set the all-time single season sack record in January 2002 when he took down Green Bay Packers quarterback Brett Favre in the fourth quarter. But the play looked suspicious. It looked like a "gimme." Everybody in that stadium knew Strahan needed one sack for the record, and the takedown looked as if it could have been completed by a punter. Favre denied handing Strahan the sack, but few people who see the footage believe it. The record is tainted.

Allow me to give you a few more examples of how we are counting ourselves stupid in American science.

The very notion of impact factors has, I argue, a chilling effect. It is difficult to get truly "new" work published. For instance, I have two papers that won top paper awards from an academic society that are awaiting journal homes. They are difficult to publish. They are new. They are not trendy. They do no fit high impact journals. So they are published in a journal also known as my desk drawer.

So you run a risk as a young scholar. Ground-breaking work runs the risk of slow or no acceptance. Better to tow the line. Do menial work and cite the big names in the field. Imagine if all of science behaved this way.

Budding linguist Noam Chomsky published Syntactic structures in 1957. Chomsky has gone on to be one of the most cited scholars of his generation, and no one can deny the influence of the 1957 volume on modern linguistics. But Syntactic structures was not published by the biggest or best publisher. Instead it was published by Mouton in the Hague, Netherlands. Today, publishing in such an obscure outlet may cost someone tenure.

There is no denying the clear separation between the top and bottom journals in a field. But the finer gradations are far more subjective. Take, for instance, the emphasis on flagship journals by my outgoing employer, The Ohio State University.

In the school's pattern of administration (available online), it states, "Faculty of the School of Communication strive to become known for high quality research programs. Thus, tenure track faculty are expected to engage in a rigorous program of research that contributes to the advancement of the field of communication and to the prestige of the School." Later on the same page, it defines three "flagship" journals in communication, Communication Research, Human Communication Research, and the Journal of Communication.

Herein lies the rub: OSU's School of Communication has approximately 27 tenured and tenure-track faculty members. Although numbers expectations are very "hand wavy," to be considered successful, one is usually publishing 2-3 peer reviewed journal articles per year. And the communication faculty at just one university may easily be flooding these three journals with more than 50 submissions per year -- perhaps more than 100.

Although this may not directly inflate the impact factor, it does inflate another statistic, the rejection rate. The top journals all have high rejection rates. Like the best universities, the best journals are "hard to get into." And we are single-handedly increasing these journals' positions as high rent districts.

I do not point this out to fault OSU. To be clear, the system is driving this problem, not this individual school. However, just a few like-minded programs with large research faculties can unintentionally drive the field. Furthermore, we give three editors the power to decide what "matters" in communication.

If Chomsky had been held to this model (imagine him forced to publish in journals edited by behaviorists such as B. F. Skinner), stimulus-response models of cognition might still win the day.

This impact factor phenomenon also colors the process at the individual level. Just as with journals, it is popular to think that the more an individual is cited, the more important that individual's work is to the field. However, this assumes that no one is "working" the process.

It has been my observation -- and that of others, although I will not hold them accountable here -- that citation circles have developed within our field. That is, a group of 8-10 like-minded individuals have the capability to completely skew the process if they so desire.

It goes like, this: These 8-10 individuals publish in a common area. So they cite each other ... a lot. And they co-author papers together, but not all at once. So they submit their articles to the journals, and the editor is most often not an expert in that particular sub-field. So the editor looks at the citations, and invites reviews from authors cited heavily within the paper.

But wait! That is within the circle. So there is no blind review. And even if the other circle members do not know the paper's authors with certainty, the paper is well within their scientific paradigm, and it cites them a lot. This means that if it gets published, it makes the reviewer look good. So Henry Ford is proud, and the assembly line is pumping.

The papers have all the trappings of science. They look like science. They "quack" like science, if you will. But they are nothing like programmatic science. They are simple regurgitations of a handful of meager ideas.

I'm not alleging any smoke filled rooms or Roswell-esque conspiracies. Read about flattery. It's hard to be "mean" to people who are kissing your butt. Even if the reviewers are trying to be impartial, the social psychology literature suggests that they cannot.

So, there go the numbers like a runaway train. If you confuse success with visibility, you will then seek to be visible. And if you narrowly quantify success and then do everything in your power to light up that scoreboard, then the numbers will follow. What happens to a baseball team when its players begin chasing individual stats?

Science will suffer. Sure, progress will be made. But it will be made in spite of most of the research being done, rather than on the backs of most of the research being done. It's sad, really. It's a sad day when a leading newspaper can publish an admission by a journal editor that they send out a boilerplate letter urging more citations, and it is not a national scandal.

But the fact that it was reported is a sign that the runaway train had better watch out ... there might just be light at the end of the tunnel.

Reference

Begley, S. (2006, June 5). Science journals artfully try to boost their rankings. Wall Street Journal, pp. B-1.

Labels: , ,