
Caveat lector
23 May 2001
Our current howler (part III): How they know
Synopsis: How does Pew know if a statement is "negative?" Thats easy—they just ask the boss.
The First 100 Days: How Bush Versus Clinton Fared in the Press Project for Excellence in Journalism, 5/01
Its almost impossible to overstate the ineptitude of Pews latest study. Pew studied seven news orgsand seven onlyand claimed to have studied "the media." And how did Pew select those orgs? The logic is hard to make out.
There are four broadcast news networksPew studied them all. Meanwhile, there are three weekly newsmagsPew studied just one. And when it came to newspapers, bad judgment ran riot. Pew selected only twothe Washington Post and the New York Timeswhile noting that both are "reputedly liberal," with editorial boards that have "liberal attitudes." The lapse in judgment is simply surreal.
Imagine if Pew had studied the Wall Street Journal and the Washington Times, and then had used the resulting data to make sweeping statements about "the media." That would have shown bizarre judgment. And so does the mess which Pew did hatch, in which Pew managed to obscure the key point which its research showedthat even in the "reputedly liberal" papers, President Bush received more favorable news reporting than President Clinton received eight years earlier. As part of the overall mess which Pew conjured, its study opens with a nugget statement which seems to say precisely the opposite (see THE DAILY HOWLER, 5/22/01). For sheer, complete, yowling ineptitude, it rarely gets better than that.
Meanwhile, how does Pew know when a story is "negative" (or "positive")? Nothing in its report is worth a fig unless Pew can make that judgment. After all, in assessing the tone of the Bush/Clinton coverage, Pew rated every "story" it studied as "positive," "negative" or "neutral/balanced." Does Pew have a good way to make these assessments? When major journalists review Pews studies, they simply never ask (note Getler and Kelly last week).
But on face, these judgments would seem to be highly subjective, and Pews descriptions of its method are far from reassuring. Here is Pews first explanation of how the "tone" of each story was assessed:
PEW REPORT: To measure tone, researchers counted all the assertions by journalists themselves or comments by their sources in the story that were either clearly negative or positive. For a story to be considered anything but neutral, the positive or negative comments within it must outnumber each other by a ratio of at least two-to-one. For example, for a budget story to be considered positive for Bush, there would have to be eight positive assertions for every four negative ones.
But how does Pew know if a comment is "clearly negative?" That can be a tricky judgment, requiring a good ear for the political discourse. And is a news report "negative" just because it quotes people making "negative" statements? In its report, Pew offers fleeting overviews of some negative and positive stories. Consider one "negative" case:
PEW REPORT: Clintons critical stories most often dealt with his leadership abilities. A few days before the vaccination story, NBCs Tim Russert assessed Clintons problems. "Theres concern among Democrats in Congress, Tom, that Governor Clinton, like Governor Carter, is used to working with weak legislatures. Thats not Congress.... [Republicans will say:] On Zoe Baird, illegal immigrants, he was tone deaf.... Even his most avid supporters are saying that in the last few days, the president is stumbling." [Ellipses and brackets as found in Pew report]
According to Russerts report, even Clintons strongest supporters were saying that he was stumbling. But if thats true, it surely is news, and Pew doesnt claim that Russert was wrong. But to Pew, reporting what Clintons supporters were saying seems to count as "negative" reporting. Presumably, even that part of Russerts report would go down in the "negative" column.
Surely this is not what people mean when they ask if Clinton got a raw deal. But clearly, that is part of what Pew has in mind in judging news reports:
PEW REPORT: On January 28, [2001], the front of the New York Times carried the headline, "Bushs Transition Largely a Success, All Sides Suggest," and wrote in the lead, "As President Bush completes his first week in office, prominent Republicans and even many Democrats agree that he has presided over one of the most orderly and politically nimble White House transitions in at least 20 years."
This is offered as an example of Bushs "positive" coverage. But was the reporting true or false? If trueif "even many Democrats agree[d] that Bush ha[d] presided over" a superior transitionthen surely the Times was right to say so. Again, this is surely not what most people mean when they wonder if Bush got an "easy ride"but this is clearly one of the ways that Pew assesses its "stories." And consider one last example, of "positive" reporting on Clinton:
PEW REPORT: Similar tendencies appeared on network news. A February first report on NBC quoted two experts applauding Clintons vaccination plan for children as "long overdue" and the answer to "a tragedy." The only critical remark came as an indirect claim made by Robert Hager that "government officials say the [drug] companies are resisting." [Our emphasis]
But were the drug companies resisting? If they were, and Hager reported it, is that an example of "negative reporting" as people understand the term? Surely, when people claim that Clinton got "negative coverage," they dont refer to things like thisto a perfectly accurate statement that the drug companies opposed Clintons plan. We dont know why Hager only quoted experts who said good things about the plan. (Was that the bulk of expert opinion?) But when he says the companies oppose the plan, should that be scored as a "negative" statement? Pew seems to score it as such.
Its hard to make an objective assessment of whether reporting is "positive" or "negative." Routinely, this point is completely obscured by these Pew reports. Typically, Pew shows no sign of having any idea that they are offering, at best, a crude measure. And mainstream pundits quote Pews results as if they were gifts from the gods.
In fact, Pew is so much a part of the Official Press that it is virtually the PR arm of the press corps, offering timid critiques from the margin. And the entire press corps treats Pews work as if it were set in stone. The present, hopelessly flawed report shows the problems that result. Pew has grown exceptionally lazy. It claims that sixty days is the same as a hundred; thinks seven news orgs make up "the media;" and doesnt seem to have the slightest idea of how crude a measure it employs. Indeed, part of the comedy in Pew report came in the one place where it addresses this last problem. How does Pew know that a statement is negative? Read what Pew itself said:
PEW REPORT: Researchers coded each comment and innuendoes [sic] pertaining to that particular theme for it [sic] tone: positive, negative or neutral. Extra weight was given to text in the headline or lead paragraph of a story. When the ratio of positive to negative comments, or negative to positive comments, equaled or exceeded 2:1 a story was coded as a positive or negative assessment of the president. All other stories were classified as neutral.
All subjective variables were reviewed and confirmed by a senior manager. [Our emphasis]
Sic, sic, sic! (Maybe Pew could hire some proofreaders.) But how does Pew know if a comment is "negative?" Simple, folksthey just ask the manager! Were always intrigued by Pews apparent lack of awareness of the basic problems involved in its methods.
Next: There were far fewer stories on Bush. Politely, Pew ignored one key reason.
The press corps latest disgrace: We congratulate Salon for its new report on the "trashing of the White House" hoax. We strongly suggest that you read the report. You know what to do. Just click here.
The occasional update (5/24/01)
Drilling for crude: The crudeness of Pews measure can be sketched in a thought experiment. Imagine a president coming to power in two parallel universes. He offers the same budget plan in each universe. But in one universe, he offers bribes to every member of Congress, and robs a bank to fund the effort. All his deeds become fully known. In that universe, the news reports would be full of people making "negative" statements. Would that mean that the president was a victim of "negative coverage?" Not as that phrase is normally understood. But to Pew, that president would be getting highly "negative" coverage. Conclusion? Whatever Pew is measuring with its described method, it bears a fairly crude relation to what people normally think of as "negative coverage."
One final caveat. Our bank-robbing president would get lots of "negative" storiesunless President Clinton also lived in that universe, and had recently gotten a speeding ticket. In that case, of course, our giver of bribes would get no press coverage at all.
|