Companion site:


Google search...


Daily Howler: Uh-oh! The Post's favorite school had the second-lowest score in the state
Daily Howler logo
HOW LOW DID THEY GO? Uh-oh! The Post’s favorite school had the second-lowest score in the state: // link // print // previous // next //

PEP RALLIES—AND LOUDSPEAKER LESSONS: We’d like to return to other topics—to topics which directly involve real instruction in real low-income classrooms. And yes, we plan to comment on John Tierney’s column regarding charter schools in Milwaukee (the column appeared on March 7). We also plan to look at President Bush’s statement to those newspaper publishers in a bit more detail (see THE DAILY HOWLER, 3/11/06).

We’d like to return to other topics—but testing-and-reporting does matter. And this morning, the Post runs another large story about a school system’s preparations for this spring’s high-stakes tests. Indeed, here’s the headline atop Nick Anderson’s piece: “What Do We Want! High Test Scores!” In the sub-headline, the excitement continues: “Prince George’s Schools on Watch List Pile on Drills, Cheers as State Tests Loom.”

Yes, this report takes us back to low-income, majority-black Prince George’s County, just outside Washington, D.C.—and it describes the way the county is getting kids ready to take this spring’s high-stakes tests. Anderson’s report is quite informative, but we do have a standard objection. Although Anderson describes the county-wide frenzy which is building as the test date nears, he never mentions a salient fact. He never notes that pressures like these have produced many cases, all over the country, in which teachers, principals, and school systems have outright cheated on tests. Almost surely, the desire to get those test scores up produced the odd procedure in the state of Virginia which we’ve discussed for these several weeks (see below). No, no one is saying that Prince George’s County is doing something wrong as it preps for these tests. But we did start a bit at this statement:

ANDERSON (3/14/06): Prince George's officials have tried nearly everything in recent years to raise scores.
Ouch! The history here is clear, nationwide; all too often, when school officials “have tried nearly everything,” they have ended up slipping over the line. Let’s stress again—no one is saying that Prince George’s County has engaged in any misconduct at all. But this has been an important part of the national testing story for the past forty years—and it’s a part of this story which mainstream reporters almost never mention. In a story of this length, we think this omission is unfortunate. But then, reporters rarely display awareness of this general problem. (Note Anderson’s unblinking report about one school which has recorded “phenomenal gains.” In our experience, schools which record “phenomenal gains” are schools which should be double-checked.)

That said, we especially started at one part of this report. Yes, Prince George’s schools are holding pep rallies to motivate students for the big test. But at one point, Anderson described an educational practice which struck us as a bit odd:

ANDERSON: Like other schools, James Madison [Middle School] broadcasts MSA vocabulary words every morning over the loudspeaker. One recent word: "innocuous." Definition: "harmless, producing no injury." An announcer gave an example of usage tailored to adolescents: "No gossip is innocuous. Gossip always hurts somebody."
Really? The school broadcasts specific vocabulary words from the MSA (Maryland Student Assessment) every day on the intercom? Let’s get technical: On a nationally norm-referenced, standardized test, this practice would plainly be inappropriate; you’d be preparing your students for specific test items in ways the norm group had not been prepared. This makes such a test invalid. By contrast, on a test like the MSA, the practice may be completely appropriate—but it does strike us as slightly odd. Is this really being done in all the state’s schools. In a uniform way? Does it involve specific vocabulary words which will appear in specific test items? If so, what’s the theory behind this sort of preparation? Don’t kids need to know lots of words—not just the handful which appear on one test? And by the way: If kids are being prepped in item-specific ways, are we really surprised when it turns out that they do fairly well on these tests?

Let’s stress this again: This practice may be wholly appropriate. Who knows—it may even make sense! But it did strike us as somewhat odd—and we’re accustomed to a world in which education reporters fail to note irregular practices. At the Maryland web site, we can’t find examples of vocabulary items from past tests. This procedure may be completely A-OK. But we’re going to make a couple of calls to double-check on this slightly odd practice.

By the way: If kids really learn from school-wide intercom broadcasts, why do we bother having teachers? Wouldn’t we save a lot of dough if we just let the principal do the work?

Continuing story: Yes, Virginia!

HOW LOW DID THEY GO: Should the Washington Post have presented Maury Elementary (Alexandria, Virginia) as “a study in pride, progress?” That’s how the paper described the small school in a top-of-the-front-page report last month (full links below). But how low did the Post really go in its search for a heart-warming story? With Virginia’s “school report cards” accessible again, we’ve checked through the state’s school systems, trying to find other schools which scored as low in third-grade reading as Maury seems to have done last year. As we have noted, only 27 percent of Maury’s third-graders seem to have passed the state “Reading/Language Arts” test—a test which was passed by 77 percent of third-graders statewide. (From now on, we’ll just call it “reading.”) Did any school in the state do worse? We have found only one such school: Annie B. Jackson Elementary School of tiny, rural Sussex County, where only 22 percent of third-graders passed the state reading test last spring. (Assiduous reader RC reports the same tentative finding. We’re disregarding the Richmond Alternative School, for reasons explained below.) In short, when the Post hailed Maury as “a study in progress,” it was hailing a school with one of the lowest reading performances in the entire state of Virginia! As we’ve noted, only two grade levels were tested last spring—third grade and fifth. And in third grade reading, only one school scored lower—one school in the whole state!

(Maury did score fairly well at the fifth-grade level. According to its school report card, 83 percent of Maury fifth-graders passed the state reading test, compared to 85 percent of fifth-graders statewide.)

Readers, what ever gave the Post the idea that Maury was “a study in progress?” As we’ve noted, it was the school’s combined “Grade 3 and 5" passing rate in reading—a passing rate which appears at the top of the Maury “report card.” According to that pleasing statistic, 92 percent of Maury students (“Grades 3 and 5” combined) passed the “English” test last year. (Inexcusably, the state uses a confusing array of names for its Reading/Language Arts test.) We’ve described the absurd statistical procedure which seems to have yielded that pleasing statistic. For today, let’s just note that schools all across Virginia seem to have gained from this procedure. Did Christmas come early at Maury last year, transforming a 27 percent into a pleasing 92? If so, then Christmas came early at many schools—although none of them seem to have gained as much from this rate-shifting process as Maury.

Yes, all around the state of Virginia’s, schools boast a passing rate for “Grade 3 and 5 English” which can’t be derived from the passing rates of the two grades individually. Consider J. L. Francis Elementary in Richmond, for example. According to its school report card, 60 percent of the school’s third-graders passed the reading test last spring, along with 68 of fifth-graders. But the combined passing rate, at the top of its “school report card,” is much more pleasing—85 percent! This pattern obtains in schools throughout the state, including award-winning Norfolk City—although nowhere to the extent seen at Maury. As such, it does seem that this odd statistical procedure —a procedure in place since 2001—has been systematically misleading citizens in every part of Virginia. We think it’s time that the state’s big newspapers investigate and report this odd pattern.

For the record, no district seems to have gained as much from this procedure as Alexandria. Last year, the system reported results for thirteen elementary schools—and in a good number, combined “Grade 3 and 5" passing rates were substantially inflated. Example: James K. Polk Elementary School. Deep down in its school report card, we see that 67 percent of third-graders passed the reading test, along with 65 percent of fifth-graders. But at the top of the card, we get the passing rate for the two grades combined—84 percent! Ditto Cora Kelly Magnet Elementary. Deep down in the data, we see that 60 percent of third-graders passed, along with 80 percent of fifth-graders. But what do we see at the top of the card? Happy days are here again! The passing rate for the two grades combined is presented as 90 percent! But then, Christmas came early at Jefferson-Houston Elementary too, where 44 percent of third-graders passed, along with 71 percent of fifth-graders. What does it say at the top of the school’s report card? Combined passing rate, 75 percent! In these schools, as in schools all over the state, combined passing rates are substantially higher than the passing rates of the two grades at issue. No, nobody’s passing rate was jacked up more than the passing rate at Maury. But if these passing rates derive from a bogus procedure, then bogus data have been peddled all across the state of Virginia—bogus data which persistently over-state school passing rates, of course.

As we’ve seen, two explanations have been offered for this odd phenomenon. When we asked about Maury’s contradictory data, Alexandria testing director Monte Dawson described an absurd statistical procedure, even sending us detailed material (apparently from a technical manual), material which explained how this bizarre procedure works. Later, in an on-line reply in the Post, Jay Mathews gave a different explanation; he described a more sensible process, but he completely misstated what Dawson had told us, and many of the statistical complexities he described simply didn’t make any sense if his explanation was accurate. Beyond that, he didn’t seem to have asked Dawson about what we had been told, although Dawson had been his original source for the story about Maury’s high passing rates. Mathews described a more sensible process—but as a piece of basic journalism, his report didn’t seem to add up.

What explains the statewide pattern of apparently inflated passing rates? Will the real explanation please stand up? Last week, we thought we’d turn this puzzling story over to real news orgs. But with news orgs seeming to drag their heels, we’ll now go back to Jay’s basic sources to try to resolve this puzzle.

But understand: Last spring, all third-graders in the state of Virginia were given the third-grade “Reading/Language Arts” test. Statewide, 77 percent of third-graders passed on that standard first testing. But at Maury, only 27 percent of third-graders passed—the second-lowest result in the state. How low was the Post prepared to go in its search for a heart-warming story? In a rational world, Maury’s performance would have been cause for major concern. In our world—a world which loves a feel-good tale—the school was a “study in progress.”

By the way—Mathews quoted Maury parents who were thrilled by their school’s “progress.” Did they know about the school’s third-grade passing rates? (The rate was also quite low in math.) Would these parents have been so pleased if they knew that their third-graders had the second-lowest score in the state? Did they know how low the Post would go to hand them a heart-warming story?

DISREGARDING ONE ALTERNATIVE: One other school scored lower than Maury in third-grade reading—the Richmond Alternative School, where zero percent of third-graders passed last spring’s reading test. But Richmond Alternative is a very small K-12 school which had no female third-graders last year. Published info is hard to come by, but it seems to have been described in the Richmond Times-Dispatch as a school for disruptive students.

WE’D LOVE TO SEE THE TIMES-DISPATCH REPORT: We’d love to see state or national orgs report on Virginia’s “school report cards.” Do citizens understand the contradictory data routinely found in these reports? We doubt it; in fact, we’re certain they don’t. We’ve been working on this topic for a month, and we’re still not sure that we know just how these dueling data are derived. How did Maury’s 27 percent become a pleasing 92? And do Richmond parents understand these matters? We’ll promise you: No, they do not.

That said, we fired up Hotmail yesterday. We sent a version of that e-mail (see THE DAILY HOWLER, 3/13/06) to these people at the venerable Richmond Times-Dispatch:

Richmond Times-Dispatch
Pam Stallsmith, State Government & Politics/Special Assignments
Louise Seals, Managing Editor
Tom Kapsidelis, Virginia editor
Jeff Schapiro, State Government
Tyler Whitley, Politics
Holly Prestridge, Education reporter
Lindsay Kastner, Education reporter
Juan Lizama, Education reporter
Olympia Meola, Education reporter
We’d love to see the Times-Dispatch report this story. No one understands those school report cards—and if Dawson gave us the straight dope, the state has been using an absurd procedure which is churning out embellished data. Or is it OK—to report false scores if we’re talking about low-income kids?

BASIC LINKS: On February 2, Maury hit the top of the Post’s front page. You know what to do—just click here.

We questioned this story the following week. See THE DAILY HOWLER, 2/6/06, then click forward from there.

Post reporter Jay Mathews followed up on February 28. Click here and you can read every word.

We responded all last week—and the state of Virginia has hidden its data. See THE DAILY HOWLER, 3/7/06, for our first installment.

For a list of Alexandria schools, just click here. Keep clicking to see each school’s “report card.”