AZs tests to gauge school performance spur debate, criticism
Arizona Daily Star
April 25, 2007


By Daniel Scarpinato

Tucson, Arizona | Published:

Differing results

A disparity between how Arizona students perform on two different standardized tests is fueling debate about how Arizona matches up nationally.


Percentile represents what percent of students nationally score lower than Arizona students. Arizona fourth-graders, excludes English learners:

• Math: 58 percentile

• Reading: 52 percentile

• Language: 54 percentile

The Nation's Report Card

Percent represents how many students are proficient. Arizona fourth-graders:

• Math: 28 percent

• Reading: 24 percent

• Writing: 15 percent


PHOENIX — Superintendent of Public Instruction Tom Horne's oft-repeated claim "Arizona students perform above the national average" is being scrutinized by some educators and national policy experts.

The claim was a mainstay of Horne's re-election campaign last year, but his persistent reference to the successful performance of Arizona students is fueling questions about the very national test he's referencing — the TerraNova.

The scrutiny includes questions about the way Arizona administers the exam, how the test questions are chosen and whether students and teachers are simply doing a better job figuring out what's on the test before they take it.

And some point to another exam — the National Assessment of Educational Progress, or NAEP — that shows Arizona students performing significantly below the national average.

For example, while the Horne-favored TerraNova test shows Arizona fourth-grade students doing better than 58 percent of students nationally, on the NAEP only 28 percent of Arizona students are considered proficient.

On reading, fourth-graders are in the 52nd percentile compared nationally, according to the TerraNova. But on the NAEP, only 24 percent are proficient.

"The TerraNova scores are implausibly high," says critic Matthew Ladner, vice president of research at the Goldwater Institute, a conservative think tank in Phoenix. "On TerraNova, they're knocking the ball out of the park, and on NAEP they're below average."

But Horne takes the opposite view, saying, "I consider NAEP very unreliable."

Horne points to the fact that the NAEP uses a smaller sample size than TerraNova, the questions aren't necessarily aligned with Arizona's school standards and some states have different rules about testing students who are learning English.

In Arizona, English learners must be tested in English, not their native language.

What's normal?

The TerraNova is what is referred to in the education world as a national norm-reference test.

The testing company — in this case, California-based CTB McGraw-Hill — sets a national "norm" for student performance. Then students in Arizona and elsewhere take the test and are graded against that norm.

Voter-approved Proposition 203, which passed in 2000, requires the state to administer such a test.

NAEP is a standards-based test, where an average score is determined and comparisons are made.

The problem with the TerraNova results, some critics say, is Arizona administers it as part of the AIMS test, a high-stakes exam used to measure school and student performance. Poor performance can result in state and/or federal intervention.

Some questions on the joint text only count for AIMS, some for TerraNova and some for both. Which are which is separated out in the grading.

Because Arizona puts so much emphasis on AIMS, preparation for the test can also influence — and throw off— the Terra, some educators say.

"The kids are just going to be more likely to take those tests seriously," says Jerry D'Agostino, a University of Arizona associate professor of education who specializes in achievement testing.

But Horne argues that because the only AIMS scores that count are for high school students who must pass to graduate, there is no extra pressure on students in lower grades to do well.

The pressure is on the system, not the students.

But Thomas Haladyna, a professor of education at Arizona State University who has helped steer Arizona's testing policies by serving on committees since the 1980s, argues the pressure translates to students. "AIMS is high-stakes," he said.

"I think there are legitimate differences of opinion, but I don't think that's one," said Horne.

Horne also cites the college entrance exams — SAT and ACT — to make the case that Arizona students are doing better than the national average.

But those scores can be misleading, Ladner and others say, since only select, college-bound students take them.

Turning up the heat

Education experts say that the practice of marrying standardized tests with school accountability measures has unintentionally led to a "Lake Wobegon effect," where everyone claims to be above average.

"The really fundamental problem is when all you do is tell people they are accountable for scores on one test, and turn up the heat on that test, you are setting up the opportunity to cut corners," says Daniel Koretz, a professor of education at Harvard University and a leading expert on achievement testing.

In Arizona, some policy experts, including Goldwater's Ladner, have criticized changes made to the AIMS test over the years — changes he and others say made it look like students are doing better but only made the test easier.

Despite the disagreements, most agree Arizona should have a way of measuring students against those in other states.

John Wright, president of the Arizona Education Association, the state's teacher lobby, says the priority should be using tests to identify if students are learning the standards Arizona has set.

"That is not identical state to state," he said.

Horne says the $2.1 million spent every year on administering the TerraNova is money well spent.

But as questions loom, others wonder what the test results are really showing parents, students and educators.

Koretz asks, "Are kids picking up skills that they still have after they leave the school system?"

● Contact reporter Daniel Scarpinato at 307-4339 or