Notice: Undefined index: file in /srv/www/blog.flashreport.org/releases/20130218155602/wp-includes/media.php on line 1676
[Publisher’s Note: As part of an ongoing effort to bring original, thoughtful commentary to you here at the FlashReport, we are pleased to present this column from Lance T. Izumi, Koret Senior Fellow and Senior Director of Education Studies at Pacific Research Institute.]
If you are new to the FlashReport, please check out the main site and the acclaimed FlashReport Weblog on California politics.
As California students take new state tests aligned to the controversial and much-criticized Common Core national education standards, the State Board of Education announced that scores on the tests will not be used to hold public schools accountable for performance in the 2014-15 school year. More troubling, however, are the findings of a just released report detailing major flaws and deficiencies in the tests themselves.
Supporters of the Common Core national standards, which replaced California’s own state education standards, claim, among other things, that student scores on the new tests, which are designed to be taken on computers, shouldn’t be used for accountability purposes this year because of the supposed rigor of the Common Core standards. State Board of Education president Michael Kirst argued, “You can’t move from low level standards to standards that make students college and career-ready overnight.” However, even pro-Common Core researchers acknowledge that the Common Core standards are weaker than California’s former state standards.
Comparing California’s previous state standards to Common Core standards, the pro-Common Core Fordham Institute concluded, “California’s [English language arts] standards are clearer, more thorough, and easier to read than the Common Core standards.” Further, “California’s [math] standards are exceptionally clear and well presented, and indeed represent a model for mathematically sound writing” and are “easier to read and follow than Common Core.”
The problem with the new Common Core tests, therefore, isn’t that they are based on stronger standards, but, rather, that they are inherently flawed.
In a report released this month, education analyst Steven Rasmussen of SR Education Associates, found that, based on publicly available sample math questions, the Common Core tests: “Violate the standards they are supposed to assess; cannot be adequately answered by students with the technology they are required to use; use confusing and hard-to-use interfaces; or are to graded in such a way that incorrect answers are identified as correct and correct answers as incorrect.”
For example, on a high-school sample test, Rasmussen reviewed a word problem involving two cars, driven by Justin and Kim, which cover different numbers of miles using gasoline at different rates of consumption. The question asks test-takers, “how far can each car travel with 1 gallon of gas.” However, instead of simply filling blanks with the correct answers, the question asks students to drag the icons for Justin and Kim’s cars onto a number line on the computer screen. The trouble then starts.
Rasmussen discovered, “if a student calculates 25 1/5 as the mileage for Justin ‘s car and drags the car to this spot [on the number line], Justin’s car snaps to 25 – the correct answer.” Since the question doesn’t ask students to round their answers to the nearest whole number, snapping the student’s placement to the nearest whole number on the number line creates a disturbing problem: “The test’s scoring mechanism wouldn’t know the student calculated incorrectly. Any answer a student calculates between 24 ½ and 25 ½ becomes a correct answer.” Thus, “we can’t tell what a right answer means because the [testing] system corrects many wrong answers into right ones.”
Rasmussen emphasizes that such defects on the new Common Core tests are not isolated. “The poor craft,” he concludes, “is not confined to this single test,” but “run through all of the [grade-level math] tests.” Further, because thousands of test questions have already been produced and deposited in the test-item databank, “Bad items will surface on tests for years to come.”
Teachers report that their students are having great difficulty with the computer interface tools provided by the tests. For example, one fifth-grade teacher videotaped her tech-savvy students taking the sample test questions and found that they couldn’t figure out how to enter a fraction on the computer-screen keypad on the tests (e.g., when one student tried to enter “5 ¼” using the keypad, the keypad automatically made the “5” the numerator of the fraction). In fact, on a wide variety of types of questions, the students were perplexed as to how to use the different computer interface tools.
In conclusion, Rasmussen says, the “tests are lemons” that “fail to meet acceptable standards of quality and performance, especially with respect to their technology-enhanced items.” He warns, “No tests that are so flawed should be given to anyone” and urges, “They should be withdrawn from the market before they precipitate a national catastrophe.”