ATLANTA - The scoring errors on a standardized test used by schools to measure student performance in basic skills may never be fixed properly, state officials said Thursday. <br>
<br>
``We don't know what the problem is,'' said Mike Harmon, program manager for test administration at the Georgia Department of Education. <br>
<br>
``It is incredibly frustrating. And even more frustrating than it is for us here,'' Harmon said, ``is the frustration for teachers and for parents and for children. <br>
<br>
``They worked so hard and they got these results back and they expected them to be an accurate representation, and we don't believe that it is,'' he said. <br>
<br>
The state requires the Stanford 9 test be given each spring to public school students in grades 3, 5 and 8 to measure their performance in math and reading. <br>
<br>
School administrators say the test helps determine placement for students, deciding whether to put them in advanced or remedial classes. Some districts use the Stanford 9 to determine bonus money for administrators, teachers and staff. <br>
<br>
The Stanford 9 ``is not just another test,'' said Martha Greenway, executive director of planning, research and policy with Fulton County schools. ``It's a critical measure of how our students are doing. And we use that to plan for the future.'' <br>
<br>
Some school officials have downplayed the impact, saying they have other ways to measure students' abilities, such as the state's curriculum test and the judgment of teachers and principals. <br>
<br>
But the testing company that developed the Stanford 9, Harcourt Educational Management, has brought in three testing experts to sort out the mess, at Harcourt's expense. The state paid $690,000 for the tests. <br>
<br>
State officials first noticed there was a problem with this year's scores in May. In about one-third of the subject areas, scores went up or down by large margins compared with the year before, Harmon said. A change of 1 to 3 percentage points is all that should be reasonably expected for a statewide average, he added. <br>
<br>
Harcourt checked the tests and found they were scored accurately. No correct answers were marked as incorrect, or vice versa. The problem appears to be in ``equating'' the 2002 test with the 2001 test, making sure they are of the same difficulty level so that scores on the two tests can be compared. <br>
<br>
Harcourt spokesman Richard Blake said the company used a commonly accepted equating method, ``but in this case it produced anomalous results.'' <br>
<br>
``We are working as hard as we can to get this resolved as quickly as we can,'' Blake said. ``We are sympathetic that these problems, these delays, have caused problems for people.'' <br>
<br>
In June, the state Department of Education decided to make the test optional for districts next school year.
http://accesswdun.com/article/2002/7/192322
© Copyright 2015 AccessNorthGa.com
All rights reserved. This material may not be published, broadcast, rewritten, or redistributed without permission.