The Cranky Taxpayer
Until January, 2004, the Education Department web site described the accreditation requirement as follows:
As we shall see, that statement was not true. Indeed, the accreditation web page now is considerably more circumspect (in its wordy, bureaucratic way):
Even with all those caveats, you might have been inclined to credit the regulation that says "[t]he awarding of an accreditation rating shall be based on the percentage of students passing SOL tests . . . ."
In fact, the scoring only starts with the percentage of the kids who pass the test. Then we get the “adjustments.” As the critics of the test point out, “the state allows use of various accreditation-inflating strategies.” These include:
It makes sense to make allowances for disabled kids, for transfer students, and for students who do not speak English. Given that the test numbers can fluctuate, perhaps it makes sense to allow a rolling average of the scores. For a clear discussion of those adjustments, see the PAVURSOL web site.
As to the Bonus Points, however, the State Board has created two huge deceptions that can falsely inflate the scores. The bonus points and some other "adjustments" are not authorized by the regulations. In fact, some of the adjustments contradict the regulations. And for good measure, it looks like they have invented a couple of smaller "adjustments" without even a vote of the Education Board.
The Regulations provide for a "remediation recovery" program through the eighth grade for students who have flunked an English SOL and in all twelve grades for the kids who have flunked a math SOL test. The students who go through that program and fail the retake do not count against the score for the school. The kids who finish the program and pass inflate the school's score.
Here is the way the state guidelines say it:
The passing rates on assessments administered in schools shall be calculated by dividing the total number students in a school who pass the assessments (numerator) divided by the total number of students who take the assessments except that students who are re-tested and fail SOL tests in English and/or mathematics after participating in a remediation recovery program shall not be counted in the total number of students assessed.
Stated in the Mother Tongue, that means if a kid flunks the test in English or math and suffers through a “remediation recovery program” and then flunks again, he does not count as having taken the test the second time, i.e., the second failure does not lower the school's pass rate. Only the passes count in the scoring.
Of course, the pass rate for those who pass is 100% so counting only the passes inflates the score.
That Guideline is not consistent with the Board's Regulation:
At 8 VAC 20-131-280.C.4 the Regulations further tell us that "eligible students" are all those "enrolled in the school at a grade or course for which a SOL test is required" with exceptions for kids with limited English proficiency or disabilities.
The Guidelines, however, go on to provide a score inflator that appears to be contrary even to the to the guideline above:
That is, the kids who go through remediation recovery and pass the test are counted as passing but not as taking the test. This is even better than just not counting the remediation recovery kids who fail: This counts the passes in the numerator of the score but not in the denominator.
This can have a dramatic effect on the scoring: If 100 kids take the test and 50% of them pass, the raw pass score is, sure enough, 50%. But if 50 kids who have flunked earlier undergo “remediation recovery” and retake the test and half of them pass, the score is not 50/100 = 50%. Neither is it (50+25)/(100+50) = 50%. Nor is it (50+25)/(100+25) = 60%, a 10% improvement over reality as provided by the first guideline above. No, it is (50+25)/100 = 75%, a 25% improvement over reality.
Indeed, the effect can be more spectacular than this scenario suggests: Because a remediation recovery pass adds to the number of passes but not to the number of tests taken, the bonus points can produce a score greater than 100%.
In 2001 (see the Minute for 10/22/01, no longer available on line), the Board added another score booster: It voted (again without changing the regulation and this time without even modifying the Guidelines) to give the same boost for kids who are retaking and pass a SOL test for verified credit (generally high school kids, who need "verified credits" to graduate). Again, these passes add to the numerator of the score but the number of these students retaking the tests does not appear in the calculation. This also can run the score over 100%
That, or something like it, happened in Richmond this year, where Franklin Military had a 112.5%(!) in English and Community High had a 100.86% pass rate, also in English.
This maneuver is entirely contrary to the Regulation that provides, "Schools shall be evaluated by the percentage of the school's eligible students who achieve a passing score." These retake kids are not "eligible students" because they have taken the test as required. Thus they should not affect the score either way but they are being used to inflate the scores.
As if that were not enough, there are two little boosts and one coverup, all of which modify the regulation without being adopted as regulations. Indeed, two of these did not even enjoy a vote by the Board of Education:
In response to my Freedom of Information Act request, the Education Dept. produced a Word document that is titled "accreditation procedures/notes for 2003-2004." This document says, in part:
First, they reward the schools that permit cheating by removing the cheaters from the scoring, rather than scoring them as failures. The Board voted to approve this change (search the 9/28/00 Minute minute -- no longer available on line -- for "improprieties" to find it) but did not adopt the change as a regulation.
Next, the 100 point cap, which does not inflate the scores but does serve to hide their nature: The only reason to cap the score at "100% passing" is that the score in fact is not "% passing" and numbers above 100 would give the game away. Thus, to avoid embarrassing truths, the Department reports the 112.5% score at Franklin and the 100.86% at Community as "100" in each case.
Finally, even the round off is cooked, rounding up at 0.45 rather than 0.5. The folks at the Department say this is to prevent principals who are below but very near to the 0.5 point border for accreditation from tormenting the Department by "finding" a student or two who should be excluded from the scoring, thus qualifying for the round up and accreditation. They do not explain how this prevents the same gaming by principals who are within a student or two of the 0.45 cutoff.
For these last two "adjustments," they say they consulted the Chairman of the Board. For sure, neither the Regulations nor the Guidelines authorizes these modifications of the Regulation.
As we have discussed, the Board's Regulations provide for accreditation "based on" the SOL scores. The Guidelines modify that to provide a big boost for remediation recovery. The Board voted on the retake boost for students seeking verified credits and the little boost by removing cheaters but did not incorporate them into either the Regulations or the Guidelines. The Board did not even vote on the 100% cap and deceptive roundup.
So, we have accreditation requirements in a regulation, in the guidelines, in Board minutes but not in either the regulations or the guidelines, and in two changes the Board did not even vote upon.
The Virginia Register Act defines "guidance document" as follows:
My copy of the Administrative Process Act says "'Regulation' means any statement of general application, having the force of law, affecting the rights or conduct of any person. . . ." The Administrative Process Act also sets out a complicated process for adopting regulations. The Board did not follow that process when it adopted its Guidelines. Similarly, it did not follow the process when it adopted the other changes discussed above. For sure, the Department did not follow any process when it installed the 100% cutoff and the 0.45 roundup.
Beyond question, all the "adjustments" discussed here are statements of general application, having the force of law, affecting the rights of Virginia schools and their students. A few, such as the three-year rolling average, are in the Regulations. If the rest had been adopted as regulations they surely would be regulations. Instead, we have the Board and the Department modifying the Regulations by processes that do not amend or rescind the Regulations. Of course, if they can install accreditation requirements by fiat, they don't need the Regulations in the first place.
"Fiat" means "an authoritative or arbitrary order," as opposed to an Italian automobile or a duly adopted regulation. The other term that probably applies to these non-regulation adjustments is "unlawful."
You might think the State Education Department would put the "adjustments" on the web where every citizen can see how they are arriving at the accreditation decisions. You also might think pigs can fly.
I asked the Department for the data on their "adjustments." They gave me the "procedures" quoted above and a set of spreadsheets showing the adjusted numbers of students passing and failing each test at each school. That is not enough to figure out where all the adjustments came from.
So I asked for the data showing where those adjusted numbers came from. They replied:
It's hard to be sure what that means. It could well mean they are so ashamed of their shenanigans that they destroyed the audit trail. For sure they had destroyed the files.
To their credit, the folks at the Education Department met with me, discussed their procedures, and recreated the intermediate data for Richmond. They also fixed the description on their web page (as discussed above), insofar as their Board's regulation would allow it. That behavior was consistent with their reputation as one of the more capable (and more transparent) state agencies.
Stay tuned for an analysis of those recreated data. In the meantime, the numbers below are from the "adjusted" scores in the spreadsheets they produced earlier.
Also stay tuned to see whether they start reporting the size of the "adjustments" and whether they figure a way to stop calling the "adjusted" numbers "pass rates."
In any event, the 25% enhancement calculated in the scenario above is not far from what happened in Richmond with some of the 2003 scores.
From the data they gave me earlier it is not possible to tease out just how they arrived at the adjusted scores. The data reveal the effects of some of the adjustments, however.
Here are the middle school accreditation numbers and the corresponding pass rates on the 8th Grade math test:
The accreditation scores average 14.2 points higher than the test scores.
Some of the kids take algebra (the number ranges from half of those taking the 8th grade test at AP Hill to 7% at Thompson) and a few take geometry (20% at Binford, fewer elsewhere). The pass rates in these advanced classes are excellent. If we include those scores, the accreditation inflation drops to 8.5 points:
The 8.5 difference, however, double-counts the smart kids (count them once when they take 8th grade math; count them again when they take an advanced math course). The fair measure of the school's failure is the raw, eighth grade score. By that measure, Boushall, Mosby, and Thompson are seen to be performing at sub-Petersburg levels; with all the adjustments, they all are provisionally accredited in math.
The pass rates for the alternate assessment kids is very high, but the number of kids is small (eleven at Elkhardt, seven at Boushall, fewer elsewhere). The big boost in the "adjustments" comes from the count of the fifth grade math test:
Or, in graphical terms:
That's right folks: They are counting the passes but nowhere near all the takes. See above for their (ephemeral) authority for this.
If we look at the underlying data at Elkhardt, here is what we see:
The folks at the Department say those Fifth Grade scores are remediation recovery kids. Sure enough, the intermediate data they had destroyed show 18 5th Grade Math remediation passes.
It looks to my old eyes as if the Board's Regulations do not support this shenanigan. At 8 VAC 20-131-280.C the regs say:
The straightforward reading of that is that each school counts its own, not imported flunkers. In contrast, the Guidelines say (at pp. 1-2):
See above for a more elaborate discussion of this process of using Guidelines (or even lesser instruments) to modify the regulations.
This counting of passes but not failures can have a dramatic effect on the scoring. For example, the eighteen 5th grade scores at Elkhardt, plus ten alternate assessment passes, increased the (already inflated by the double-count) score from 61 to 75, a 13.8 point boost that took the school comfortably into full accreditation for math (If you include the alternate assessments in the overall scoring, the remediation recovery scores increase the score from 63 to 75, a 12-point boost). Yet the actual pass rate on the 8th grade test at Elkhardt was 54%, just above Petersburg performance. This is fraudulent accounting.
As to middle school English, the major boost to the scores again came from the fifth grade test where, again, many more passed the test than were counted for taking it:
Turning to the remarkable 112.5% English score at Franklin: Here are the high school English scores for the end of course tests ("English" + writing) and the raw accreditation scores (unrounded but also not cut down to 100).
The funny business here comes from the eighth grade scores. Franklin and TJ had nice numbers of folks who passed but did not count as taking the test:
With the smaller enrollment at Franklin, the inflation was more dramatic.
On the other hand, Kennedy had a flock of eighth graders who were counted for flunking the 8th grade test. Go figure.
Again, the 8th graders came in at Franklin via the remediation recovery process:
For the history test in the elementary schools, they throw out the third grade score (per the regulations) if that will improve the school score. For 20 of 31 elementary schools, ignoring the third grade score had just that effect. At twelve of those schools, the improvement in the history score was more than five points:
Note the 18.5 (!) point boost at Patrick Henry.
In exactly the same fashion, 15 of the 31 elementary schools got their science scores boosted by not counting the third grade scores. Six of those schools got a boost of more than five points with Broad Rock the winner at 12.3:
In a couple of years we'll see whether it was wise to reward these schools for shorting the history and science instruction in the third grade.
Here are the differences between the raw and the "adjusted" 3d Grade English scores:
Norrell is the big winner, enjoying a 33 point boost from 54.7 (the actual score) to 87.8 (the "adjusted" score). The average boost is 11.8 points.
Here are the same data for the 3d Grade Math Scores:
The big winner here is Reid, with a 40 point boost (that is NOT a typo) from 78.7% kids passing to an "adjusted score" of 118.6 (THAT is not a typo, either). The average boost is 15.9 points.
I could go on but I trust the point is clear. They are cooking the numbers and misrepresenting the inflated results as having something to do with pass rates. They are accrediting schools that, in fact, are doing a dismal job.
The net of all these "adjustments" was to take Richmond from about 11 fully accredited schools on the raw data to 23 on the adjusted data and from about 12 schools on warning to an adjusted 9.
I sure wish I could balance my bank book by this kind of process.
Another factor that adjusted the Richmond SOL scores was a very low percentage tested. Doubtless Richmond's already lousy SOL scores would be even worse if the school system tested those missing students. For reasons best known to the educrats, there is no accreditation penalty for not testing all the kids.
In contrast, the number tested has implications for Richmond's Adequate Yearly Progress under the the No Child Left Behind Act. Indeed, the 91% overall test rate has prevented most of the Richmond schools from making Adequate Yearly Progress under the Act.