SOL Scores


The Cranky Taxpayer

SOL Scores

Back | Home | Up | Next
Accreditation | Attendance | Corruption | Cost | Failing Board | Discipline | Eat Cake | NCLB | FOIA Suit | Waste $ | SOL Scores | SGP Scores
Disability Abuse | SOL v Poverty | SAT Disaster | Fun1 | Fun2 | Fun3 | Leadership | Neanderthal Security

The VDOE has a lovely (but verrrrry slow) front end on their SOL database that allows both summary and detailed examination of the data, going back to 2005. 


Math Disaster

After I pointed out the cancerous growth of the VGLA testing and its use to cheat on the SOL testing, notably in Richmond, the State Superintendent belatedly expressed "concern" about abuse of the VGLA.  In 2012, the new math tests, responding to a prodding from the General Assembly, eliminated the VGLA. 

As expected, the new (also tougher) math tests produced a statewide drop in the scores in 2012.  Another set of new math tests in 2019 raised the scores.

There's another nuance here: Statewide, economically disadvantaged ("ED") students pass the SOL tests at a rate ca. 20% lower than their more affluent ("Not ED") peers. Divisions with larger populations of ED students (e.g., Richmond with about 65% ED) show artificially lowered SOL average pass rates. Accordingly, the data below show the division averages for both ED and Not ED students.

The division average started back up in 2013; Richmond waited another year.  Richmond then recovered some but then slid. Richmond enjoyed only part of the boost from the easier tests in 2019.

In 2019, Richmond had the fourth lowest math pass rate in Virginia for Not ED and third lowest for ED students, both one notch worse than 2018. (Halifax tested <10 Not ED students so had its results blocked by the suppression rule.)



English Disaster

As with the math test a year earlier, the new reading tests dropped scores statewide in 2013; Richmond scores again plummeted.

Richmond recovered some after 2013 but, again, the gains then petered out.

Richmond had the seventh lowest Not ED pass rate on the reading tests in the state in 2018, up from fourth in 2018; Richmond's ED pass rate was second lowest, same as 2018.



The Explanation You Won't Hear from the (ex)Superintendent

The math scores dropped statewide in 2012 (and collapsed in Richmond) because HB304 made it tougher to use the VGLA to cheat.  The new reading tests in 2013 produced similar drops in the scores.  In both cases, the Richmond scores plunged, at least in part, because the new tests deprived RPS of the VGLA cheat. 

There was something else going on.  We heard tales in 2012 that the Counties had banded together to prepare for the new math test but that Richmond had done essentially nothing.  The pass rates are consistent with that: The tough new test dropped scores all over; the Counties did about the same as the State average while Richmond plummeted.

We heard the same stories in 2013 about the new reading test and, again, the scores are consistent with County preparation and Richmond lethargy.  To that point, I received an interesting email from a Richmond teacher (bowdlerized here to protect that teacher's identity):

. . . Richmond had several after school meetings that were supposed to help prepare teachers for the test. I went to all of these meetings (both in math and reading) and they were very poorly attended and they weren't helpful. . . They were mandatory, but you could tell RPS was just trying to cover their butts. That's the way it felt.

The reading test was hard. . . .

I am shocked that no one at RPS created practice online tests to help the students study and prepare in any subject, since we tested online for the first time.

The plan was always to blame the online test for the drop in scores since we hadn't taken them on the computers before.

As far as the VGLA stuff goes, I think that hit us hard. . . .

More recently we learn that the administration of the then-Superintendent did not update the curriculum for the new tests.  In light of that, the nose dive in scores is no surprise.

Those disasters are the product of the Brandon years:

Aug-02 Assoc. Superintendent
May-07 Deputy Superintendent
Aug-08 Interim Superintendent
Feb-09 Superintendent
May-13 Resigned

She retired and went off to VCU. Firing was not good enough for her; I think she should have been fired and then sued for deliberately harming these kids.

But, then, her successor (January, 2014 to June, 2017) did not fix anything, either.


Don't Blame the Kids

We hear that the Richmond student population is particularly difficult because the kids are [pick your excuse].  The only excuse that I might credit is socioeconomic: Poorer kids don't perform as well in school as their better-off peers. 

The conventional proxy for socioeconomic status of a school population is the number of kids who qualify for free or reduced price lunches.  Indeed, the F/R percentage is the criterion for Title I money from the feds.  VDOE has settled on a more general measure, the percentage of students who are economically disadvantaged.  That term includes any student who "1) is eligible for Free/Reduced Meals, or 2) receives TANF, or 3) is eligible for Medicaid, or 4) [is] identified as either Migrant or experiencing Homelessness."

The SOL database includes the numbers of ED and Not ED taking and passing the tests. So let's look at the reading pass rates as a function of the percentage of students whom VDOE identifies as "economically disadvantaged."

The least squares lines suggest that increasing Ed populations are loosely correlated with decreasing Not ED pass rates (recalling, always, that correlation is not causation). 

To the point here, Richmond is the enlarged, gold points; Norfolk is the red: Richmond is grossly underperforming both the fitted lines and its peer, Norfolk. 

The math data tell the same story.

In short: Socioeconomics does not provide an excuse, much less a reason, for Richmond's lousy performance.


Costly Failure

VDOE posts disbursements (and other) data in the Superintendent's Annual Report.

To get a bang per buck measure, here are the 2018 reading division pass rates plotted v. the 2018 disbursements.

Notes: The 2019 disbursement data won't be available until some time next Spring. The disbursement totals here do not include contingency, facility, or debt service spending.

The least squares fitted lines suggest that division pass rates decrease with increasing disbursements, particularly for the ED students, but the R2 values tell us that the two variables are almost entirely uncorrelated.

Richmond is the enlarged gold (grossly underperforming) points.  The red are the peer cities, from the left Hampton, Newport News, and Norfolk. 

The math data tell the same story.


Don't Take My Word for It

Steve Fuhrmann of Charles City points out that two studies, from opposite ends of the political spectrum, reach pretty much the same conclusion as I do: Richmond is spending a lot of money and obtaining inferior results.

A 2008 study, based on 2005 data, from the Clare Boothe Luce Policy Institute, (link now broken) ranks the Virginia divisions by per student expenditure per average SOL point.  The study uses raw costs except that it corrects for the higher costs of the NoVa jurisdictions.  On that scale, Richmond is 7th from the most expensive, with a $160.54 cost that is over twice the $77.68 cost of the most effective division, Poquoson.

A more recent study from the Center for American Progress, based on 2008 data, rated school districts for productivity after controlling for cost of living and student needs.  Their interactive site (link now broken) shows Richmond at the bulls eye in the high cost, low production quadrant:


Cheating the Kids to get Better Scores

So, in Richmond we have very high cost and lousy performance.  The other Bad News is that Richmond has been inflating the SOL scores by getting rid of a third of the kids who enter high school.  The enrollment pattern by grade gives away the game:

(Data normalized to the 9th grade membership).  Just looking at the raw enrollments, the State enrollment at Grade 12 is down 11% from Grade 9.  Richmond, however, is down almost three times as much, 32%.  In short, Richmond is culling the kids to improve their test scores. 

Viewed in terms of the cohort dropout rate, Richmond was worst in the state in '18 and 3.7 times the state average:

Division Dropout Rate
Richmond City 20.2%
Lunenburg County 15.2%
Manassas City 14.5%
Fredericksburg City 13.1%
King and Queen.. 10.5%
Alexandria City 10.4%
Charles City County 10.2%
Norfolk City 10.0%
Manassas Park City 9.6%
Harrisonburg City 9.4%
Lancaster County 9.3%
State 5.5%

Or, compared to the peer cities:

Of course, the students who drop out don't graduate.

And Those Few Who Do Graduate . . .

The new Federal data, part of USDOE's reporting requirements under the American Recovery and Reinvestment Act, include data beyond just a less dishonest graduation rate: Virginia must report the numbers of students in the cohort who, having graduated with a standard or advanced diploma, enter a public, private, or 2-year college or university (Institution of Higher Education, "IHE" in FederalSpeak) within sixteen months of graduation.  Here are the data for the 4-year cohort, expressed as a percentage of the cohort:

Then there are the SAT scores.

*You can see where they changed the test.


So there you have it: High cost and lousy pass rates and poor performance.  And a history of cheating by the division and by schools.

Your tax dollars at "work."


Back to the Top

High Taxes | High Crime | Lousy Schools | Obdurate Bureaucrats

Last updated 10/27/19
Please send questions or comments to John Butcher