SOL Scores

 

The Cranky Taxpayer

SOL Scores


Back | Home | Up | Next
Accreditation | Attendance | Corruption | Cost | Failing Board | Discipline | Eat Cake | NCLB | FOIA Suit | Waste $ | SOL Scores | SGP Scores
Disability Abuse | SOL v Poverty | SAT Disaster | Fun1 | Fun2 | Fun3 | Leadership | Neanderthal Security


The VDOE has a lovely front end on their SOL database that allows both summary and detailed examination of the scores, going back to 2005. 

As to Richmond, the data raise the question why the School Board settled with the (now former) Superintendent instead of firing her for incompetence and/or malfeasance.  As well, the data ask why the new Superintendent (January, 2014 to June, 2017) did not manage to correct his predecessor's mistakes.

 

Math Disaster Festers

After I pointed out the cancerous growth of the VGLA testing and its use to cheat on the SOL testing, notably in Richmond, the State Superintendent's belatedly expressed "concern" about abuse of the VGLA and VDOE set out to eliminate the VGLA.  In 2012, the new math tests, responding to a prodding from the General Assembly, eliminated the VGLA as to math. 

As expected, the tougher new tests produced a statewide drop in the scores in 2012. 

math sol by year

The division average started back up in 2013; Richmond waited another year.  Richmond then rose for two years and then fell back for another two.

The division scores minus the State average give a measure of each division's performance in relative terms.

division minus state math by year

Richmond rose from second from the bottom in 2016 to third in '17, courtesy of even larger pass rate drops in Petersburg and Danville.

Division 17 Math
Petersburg City  52%
Danville City  53.6%
Richmond City  54.5%
Greensville County  60%
Lancaster County  63%
Prince Edward County  64%
Brunswick County  66%
Alexandria City  66%
Covington City  66%
Martinsville City  66%
Fredericksburg City  67%

English Disaster

As with the math test a year earlier, the new reading tests dropped scores statewide in 2013; Richmond scores plummeted.

reading by year

Richmond recovered some after 2013 but slid this year.

Versus the state, Richmond also but slacked off this year and still grossly underperforms.

reading v state by year

Richmond had the lowest pass rate on the reading tests in the state in 2016; it advanced one notch this year courtesy of a larger decline in Greensville.

Division 17 Read
Greensville County  57%
Richmond City  58.1%
Danville City  58.2%
Petersburg City  60%
Prince Edward County  62%
Brunswick County  63%
Buckingham County  64%
Martinsville City  65%
Harrisonburg City  65%
Hopewell City  69%

The Explanation You Won't Hear from the (ex)Superintendent

The math scores dropped statewide in 2012 (and dove in Richmond) because HB304 made it tougher to use the VGLA to cheat.  The new reading test in 2013 produced a huge drop in the Richmond scores.  In both cases, the Richmond scores plunged almost certainly because the new tests deprived RPS of the VGLA cheat. 

There was something else going on.  We heard tales in 2012 that the Counties had banded together to prepare for the new math test but that Richmond had done essentially nothing.  The scores above are consistent with that: The tough new test dropped scores all over but the Counties did fine v. the State average; indeed, Hanover improved.

We heard the same stories in 2013 about the new reading test and, again, the scores are consistent with County preparation and Richmond lethargy.  To the latter point, I received an interesting email from a Richmond teacher (bowdlerized here to protect that teacher's identity):

. . . Richmond had several after school meetings that were supposed to help prepare teachers for the test. I went to all of these meetings (both in math and reading) and they were very poorly attended and they weren't helpful. . . They were mandatory, but you could tell RPS was just trying to cover their butts. That's the way it felt.

The reading test was hard. . . .

I am shocked that no one at RPS created practice online tests to help the students study and prepare in any subject, since we tested online for the first time.

The plan was always to blame the online test for the drop in scores since we hadn't taken them on the computers before.

As far as the VGLA stuff goes, I think that hit us hard. . . .

More recently we learn that the administration of the former Superintendent did not update the curriculum for the new tests.  In light of that, the nose dive in scores is no surprise.

 

Some Details

The front end to the SOL database makes it much easier to extract data. 

Those data cast a clear light on the effect of Richmond's VGLA cheating.  We start with the reading pass rates by year.  Here are the Richmond and state data for students with and without disabilities; the second graph shows the Richmond scores minus the state scores, both with and without disabilities.

    Ric & State, w & w/o disab.

To the point here, Richmond's kids without disabilities underperformed the state average while the kids with disabilities outperformed the statewide average for students with disabilities.  Until 2012, that is, when the new test stopped Richmond's abuse of the VGLA (except for LEP students) and took the Richmond scores for students with disabilities below the state scores.  Then those kids with disabilities (and certainly some without who had been misclassified to improve their scores), who had coasted through on the bogus VGLA tests, were faced with real SOL tests.  Of course they performed badly, and kept on doing so.

The new math test in 2012, with no VGLA, confirms this picture.  Again we see Richmond's non-disabled students before 2012 generally underperforming the state average for students w/o disabilities but its students with disabilities outperforming the state average for students with disabilities.  But the 2012 plunge in Richmond's scores, led by the students with disabilities, gives away the game.

  

The drops in scores for students without disabilities, in 2013 for the reading test and in 2012 for the math test, are consistent with the poor preparation, mentioned above, for the new tests. 

These disasters are the product of the Brandon years:

Aug-02 Assoc. Superintendent
May-07 Deputy Superintendent
Aug-08 Interim Superintendent
Feb-09 Superintendent
May-13 Resigned

Under her watch, performance went from bad to appalling:

I don't think the (ex)Superintendent should merely have been fired; I think she should have been sued for deliberately harming these kids.

But, then, her successor (January, 2014 to June, 2017) did not fix anything, either.

 

Don't Blame the Kids

We hear that the Richmond student population is particularly difficult because the kids are [pick your excuse].  The only excuse that I might credit is socioeconomic: Poorer kids don't perform as well in school as their better-off peers. 

The conventional proxy for socioeconomic status of a school population is the number of kids who qualify for free or reduced price lunches.  Indeed, the F/R percentage is the criterion for Title I money from the feds.  VDOE has settled on a more general measure, the percentage of students who are economically disadvantaged.  That term includes any student who "1) is eligible for Free/Reduced Meals, or 2) receives TANF, or 3) is eligible for Medicaid, or 4) [is] identified as either Migrant or experiencing Homelessness."

The enrollment data are available here and the SOL scores are here for both the general population and the economically disadvantaged.

So let's look at the Virginia division five-subject average pass rates as a function of the percentage of students whom VDOE identifies as "economically disadvantaged."

The data give a decent least squares fit (R2 = 39%), suggesting that the ED percentage indeed correlates with the scores (recalling, always, that correlation is not causation).  On this graph, Richmond is the gold square (Richmond had the second lowest 5-subject score in the Commonwealth this year).  The red diamonds are, from the left, Hampton, Newport News, and Norfolk.

Plainly, Richmond is grossly underperforming both the fitted line and a number of other divisions that have similar socioeconomic situations.  Newport News and Norfolk, both old, urban jurisdictions, are particularly instructive. 

In short: Socioeconomics does not provide an excuse, much less a reason, for Richmond's lousy performance.

 

Costly Failure

VDOE posts disbursements (and other) data in the Superintendent's Annual Report.  The 2017 data won't be posted until sometime this spring so we're stuck with the 2016 numbers.

To get a bang per buck measure, here are the 2016 reading division pass rates plotted v. the 2016 disbursements.

Notes: The disbursement totals do not include contingency, facilitiy, or debt service spending.

The least squares fitted line suggests that division pass rates decrease with increasing disbursements but the 0.5% R2 tells us that the two variables are almost entirely uncorrelated.

Richmond is the gold square.  The red diamonds are the peer cities, from the left Hampton, Newport News, and Norfolk. 

The two points up at the top are the outperforming divisions, West Point on the left (low cost) and Falls Church (high, Northern Va. cost).

The division average expenditure is $12,001 per student.

 

Don't Take My Word for It

Steve Fuhrmann of Charles City points out that two studies, from opposite ends of the political spectrum, reach pretty much the same conclusion as I do: Richmond is spending a lot of money and obtaining inferior results.

A 2008 study, based on 2005 data, from the Clare Boothe Luce Policy Institute, (link now broken) ranks the Virginia divisions by per student expenditure per average SOL point.  The study uses raw costs except that it corrects for the higher costs of the NoVa jurisdictions.  On that scale, Richmond is 7th from the most expensive, with a $160.54 cost that is over twice the $77.68 cost of the most effective division, Poquoson.

A more recent study from the Center for American Progress, based on 2008 data, rated school districts for productivity after controlling for cost of living and student needs.  Their interactive site (link now broken) shows Richmond at the bulls eye in the high cost, low production quadrant:

 

Cheating the Kids to get Better Scores

So, in Richmond we have very high cost and lousy performance.  The other Bad News is that Richmond has been inflating the SOL scores by getting rid of a third of the kids who enter high school.  The enrollment pattern by grade gives away the game:

(Data normalized to the 9th grade membership).  Just looking at the raw enrollments, the State enrollment at Grade 12 is down 13% from Grade 9.  Richmond, however, is down over twice as much, 33%.  In short, Richmond is culling the kids to improve their test scores. 

Viewed in terms of the cohort dropout rate, Richmond is 11th worst in the state and almost double the state average:

Division DO Rate
Covington City 14%
Brunswick County 13%
Buena Vista City 12%
Hopewell City 12%
Giles County 11%
King William County 11%
Manassas Park City 11%
Alexandria City 11%
Lunenburg County 11%
Roanoke City 11%
Richmond City 9.9%
King and Queen County 9.8%
State 5.3%

Or, compared to the peer cities:

And Those Few Who Do Graduate . . .

The new Federal data, part of USDOE's reporting requirements under the American Recovery and Reinvestment Act, include data beyond just a less dishonest graduation rate: Virginia must report the numbers of students in the cohort who, having graduated with a regular diploma, enter a public, private, or 2-year college or university (Institution of Higher Education, "IHE" in FederalSpeak) within sixteen months of graduation.  Here are the data for the 4-year cohort graduating in 2015, expressed as a percentage of the cohort:

I trust you got that: Even with Richmond's reporting of Maggie Walker students at schools they don't attend (you can be sure those MLW kids will graduate and do well afterward: average SAT scores in 2013 were 713 verbal, 692 math; average scholarship offer was $72,000 per student), the diploma graduates of RPS are much less successful than the state norm at getting into public universities and even community colleges. 

Then we have the 2012 high school graduates (again with real diplomas) who enrolled in a Virginia IHE within sixteen months of graduation and who completed at least one year's worth of college credit applicable to a degree within two years of enrollment in the IHE.

One can only conclude that Richmond is giving diplomas to a number of students who would not receive them in other divisions. 

Even more outrageously, as we have seen, Richmond has been boosting its scores by abusing the process for identifying and testing kids with disabilities.  To the good, the new math test in 2012 ended the abuse as to math (with a brutal cost to Richmond's scores, as set out above); the new English test in 2013 finished the job.  Richmond now has the second lowest math and lowest reading pass rates in Virginia.

So there you have it: High cost and lousy pass rates and poor performance by those who do graduate.  And cheating to boost the numbers.

Your tax dollars at "work."

 

Back to the Top

High Taxes | High Crime | Lousy Schools | Obdurate Bureaucrats

Last updated 09/24/17
Please send questions or comments to John Butcher