As with the math test a year earlier, the
reading tests dropped scores statewide in 2013; Richmond scores again plummeted.
Richmond recovered some after 2013 but, again, the gains
then petered out.
Richmond had the seventh lowest Not ED pass rate on the reading tests in the state
in 2018, up from fourth in 2018; Richmond's ED pass rate was second lowest, same as 2018.
The Explanation You Won't Hear from the
The math scores dropped statewide in 2012 (and
in Richmond) because
made it tougher to use the VGLA
to cheat. The
reading tests in 2013 produced similar drops in the scores.
In both cases, the Richmond scores plunged, at least in part, because the new tests deprived RPS of the VGLA cheat.
There was something else going on.
We heard tales in 2012 that the Counties had banded together to prepare for
the new math test but that Richmond had done essentially nothing. The
pass rates are consistent with that: The tough new test dropped scores all
over; the Counties did about the same as the State average while Richmond
We heard the same stories in 2013 about the new
reading test and, again, the scores are consistent with County
preparation and Richmond lethargy. To that point, I
received an interesting email from a Richmond teacher (bowdlerized here to
protect that teacher's identity):
. . . Richmond had several after school meetings that were
supposed to help prepare teachers for the test. I went to all of
these meetings (both in math and reading) and they were very
poorly attended and they weren't helpful. . . They were
mandatory, but you could tell RPS was just trying to
cover their butts. That's the way it felt.
reading test was hard. . . .
I am shocked that no one at
RPS created practice online tests to help the students study and
prepare in any subject, since we tested online for the first
The plan was always to blame the online test for
the drop in scores since we hadn't taken them on the computers
As far as the VGLA stuff goes, I think that hit
us hard. . . .
More recently we learn that the administration of the
did not update the curriculum for the new tests. In light of
that, the nose dive in scores is no surprise.
Those disasters are the product of the
She retired and went off to VCU. Firing was not
good enough for her; I think she should have been fired and then sued for deliberately harming these kids.
But, then, her successor (January, 2014 to June,
2017) did not fix anything, either.
Don't Blame the Kids
We hear that the Richmond student population is
particularly difficult because the kids are [pick your excuse]. The
only excuse that I might credit is
socioeconomic: Poorer kids don't perform
as well in school as their better-off peers.
The conventional proxy for socioeconomic status
of a school population is the number of kids who qualify for
free or reduced price lunches.
Indeed, the F/R percentage is the criterion for
Title I money from the feds. VDOE has settled on a more general
measure, the percentage of students who are
economically disadvantaged. That term includes any student who "1)
is eligible for Free/Reduced Meals, or 2) receives TANF, or 3) is eligible
for Medicaid, or 4) [is] identified as either Migrant or experiencing
The SOL database includes the numbers of ED and
Not ED taking and passing the tests. So let's look at the reading pass rates as a function of the percentage of students whom VDOE identifies as "economically disadvantaged."
The least squares lines suggest that increasing
Ed populations are loosely correlated with decreasing Not ED pass rates (recalling, always, that
correlation is not causation).
To the point here, Richmond is the enlarged,
gold points; Norfolk is the red: Richmond is grossly underperforming
both the fitted lines and its peer, Norfolk.
The math data tell the same story.
In short: Socioeconomics
does not provide an excuse, much less a reason, for Richmond's lousy
VDOE posts disbursements (and other) data in the
Superintendent's Annual Report.
To get a bang per buck measure, here are the 2018 reading division
pass rates plotted v. the 2018 disbursements.
Notes: The 2019 disbursement data won't be available
until some time next Spring. The disbursement totals here do not include
contingency, facility, or debt service spending.
The least squares fitted lines suggest that division pass
rates decrease with increasing disbursements, particularly for the ED
students, but the R2
tell us that the two variables are almost entirely uncorrelated.
Richmond is the enlarged gold (grossly underperforming)
points. The red are
the peer cities, from the left Hampton, Newport News, and Norfolk.
The math data tell the same story.
Don't Take My Word for It
Steve Fuhrmann of Charles City points out that two studies, from opposite
ends of the political spectrum, reach pretty much the same conclusion as I
do: Richmond is spending a lot of money and obtaining inferior results.
A 2008 study, based on 2005 data, from the
Clare Boothe Luce
Policy Institute, (link now broken) ranks the Virginia divisions by per student
expenditure per average SOL point. The study uses raw costs except
that it corrects for the higher costs of the NoVa jurisdictions. On
that scale, Richmond is 7th from the most expensive, with a $160.54 cost
that is over twice the $77.68 cost of the most effective division, Poquoson.
A more recent study from the
Center for American Progress, based on 2008 data, rated school districts
for productivity after controlling for cost of living and student needs.
interactive site (link now broken) shows Richmond at the bulls eye in the high cost, low
Cheating the Kids to get
So, in Richmond we have very high cost and lousy performance.
other Bad News is that Richmond has been
inflating the SOL scores by
getting rid of a
of the kids who enter high school. The
enrollment pattern by grade gives away the game:
(Data normalized to the 9th grade membership).
Just looking at the raw enrollments, the State enrollment
at Grade 12 is down 11% from Grade 9. Richmond, however, is down
almost three times
as much, 32%.
In short, Richmond is culling the kids to
improve their test scores.
Viewed in terms of the
cohort dropout rate, Richmond was worst in the state in '18 and 3.7
times the state average:
King and Queen..
Charles City County
Manassas Park City
Or, compared to the peer cities:
Of course, the students who drop out don't
And Those Few Who Do Graduate . . .
The new Federal data, part of USDOE's
reporting requirements under the American Recovery and Reinvestment Act, include
data beyond just a less dishonest graduation rate:
Virginia must report the numbers of students in the cohort who, having
graduated with a standard or advanced diploma, enter a public, private, or
2-year college or university (Institution of Higher Education, "IHE" in FederalSpeak) within sixteen months of graduation. Here are
for the 4-year cohort, expressed as a percentage of the
Then there are the SAT scores.
*You can see where they changed the test.
So there you have it: High cost and lousy pass rates and
poor performance. And a history of cheating by the
division and by schools.
Your tax dollars at "work."