The Cranky Taxpayer 

Think what you will of the Federal Devils, they recognized some time back that the SOL correlates with socioeconomic status. For example, here are the Virginia 2015 reading pass rates by division vs. the division % of economically disadvantaged students:
We're not here to discuss whether this correlation suggests that more affluent families live in better school districts, whether their children are better prepared for school, whether their children have higher IQs, or whatever. The point here is that a teacher with a classroom full of more affluent kids can be a lousy teacher and still show better SOLs than a better teacher with a class of less affluent students. So, under the federal whip, Virginia from 2011 to 2014 collected Student Growth Percentiles.
VDOE has a detailed discussion of the SGP here. In short, a kid who passed last year with a pretty good score and who gets a score this year that is average for the statewide group who had the same performance as this student last year will score in the 50th percentile. By the same measure, a student who had an awful score last year is compared to other students with the same score; if his performance is average for that group, he also will score in the 50th percentile. To this point, VDOE says:
Thus, the SGP measures a teacher's performance vs. other teachers with similar students. Slide 31 in the VDOE powerpoint is the Colorado data suggesting essentially no correlation between the SGP and economic disadvantage. More on that and related issues here and here. You'd think that an enlightened Education Department would trumpet those data. Doubtless it would. But don't mistake our own State Department of Data Suppression as being enlightened. Brian Davison, a parent of two Loudoun schoolchildren, had to sue VDOE and pay $1,100 to get the data, and then with the teacher identities suppressed. First, the dataset: VDOE has produced three sets of SGP data. They say there were errors in the first set, so I'll be looking at the 2d and 3d. The 2d set of data has statewide SOL and SGP data with anonymous student IDs but with the schools identified. VODE suppressed the data for classes of fewer than ten students and for students who transferred during the school year. The 3d dataset also has anonymous student IDs (different from those in set 2 so there's no way to join the two sets) and anonymous teacher IDs but with the schools not identified. The 2d dataset has lots of duplicate records. Both sets 2 and 3 are missing a flock of Richmond data for 2013, especially in the middle schools. VODE says it's because Richmond didn't report the data. Even so, there's a lot to learn from these data. CAVEATS:
That said, here is the 2014 distribution of division average reading SGPs. On this graph, and the one below, Richmond is the yellow bar, Petersburg is red, Norfolk is blue, and Hampton is green. Excel is unable to readably list all the divisions on this graph; the list here and below excludes every 2d division. Thus, on the graph below, the yellow bar (Richmond) appears between Pulaski and Henry Counties. That’s just an artifact of the program, not evidence that Richmond has disappeared. The relatively higher SGP averages for Norfolk and, especially, Hampton tell us that, despite mediocre SOL pass rates, their students' reading scores are improving significantly year over year. The graph above averages the results of reading tests in the five grades, 48 (the SOL score in Grade 3 is the starting point so there’s no third grade SGP). It turns out there is a considerable variation from grade to grade. For example, Richmond: As to math and algebra I, Richmond does much better. As with the reading scores, the Richmond math scores plummet when the students enter middle school. Yet the state averages remain nearly flat (as they nearly should; if VDOE were not manipulating the data, the state average should be entirely flat at 50 on every test). Something uniquely ugly happens in the
sixth grade in Richmond.
She replied:
Could be. The 8th Grade SGPs (which are
below but approaching state average values) are based entirely on the change
from previous years’ middle school SOL scores, while the 7th Grade SGP
scores can reach one year into elementary school and the 6th grade SGP
scores are based entirely on the change from students’ SOL histories in
elementary school. If Richmond’s elementary SOL scores were artificially
high and the middle school SOLs were low normal, the 6th graders and, to a
lesser degree the 7th graders, would be starting at an artificially high
SOL, so their SGP scores would show abnormally little improvement. That is,
the SGP scores would be abnormally low in the sixth and, to a lesser degree,
seventh grades. As I mention above, the third SGP dataset from VDOE contains (anonymous) teacher IDs. This gives a first peek at how well, and how badly, some of Richmond's teachers are performing. With all the caveats listed above, let's start with the statewide distributions of teachers' average SGP scores in reading and math.
Brian Davison points out that both distributions are reasonably
symmetrical, suggesting that we do not have an unusually large number of
teachers doing particularly well or poorly. That said, no parent will want a
child to be subjected to the reading teacher in the first percentile, the
other teacher in the second, or the three in the eighth. We already have seen that the Richmond average reading SGP plunges from fifth to sixth grades. The Richmond distributions conform to that pattern. First, grade 5, statewide and then Richmond:
As you see, this distribution is a bit
wider than the statewide distribution. That is, Richmond has relatively more
excellent fifth grade reading teachers than the statewide average, and also
relatively more who are not performing. Five (of sixtyseven) Richmond
teachers are more than two standard deviations above the state average;
three are more than two standard deviations below.
Only one of Richmond’s twentyone sixth
grade reading teachers produced an average student improvement better than
the state average; none was more than two standard deviations above the
statewide average. Six (or seven, depending on the rounding) were more than
two standard deviations below the state average and four were more than
three standard deviations below. The Richmond average is 1.5 standard
deviations below the state average.
Turning to the math tests, are are the statewide and Richmond fifthgrade distributions.
Again, a lower average and larger numbers of good and bad teachers, but nothing startling. But see the sixth grade numbers:
The Richmond average is 1.75 standard deviations below the state average. Four of eighteen Richmond math teachers are more than two standard deviations below the state average. Only one is above the state average. At present, parents take their kids’ teachers willy nilly. VDOE now has data in some cases to tell those parents whether the teachers are effective. Yet VDOE says the “privacy” of those public employees is more important than informing the public about those employees’ performance. VDOE’s refusal to share those important data that have been bought with taxpayer dollars is an abiding and outrageous insult to Virginia’s taxpayers. More data on teacher performance are
here. ANOTHER CAVEAT:
On another front, the 2014 SGP data tell us, again, that spending more money on schools does not produce more learning. First reading, then math:
Richmond is the gold squares; the red diamonds, from the left, are Hampton, Newport News, and Norfolk.
A Modest Proposal
SOL scores
decrease with decreasing
economic status
of the family. Thus, the Feds have
required (select the SGP Primer link) VDOE to compute a measure
of learning, not family income. VDOE selected the
SGP. VDOE now has three years’ of those
SGP data that can be used to measure teacher effectiveness.

Last updated
11/12/15 