Disability Abuse


The Cranky Taxpayer

Disability Abuse

Home | Up | Next
Disability Abuse | SOL v Poverty | SAT Disaster | Fun1 | Fun2 | Fun3 | Do SOLs Work? | Leadership | Neanderthal Security
Teachers Speak | VCU Study

Richmond is abusing the process for identifying and testing kids with disabilities.  VDOE is letting them get away with it.

In particular,

  • Richmond has the second-highest rate of VGLA testing (substitute tests in grades 3-8 for kids with handicaps) in the state while the Richmond rate of VSEP testing (substitute tests in high school for kids with handicaps) is exactly zero.  Yet the eligibility criteria for VGLA and VSEP are essentially the same.  The Major difference is that Richmond grades the VGLA tests while a state contractor grades the VSEP.  Thus, the VGLA is a useful instrument for inflating AYP but VSEP is not.

  • Under federal pressure on the number of tests, Richmond reduced the number of VAAP tests (alternative tests for students with "significant cognitive disabilities") by 57% in three years by "staff development" and "working with the schools."  Manifestly most of that 57% were students who had been misclassified.  While they were thus misclassified, however, they helped improve AYP for their schools and for the division.

  • Richmond classifies an atypical number of black schoolchildren as "disabled," which contributes to Richmond's high rate of VGLA testing and which renders Richmond's zero rate of VSEP testing yet more irregular.

  • Aside from the monitoring of racial and other disparities required by federal law, the State Board of Education is doing nothing to rectify these abuses.

Note added on 11/25/2012:  Today I discovered a nifty new database on the VDOE Web site that gives further insight into the price Richmond is paying for its abuse of the VGLA.  Data are here.

Note added on 8/19/2012:  It looks like the combined effects of HB304 and the new math test have come home to roost on Richmond and the other jurisdictions that were abusing the VGLA, as discussed below.

Note added on 1/31/10:  VDOE recently reworked its Web site and broke all of my links to that site.  I spent a snow day today correcting links.  If you find any that still are broken, please drop me an email.

Note added on 4/22/10:  The General Assembly this year passed, and the Governor signed, HB304, which requires each Superintendent and School Board Chairman to annually certify that there is a justification for every student who takes the VGLA.  The original bill, submitted by Del. O'Bannon, would have required an audit of any division that administered the VGLA to more than 3% of its students.  VDOE (to its eternal shame) lobbied against that bill and was instrumental in the watering down to the form that passed.

A VDOE press release today announced the phase out of the math VGLA in two years and the reading VGLA in three years, and the replacement of those tests with an online test, the Virginia Modified Achievement Standard Test, aka the VMAST.  The press release quotes the Superintendent as saying: "Today's announcement is the first step in carrying out the will of the General Assembly and addressing my own concerns about overuse and misuse of the VGLA."   In light of the information below and here, one cannot but wonder why it takes a new law to excite the Superintendent's "concerns."  It seems to me that we need a Superintendent whose concerns center on Virginia's students, not the administrators who have been abusing them via these alternative testing schemes.


Note added on 4/27/10:  Having seen precious little evidence of the Superintendent's "concerns about overuse of the VGLA" mentioned in that press release, I asked VDOE for any public records, aside from that press release, that establish or comment upon that "concern."  VDOE replied today by identifying Superintendent's Memo No. 096-10, dated April 23, 2010, which requires a written justification for every student in VGLA, using a state-issued form.  The memo quotes the requirement installed by HB304.  The reply further identified Superintendent's Memo No. 041-10 of Feb. 19, 2010.  This memo mentions a VDOE staff concern about the increased VGLA numbers and announces a "series of Web conferences to assist school divisions."

The February memo is dated ten days after HB304 passed the House and three days before it passed the Senate.  The April memo is dated forty-five days after the Governor signed HB304 into law.

And then we have the Buchanan mess, where VDOE in October 2009 visited the school system with the largest relative number of VGLA participants and found a Superintendent who admitted he had "encouraged the use of VGLA as a mechanism to assist schools in obtaining accreditation and in meeting AYP targets."   As of a few weeks ago, that Superintendent still had his job; instead of firing him, the State Superintendent merely required him to write a Corrective Action Plan.

And there you have it: Despite the plethora of data about abuse of the VGLA (teacher comments going back to 2006; Carol Wolf's blog post in June, 2009; lurid numbers in their own files, see below; the mess in Buchanan), nobody at VDOE was "concerned" enough to do anything at all until the legislation was through the House and on the rails to passage.  Then VDOE offered Web conferences.  That is, they offered to help the divisions who were abusing their own students measure the extent of their own cheating.  And only after the bill was adopted and about to take effect did VDOE demand that the divisions document each VGLA decision.   This is not evidence of "concern"; this is evidence that VDOE and its Superintendent have been co-conspirators in this wicked and shameful fraud. 

And, as Chris Dovi reported on April 23, VDOE is at last getting rid of the VGLA.  It seems it takes a law to make these people pay attention to their duties.

Governor: How about a new Board of Education and a new Superintendent?


Note added on 5/9-10/10 Carol Wolf has come up with still more interesting data:

If you have a strong stomach for pathetic lies, you might want to read this report (pdf) from RPS on VGLA participation rates.  The report, required in the wake of the enactment of HB304, says that at AP Hill Middle School, "[a]ll students with intellectual disabilities participated in the VGLA or VAAP in all content areas" in 2008-09.  The reason?  "student deficits in reading and mathematics and other challenges that impact school readiness."  The response?  "Staff developments"  (sic).

At Bellevue Elementary School, all the students with an "identified intellectual disability" were in the VAAP and "most" with a learning disability or "other" health impairment" were in the VGLA.  The response again will be "staff development."

There is more in the report if you can stand it.  They all fit the pattern: Blatant cheating by the school staff and no accountability.  Of course, given the "leadership" from the state, that absence of accountability can hardly be an accident.

Then here are the Richmond data by school with the state average shown by the orange line:

Those zeros at the high schools might be thought to be a good thing until one notices that the VSEP numbers there also are zeros.


Note added on 9/24/10:

Carol Wolf just came up with the 2009-10 VGLA data.  They show that HB304 is having an effect, even before it has been in effect for an entire school year; they further show that VDOE has the ability, albeit not the will, to fix this mess.

As to VDOE: On February 19, 2010, after HB304 had passed the House, the Superintendent became "concerned about the increase in the numbers of students with disabilities participating in VGLA."  She scheduled mandatory web conferences for those divisions with participation rates of 25% or more in VGLA reading or math.  At the time, the State rates in both tests were close to 20% so the criterion was a (user friendly) 125% of the State rate.  Here are the participation rates at some selected divisions at that time:

State Buchanan Ccity Hampton N. News Norfolk Richmond
Reading 20.4% 51.1% 37.0% 8.4% 6.0% 8.7% 43.4%
Math 19.9% 54.7% 39.5% 7.5% 7.4% 9.0% 38.7%
Science 4.7% 49.4% 28.9% 0.1% 1.2% 1.3% 30.1%
History/SS 7.0% 47.1% 22.0% 4.1% 0.3% 1.5% 32.8%
Writing 5.8% 61.7% 0.0% 4.3% 0.0% 1.2% 40.4%

Recasting those numbers as % of the state rates produces an interesting graph:

Turning to current data: Richmond now is the "leader" in abuse of the VGLA, with a VGLA tests/ADM ratio of 15.5%, 460% of the state average.

Division VGLA/ADM
Richmond City  15.5%
Sussex County  14.4%
Radford City  11.0%
Highland County  10.8%
King William County  10.0%
Halifax County  9.7%
Nottoway County  8.3%
Northumberland County  7.9%
Greensville County  7.9%
Alleghany County  7.9%
Buckingham County  7.8%
Lunenburg County  7.6%
King George County  7.4%
Prince Edward County  7.3%
Caroline County  7.2%
Charles City County  7.1%
* * *   
State 3.4%

Buchanan County, which led the list last year, is at 3.6%, just above the state average, this year.  What happened, of course, is that they got caught cheating

Here is the entire distribution:

Richmond is the gold bar; Charles City is green; Buchanan is magenta; the red bars are Norfolk, Hampton, and Newport News.

For another view, here is the VGLA/ADM ratio by year for Richmond, Buchanan, the State, Charles City, and Richmond's peer jurisdictions:

Unfortunately, the Buchanan Superintendent who admitted he had "encouraged the use of VGLA as a mechanism to assist schools in obtaining accreditation and in meeting AYP targets" still has his picture on the Buchanan County Schools Web page and, presumably, still has his job.  But then, you already knew how feckless and pusillanimous VDOE is.

These data also let us measure the effect of the Buchanan (and, probably, Richmond next year) cheating:

In English, the Buchanan score dropped four points from 2009 to 2010.  The math score dropped six points.  Indeed, if the the VGLA cheating had continued, we could expect the scores to have continued to rise, so the effect of the reduced VGLA participation probably is larger than the 4 and 6 points from that crude calculation.

To the same end, Steve Fuhrmann points out that the Charles City VGLA/ADM data for fiscal '05 thru '10 track the SOL scores quite well:

Indeed, the R-squared for the fitted line (SOL vs. % VGLA) for the math test is 73%, 95% for the English test.

Next year's Richmond data should be fascinating.


Some History

You may recall that Carol Wolf had dismissed as Urban Legend the widespread reports that Richmond and other affected divisions take the SOL scores of students at Maggie Walker Governor's School and apply them to the high schools in the students' "home" districts.  When her son, a student at Walker, asked why the school was not on the US News list of outstanding high schools, she discovered that the rumors were true: her son's score was being counted for AYP and accreditation purposes at John Marshall, which he does not attend.  She also learned that Richmond is further abusing the SOL process by counting the SOL scores of the temporary students at the Richmond Alternative School (aka the CCP) at that school, not at those students' actual home schools.

Carol learned her lesson.  So, in the context of complaints from some Richmond teachers and others that RPS was abusing the alternative testing programs for kids with disabilities, she asked VDOE for the data.  The always helpful Charles Pyle at VDOE provided a spreadsheet with the test counts, by division and by subject, going back to 2001-02.  Carol is not one of the Numerati but she thinks I am, so she gave me the spreadsheet.

Pyle and Paul Raskopf of the Special Ed. Division remained kind and professional (and helpful) in the face of my follow-up data requests.  The data they provided (notably here, here, and here (pdf), and more recently here) are shocking.

Carol's comments on those data are here.



But before we turn to the numbers, let's get on top of the acronyms.  Here is a summary taken from a VDOE (Virginia Department of Education) publication (pdf).

The No Child Left Behind (NCLB) Act of 2001 requires that all students, including those with disabilities, and those with limited proficiency in English [LEP], be assessed on statewide accountability measures to determine Adequate Yearly Progress (AYP). For all students with disabilities identified under the Individuals with Disabilities Education Improvement Act (IDEIA), the Individualized Education Program (IEP) team determines how the student will participate in the accountability system. For students identified under Section 504 of the Rehabilitation Act of 1973 as amended, the 504 committee determines how the student will participate.

In Virginia, students with disabilities have several options for participating in the state accountability system. They may participate in the SOL assessments without or with accommodations in the same manner that non-disabled students participate. Students in grades 3 through 8 with disabilities that prevent them from accessing the SOL test(s) in a content area, even with accommodations, may participate in the VGLA [Virginia Grade Level Alternative].

Similarly, LEP students who are at level 1 or level 2 of English language proficiency may take the SOL reading test with or without accommodations or the VGLA for reading. The LEP team makes participation decisions for eligible students. These decisions must be documented in the 2008-2009 LEP Student Assessment Participation Plan . . . .

The VSEP [Virginia Substitute Evaluation Program] is available to students with disabilities who are enrolled in courses with end-of-course SOL assessments [required for the standard diploma] and students in grades 9-12 who need the grade 8 literacy and numeracy certification required to earn a modified standard diploma. . . . All participation decisions are the responsibility of the student’s IEP team or 504 committee. . .

Under the Individuals with Disabilities Education Improvement Act of 2004 (IDEIA), . . . students with the most significant cognitive disabilities may be assessed on state-established content standards through an alternate assessment. NCLB guidance allows states to address this instructional challenge by developing grade-level state standards that have been “reduced in complexity and depth.” This concept is referred to as aligning content level standards. The concept of aligned content level standards for students with significant cognitive disabilities has been addressed in the design and implementation of VAAP [Virginia Alternate Assessment Program]. Using eligibility criteria, the IEP team must determine participation in the SOL assessments, VSEP, VGLA, or VAAP (emphasis and links supplied).

 Here is a table (pdf at p.2) from VDOE that sets out the entire program, with the source of content standards, the testing options, and the affected grades and subject areas.

VA. Assessment Program Options for Students with Disabilities

The structure makes some sense.  The premise of the SOL is accountability, based on impartial testing.  The VGLA and VSEP extend the SOL testing process to students whose disabilities interfere with taking the test, but not with learning the material.  The VAAP further extends the process to those kids who are so impaired that the schools develop individualized standards for each student.

Of course, a school or a division can abuse such programs.  The feds early on recognized that possibility and they provided a partial remedy: In December 1993 the USDOE regulation at 34 CFR 200.13(c)(2)(i) capped the VAAP pass level at 1% of the school division enrollment and at 1% of the state enrollment for purposes of Adequate Yearly Progress under the No Child Left Behind Act.  Unfortunately, the feds did not impose similar controls on the VGLA/VSEP.


Remarkable Numbers in Richmond

Note: Graphs below updated 9/17/09 to include the 2008-09 test data, where available.

At the threshold, we might expect that the students with disabilities will not test as well as the general population, even with the VGLA/VSEP/VAAP in place.  The statewide data for 2008-09 (data are here) are consistent with that expectation.


Yet in Richmond, where the division average SOL score is well below average, we see the students with disabilities performing well above the statewide average for children with disabilities.

Note added on 8/7/09:  The three-year history emphasizes the difference: Richmond trails the state by 8-12 points on both tests but Richmond's students with disabilities lead their peers statewide by somewhere between 3 and 11 points:


Note added on 10/13/2010:  Data by grade from the VDOE database of assessment results paint a similarly dramatic picture.  Here are the 2010 statewide and Richmond reading scores first for all students and then for students with disabilities:

Notice how the "all students" results show Richmond well below the state average except in the fifth grade, and falling off rapidly above the fifth grade.  Yet, remarkably, except for grades 7 and 8, Richmond's students with disability perform much better than the statewide students with disabilities.  And, again, the Richmond performance plummets after the fifth grade. 

The drop after the fifth grade may say something about Richmond's miserable middle schools (grades 6-8).  To the point here, however, the much better performance of Richmond's students with disabilities suggests either a remarkable population of students with disabilities or widespread cheating in Richmond.

The math scores tell pretty much the same story: Inferior performance by all Richmond students but remarkably good performance by those with disabilities.


The phenomenon extends to diplomas:  Statewide in 2007-08, 55% of graduates received advanced diplomas while 17% of graduating students with disabilities received advanced diplomas.  In Richmond, only 37% of graduates received advanced diplomas but 22% of graduating students with disabilities received advanced diplomas.

How is it that Richmond's students perform well below state average but its students with disabilities perform well above the average for students with disabilities?

  1. Richmond's students with disabilities are remarkably smarter or less disabled than the statewide average?

  2. Richmond is labeling a disproportionate number of kids as having disabilities?

  3. Richmond's testing of kids with disabilities is less rigorous than the statewide average?

The data below demonstrate #2 conclusively and strongly suggest that #3 may also be operating here. 

Let us start by examining the remarkable growth of the alternative testing program and the remarkable lack of the VSEP.

VSEP as the Poor Stepchild

The VAAP has been in place for a while; The VGLA started in 2004.  The statewide VGLA and VAAP numbers have increased dramatically over time, while the VSEP numbers have remained quite small:

Note: The data here and below are the total number of tests, not the number of affected students.

Here are the Richmond numbers, first as number of tests and second as percentage of the Virginia total.

The reason for the decrease in VAAP in Richmond will become apparent below.

Richmond has very nearly 2% of the students in Virginia.  Thus, we see Richmond administering the VAAP at about three times the average rate and the VGLA at four or more times.  In contrast, the statewide rate of VSEP testing is very low but Richmond's rate is still lower: zero.

For another view of the explosive growth of the VGLA testing, here are the tests per student data for Richmond and three peer jurisdictions by year.

For the VGLA in 2008-09, Richmond had the second highest participation rate in the state, 4.3  times the State average. 

Buchanan County  3399 585 17.2%
Richmond City  23202 3769 16.2%
Highland County  273 40 14.7%
Radford City  1497 210 14.0%
Sussex County  1215 158 13.0%
Nottoway County  2428 283 11.7%
Alleghany County  2896 329 11.4%
Lee County  3694 398 10.8%
Charles City County  859 84 9.8%
Prince Edward County  2615 245 9.4%
Halifax County  6026 536 8.9%
Manassas Park City  2464 218 8.8%
King William County  2212 194 8.8%
Mecklenburg County  4837 414 8.6%
Southampton County  2850 240 8.4%
Lunenburg County  1686 140 8.3%
Northumberland County  1479 122 8.2%
Charlottesville City  4060 308 7.6%
Galax City  1361 103 7.6%
Caroline County  4244 318 7.5%
* * *
State Totals 1236546 47113 3.8%

The numbers here are December VGLA counts from the VDOE spreadsheet divided by the Fall Membership.

Here is the entire dataset as a graph:

Richmond is the gold bar there; the red bars from the left are Hampton, Norfolk, and Newport News.  The State does not report the number when it is <10, so some of the zeros on the graph (and in the part of the table not reproduced above) will be small but nonzero.

Note added on 4/30/10:  As part of VDOE's belated reaction to the VGLA scandal, they are publishing the VGLA participation rates by subject (apparently as a % of the number of kids with handicaps).  The 2008-09 numbers are here.  As part of their belated, insufficient reaction, they are requiring training in divisions where the rate is > 25%.  How about that!  Cheat on the SOLs; abuse your kids; receive training so you can figure out how much you are cheating!

Here are the data, math participation as a function of English participation:

The red points are, from the left, Newport News, Hampton, and Norfolk.  Richmond is the gold square.  The blue lines are the state averages, 20.4% for English, 19.9% for math.  The 83% R2 should not be a surprise: In the divisions that are playing it straight, the kid who needs one test likely will need the other; in the divisions that are cheating, what boosts the score on one test likely will boost it on the other.

Only fifty-seven divisions reported testing under the VSEP in 2007 and even in those divisions the rate of testing was very low.

Bedford County  35
Orange County  22
Warren County  21
Brunswick County  18
Goochland County  16
Hopewell City  14
Buena Vista City  14
King George County  13
Smyth County  13
Fairfax County  <
Virginia Beach City  <
Chesterfield County  <
Henrico County  <
Chesapeake City  <
Loudoun County  <
Norfolk City  <
Spotsylvania County  <
Hampton City  <
Williamsburg-James City County  <
Alexandria City  <
Albemarle County  <
Rockingham County  <
Frederick County  <
Augusta County  <
Accomack County  <
Petersburg City  <
Charlottesville City  <
Prince George County  <
Washington County  <
Culpeper County  <
Caroline County  <
Tazewell County  <
Pulaski County  <
Mecklenburg County  <
Powhatan County  <
Prince Edward County  <
Carroll County  <
Halifax County  <
Fluvanna County  <
Sussex County  <
Wythe County  <
Lee County  <
Salem City  <
Fredericksburg City  <
Northampton County  <
Westmoreland County  <
Buckingham County  <
Giles County  <
Grayson County  <
Alleghany County  <
Charles City County  <
Staunton City  <
Floyd County  <
Franklin City  <
Essex County  <
Radford City  <
State Total 331

"<" indicates fewer than ten.

As to the 2007 VAAP, the RPS test rate is 14th from the highest in the state (down from first in the previous year), 1.6 times the state average.

King and Queen County  47 802 5.9%
Hopewell City  219 4190 5.2%
Brunswick County  110 2167 5.1%
Sussex County  57 1215 4.7%
Charles City County  40 859 4.7%
Bath County  33 733 4.5%
Dinwiddie County  194 4675 4.1%
Surry County  41 1041 3.9%
Bristol City  83 2415 3.4%
Covington City  31 918 3.4%
Rappahannock County  31 921 3.4%
Mathews County  42 1260 3.3%
Madison County  59 1870 3.2%
Richmond City  731 23202 3.2%
Winchester City  117 3802 3.1%
Louisa County  141 4738 3.0%
Charlottesville City  117 4060 2.9%
King George County  111 4066 2.7%
Prince Edward County  69 2615 2.6%
Richmond County  32 1213 2.6%
Lunenburg County  44 1686 2.6%
Petersburg City  121 4675 2.6%
* * *
State Totals 24002 1236546 1.9%

Here is the graph of the entire VAAP distribution

Again the gold bar is Richmond.  The red bars, from the left, are Norfolk, Newport News, and Hampton.

These numbers are the sum of the counts for the tests and are not directly comparable to the 1% cap.  VDOE's 2007-08 data show Richmond at 3.2 times the cap on the English test and 3.1 times the cap on the math test.  Note also that the cap is on the number of passing scores, not the number of kids in the VAAP; a division that has a 70% pass rate in the VAAP can have just over 1.4% enrolled in the VAAP without hitting the 1% cap.


A VCU Study Confirms That the VGLA is Easy

Note added on 8/15/09:

A VCU Study (link broken by VDOE Web page redesign; the Study appears to have been taken down) under a grant from VDOE found the math VGLA too easy by 23% and the English tests too easy by 50%.  A skeptical reading of the Study suggests that those results may be understated.

Richmond Is Abusing the VSEP

Students in our high schools face a graduation requirement to pass six end of course SOL tests: two English, one math, one laboratory science, one history and social sciences, and one elective.  That is, high school students who are candidates for a standard diploma will take on the average 1.5 end of course SOL tests per year.  The VGLA stops at the eighth grade and the VSEP provides the end of course alternative.

Indeed, aside from the availability of VGLA to some LEP students for the English test, the eligibility for VSEP is the same as for VGLA: a disability that prevents the student from accessing the SOL test, even with accommodations.  Aside from those LEP students, unless the student's IEP or 504 plan is changed between the eighth and ninth grades, the student who was eligible for VGLA in the eighth grade would almost always be eligible to participate in the VSEP when tested on the same subject in high school. 

Yet the statewide number of VSEP tests is minuscule.  Most divisions, notably including Richmond, have not administered even one test under the VSEP, despite a hefty and growing number of VGLA tests.

Doubtless some of the lag in the VSEP numbers comes from the sheer workload of starting up a complicated program.  But the VGLA is nearly as complicated as the VSEP and the VGLA has grown like a melanoma.  The explanation is more sinister than workload: The school divisions grade their own (pdf at p. 15) VGLA tests (VDOE's contractor scores the results) but the VDOE, through the contractor, grades the VSEP (pdf at p. 18).  Thus, from the school division's viewpoint, the VGLA is an excellent tool for boosting AYP while the VSEP is not.  As a lesser matter, the VSEP also takes more work because of the requirement for preparing an annual plan for State approval.

Note added on 7/6/09: I chatted today with a parent who offered an alternative view of the VSEP scandal.  He says the tests in the VGLA are so easy that the kids cruise through and then hit a wall at the state-graded VSEP.  So, he suggests, there is no VSEP because most of the kids couldn't pass it.

Note added on 8/18/09: The Times-Dispatch today quoted Harley Tomey, the RPS Special Ed director, for two explanations for the very small VSEP numbers.  Both explanations try to put a reverse spin on the truth:

  1. Extra work: With the VGLA, the school decides that the student qualifies and start collecting data; with the VSEP the school must submit a plan for State approval.  Of course the crucial element here is not the need for a "plan" but for State approval.  The schools are generally free to manipulate the VGLA but the prior State approval and, more to the point, the State grading, are the real reason that Virginia schools administered 32,301 VGLA tests in 2008 but only 253 VSEP tests.

  2. VSEP doesn't count toward AYP:  Tomey says "people don't take [giving the VSEP] as option No. 1" because it hurts their AYP status.  That argument overlooks recent history: In 2007, both tests counted toward AYP; that year Virginia schools administered 16,657 VGLA tests and 177 VSEP tests.  That year, also, VCU performed a study that VDOE used to validate the VGLA.  The VSEP numbers were too small to validate so VDOE asked the feds to not count the VSEP for AYP purposes.  In short, the numbers were small before the VSEP didn't count.  Tomey's lame excuse has it exactly backward.

But don't take Tomey's or my word for it: Listen to the teachers (one of whom quoted a boss for the proposition that "My dog could pass VGLA").

Note added on 8/30/09:  Often it's useful to go look at the actual requirement.  The plans of which Tomey complains can be found here.  The EOC English Reading plan, for example, is a five-page Word document that requires a description of the methods or products by which the student will demonstrate achievement of sixteen specific SOLs.  The form has room for an answer of ca. 40 words in each category.  In the context of an IEP or 540 Plan that specifies why the student cannot access the multiple-choice SOL test, this 640 word plan should not present a great burden.  Yet Tomey says it is such a burden that no high school student in Richmond has yet participated in the VSEP, albeit RPS administered over 3000 VGLA tests in 2006-07.

And just think: We pay Tomey to peddle this malignant nonsense.

 This perverse situation shows up in the graduation rate.  The real "graduation rate" is the rate of regular (standard plus advanced) diplomas.  The School Report Card and the Virginia Special Education Performance Reports disclose those rates for 2007-08:

Richmond State
All Students 60% 80%
w Disability 32% 44%

Here we see that two-thirds of Richmond's students with disabilities are dropping out or receiving one of the less-than-standard degrees.  These students do not have to pass the six end of course SOL's required for a standard diploma, so the lesser diploma becomes another perverse tool for boosting AYP.  There are costs, however.  This failure of Richmond's educational system leaves Richmond far short of the 45% graduation rate Special Education Performance Target under the IDEA.  Even more to the point, it denies two-thirds of Richmond's high school special education students access to the VSEP as an alternative path to a regular diploma.

Note added on 7/16/09: As a further perverse incentive, students taking the VSEP are not counted toward either the participation (the 95% minimum) or proficiency (scoring) calculations for AYP because of the very small numbers.  This shines a brutal spotlight on our schools: If they were focused on their students' needs, they would offer VSEP as a path to the standard diploma, whether or not the scores counted toward AYP.  If they were cynically interested in improving AYP, and if they thought the VSEP test were easier than the SOL (or more subject to gaming), they would run up the numbers to the point where VDOE could validate the tests and count the scores.  Yet in 2007-08 Richmond gave 3,391 VGLA tests and zero VSEP tests; statewide the schools administered 32,301 VGLA tests but only 253 VSEP.  These numbers show beyond any quibble that (1) most schools are gaming the VGLA but realize they can't cook the VSEP, and (2) they are focused on AYP and not on the needs of their students.

Note added on 7/29/09: The graduation data on the VDOE Web site quantify the disgraceful effect of the easy VGLA and the absent VSEP.  But first, a reminder as to the various kinds of diplomas and alternatives:

  • Standard Diploma: What the name implies.  Requires six verified credits (student must pass end of course SOL's).

  • Advanced Studies Diploma: Tougher requirements, including nine verified credits vs. six for the standard diploma.

  • Modified Standard Diploma: Reduced requirements (basically 8th grade numeracy and literacy) for some students with disabilities.  Notice that the school system makes the eligibility determination after the student's 8th year, i.e., after the VGLA ceases to be available.

  • Other Diplomas: See the Web site.  All carry less significance than even the modified standard diploma and do not require passing the six EOC SOL's.  Note that the GAD is rare; only six in the state last year.

  • ISAEP: Alternate Education Plan for students sixteen years and older; this appears to be a second chance program leading to the GED.

Richmond gives fewer advanced studies diplomas than average and, with the exception of ISAEP, more of all the others.


Notice that the modified standard and certificate rates took off at the same time as the VGLA.  Overall, notice further Richmond's poor performance as to advanced studies diplomas, its overproduction of inferior degrees, and its total failure to offer the ISAEP "second chance." 

Ric/State Ric-State
Standard 120% 8%
Advanced 58% -21%
Modified 302% 5%
Special 264% 5%
GED 412% 4%
ISAEP 0% -3%
Certificate 349% 1%

Graph added 1/10/10:

To get some further context, here are the rates for these various outcomes (degrees as % of completers) in 2008 for Richmond and three peer jurisdictions:

As you see, these other old cities give standard diplomas at rates higher than the state average and higher even than Richmond; they issue advanced diplomas at lower than the state rate but at a higher rate than Richmond.

Richmond's very high rates of granting lesser degrees persists in all areas except the ISAEP.  There, Richmond, being rid of some difficult kids, does not trouble itself to offer those kids a second chance that can't affect AYP.  Otherwise, as we see here, all those VGLA kids have to go somewhere; all too often, "somewhere" is someplace other than a standard or advanced diploma.

We saw above that an unusually large percentage of Richmond's students with disabilities receive advanced diplomas; yet the data here show that Richmond's portion of all completers receiving advanced diplomas is only 58% of the state rate.  That is, Richmond students do not earn advanced diplomas at nearly the same rate as students statewide but Richmond's students with disabilities earn an unusually large number of advanced diplomas.  At the same time, Richmond's graduation rate (standard + advanced) for students with disabilities lags the state rate by 13%.  The 13% wind up with inferior degrees.

All this, of course, is consistent with a focus on AYP but not on Richmond's students: Some of Richmond's VGLA students don't belong in the program and they prove it by obtaining advanced diplomas.  Many of Richmond's large number of VGLA students need the VSEP; in its absence they disappear into the modified standard or other lesser degrees.

[End of 7/29 note.]

Note added on 8/5/09:  Follow this link to see the effect of the missing VSEP on the on-time graduation rate.

The remedy, of course, is for our feckless State educational bureaucracy to rigorously audit the VGLA program of any division that does not offer a flourishing VSEP.  Or, still more directly, have the contractor grade the VGLA as well as the VSEP tests.  The current process of random audits (pdf at p. 20) is a blatant failure (for one thing the audits utterly fail to control the practice of testing and retesting until the student gets the right answer).

Alas, this systematic and perverse failure to offer VSEP is not the end of the matter.


VAAP and the Cap

We asked VDOE about Richmond's ongoing high VAAP count.  They replied in an email:

Beginning with the 2008 administration, the VDOE required school divisions that exceeded the one percent cap on the VAAP to overturn scores so that the pass rate did not exceed the cap.

That seems to say that, until 2008, VDOE ignored the federal law and allowed Richmond to report unlawful data. 

To its belated credit, however, VDOE summarily denied RPS's inept 2008 and 2009 requests for an exception to the cap.

By letter of June 12, 2008, signed by Deputy Superintendent Brandon on behalf of Superintendent Jewell-Sherman, RPS said: "This letter is to request an exception for to the one percent cap . . ."  In the third paragraph, these people who supervised the teachers who taught Richmond's kids continued their assault upon the Mother Tongue: "Additionally, students with disabilities tend to remain in RPS for there entire educational career . . ."(emphasis supplied in both cases).

The logic of the letter is even worse than the English.  The letter argues that approximately 10,000 "school-age students" in Richmond are not enrolled in RPS.  It then calculates that if these 10,000 were at RPS, the special education (i.e., kids with disabilities) percentage would drop from 19.4% to 13.3%.  This assumes, incredibly and with no supporting data whatever, that none of the 10,000 has a handicap. 

By letter of July 28, 2008, State Superintendent Cannaday denied the request, saying: "A three member review team examined your division's supporting documentation and determined that the data did not meet the conditions set forth in Superintendent's Memo. No. 12 . . . ."

Extra Credit Question: Does the Cannaday letter violate the requirement in the Administrative Process Act that a case decision inform the party "of the factual . . . basis for an adverse decision"?

By letter of January 30, 2009 from Superintendent Brandon, RPS requested relief from the cap for 2008-09.  The English in this letter is better but the logic is not; the letter again assumes that the school-age children in Richmond who do not attend RPS are wholly without disabilities.  VDOE rejected the request on April 9.

The 2009 letter also contains a remarkable admission (another hat tip to Carol):

The table below shows that the number of VAAPs (sic) participants since 2006 has decreased by 57.3%.  This has been done through staff development and working with schools to ensure that IEP teams are following the VDOE VAAP participation guidance document and are using the Learner Characteristics Inventories as part of the IEP team decision making process.

The table indeed shows a 57% overall drop in the total VAAP participation, along with a 67% decrease in mental retardation handicaps in the VAAP:

Note: The data are for students taking the English and math tests; the available State data confirm that the VAAP numbers in Richmond are equal for the two tests and they are comfortably close to the Brandon numbers.

Year Reading Math Brandon
2005-06 503 503 504
2006-07 454 454 454
2007-08 389 389 391
2008-09 215

If, as Ms. Brandon wrote, Richmond could achieve this kind of decrease simply by "staff development" and "working with schools to ensure that IEP teams are following the VDOE VAAP participation document and are using the Learner Characteristics," we can be certain that the system earlier was overloaded with kids who had been misclassified, i.e., did not in fact have significant cognitive disabilities (Even I am not cynical enough to consider the alternative: That they kicked out kids who qualified for and needed the VAAP, solely to improve AYP.  But then, maybe I am merely naive.). 

What do you suppose happens to the kids from Richmond's shrinking VAAP?  Either they go into the VGLA

or, if in high school, they contribute to RPS's atrocious graduation rate.

Thus, we see that even VDOE's belated and user friendly enforcement of the 1% cap can produce a 57% shrinkage of the VAAP in Richmond in three years.  We are left to wonder why VDOE is not doing something about the VGLA/VSEP.


Inventing Disabilities?

The data support RPS's assertion that Richmond has an unusually large number of kids with disabilities.  Unfortunately, the same data suggest that RPS invented many of those disabilities (as they admit they did under the VAAP).

For a start, here from the State Web site are the 2007 counts of students with disabilities, by division, expressed as a percentage of the 2007 Fall membership.

Richmond again is the yellow bar and Norfolk is red.  The green, from the left, are Chesapeake, Hanover, Henrico, and Chesterfield.

Division Members Total Disab. % Disab.
Prince Edward County Public Schools         2,666         556 20.9%
Buchanan County Public Schools         3,475         693 19.9%
Richmond City Public Schools       23,771      4,622 19.4%
Lee County Public Schools         3,694         711 19.2%
Halifax County Public Schools         6,101      1,140 18.7%
Covington City Public Schools            950         177 18.6%
Highland County Public Schools            283           51 18.0%
Russell County Public Schools         4,424         796 18.0%
Alleghany County Public Schools         2,914         522 17.9%
King and Queen County Public Schools            839         148 17.6%
Franklin County Public Schools         7,529      1,325 17.6%
Chesapeake City Public Schools       40,046      7,028 17.5%
Dickenson County Public Schools         2,533         444 17.5%
Scott County Public Schools         3,996         694 17.4%
Winchester City Public Schools         3,734         640 17.1%
Southampton County Public Schools         2,932         501 17.1%
* * *
State   1,231,506  168,441 13.7%

Looking at the top of the list, it might appear that, aside from Richmond and Chesapeake, the large numbers are peculiar to small school divisions.  A graph of %disabilities v. enrollment provides a more nuanced picture:

Richmond is the gold square; Norfolk is the red diamond.  The green diamonds are, from the left, Hanover, Chesapeake, Henrico, and Chesterfield.  Having big, old Fairfax on the graph squashes the data; if we expand the axis to push Fairfax (and the lesser large divisions, Prince William, Va. Beach, Chesterfield, and Loudoun) off to the right, we see that the small divisions come in both high and low.  Of course, more scatter is to be expected when the numbers are small.  Interestingly, the middle-size divisions (ca. 10,000) tend to report low numbers of disabilities.  In any event, Richmond and Chesapeake are plainly unnatural.

BTW: The low-lying points are Clarke (2,226, 7.3%), York (12,844, 9.6%), and Stafford (26,594, 8.9%).

The least squares fit tells us that the reported percentage decreases by about 0.01 per thousand increase in enrollment but the R2 tells us that the two variables are essentially uncorrelated.

A graph comparing Richmond and three similarly old, urban jurisdictions with the state average emphasizes that Richmond is unusual among its peers.

2007 % w Disab. for Urban Divisions

The State data further break out the disability counts by disability.  Here are those counts, again expressed as percentages of the Fall membership:

2007 Disability/Member by type and by division

As you see, the counts in the four divisions are similar to the statewide counts, except that Richmond is high (by over a third) in specific learning disabilities, very high (double the state rate) in emotional disturbance and developmental delay, and astoundingly high (over three times the State rate) in mental retardation. 

Before going much further, we should account for the peculiar racial composition of Richmond's public schools: Blacks are 20% of the Virginia population, 57% of the Richmond population, and 88% of the membership in RPS.  Fortunately, the data on the VDOE Web site give the school populations by race and sex and Excel's pivot table quickly enough breaks that out by division.  Mr. Raskopf of VDOE provided the statewide disability data (pdf) for 2008, broken out for each division by disability, sex, and race.  So let's look at the black students.

We saw above that the disability rates in Richmond, the State, and three older, urban jurisdictions reveal a very high rate in Richmond.  The same data, by sex, for the black students show the very same pattern:


Or, combining the two graphs:

As you see, Virginia's school systems find over twice as many disabilities among their black male students as among the black females.  The same is true in Richmond.  In Richmond, however, for both males and females the numbers are much larger than for the state or the other old, urban jurisdictions.  Plainly something is going on in Richmond.

For another look, here is the 2008 number of black males with disabilities in each school division expressed as a percentage of the number of black males.

Richmond is the gold square; the red diamonds from the left are Newport News, Hampton, and Norfolk; the green diamond is Chesapeake.

Again, we see the expected scatter where the membership (here of black males) is small; again we see a trend to about the average as the number of students in the division increases; again we see Richmond and Chesapeake sitting far above the trend.

Indeed, the Richmond numbers may be even more aberrant than they appear because Richmond is utterly failing to find any VSEP-eligible disabilities in its high schools.

One last datum on the question of race: Let's compare the disability rates for black males with those for white males:


Or, combining the two graphs:

Richmond Is Abusing the Process

The Richmond data are plainly abnormal The reason could be a pattern and frequency of disabilities in Richmond's schoolchildren remarkably different from those in other old cities and the state at large.  Or it could be RPS is cooking the numbers.  William of Occam and experience elsewhere would counsel the latter explanation.

Likewise, Richmond's recent success in reducing its VAAP numbers, particularly the mental retardation count, suggests that the count is high because of the counters, not the counted.  That explanation also is consistent with the reports we hear from some teachers that too many school divisions are prone to slap the "disabled" label on kids who are disruptive or slow to learn.  It also is congruent with the division's incentive to get those kids tested to a less stringent standard than the regular SOL test.  The explanation also fits the grading scheme: The local division gets to grade the VGLA while VDOE's contractor grades the SOL 

In light of Richmond's history of gaming the system (suspending students at a scandalous rate, stealing SOL scores from Maggie Walker, ignoring the truancy laws, encouraging cheating by the teachers), it should be no surprise that RPS would invent disabilities.  Moreover, in light of VDOE's history of acquiescence with Richmond's cheating (see, e.g., here, here, here, here, and here) and it's opaque and corrupt accreditation system, it is no surprise that VDOE is not exercising its considerable authority to regulate RPS. 

The Watchdog as Lapdog

Indeed, aside from a manifestly ineffective system of random audits (pdf at p. 15) of the VGLA and and a blinkered look for disparities under the IDEA, VDOE is not even looking to see whether RPS is mishandling Richmond's schoolchildren.

Note added on 11/14/09: Chuck Pyle at VDOE kindly provided data on the Audit.  It turns out they also audit the VAAP.  Here are the numbers for Richmond, Fairfax, and the State for the last three years:

Interestingly, these data suggest a 10-15% uncertainty in the grading process in the state-graded VAAP.  To the point here, the Fairfax data for the VGLA are consistent with the notion that Fairfax is far advanced in the process of abusing the VGLA.  Still more to the point, these data suggest that the problem with the VGLA is not so much the grading as the repeated testing:

A student can do the same work sheet over and
over again until they get it perfectly. No one is to know and
they don't want to know. They just want a "pass".

Let's start with the authority of the Board of Education:

·        The Board of Education is charged with general supervision of the public school system
·        The "core" of Virginia's educational program is the SOL's, created under the first Standard of Quality.  The Board of Education has authority to seek compliance with the Standards of Quality and can sue to compel compliance. 
·        The Board has the duty and authority to see that the truancy laws are enforced.
·        The Board can fine, suspend, or remove a division superintendent for "sufficient cause."

When we shared a preliminary analysis of the VGLA and VAAP data with VDOE, they pointed out their (belated) enforcement of the 1% cap on VAAP, discussed above.  Regarding the VGLA they said:

VDOE provides annual training to school divisions on the VGLA. The training includes criteria for students' participation in the VGLA as well as information on collection of evidence and scoring. The training is provided to division directors of testing and directors of special education and others at the discretion of the school division. The department is currently reviewing statewide data and is planning specific technical assistance to school divisions that are significantly above or below the state average. This assistance will include strategies for reviewing local VGLA participation rates and demographic factors.

So, they are "planning" technical assistance, not delivering it.  That assistance, when it arrives, will include "strategies for reviewing" participation rates, not an audit to be followed by a demand to stop gaming the system.

It gets worse: Even where VDOE is required to look for problems they are looking with their blinkers on. 

The regulations under the federal Individuals with Disabilities Education Act ("IDEA") require the State to have policies and procedures to prevent "inappropriate overidentification" (sic) of disabilities if the basis is race or ethnicity.   More specifically, 34 CFR 300.173 requires

The State must have in effect, consistent with the purposes of this part and with section 618(d) of the [Individuals with Disabilities Education] Act, policies and procedures designed to prevent the inappropriate overidentification or disproportionate representation by race and ethnicity of children as children with disabilities, including children with disabilities with a particular impairment described in §300.8.

and 34 CFR 300.646(a) provides

Each State that receives assistance under Part B of the Act . . . must provide for the collection and examination of data to determine if significant disproportionality based on race and ethnicity is occurring in the State and the [school division]s of the State with respect to—

(1) The identification of children as children with disabilities, including the identification of children as children with disabilities in accordance with a particular impairment described in section 602(3) of the Act;

(2) The placement in particular educational settings of these children; and

(3) The incidence, duration, and type of disciplinary actions, including suspensions and expulsions.

The Virginia reports responding to this mandate are here.  For 2007-08, the Richmond report says that there was no disproportionate representation in special education and no disproportionate representation in specific disability categories.  Read on down the report and you'll see this: "Data Source: School division submission."

More specifically, the Annual Performance Report for 2007-2008 (pdf) to the USDOE says VDOE performed a "level one" data analysis in six disability categories.  That found 101 of the 132 school divisions with possible disproportionate representation.  Then:

If a school division was identified in the level one analysis, the division was required to review individual student records for the racial/ethnic group(s) identified in the level one analysis. This record review required use of a checklist that allowed the school division to identify violations of procedural or regulatory requirements related to the identification of students for any of the six designated disability categories (emphasis supplied).

The specific elements of the review are here (pdf, pp 10).  In short, Richmond merely had to say that its classifications were regular on the face.  Thus, when Richmond flunked the level one screen in 2006 and 2007, there was no requirement that Richmond (or anybody else!) audit the process to show that the underlying data were valid.

So, if the numbers look bad, VDOE in effect measures whether a school division has its thumb on the scale by having that division do a record review to be sure they subtracted the tare.  And, it seems, they accept the answer without any independent verification of the response, despite data in Richmond's case that fairly shout that something is amiss. 

BTW: The numbers do look bad.  Here, for example, is an excerpt from the VDOE's 2007 level one screen:

And, aside from its blinkered review of the IDEA data, we saw above that VDOE is doing next to nothing about Richmond's obvious abuse of the VGLA/VSEP.

Indeed, it is clear that VDOE has adopted as much as it can of "don't ask, don't tell."  As with the 1% cap, when something is in the public record they sooner or later have to tell.  But it's clear that, unless the Governor or USDOE cracks the whip, they won't be asking about this lousy situation in Richmond or about those funny numbers in Chesapeake and Stafford.

Note added on June 23, 2009:

I was checking a link when I was reminded that VDOE this year is collecting truancy data (the number of unexcused absences for each student).  I asked whether those data are available on the Web and VDOE replied: "[F]or the first time, data concerning the number of unexcused absences is being collected by the Virginia Department of Education (VDOE) for the 2008-2009 school year to establish a potential baseline for future federal reporting requirements" (emphasis supplied).

There, in a nutshell, is the attitude of our educational bureaucracy:  A potential baseline for future federal reporting requirements causes them to go get the data.  Their need for data to enforce the truancy laws they are required to enforce fails to disturb their regulatory torpor.

So, as I said at the top: Richmond is abusing the process for identifying and testing kids with disabilities.  VDOE is letting them get away with it.

Your tax dollars at work.

Note added on 1/31/10:  For reasons that are not yet clear, VDOE performed an audit of the Buchanan County schools last October.  The results here show a shocking and deliberate abuse of the VGLA testing.


Back to the Top

High Taxes | High Crime | Lousy Schools | Obdurate Bureaucrats

Last updated 09/29/13
Please send questions or comments to John Butcher