VCU Study

The Cranky Taxpayer

VCU Study

Back | Home | Up
Teachers Speak | VCU Study

In August 2006 VDOE hired VCU to review the "alignment" of the SOL tests with the Standards of Learning.  The Study (link to VDOE site now is busted; study apparently taken down; pdf copy is here)included an assessment of the VGLA tests for Grades 3-8 in reading and math.

The report, dated January 8, 2007, concludes as to the VGLA

The results of the VGLA Mathematics Grades 3-8 review indicated that generally the alternative portfolio assessments were aligned with the SOLs. The results suggest that range of SOLs demonstrated in the portfolio collections at each grade level accurately reflected the SOLs. The agreement between the DOK levels of the portfolios and that of the SOLs was generally
consistent for Grades 3-8.

The alignment of the VGLA Reading Grades 3-8 portfolio collections was also quite good with regard to the coverage of the SOLs or range-of-knowledge. The evidence included in the portfolios generally addressed all of the SOLs at the specified grade levels. However, the inconsistency between the DOK levels demonstrated in the portfolios and those expressed in the SOLs, is a concern at Grades 3, 5, 6, and 8. The evidence included in these collections was generally at a lower cognitive level than that expressed in the related SOLs.

To translate this into English, we need to know that "alignment" involves two questions about the VGLA: (1) Whether the content measured in the VGLA test reflects the content required by the Standard of Learning (remember the VGLA is supposed to be at grade level and cover the same material as the SOL), and (2) whether the depth of knowledge ("DOK") presented in the VGLA evidence demonstrates the depth of knowledge required by the SOL, i.e., whether the test is tough enough. 

With that under our belts, we can state the two questions in the Mother Tongue:

  •  Does the VGLA portfolio cover all the material in the SOL?

  • Is the test tough enough?

Thus translated, the report says:

  • The math test is pretty good, but

  • The English test is too easy.

Unfortunately, these conclusions considerably understate the problems the Study found in the VGLA.  Moreover, the Study protocols raise a serious question whether the Study, even so, was merely a whitewash afloat in a sea of jargon.


The Results are Worse Than the Summary Admits

The Study evaluated a set of "randomly selected" VGLA portfolios for Depth of Knowledge (again, "DOK"), i.e., whether the portfolio was as cognitively demanding as the Standard.  The DOK was adequate if >50% of the test items had DOK levels at or above the corresponding SOLs.  The standard was "weakly" met at 40 to 50% and not met below 40%.

The Range of Knowledge similarly measured the extent to which the VGLA portfolios demonstrated the span of knowledge required by the Standard.

As to the math test, the report measures depth and range of knowledge in five areas.  Here is a summary of the results, where "<" denotes "weakly met." :

Depth Range
No < Yes No < Yes
Grade 3 1   4     5
Grade 4 1   4     5
Grade 5 2   3     5
Grade 6     5     5
Grade 7 2   3     5
Grade 8 1   4     5
7 23 30

That is, the Study reports that the Depth of Knowledge required by the math VGLA tests was insufficient in seven of 30 respects, i.e., 23% of the time.  Thus we see that the testing rigor the report calls "generally consistent" in fact flunks almost a quarter of the time.

Viewed otherwise, a kid taking the VGLA can have his math scores boosted by up to 23% because of the easier tests.

If you think a 23% score boost would make the VGLA popular with a school or division focused on AYP, stay tuned:

Turning to the English test, because of the nature of the material there the report measures depth and range of knowledge in only two reporting categories ("Use of word analysis strategies and information resources" and "Demonstrat[ion of] comprehension of printed materials").  Here is a summary of the results:

Depth Range
No < Yes No < Yes
Grade 3 1   1     2
Grade 4     2     2
Grade 5 1   1     2
Grade 6 2         2
Grade 7     2     2
Grade 8 1 1       2
  5 1 6 12

As to the depth of knowledge measured by the VGLA portfolios, the report says that the VGLA portfolios were "generally at a lower cognitive level than that expressed in the related SOLs."  In fact, as we see from the table, the VGLA tests were too easy in 50% of the categories measured.  That is, the "lower cognitive level" of the English VGLA tests offers the students a score boost of up to 50%.

In summary:

Math English
Reported DOK Generally Sufficient Generally lower than SOLs
Measured DOK Insufficient by 23% Insufficient by 50%
Effect up to 23% easier up to 50% easier

Is it any surprise that the schools are discovering handicaps and offering the VGLA at a prodigious rate?


Friendly Grader?

When the report characterizes a 23% score boost as "generally consistent" with the SOL, we have to wonder how tough the grader is.

The circumstances surrounding the Study reinforce our sense of wonder:

  • VDOE awarded the $61,200 grant for the Study to VCU, apparently without any competitive process.  Note added on 8/21/09: VDOE confirmed yesterday that there was no competitive process.  The Procurement Act permits this when the contractor is a State agency.  And, as we see below, the senior author of the Study is a known quantity at VDOE.

  • One of the authors is a former Fairfax history teacher.  She is author of a paper that concludes "serious reconsideration must be given to the use of high-stakes consequences in current statewide testing programs."  That is, this author may well be an enemy of the SOL program.

  • The other author is Chairman of the Department of Foundations of Education at VCU.  He has received at least fourteen other grants from VDOE, three from Chesterfield, and three from Richmond.

  • VDOE selected, hired, and paid the Virginia teachers and administrators who performed the evaluations underlying the Study.

  • The report is not clear as to who "randomly selected" the VGLA portfolios to be analyzed.  The suspicious among us might think that VDOE did the selecting.  If so, the authors of the report have no way to know whether or not the portfolios were cherry picked.  Note added on 8/20/09: The helpful folks at VDOE tell me that the portfolios came from the group selected by Pearson for audit and found to have been graded accurately; they wanted good portfolios so the alignment study would not be biased by data that manifestly did not reflect the underlying SOLs.  Otherwise, VDOE selected the subset of the Pearson portfolios, so the authors (and we) still don't know whether the portfolios in the Study were cherry picked.  That defect, even standing alone, should disqualify the Study for its purpose of validating the VGLA tests.  Moreover, this information renders false the Study's statement that the portfolios "were randomly selected from those submitted for the 2005-2006 academic year."

  • The Study performed some rudimentary calculations as to the precision of the evaluations of the VGLA tests but it used the evaluations even where the agreement was poor.  More seriously, the Study did not report any effort to measure the accuracy of those evaluations.

So to validate the VGLA tests VDOE turned to the local university and two faculty members who are thoroughly embedded in the culture (supervised by VDOE) that finds it acceptable to slap a "handicap" label on kids in order to boost the SOL scores.  The Study used Virginia teachers and administrators (selected by VDOE) to do the evaluations and, apparently, did not even attempt to measure the accuracy of their evaluations.

If there is any surprise here it is that the Study discovered and reported that the VGLA tests are too easy.

- - -

Note added on November 12, 2009:

On Aug. 20, I chatted with Charles Pyle and Dr. Shelley Loving-Ryder of VDOE.  VCU withheld the underlying data, claiming that all the information has been given to VDOE and they (VCU) no longer had it.  Pyle and Loving-Ryder said they had the report, five appendices that have not been made public (but that, based on VDOE's description of their content, do not interest me), and two boxes of forms filled in by the evaluators.  They said they do not have the spreadsheet or database for which the grant to VCU paid $3,000 for data entry. 

The folks at VDOE have always played straight with me, and it is not credible that these VCU scholars would give up the only copy of their data, so I thought VCU was stonewalling.  I escalated the inquiry over there to the General Counsel. 

The letter reproduced below summarizes both the problem and current situation:


                                                                                                November 12, 2009

Hon. Arne Duncan
Secretary of Education
U.S. Department of Education
400 Maryland Ave., SW

Washington, DC  20202

Hon. Patricia I. Wright
Superintendent of Public Instruction
Post Office Box 2120
Richmond, Virginia 23218-2120

RE: Potential Defects in Virginia Alignment Study for the VGLA

Dear Secretary Duncan and Superintendent Wright:

Certain aspects of Virginia’s alignment study[1] of the Virginia Grade Level Alternative (VGLA)[2] and certain circumstances surrounding it suggest that the VGLA portion of the study may understate the lack of alignment, particularly as to depth of knowledge (DOK).

The circumstances begin with the following:

·        The number of VGLA tests administered in Virginia has exploded since the test first was administered in 2004, while the number of VSEP tests (the VGLA analog for students in high school) has remained minuscule.  For example, in 2008-09, Virginia schools administered 47,113 tests under the VGLA but only 331 under VSEP.  Indeed, because of the small number of VSEP tests, Virginia has been unable to assess the VSEP alignment and has abandoned VSEP testing for AYP. 

·        One parent explained this disparity by saying that the tests in the locally-graded VGLA are so easy that the kids cruise through and then hit a wall at the state-graded VSEP.

·        A number of Virginia teachers[3] complain that children are misclassified into the VGLA, that the tests are notoriously easy, and that cheating is rampant, all in order to improve school and division SOL scores and AYP performance.


·        Data in Fairfax[4] and Richmond[5] support the teachers’ allegations.

In this context, the Study concludes that the DOK of the math VGLA tests reviewed were too easy in 23% of the categories and the reading test too easy in 50%.  Yet there is evidence that even these shocking numbers may be understated:

·        The Study claims that the VGLA portfolios studied were “randomly selected.”  Yet the Virginia Department of Education (VDOE) tells me that it selected the portfolios from those that had been evaluated as satisfactory in its audit process.[6]  Thus, the study did not use “randomly selected” portfolios, but rather portfolios selected by an entity the Study failed to identify and preselected to be “good” data. 

·        Virginia Commonwealth University (VCU) performed the Study under a grant from VDOE.  When I asked VCU for the database and statistical analyses that underlie the test, they responded that all the underlying materials had been forwarded to VDOE.  VDOE, however, told me that they only had the grading sheets. 


·        When I turned again to VCU, they admitted that they indeed had retained the underlying data and they refused to produce those data, claiming that they are “proprietary.”[7]

Even if VDOE made its selection randomly, the universe of the study was not all VGLA portfolios but those that the audit found to have been graded satisfactorily.  Thus, the alignment analysis comes from a study that falsely reports the origin of its biased source of raw data.  Beyond that, VCU first lied to me about whether it had the database et al. and then refused to produce those data that were paid for with public funds, produced at a public institution, and used to validate a testing program in Virginia’s public schools.

It seems to me that the only explanation for these anomalies is that the data I requested will disclose further defects in the Study.   

I suggest that you demand that VCU produce those data both to you and to me so that we all can see for ourselves whether VCU’s remarkable behavior is, as it appears, a clumsy attempt at a coverup of defects in Virginia’s testing program.

With kindest regards, I am,

                                                                                                John R. Butcher



From: John Butcher [address deleted]

To:      David L Ross/HSC/VCU <>
Cc:      Susan T Ferguson/HSC/VCU <>, "Pyle, Charles \(DOE\)" <>
Date:   11/02/2009 09:45 AM
Subject:          FOIA Request


Thanks for your note.

I don't want to revisit the legal issues but I should caution you to not read any kind of concession into my last note.  The very notion that VCU would or could hide the data that underlie that public study report, paid for by public money, is somewhere between hilarious and offensive.

You offer by way of settlement a hard copy of "the statistical reports" that underlie the study if I will agree not to publish the data on my website.  That offer is unacceptable in at least four respects:

1.        Lack of specificity.  The term "the statistical reports" does not specify the nature of the reports or even whether VCU is offering all the "statistical reports."

2.        Hard copy.  The only way to analyze hard copy data is to reenter the data.  I have no intention of reentering data that surely are already in computer-readable form.

3.        Restricted scope.  VCU has not specified the documents being withheld (in further violation of the Act, of course) so I have no idea of what I would be giving up.  For certain, the grant paid $3,000 for data entry so there is a database (most likely in Excel or Access format) that you do not appear to be offering.  I attach the original list of requests (less the contract Ms. Lepley provided) for your reference.

4.        Secrecy.  The purpose of my FOIA request was to find out whether the VCU study for VDOE was, as appears likely from the face of the Study, a cover-up and to revisit the statistical analysis of the data that the report hints at but does not describe in any useful detail.  VCU's false response to my request for the data (that the responsive records had been turned over to VDOE) enhances the inference of a cover-up.  Your offer further buttresses that inference: If the data in fact support the conclusions in the study, VCU should be willing, nay happy, to see those data published where they can be subjected to analysis by the public.  Thus, I must conclude that VCU is hiding these data because it knows that the disclosure will be an embarrassment.  Stated otherwise, I am not willing to discuss my analysis of these data without providing access to the data; I am shocked that VCU does not take the same position.

If you would like to chat about this please give me a call.  If VCU wishes to make a more substantial offer, please let me have it by next Tuesday, November 10.  If we have not resolved this matter by then, or at least begun a more substantial conversation, I will (1) write the US Secretary of Education to complain about his reliance upon this flawed study that is founded upon secret data and (2) consult my lawyer about a suit to enforce FOIA.  Once those consultations begin, any settlement will have to include 125% of my attorney's fees (to cover the fees and the taxes I would have to pay on them).

I see you are out of town until Wednesday.  By copy of this to Ms. Ferguson I request under the Freedom of Information Act all public records created between August 9, 2009 and Ms. Lepley's false reply in which any officer or employee of VCU discusses, responds to, or refers to my FOIA request of August 9, 2009.  In particular, and without limitation, I request any email or other document in which Ms. Abrams or Mr. McMillan advises Ms. Lepley that the records that respond to the Aug. 9 request have been sent to VDOE.

Thanks again for your kind help with this interesting issue.


PS: It was miserably hot in Florida but I had the pleasure of the company of two of my siblings and the acquaintance of a number of remarkably stupid fish.  If you are having half as much fun now as I had then, you are enjoying yourself.


                 All summaries of data regarding the alignment of the VGLA.

                 All records that establish or comment upon the protocol for selecting VGLA portfolios for inclusion in the Study.

                 All records that identify the VGLA portfolios selected for inclusion in the Study.

                 All records that show, in the words of the Report, that "90 percent of all DOK ratings were within 1 point of each other."

                 All records that define or explain the meaning of the statement in the report that "90 percent of all DOK ratings were within 1 point of each other."

                 All records that establish or comment upon any statistical calculation regarding the validity of the ratings included in the Study.

                 All records of peer or other reviews of the Report.

                 All records that establish, report, or comment upon quality control or accuracy of the alignment reviews, whether by comparison of alignment reviews with those performed by Pearson, by evaluation of alignment reviews by VCU staff or faculty or students, or other measures taken to determine the accuracy of the alignment reviews performed for this study.

                 All records containing comments received from VDOE or others regarding drafts of the Study or regarding the Study itself or any study reports or draft study reports not contained in the Report.

                 All drafts of the Report and of any study reports not contained in the Report.

                 All study reports not contained in the Report.  In particular, but without limitation, all reports concerning the Study that were intended to be confidential or of limited distribution.


David L Ross/HSC/VCU wrote:


Thanks for your note.  I certainly have been somewhat dilatory in not having gotten back to you before now.  I have been incredibly busy and your issue kept getting pushed aside. I'm trying to clean up enough things that will allow me to leave town tomorrow, thought, so this is a good opportunity to let you know where I think things currently stand.

After having spoken with VCU's researchers and its sponsored program director, and, without trying to rehash what has been said before, I believe all here remain comfortable with the positions that have been taken to date relative to your FOIA request - and I believe your last message had seemed to acknowledge the legitimacy of the proprietary/research exemption that previously was claimed.   While you also have raised the lack of timeliness of the assertion of those exemptions since they were not cited in the initial response to your request, I believe we all understand that the initial response that no documents of the type requested actually existed at VCU was an honest response by those who were responding - based upon the information that had been shared with them as of that point in time.  It was only after you reiterated your request after having checked with the State Department of Education was it confirmed that certain records had been retained that previously had not been specifically disclosed.  Under these circumstances, we think it is unlikely that a court would rule that VCU effectively was estopped from raising a legitimate exemption once all the facts were known.

All this having been said, and in view of VCU's research data policy to which you made reference in a recent email, I now am advised that VCU is willing to supply a printout of the statistical reports that you have been seeking - which, I understand, would enable you to "look behind" the published study results.  There is concern about your past practices of posting data, etc. on your website/blog, however, and, the request, therefore, is that if the referenced reports were to be supplied, they would not be posted on your website.

This seems to represent a decent compromise of the issue we have been addressing - so let me know what you think.

I'll be out of town for a few days - returning to the office next Wednesday - so I'm copying Susan Ferguson so she will know the status of this matter and can be available to speak to any questions you may have.

Thanks - and I hope your time in Florida has been and will be enjoyable (apart from the heat/humidity).


David L. Ross
General Counsel
Virginia Commonwealth University
VCU General Counsel's Office
16 N. Laurel St.
P.O. Box 843090
Richmond, VA 23284-3090
Phone (804) 828-6610
Facsimile (804) 828-6614

The information contained in this email message and any attachments may be privileged, confidential, and protected from disclosure.  If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution, or copying of this communication is strictly prohibited.  If you have received this email in error, please notify me immediately by replying to this message and deleting it from your computer.  Thank you.



Note added on Nov. 19, 2009:

VCU responded yesterday to my FOIA request for the internal documents leading to their earlier (false) response that the database et al. that underlay the Study all had been sent to VDOE.  The response includes an email from Ms. Abrams that appears to be the source of the false information:

Where does this leave us?

  • The VCU study says that the portfolios it reviewed were "randomly selected."  VDOE says that it selected the portfolios from those that had not been overturned by the contractor's audit.

  • The audit last year overturned ca. 31% of the audited portfolios:

    Number %
    Upheld 3104 69.3%
    Lowered 759 16.9%
    Raised 619 13.8%
    Total 4482

  • So the study found that 23% of the upheld 69.3% were too easy as to math (assuming for the moment that VDOE selected randomly from the non-overturned portfolios). 

  • The Study reports that the English portfolios were too easy in 50% of the upheld portfolios (same assumption)

    Thus, we can estimate the rates of bogus tests:

            Math:            16.9% + 23% (69.3%) = 33% bogus

            English:        16.9% + 50% (69.3%) =  52% bogus

    NOTE: This assumes that the overturn rates were the same for both math and reading, since we only have overall numbers from VDOE. 

    • Why would anybody use a test that gives false results half the time?

  • One of the study authors told the VCU that all the files related to the Study had been sent to VDOE; now VCU admits that it still has the files. 

    • What do the misleading and probably false statement about "randomly selected" portfolios and the author's false statement to her University about the files say about the believability of the Study?

  • VCU refuses to produce the files because, VCU says, they are "proprietary." 

    • Why would VCU participate in a coverup?

On this record, I suggest that only a school administrator or VDOE bureaucrat could say without blushing that the VGLA testing is not a wicked and shameful fraud.

Note added on Jan. 9, 2010:  Superintendent Wright replied on Nov. 30.  The letter doesn't really say anything but it hints at some further questions I should be asking.  For instance, in the third paragraph:

Although participation rate differences between VGLA and VSEP are to be expected due to significant programmatic distinctions, VDOE is currently analyzing participation data and will work with school divisions to address issues that are identified.

  • "Programmatic distinctions"?  Sure.  For instance LEP kids in the VGLA but not VSEP.  But enough to explain a 140:1 VGLA/VSEP ratio?  Only to a bureaucrat who doesn't care about the truth.

  • "Currently analyzing."  You mean this outrage wasn't obvious to you three years ago?

  • "Work with school divisions"?  You mean you're not going to make them stop this abuse?

The Smoking Gun!

Note added on 1/27/10:  Today I learned that VDOE performed something very close to an audit in Buchanan County, the division with the largest VGLA tests/student ratio last year.  The report of a visit on Oct. 29, 2009 provides some astounding results:

  •  The Buchanan Superintendent admitted he had "encouraged the use of VGLA as a mechanism to assist schools in obtaining accreditation and in meeting AYP targets."

  • IEP's lacked "documentation" and rationales for inclusion in VGLA

  • VGLA was found to be applied across all content areas (i.e. maxed the number of VGLA tests each kid could take)

  • Widespread use of accommodations, "often . . . same across many students" (i.e. broad brush application)

  • IEP goals "same across many students" (ditto)

  • Sharp decrease in VGLA numbers since local investigation this summer

Thus we see an admission of malfeasance by the Superintendent and clear evidence of wholesale abuse of the VGLA (and the County's students) for a corrupt purpose.  We also see evidence that VDOE is perfectly capable of discovering this abuse.

It is encouraging to see VDOE taking this first step toward discharging its duty to supervise the public school system and to seek compliance with the Standards of Quality. Unfortunately, in addition to the question why VDOE has not looked at other divisions near the top of that list, this situation leaves open the fundamental questions of follow-up and accountability:

  • What will VDOE be doing to see whether its investigation and technical assistance have been effective ? This investigation must continue until Buchanan County has demonstrated that it has stopped abusing its students.

  • The Board, upon recommendation of the Superintendent, can fine, suspend, or remove the Buchanan Superintendent. Which will it be?

The Pusillanimous Regulator

Note added on 1/28/10:  Today I learned what VDOE did to the Superintendent in Buchanan.  They made him submit a Corrective Action Plan.

That's at least a start on follow-up but more nonfeasance by VDOE:  If a kid cheats on a test, he gets an "F."  If a Superintendent cheats on AYP by jamming kids into the VGLA, he must make one of his staff write a Corrective Action Plan.

In comparison to VDOE, the federal bank regulators look fierce. 

Reading the Corrective Action Plan is disheartening in another dimension.

  • The first "Action" in the Plan is to "[e]nsure participation criteria is being used during IEP meetings in which VGLA placement is being considered."  Apparently "criteria" is singular in Buchanan County (either that or they have abolished subject-verb agreement) (or maybe they are trying to qualify to be an administrator in Richmond).

  • A later "Action" is "[t]o present information and updates on the current VGLA Irregularity."  It seems that packing kids into VGLA creates a "VGLA Irregularity" instead of a scandal.

I need a drink.


[2] The VGLA is Virginia’s on-grade alternative test in mathematics and reading for students in grades 3 through 8 with disabilities that prevent them from accessing the Standards of Learning (SOL) test(s) in a content area, even with accommodations.  The VGLA in reading also is available to LEP students at level 1 or 2 of English language proficiency.


[3]  One teacher quotes her director from the VGLA inservice: “My dog could pass VGLA.”


[6] VDOE has provided data showing that about 30% of the portfolios flunk the audit.


[7] See the attachment.




Back to the Top

High Taxes | High Crime | Lousy Schools | Obdurate Bureaucrats

Last updated 04/01/12
Please send questions or comments to John Butcher