The Cranky Taxpayer
In August 2006 VDOE hired VCU to review the "alignment" of the SOL tests with the Standards of Learning. The Study (link to VDOE site now is busted; study apparently taken down; pdf copy is here)included an assessment of the VGLA tests for Grades 3-8 in reading and math.
The report, dated January 8, 2007, concludes as to the VGLA
of the VGLA Mathematics Grades 3-8 review indicated that generally the
alternative portfolio assessments were aligned with the SOLs. The results
suggest that range of SOLs demonstrated in the portfolio collections at each
grade level accurately reflected the SOLs. The agreement between the DOK
levels of the portfolios and that of the SOLs was generally
The alignment of the VGLA Reading Grades 3-8 portfolio collections was also quite good with regard to the coverage of the SOLs or range-of-knowledge. The evidence included in the portfolios generally addressed all of the SOLs at the specified grade levels. However, the inconsistency between the DOK levels demonstrated in the portfolios and those expressed in the SOLs, is a concern at Grades 3, 5, 6, and 8. The evidence included in these collections was generally at a lower cognitive level than that expressed in the related SOLs.
To translate this into English, we need to know that "alignment" involves two questions about the VGLA: (1) Whether the content measured in the VGLA test reflects the content required by the Standard of Learning (remember the VGLA is supposed to be at grade level and cover the same material as the SOL), and (2) whether the depth of knowledge ("DOK") presented in the VGLA evidence demonstrates the depth of knowledge required by the SOL, i.e., whether the test is tough enough.
With that under our belts, we can state the two questions in the Mother Tongue:
Thus translated, the report says:
Unfortunately, these conclusions considerably understate the problems the Study found in the VGLA. Moreover, the Study protocols raise a serious question whether the Study, even so, was merely a whitewash afloat in a sea of jargon.
The Results are Worse Than the Summary Admits
The Study evaluated a set of "randomly selected" VGLA portfolios for Depth of Knowledge (again, "DOK"), i.e., whether the portfolio was as cognitively demanding as the Standard. The DOK was adequate if >50% of the test items had DOK levels at or above the corresponding SOLs. The standard was "weakly" met at 40 to 50% and not met below 40%.
The Range of Knowledge similarly measured the extent to which the VGLA portfolios demonstrated the span of knowledge required by the Standard.
As to the math test, the report measures depth and range of knowledge in five areas. Here is a summary of the results, where "<" denotes "weakly met." :
That is, the Study reports that the Depth of Knowledge required by the math VGLA tests was insufficient in seven of 30 respects, i.e., 23% of the time. Thus we see that the testing rigor the report calls "generally consistent" in fact flunks almost a quarter of the time.
Viewed otherwise, a kid taking the VGLA can have his math scores boosted by up to 23% because of the easier tests.
If you think a 23% score boost would make the VGLA popular with a school or division focused on AYP, stay tuned:
Turning to the English test, because of the nature of the material there the report measures depth and range of knowledge in only two reporting categories ("Use of word analysis strategies and information resources" and "Demonstrat[ion of] comprehension of printed materials"). Here is a summary of the results:
As to the depth of knowledge measured by the VGLA portfolios, the report says that the VGLA portfolios were "generally at a lower cognitive level than that expressed in the related SOLs." In fact, as we see from the table, the VGLA tests were too easy in 50% of the categories measured. That is, the "lower cognitive level" of the English VGLA tests offers the students a score boost of up to 50%.
Is it any surprise that the schools are discovering handicaps and offering the VGLA at a prodigious rate?
When the report characterizes a 23% score boost as "generally consistent" with the SOL, we have to wonder how tough the grader is.
The circumstances surrounding the Study reinforce our sense of wonder:
So to validate the VGLA tests VDOE turned to the local university and two faculty members who are thoroughly embedded in the culture (supervised by VDOE) that finds it acceptable to slap a "handicap" label on kids in order to boost the SOL scores. The Study used Virginia teachers and administrators (selected by VDOE) to do the evaluations and, apparently, did not even attempt to measure the accuracy of their evaluations.
If there is any surprise here it is that the Study discovered and reported that the VGLA tests are too easy.
- - -
Note added on November 12, 2009:
On Aug. 20, I chatted with Charles Pyle and Dr. Shelley Loving-Ryder of VDOE. VCU withheld the underlying data, claiming that all the information has been given to VDOE and they (VCU) no longer had it. Pyle and Loving-Ryder said they had the report, five appendices that have not been made public (but that, based on VDOE's description of their content, do not interest me), and two boxes of forms filled in by the evaluators. They said they do not have the spreadsheet or database for which the grant to VCU paid $3,000 for data entry.
The folks at VDOE have always played straight with me, and it is not credible that these VCU scholars would give up the only copy of their data, so I thought VCU was stonewalling. I escalated the inquiry over there to the General Counsel.
The letter reproduced below summarizes both the problem and current situation:
November 12, 2009
Hon. Patricia I. Wright
RE: Potential Defects in Virginia Alignment Study for the VGLA
Dear Secretary Duncan and Superintendent Wright:
Certain aspects of Virginia’s alignment study of the Virginia Grade Level Alternative (VGLA) and certain circumstances surrounding it suggest that the VGLA portion of the study may understate the lack of alignment, particularly as to depth of knowledge (DOK).
The circumstances begin with the following:
· The number of VGLA tests administered in Virginia has exploded since the test first was administered in 2004, while the number of VSEP tests (the VGLA analog for students in high school) has remained minuscule. For example, in 2008-09, Virginia schools administered 47,113 tests under the VGLA but only 331 under VSEP. Indeed, because of the small number of VSEP tests, Virginia has been unable to assess the VSEP alignment and has abandoned VSEP testing for AYP.
· One parent explained this disparity by saying that the tests in the locally-graded VGLA are so easy that the kids cruise through and then hit a wall at the state-graded VSEP.
· A number of Virginia teachers complain that children are misclassified into the VGLA, that the tests are notoriously easy, and that cheating is rampant, all in order to improve school and division SOL scores and AYP performance.
In this context, the Study concludes that the DOK of the math VGLA tests reviewed were too easy in 23% of the categories and the reading test too easy in 50%. Yet there is evidence that even these shocking numbers may be understated:
· The Study claims that the VGLA portfolios studied were “randomly selected.” Yet the Virginia Department of Education (VDOE) tells me that it selected the portfolios from those that had been evaluated as satisfactory in its audit process. Thus, the study did not use “randomly selected” portfolios, but rather portfolios selected by an entity the Study failed to identify and preselected to be “good” data.
· Virginia Commonwealth University (VCU) performed the Study under a grant from VDOE. When I asked VCU for the database and statistical analyses that underlie the test, they responded that all the underlying materials had been forwarded to VDOE. VDOE, however, told me that they only had the grading sheets.
· When I turned again to VCU, they admitted that they indeed had retained the underlying data and they refused to produce those data, claiming that they are “proprietary.”
Even if VDOE made its selection randomly, the universe of the study was not all VGLA portfolios but those that the audit found to have been graded satisfactorily. Thus, the alignment analysis comes from a study that falsely reports the origin of its biased source of raw data. Beyond that, VCU first lied to me about whether it had the database et al. and then refused to produce those data that were paid for with public funds, produced at a public institution, and used to validate a testing program in Virginia’s public schools.
It seems to me that the only explanation for these anomalies is that the data I requested will disclose further defects in the Study.
I suggest that you demand that VCU produce those data both to you and to me so that we all can see for ourselves whether VCU’s remarkable behavior is, as it appears, a clumsy attempt at a coverup of defects in Virginia’s testing program.
With kindest regards, I am,
From: John Butcher [address deleted]
Thanks for your note.
I don't want to revisit the legal issues but I should caution you to not read any kind of concession into my last note. The very notion that VCU would or could hide the data that underlie that public study report, paid for by public money, is somewhere between hilarious and offensive.
You offer by way of settlement a hard copy of "the statistical reports" that underlie the study if I will agree not to publish the data on my website. That offer is unacceptable in at least four respects:
1. Lack of specificity. The term "the statistical reports" does not specify the nature of the reports or even whether VCU is offering all the "statistical reports."
2. Hard copy. The only way to analyze hard copy data is to reenter the data. I have no intention of reentering data that surely are already in computer-readable form.
3. Restricted scope. VCU has not specified the documents being withheld (in further violation of the Act, of course) so I have no idea of what I would be giving up. For certain, the grant paid $3,000 for data entry so there is a database (most likely in Excel or Access format) that you do not appear to be offering. I attach the original list of requests (less the contract Ms. Lepley provided) for your reference.
4. Secrecy. The purpose of my FOIA request was to find out whether the VCU study for VDOE was, as appears likely from the face of the Study, a cover-up and to revisit the statistical analysis of the data that the report hints at but does not describe in any useful detail. VCU's false response to my request for the data (that the responsive records had been turned over to VDOE) enhances the inference of a cover-up. Your offer further buttresses that inference: If the data in fact support the conclusions in the study, VCU should be willing, nay happy, to see those data published where they can be subjected to analysis by the public. Thus, I must conclude that VCU is hiding these data because it knows that the disclosure will be an embarrassment. Stated otherwise, I am not willing to discuss my analysis of these data without providing access to the data; I am shocked that VCU does not take the same position.
If you would like to chat about this please give me a call. If VCU wishes to make a more substantial offer, please let me have it by next Tuesday, November 10. If we have not resolved this matter by then, or at least begun a more substantial conversation, I will (1) write the US Secretary of Education to complain about his reliance upon this flawed study that is founded upon secret data and (2) consult my lawyer about a suit to enforce FOIA. Once those consultations begin, any settlement will have to include 125% of my attorney's fees (to cover the fees and the taxes I would have to pay on them).
I see you are out of town until Wednesday. By copy of this to Ms. Ferguson I request under the Freedom of Information Act all public records created between August 9, 2009 and Ms. Lepley's false reply in which any officer or employee of VCU discusses, responds to, or refers to my FOIA request of August 9, 2009. In particular, and without limitation, I request any email or other document in which Ms. Abrams or Mr. McMillan advises Ms. Lepley that the records that respond to the Aug. 9 request have been sent to VDOE.
Thanks again for your kind help with this interesting issue.
PS: It was miserably hot in Florida but I had the pleasure of the company of two of my siblings and the acquaintance of a number of remarkably stupid fish. If you are having half as much fun now as I had then, you are enjoying yourself.
• All summaries of data regarding the alignment of the VGLA.
• All records that establish or comment upon the protocol for selecting VGLA portfolios for inclusion in the Study.
• All records that identify the VGLA portfolios selected for inclusion in the Study.
• All records that show, in the words of the Report, that "90 percent of all DOK ratings were within 1 point of each other."
• All records that define or explain the meaning of the statement in the report that "90 percent of all DOK ratings were within 1 point of each other."
• All records that establish or comment upon any statistical calculation regarding the validity of the ratings included in the Study.
• All records of peer or other reviews of the Report.
• All records that establish, report, or comment upon quality control or accuracy of the alignment reviews, whether by comparison of alignment reviews with those performed by Pearson, by evaluation of alignment reviews by VCU staff or faculty or students, or other measures taken to determine the accuracy of the alignment reviews performed for this study.
• All records containing comments received from VDOE or others regarding drafts of the Study or regarding the Study itself or any study reports or draft study reports not contained in the Report.
• All drafts of the Report and of any study reports not contained in the Report.
• All study reports not contained in the Report. In particular, but without limitation, all reports concerning the Study that were intended to be confidential or of limited distribution.
David L Ross/HSC/VCU wrote:
Thanks for your note. I certainly have been somewhat dilatory in not having gotten back to you before now. I have been incredibly busy and your issue kept getting pushed aside. I'm trying to clean up enough things that will allow me to leave town tomorrow, thought, so this is a good opportunity to let you know where I think things currently stand.
After having spoken with VCU's researchers and its sponsored program director, and, without trying to rehash what has been said before, I believe all here remain comfortable with the positions that have been taken to date relative to your FOIA request - and I believe your last message had seemed to acknowledge the legitimacy of the proprietary/research exemption that previously was claimed. While you also have raised the lack of timeliness of the assertion of those exemptions since they were not cited in the initial response to your request, I believe we all understand that the initial response that no documents of the type requested actually existed at VCU was an honest response by those who were responding - based upon the information that had been shared with them as of that point in time. It was only after you reiterated your request after having checked with the State Department of Education was it confirmed that certain records had been retained that previously had not been specifically disclosed. Under these circumstances, we think it is unlikely that a court would rule that VCU effectively was estopped from raising a legitimate exemption once all the facts were known.
All this having been said, and in view of VCU's research data policy to which you made reference in a recent email, I now am advised that VCU is willing to supply a printout of the statistical reports that you have been seeking - which, I understand, would enable you to "look behind" the published study results. There is concern about your past practices of posting data, etc. on your website/blog, however, and, the request, therefore, is that if the referenced reports were to be supplied, they would not be posted on your website.
This seems to represent a decent compromise of the issue we have been addressing - so let me know what you think.
I'll be out of town for a few days - returning to the office next Wednesday - so I'm copying Susan Ferguson so she will know the status of this matter and can be available to speak to any questions you may have.
Thanks - and I hope your time in Florida has been and will be enjoyable (apart from the heat/humidity).
David L. Ross
The information contained in this email message and any attachments may be privileged, confidential, and protected from disclosure. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution, or copying of this communication is strictly prohibited. If you have received this email in error, please notify me immediately by replying to this message and deleting it from your computer. Thank you.
VCU responded yesterday to my FOIA request for the internal documents leading to their earlier (false) response that the database et al. that underlay the Study all had been sent to VDOE. The response includes an email from Ms. Abrams that appears to be the source of the false information:
Where does this leave us?
On this record, I suggest that only a school administrator or VDOE bureaucrat could say without blushing that the VGLA testing is not a wicked and shameful fraud.
Note added on Jan. 9, 2010: Superintendent Wright replied on Nov. 30. The letter doesn't really say anything but it hints at some further questions I should be asking. For instance, in the third paragraph:
Although participation rate differences
between VGLA and VSEP are to be expected due to significant
programmatic distinctions, VDOE is currently analyzing participation
data and will work with school divisions to address issues that are
The Smoking Gun!
Note added on 1/27/10: Today I learned that VDOE performed something very close to an audit in Buchanan County, the division with the largest VGLA tests/student ratio last year. The report of a visit on Oct. 29, 2009 provides some astounding results:
Thus we see an admission of malfeasance by the Superintendent and clear evidence of wholesale abuse of the VGLA (and the County's students) for a corrupt purpose. We also see evidence that VDOE is perfectly capable of discovering this abuse.
It is encouraging to see VDOE taking this first step toward discharging its duty to supervise the public school system and to seek compliance with the Standards of Quality. Unfortunately, in addition to the question why VDOE has not looked at other divisions near the top of that list, this situation leaves open the fundamental questions of follow-up and accountability:
Note added on 1/28/10: Today I learned what VDOE did to the Superintendent in Buchanan. They made him submit a Corrective Action Plan.
That's at least a start on follow-up but more nonfeasance by VDOE: If a kid cheats on a test, he gets an "F." If a Superintendent cheats on AYP by jamming kids into the VGLA, he must make one of his staff write a Corrective Action Plan.
In comparison to VDOE, the federal bank regulators look fierce.
Reading the Corrective Action Plan is disheartening in another dimension.
I need a drink.
 The VGLA is Virginia’s on-grade alternative test in mathematics and reading for students in grades 3 through 8 with disabilities that prevent them from accessing the Standards of Learning (SOL) test(s) in a content area, even with accommodations. The VGLA in reading also is available to LEP students at level 1 or 2 of English language proficiency.
 http://crankytaxpayer.org/Schools/vgla_quotations.htm. One teacher quotes her director from the VGLA inservice: “My dog could pass VGLA.”
 VDOE has provided data showing that about 30% of the portfolios flunk the audit.
 See the attachment.