Below is Fred Smith’s initial reaction to the long-awaited release of the Technical Report of the 2013 New York State Common Core Math and English-language Arts (ELA) tests. Smith, a NYS testing expert and statistician, has long been sounding the alarm on the New York State Education Department’s (NYSED) lack of transparency. He is also an active member of Change the Stakes and has launched a campaign to Say “NO!” to Pearson stand-alone field tests, which were administered throughout New York State in June 2014. Currently, Smith is scrutinizing the item analysis data contained in the overdue 2013 Technical Report and “will be parsing some of its fuzzy verbiage.” At first glance, Smith reports, “there are a number of serious questions regarding the ELA exams that add weight to the concerns of educators and parents about their composition and use.”
Fred Smith: The New York State Education Department (NYSED) just posted the 2013 Technical Report— seven+ months past Pearson’s deliverable deadline. All 339 pages of it, in which the NYSED and the publisher have continued to deny useful information that the technical reports contained before Pearson took over the state testing program.
So now we can see what data they are showing us about the quality of the 2013 Common Core-aligned baseline tests three months after the 2014 exams have been given. The foundational 2013 Common Core ELA and Math tests were described last year as providing a “transparent baseline.” NYSED acts in bad faith and its words peter out in sheer derision.
No matter what the selective disclosure of the delayed data shows, this is an unacceptable way to operate and the antithesis of transparency.
Here’s one piece of clever obfuscation: Embedded Field Test Items (p. 8)
“In 2010, the Department announced its commitment to embed multiple-choice items for field-testing within the Spring 2012 Grades 3–8 ELA and Mathematics Operational Tests; this commitment continued for the Spring 2013 administrations of the Common Core assessments. Embedding field-test items allows for a better representation of student responses and provides more reliable field-test data on which to build future operational tests. In other words, since the specific locations of the embedded field-test items were not disclosed and they look the same as operational items, students were unable to differentiate field-test items from operational test items. Therefore, field-test data derived from embedded items are free of the effects of differential student motivation that may characterize stand-alone field-test designs. Embedding field-test items also reduced the number of stand-alone field-tests during the spring of 2013 but did not eliminate the need for them.”
Yes, imagine if General Motors said: “And we are committed to selling cars with brakes, as it makes driving safer. But when we can’t do that as much as we’d like to, there are times we have to sell cars without brakes.”
Thank you, Fred, for your insights. Stay tuned for Part II.