2013-11-11

PARCC re ducks

Many textbooks and other materials are lightly edited and rebranded by their creators as Common Core aligned, but being there is no central ministry of education, as in Singapore, which reviews materials and issues an official government seal of approval, anyone can make such claims with impunity.  Some education departments are making their own determinations, such as NYC, which chose “Houghton Mifflin Harcourt's ‘Go Math’ program for elementary students, and Pearson's ‘Connected Math Program 3’ for the middle grades”, or Louisiana, which last year rejected “every math and reading textbook submitted by publishers”.

The precise wording of the 93-page Common Core State Standards for Mathematics notwithstanding, a lack of consistency in interim assessments, independently developed and posed to students in states such as Kentucky, New York, Illinois and North Carolina, raises the issue of whether these test questions accurately reflect the Standards and manifest Common Core’s intent, but no matter: states, too, are barreling ahead with no independent oversight.

Carol Burris, a principal at a high school on New York’s Long Island, whose essays are often published in the Washington Post blog The Answer Sheet, recently critiqued a math test for first graders and critiqued several sample math questions.  Lest we ourselves become completely overwhelmed by myriad Common Core offerings that run the gamut, we declined to pass specific comment on those independently written questions, and instead continue to focus on states’ sample and/or actual assessments and, to date, sample-only questions designed by the two  “official” consortia, SBAC and PARCC.

This preamble brings us to PARCC’s latest batch of sample items, twelve in total, released in early November, for Grades 3-6 (nothing new for Grades 7 or 8) and high school.  Fasten your seat belt…

2013-11-07

Computer-based assessments

Our ongoing review of Common Core sample tasks from the SBAC and PARCC consortia and the State of Illinois, as well as questions that New York this year actually posed to students in the interim period before consortia assessments take over, has exposed issues with standards alignment, poor wording, incorrect mathematics, and odd interfaces, but no issue stands out more than this: none of the SBAC or PARCC extended tasks as of yet take advantage of technology’s capabilities in such a way to justify the transition to computer-based assessments.

Jason Becker, on his blog, characterized it this way:
[The SBAC tasks] represent the worst of computerized assessment. Rather than demonstrating more authentic and complex tasks, they present convoluted scenarios and even more convoluted input methods. Rather than present multimedia in a way that is authentic to the tasks, we see heavy language describing how to input what amounts to multiple choice or fill-in the blank answers. What I see here is not worth the investment in time and equipment that states are being asked to make, and it is hardly a ‘next generation’ set of items that will allow us to attain more accurate measures of achievement.

2013-11-02

Illinois joins the procession

The Chicago Tribune on Halloween 2013 reported “Illinois grade school test scores plunge — especially in poor communities”.  Following on the heels of New York’s testing debacle, in which performance gaps widened as well, we’re beginning to detect a pattern.  

The Illinois State Board of Education, on its student assessment page, reports that it had to “adjust the performance levels on the ISAT for Reading and Mathematics to better align with the more rigorous standards of the Common Core”, and a separate document entitled “2013 ISAT Mathematics Assessment”, dated December 5, 2012, and authored by “Rachel Jachino, ISBE Mathematics Principal Consultant”, states that “[a]pproximately twenty percent (20%) of the operational items on the Reading and Mathematics ISATs were written to Common Core Standards and will be included as part of students’ scores/results for the 2013 ISAT,” but it remains unclear what portion of the drop in 2013 scores is attributed to when the “state increased the scores required to pass ISAT math and reading tests by 13 to 30 points, depending on the test and grade” (Tribune), or to an actual change of test content.

Is any improvement in store for this year?  Cut scores are not set by teachers, but we’re well into the 2013-2014 school year, and the 2014 ISAT Mathematics “Roadmap” (pdf), intended to guide teachers in their instruction, has yet to be released.

The ISBE recently removed sample questions for the 2013 tests on the premise that the “sample items displayed were not necessarily representative of the material that will appear on the 2014 ISAT”, which, incidentally, will be a one-year deal, because Illinois will “replace the ISAT with the Partnership for Assessment of Readiness for College and Careers (PARCC) assessments during the 2014-2015 school year.”

It’s a foggy road ahead indeed for Illinois teachers and their students.