Apples & Oranges: Why This Year’s State Test Was a Waste of Students’ Time

Last week a group of CTQ-Colorado teachers attended and offered comments at the State Board of Education meeting. The following post is a written version of my remarks. While the board passed a symbolic resolution (in a 4-3 partyline vote) to withdraw from PARCC, one state board member boldy and publicly supported the standards and advocated for the aligned system our students deserve. If we must insist on a standardized test, I want the best one for my students. Core advocates (and skeptics!) — I hope you’ll park your thoughts on PARCC (& Smarter Balanced) here.

Last month I proctored the transitional Colorado standardized assessment (or TCAP) for what is supposed to be the last time. I watched my seventh graders tackle nine sections—3 math and 6 literacy—over the span of a two-week period. While strict proctoring guidelines prevent me from exploring the assessment myself, I was proud of the way my students handled this annual disruption to learning.

Of course, they’re trained well. For the past five years (or since my students were third graders), they’ve always taken CSAP or TCAP in the spring. The booklets, scripted directions, covered up walls, and time constraints are very familiar. So, what made this year different?

This year, aspects of the assessment made my students snicker and sneer. This year, more than any other, they saw a sharp disconnect between TCAP and the critical thinking and collaborative problem solving they do in math class; or the authentic reading and writing tasks we work on in English class.


Teachers, myself included, are using the Colorado Academic Standards (which include the Common Core for English language arts and math) as a framework for our instruction. But the assessment looked far more like the CSAP of the past, than the PARCC of the future.

What do students have to say about this year’s test? This is what I overheard during a lunch break after a few days of proctoring:

·      “Was that supposed to be an argument prompt? “Where’s the research?”

·      “Why would they ask us to write about writing?”

·      “Why would we write multiple paragraphs about THAT topic?”

·      “Do you think they choose boring passages on purpose to see if they can get us to quit reading and give up?”

And my personal favorite:

·      “Why don’t they let our literacy teachers create the prompts? They’d never make us do stuff that boring!!”

After they voiced frustration about the decontextualized writing prompts, I gave myself a homework assignment. I put myself in the shoes of a seventh grader and accessed a PARCC sample item on the website. This particular bank of items focused on the life and accomplishments of Amelia Earhart. Here’s what I noticed:

·      The texts were interesting and I was presented with three different sources and perspectives – a brief biography, a newspaper article, and a video clip.

·      The writing I was asked to compose was based on the topic I had just read about and researched. It was not a “random prompt” like the one my students described on the state test.

·      The multiple-choice items demanded that I actually read, understand and think about the information being presented.

·      The texts mirrored what I would see in a real social studies or literacy class.

I do not believe there is (or will ever be) a perfect standardized assessment. Nor do I believe that as a state or nation we will dump standardized testing altogether. (Although I strongly support and would love to see performance assessment movements driven by expert practitioners go viral!)

But in the meantime, as a teacher teaching in the reality of high stakes testing, I want the best possible standardized assessment for my students. Our current state test isn’t cutting it. PARCC is one giant leap forward. I hope Colorado doesn’t let perfect (or worse, politics) get in the way of significantly better. We cannot afford to waste our students’ time on assessments that aren’t aligned to the standards we are using to teach them.

Related categories: ,