Skip to main content

Join the Community

or Close

Search

Follow-up: Looking for data in all the wrong places

In Dan Brown’s high school English class, students trade their scantrons and pencils in for a multifaceted performance-based Shakespeare project that leaves behind a trail of “visible learning.”

Last Friday, my students didn't come to school. Instead, all staff members gathered in grade-level teams to measure their achievement in the second our of three-per-year "Data Days." During the preceding week the kids took two 105-minute interim assessment tests (English and math practice SAT or released state exams) with Scantron and open-response sections. Then the adults spent all day breaking down the data, talking standards, identifying students of concerns, and writing action plans. This has become common practice in American schools

The structure is great; the opportunity to talk with a horizontal cohort of educators about students and what's going on in class is invaluable.

The problem is the data. It didn't jive with the achievement I'm seeing in class.

Many of my students who have surged in recent weeks absolutely tanked the test. Their multiple-choice scores are far below the caliber of achievement I've seen from them in class discussions, projects, and homework assignments. On the essay section, many were confused by the prompt and their writing veered way off-topic. A lot of these students have built their self-advocacy skills and know how to reach out to find clarification—but that's not allowed on tests the way it is in real life.

Timed tests are only a tiny part of success in college and beyond. The ability to work within deadlines is certainly a vital skill, but the highly pressurized write-on-demand Testing Conditions environment created in schools does not build or assess that. Taking high-stakes exams is a game with its own rules that barely—if at all—transfers any useful skills to adult life.

On data days, the test scores can be used only as a jumping off point for discussion; they are limited, even misleading measurements of student achievement.

Looking carefully at visible learning is important. Looking at multiple-choice scores produced under high-stress conditions is not.

Original Source

Teaching Ahead

This article originally appeared in Education Week Teacher as part of a publishing partnership with the Center for Teaching Quality. Reprinted with permission of the author.