Who Will Watch The Watchers?

Tomorrow in Algebra I’ll finish a problem and the hands will go up. I know the script by now. Azelle thinks I made a mistake. Reyna found a shortcut and wants to know if it will always work. Corina’s question is three days ahead of the class. Joey’s is three days behind. Maya’s is just right. The rest of the class falls into familiar roles.

As a classroom teacher, I’m always open to new tools that I can use to better understand my students and help them become smarter.

However, recent developments in technologies that claim they can fine-tune individualized instruction have me thinking that some tools should be left in the shed.

For example, to address the 21st Century job skill of being able to stick with tough problems while overcoming barriers, a new technology called Affective Technology will supplement other means to develop a profile of each student’s grit, tenacity, and perseverance.

Affective Technology (and other new technologies) are discussed at length in two recent reports by the Department of Education (DOE): Enhancing Teaching and Learning Through Education Data Mining and Learning Analyitcs – An Issue Brief and Promoting Grit, Tenacity, and Perseverance: Critical Factors for Success in the 21st Century (Draft). All the new technologies involve collecting massive amounts of individual student data.

And that has opinion leaders as diverse as Michelle Malkin and Diane Ravitch voicing concerns about how the DOE has weakened the Family Educational Rights and Privacy Act (FERPA) in order to give tech entrepreneurs unprecedented access to students’ private data for commercial reasons. Both authors identify a company called InBloom, as a leader in this movement.

InBloom has received a $100 million grant from the Gates Foundation and had its database infrastructure built by a division of Rupert Murdoch’s News Corp.

InBloom, says Malkin, “evolved out of this strange-bedfellows partnership to operate the invasive database.” Ravitch adds, “Tech entrepreneurs are excited by the prospects of using student data for marketing purposes.”

InBloom’s website claims to guard data, except for the part buried in its privacy policy where it admits it can’t guarantee that stored or transmitted data are secure.

And besides all that, some of the means to collect all that data are just plain creepy:

In Affective Computing, according to the Promoting Grit report, “Reactions to challenge—such as interest, frustration, anxiety, and boredom—may be measured through analysis of facial expressions, EEG brain wave patterns, skin conductance, heart rate variability, posture, and eye-tracking. (Emphasis added.)

So where should a classroom teacher stand?

It takes me awhile to know my class to the point that I can predict their questions and make my instruction more personal, and it’s one of the fun parts of teaching. Yet I’m not always right. I benefit from consulting with colleagues, talking to parents, and reviewing kids’ cumulative folders.

I also welcome the concept of some kind of easily accessible student database, into which I can enter my experiences and see others’ observations and experiences. My colleagues and I have even imagined software that can provide Amazon-style recommendations regarding instruction for individuals – Something on the order of, “Students who learned to find slope with this assignment learned to solve inequalities with this assignment.”

But I don’t want teaching and learning to be creepy.

After all, if Amazon doesn’t need to know my EEG to recommend music based on books I order, why does Big Data need a student’s skin conductance to tell me he or she is nervous?

So, apart from reinstating the privacy guarantees of FERPA, here are four policy parameters to consider before allowing the DOE and organizations like inBloom to bring Big Data technologies to a district near you.

  • Districts should fully disclose any plans to adopt any Big Data technologies. The disclosure should include who is collecting the data, who is paying them, what data are to be collected, the means of collection, how the data will be used, and what rights protect parents and students.
  • Data collection should be independently audited and severe penalties for privacy violation and commercial exploitation must be in effect. Data crunchers should be held liable for stolen data.
  • Parent and student permission must be obtained yearly before any data are collected and have access to their files with full privileges. Parents and students can withdraw their permission at any time and their files must be destroyed.
  • No student or teacher in any classroom should be hooked up to any biometric device.

If parents, students, teachers, and district officials don’t demand limitations on data-mining, where will it end?

I’m facilitating the mandated on-line algebra curricula. The Watchers, who now monitor teachers’ biometrics as well as students’, notice that my eye movement and skin conductivity register ANNOYED after Nia makes a typically irrelevant, but interesting, comment. They correlate my annoyance with Nelson’s increased ANXIETY, registered by a spike in his heart-rate. I get an order in my earpiece to ask an easy question and call on him with a calm voice. But Albert is getting fidgety like I’ve noticed he does when he’s bored, so I ask him to address Nia’s point. The Watchers record my noncompliance.

But nobody is watching the Watchers.