Is doing less harm enough for Duncan?

​Earlier today Secretary of Education Arne Duncan announced a slight—yet potentially significant—shift in his stance on teacher evaluation. Duncan extended an offer of flexibility to states, a one-year delay in implementing evaluation systems that would tie teacher performance to test results.

This is a good thing. And it could be the beginning of a great thing.

Admittedly, it has taken Duncan far too long to heed educators’ concerns about new evaluation systems reliant on tests not yet aligned with new college- and career-ready standards. Yes, teachers need more time to fine-tune their instruction in line with the new standards. Yes, any test involved in holding a teacher accountable must be fair, reliable, accurate, and well-aligned with the standards it measures.

And yes, when expert teachers raise sound objections to policies governing teaching and learning, decision makers have a responsibility to take them seriously.

But despite the dawdling, I am glad that Duncan is acknowledging the need for accountability systems to be reasonable, accurate, and responsible. And that he has now vowed a continued commitment to working in concert with educators to approach this policy goal sensibly.

So, what next? How can his administration ensure that evaluation systems actually fulfill their intended role of improving teacher effectiveness? Here are some teacher evaluation facts (based on researchers’ findings) and relevant next steps for the Duncan administration:

Fact: The “value-added” statistical techniques that are part of many states’ systems to assess teachers’ effectiveness in improving student achievement are extremely unstable.

  • Next step: The federal government could spread best practices for evaluation found in top-performing nations like Singapore where trained observers use professional judgment—not rigid, formulaic statistical modes—in assessing how teachers support the whole child and spread their teaching expertise.

Fact: Teachers rarely receive usable and timely feedback from the value-added data that are used to judge them.

  • Next step: Second, the USDOE could help to ensure that evaluation systems are useful by encouraging states to implement serious peer review systems that give teachers information and support that help them improve throughout the school year. A number of school districts in the U.S.—including Montgomery County (MD) as well as Poway Unified and San Juan Unified school districts (CA)—have implemented such systems effectively.

Fact: Teachers do not have access to high-quality professional development found in top-performing nations— and they have very little time needed to work with their colleagues in improving teaching and learning.

  • Next step: The USDOE should maximize its new Teach to Lead initiative—inviting accomplished teachers to create and lead professional learning systems that spread expertise to improve student outcomes. (Example: Check out some of teacherpreneur Ali Wright’s ideas for her home state of Kentucky.) Evaluation systems can contribute to teaching quality—but they are useless if teachers don’t have access to time and high-quality opportunities to learn and improve throughout their careers.

As a result of today’s announcement, Duncan’s administration will certainly do less harm than might otherwise be the case. But will they do what’s right to improve teaching effectiveness for the long haul? That much remains to be seen.

  • LotharKonietzko

    Duncan and company

    Barnett, I like your take on things here.  I have been truly frustrated with the President and Sec. of Edu Duncan, so much so that I have lost all hope in both for any meaningful education policy that deals with reality.  The poin that hits home for me as I gear up for my 16th year is the professional devleopment piece.  As a dept. chair for Social Studies I’d like to do things with my colleagues that improve the teaching of the subject area.  Time for that is limited or not there because of other meetings we are requried to attend that deal with looking at data, how to interpret the data, and what data is needed next in terms of pre and post testing kids.  Being in a priority school drains the lifeblood out of creative teachers because they are being forced to become statistisians rather than innovative teachers.  If I had time with my departmetn members I would do a PD on trying to use the NCHE History Habits of Mind for teaching history/social studies.  That is not likely to occur because there will be some meeting to attend to that nobody wants to be at and thus inspiration is put on hold yet again thanks to the fall out from No Child Left Behind = No Teacher Left Standing and Race to the Top = Race to the Bottom.  Mr. Duncan has truly not inspired me, he frustates me because he is too far removed from the day to day work of teachers in this country.  Thank you for posting this.


    • BarnettBerry

      You too nailed it, Lothar

      Lothar. You too hit the nail on the head. It is time for the Secretary to recognize that how little time teachers have because of mandated data meetings that are out of touch with what top-performing nations expect of teachers. In Shanghai, for example, teachers’ PLCs are inquiry-driven, not data-driven. Big difference!


      • LotharKonietzko

        Inquiry-Driven PLC

        Is there a site/page that you could share that talks about inquiry driven PLC’s?  It seems that somebody would come back with, “you can’t have inquiry without data.”  Chicken or Egg kind of circle in the making?  I can just hear somebody going you need to look at your data and use that as your guide for questioning . . . how do you get around that?

        • BarnettBerry

          Inquiry vs. Data-driven PLCs

          The best example may be found in how Asian nations, particular Singapore, engage in inquiry-driven PLCs — thru lesson study. This brief article, the key feature of lesson study is “looking deeper into the children’s responses and developing effective strategies through that.”



          NOT test scores plotted from an excel spread sheet.




  • SusanDudaOsborne

    I agree, but…..


    Thank you for posting the most up-to-date comments from Mr. Duncan regarding teacher evaluation. While I agree with you that this is a good thing, what about the teachers I represented in Michigan that lost their job due to the flawed evaluation system that was put into place?

    I do like your “next steps”. I particularly like the Teach to Lead concept of providing accomplished teachers the ability to spread their expertise in leadership roles while staying in the classroom!

    It was nice talking with you in DC!

  • Barnett Berry

    You are so RIGHT!

    Susan. this is a a point that escaped me. Thank you for bringing it up as it is no trivial matter. The adage used by some VAM advocates is that you cannot let perfection get in the way of progress. But given the lives at stake caution should not be thrown to the wind. In Houston I believe a lawsuit has been filed claiming that administrators are changing — i.e., “distorting other measures of their effectiveness (e.g., observational scores) because administrators are being told to trust the EVAAS data and to force such alignments between the EVAAS and observational output to (1) manufacture greater alignment between the two and (2) artificially inflate what are currently the very low correlations being observed between the two.” Check out VAMboozled blogsite.

  • Barbara W-F



    Thanks for breaking it down and making some valid suggestions. Unfortunately, I, as many teachers around the country believe…Duncan’s comments are too little and too late! I was involved in one of those conversations with USDOE staffers. I spoke truthfully and passionately about the job that teachers do every day and the difficult position HST testing puts us in….teach to the test, improve the scores, get a high rating of effectiveness…or stay true to what you know is best for your students and risk losing your job?  

    We have yet to design a test that measures what truly matters.  I am not anti-testing, although I am getting close, but I do want to see a BIG shift away from our reliance on a set of numbers.  I am highly offended when an administrator asks me where the data is to back up something I have stated…I am a professional…I know my students and I don’t need a battery of test scores to back me up…this is my area of expertise.  

    We have gone so far that we rely more on bubbles on a sheet of paper than our own expertise.

    I am reserving judgment o the Duncan blog….we will see if he can “put his money where his mouth is” and is sincere about what he said…


    2009 Rhode Island