Final ExamFor the past 25 years, the schools in Union City, New Jersey, have gradually improved and now rival some of the best schools in the state. How did they do it? Improbable Scholars author David Kirp, a professor of public policy who studied the schools for more than a year, concluded that Union City engaged in a process of “plan, do, review.” This systematic approach allowed district officials to select a path forward, implement it, assess its results, and make the necessary adjustments. The administrators, teachers, parents, school board members, and town officials of Union City aren’t smarter than anyone else. But they are far more patient and far less likely to be fooled into adopting a silver-bullet line of attack.

The New Jersey Department of Education (NJDOE) could learn a little something from this methodology. Instead, NJDOE officials lurch forward and embrace the use of student growth percentiles (SGP) to evaluate teachers, even though the research says they shouldn’t.

Dr. Bruce Baker, a professor at Rutgers’ Graduate School of Education, is one of those researchers who questions the wisdom of the state’s proposed teacher evaluation regulations. Baker maintains that student growth data cannot show the degree to which a teacher influenced student progress. He concludes, therefore, that the data should not be used to measure teacher effectiveness.

Naturally, the NJDOE points to research that supports its course of action, namely the Gates Foundation Measures of Effective Teaching (MET) Project. In this study, student achievement measures were a critical component in determining teacher quality.

The problem is, the statistical measures used in the MET Project are very different than those proposed by the NJDOE. As University of Southern California Professor Morgan Polikoff noted in Baker’s School Finance 101 blog, “…the MET project says nothing at all about the use of SGPs.” Polikoff, a member of the MET Project research team, goes on to explain that “The growth measures used in the MET project were, in fact, based on value-added models (VAMs). The MET project’s VAMs, unlike student growth percentiles, included an extensive list of student covariates, such as demographics, free/reduced-price lunch, English language learner, and special education status. Extrapolating from these results and inferring that the same applies to SGPs is not an appropriate use of the available evidence.”

Still, the NJDOE’s march toward the use of SGPs continues, even without evidence that this use  can effectively or fairly determine teacher quality. The department—and the teachers and students of New Jersey—would be better served by an evolving approach toward long-term improvement much like the way Union City met its challenges.

So says noted researcher Dr. Howard Wainer, author of Uneducated Guesses—Using Evidence to Uncover Misguided Education Policies (Princeton University Press, 2011).

“Experience has taught us a great deal about what kinds of optimization methods work in complex systems and what kinds do not,” writes Wainer. “What does work is the implementation of constant experimentation, in which small changes are made to the process. Then the effects of the changes are assessed. If the process improves, the size of the changes is increased and the outcome monitored. If the process gets worse, the changes are reversed and another variable is manipulated. Gradually, the entire complex process moves toward optimization. If we follow this approach, when the future arrives the system of the future is there to greet it.”

See for yourself

Read Howard Wainer’s April 8 op-ed, “The DOE’s Teacher Evaluation System Has Obvious Flaws That Ought to Be Corrected Before Initial Implementation,” at NJ Spotlight.

Bruce Baker’s April 10 blog post, “On Misrepresenting (Gates) MET to Advance State Policy Agendas,” can be found at