Five years ago I wrote a commentary called "Professional Development: How Do We Know If It Works?" for Education Week. In the article, I argued that the only proper measure of success for teacher professional development is whether their students learn more as a result. I pointed out that in the extensive literature on the "best" approaches to improving teachers' skills, there was vanishing little evidence that interventions lead to increased student learning. Mostly this was because the people planning, paying for, and delivering the teacher training were not gathering the data needed to see if it helped students learn more.
Now, however, evidence is beginning to accumulate, at least in the area of mathematics. For a long time, intuition has told us that teachers who know more mathematics must be teaching it better, and indeed, studies have shown that secondary students whose teachers have advanced degrees in mathematics learn a little more than those whose teachers don't. But now (actually, in June 2009)Rolf Blank and Nina de las Alas of the Council of Chief State School Officers have done a meta-analysis of multiple studies that shed some light on what professional development can do to improve student learning.
Blank and de las Alas combed through all the studies they could find on professional development in math and science over a 20-year period. Eventually they found 16 well-designed US-based studies in math and science PD that included control groups (students whose teachers did not get the training) and good outcome measures. Twelve of the 16 studies focused on math. The researchers also identified features common among many of the interventions.
Their findings? In general, the effect of math professional development was positive but modest, with effect sizes averaging .21 for differences in pre-post test growth. That's enough to take an average student from the 50th to the 58th percentile. Effects were greater on test measures that closely matched the training and lower on more distant tests such as state tests.
These successful PD programs tended to share several features. They were focused on math content and how to teach it. The programs were long, averaging over 90 hours, with most of them spread out over six months or more. Included in the PD was a program of follow-up and reinforcement, including support from mentors, coaches, and colleagues to help teachers take what they had learned into the classroom. Effects for elementary teachers tended to be higher than those for secondary teachers.
The report has lots of meaty detail, and is well worth a read for those who want to design and carry out the most effective and cost-effective professional development. But for those who are not statistics nerds, it's good to know that we are finally accumulating rigorous evidence that the money school districts spend on programs to improve teacher skill really can pay off in increased student learning.