Why Education Experts Resist Effective Practices

Here is a powerful and compelling paper written by educational psychologist Prof. Douglas Carnine, entitled “Why Education Experts Resist Effective Practices (And What It Would Take to Make Education More Like Medicine)”. It exposes deep-seated flaws in what passes for educational “research” and how damaging it is to the teaching profession.

“Just think how often ‘research shows’ is used to introduce a statement that winds up being chiefly about ideology, hunch or preference. …The education field tends to rely heavily on qualitative studies, sometimes proclaiming open hostility towards modern statistical research methods. Even when the research is clear on a subject – such as how to teach first-graders to read – educators often willfully ignore the results when they don’t fit their ideological preferences.

“[Project Follow Through] compared constructivist education models with those based on direct instruction. One might have expected that, when the results showed that direct instruction models produced better outcomes, these models would have been embraced by the profession. Instead, many education experts discouraged their use.

“In this insightful paper, Doug examines several instances where educators either have introduced reforms without testing them first, or ignored (or deprecated) research when it did not yield the results they wanted.”

Without testing them first? Ah yes, we know something about that, don’t we? It’s time for everyone in the education community – teachers, parents and students – to demand better standards so that these sorts of mistakes never happen again.

Measuring the effectiveness of teaching practices

This is a reproduction of a table in Prof. John Hattie’s book, Visible Learning (2009). The “d” numbers are effect sizes; they measure the effectiveness of teaching practices. It turns out, surprisingly, that d is almost always greater than 0. “When teachers claim that they are having a positive effect on achievement or when a policy improves achievement this is almost a trivial claim: virtually everything works. One only needs a pulse and we can improve achievement.”

After synthesising over 800 meta-analyses representing tens of thousands of studies, Hattie concluded that we should be aiming for an effect size of 0.40 (the overall average) or higher.

The results speak for themselves. Teachers as activators are more effective than teachers as facilitators. On the left, we have Direct Instruction and Mastery learning. On the right, we have Inquiry-based teaching and Problem-based learning.

On which side is the Numeracy Project?

VisibleLearning-ActivatorFacilitator