In December 2020, RNZ reported that New Zealand’s Year 9 students recorded the worst-ever results in maths and science. Four years earlier, they reported that the same generation of students, New Zealand’s Year 5 students, were the worst at maths in the English-speaking world.
We have far too many students struggling with basic numeracy tasks. Looking at the TIMSS 2019 results for Year 5 students,
- only 30% of this group could calculate 6 × 312 (2nd last, international average 64%)
- only 26% of this group could add 385 to 5876 (last, international average 63%)
- only 25% of this group could find the number added to 73 with a sum of 1068 (3rd last, international average 49%)
- only 19% of this group could divide 927 by 3 (2nd last, international average 48%)
- only 16% of this group could choose the correct answer to 27 × 43 (last, international average 52%)
NB: In 2019, TIMSS conducted their survey on paper in some countries (including New Zealand and Australia), and electronically in other countries (including England and Singapore). Relative placings and international averages are for the paper survey only. Combined relative placings on the questions above differ by no more than one place. Combined international averages on the questions above differ by no more than four percentage points.
In case you hadn’t noticed, that last question was multiple choice. New Zealand’s success rate is exactly the same as in 2015, and worse than what we would expect from random guessing (25%). An earlier cycle of TIMSS suggests a constructed response success rate would have been much lower.
This is not exactly news. Looking at the TIMSS 2015 results for Year 5 students,
- only 25% of this group could add 385 to 5876 (2nd last, international average 66%)
- only 20% of this group could divide 45 by 3 (11th last, international average 47%)
- only 17% of this group could subtract 532 from 4809 (2nd last, international average 57%)
- only 16% of this group could choose the correct answer to 27 × 43 (last, international average 51%)
In TIMSS 2011, New Zealand’s Year 5 students finished last-equal among peers in participating developed countries:
- almost half of this group could not add 218 and 191 in a basic word problem (8th last, international average 73%)
- only 32% of this group could calculate 5631 + 286 (2nd last, international average 72%)
- only 8% of this group could calculate 23 × 19 (6th last, international average 41%)
If you think it doesn’t matter that children can’t perform these basic numeracy tasks (e.g. “they don’t need to calculate any more because we have calculators” or “it’s more important to develop their reasoning and problem solving skills”), then think again. When examining the performance of all countries participating in TIMSS 2019, there is a strong positive correlation between performance in the Number content domain and performance in the other two content domains, Measurement and Geometry, and Data. Similarly, there is a strong positive correlation between performance in the Knowledge cognitive domain and performance in the other cognitive domains, Applying and Reasoning. In other words, strong numerical proficiency paves the way for success in all aspects of Mathematics and Statistics, and you can’t solve higher-order problems or perform complex reasoning unless you have a good bed of knowledge to begin with.
What on earth has gone wrong?
Twenty years ago, a radical new approach to teaching mathematics in New Zealand, known as the Numeracy Development Projects or just the Numeracy Project, was rolled out across the nation before there was any robust proof of its effectiveness. In fact, the decade-long roll-out was the research experiment.
The goal was to raise student achievement by strengthening the capability of teachers through professional development. The philosophy was to prioritise conceptual understanding over procedural knowledge and skills (frequently called “rules”, with a negative inference). Despite acknowledging the interdependence of “knowledge” and “strategy”, the teaching of knowledge was relegated to a mere “ten-minute whole class warm-up at the beginning of lessons“. Written methods such as column addition and subtraction, divisively labelled as “algorithms”, were not to be taught to students, if at all, until students had jumped through a series of hoops for five or six years, calculating in their heads using increasingly complex mental strategies. The easiest methods for adding and subtracting numbers literally became the last lessons on addition and subtraction.
The Numeracy Project researchers created their own stages of progression called the Number Framework, and so the research conveniently showed that students were, um, progressing.
Or were they? In 2005, the National Education Monitoring Project reported that students were “improving in tasks that require quantitative reasoning skills, but declining in basic mathematics facts and solving simple number problems.”
In 2009, “there was no meaningful change in number task performance between 2005 and 2009, for either year 4 or year 8 students. The most notable change in performance was a decline for year 8 students on multiplication problems, where changes in computation strategy were clearly evident.”
The long-term trend from 1997 to 2009 was “a small net improvement in mathematics performance at year 4 level (held back from a larger improvement by the decline between 2001 and 2005 in basic fact knowledge), and essentially no net change in mathematics performance at year 8 level.”
It was clear the researchers needed to do something to address the deficiencies shown up by NEMP. They did do something. They shut down NEMP.
And, presumably, knowing their jobs and their Numeracy Project facilitators’ jobs were on the line, they secured their future by having the Number Framework embedded into the revised New Zealand Curriculum and writing the Mathematics National Standards to align with their aspirations, just as the experiment was coming to an end.
In 2010, the awful truth could not be hidden any longer. The final, longitudinal Numeracy Project study concluded that “the absolute levels on the Framework attained by students were in many cases well short of the numeracy expectations for students at particular year levels stated in the New Zealand Curriculum and in the Mathematics [National] Standards.”
An estimated $100M (based on $70M spent in the first seven years) of taxpayers’ money had been spent on a revolutionary approach to teaching maths, and it didn’t work. The experiment had failed.
But it was too late and presumably too embarrassing for the Ministry of Education to pull the plug, and much easier to go along with the idea that it was a “temporary situation while teachers are continuing to upskill themselves”. Apparently, “many teachers stuck very closely to the printed NDP resources (the “pink” books)”, which “could reflect the low levels of confidence that many teachers still have [despite two years of professional development]…In retrospect, it may be that teachers needed to receive support by facilitators for considerably longer…”.
It’s remarkable how the teachers could be criticised for doing exactly as they were told. Did the Ministry ever wonder whether the pink books were the problem? And what does “considerably longer” than two years of professional development look like? It suggests that not even our student teachers would be adequately prepared to teach mathematics by the time they complete a three year Bachelor of Teaching degree.
In New Zealand, a distinction is made between mathematical (content) knowledge and pedagogical (content) knowledge (i.e. how to teach maths). The parallels with the Numeracy Project are uncanny: initial teacher education (ITE) focusses almost entirely on pedagogical knowledge, with little regard for mathematical knowledge. A 2012 survey of first-year student teachers showed that “students enter ITE with minimal levels of mathematical content knowledge…It is questionable whether their performance can be brought to an acceptable level. Currently students are not assessed before graduation to ensure they meet numeracy competency requirements.”
If publishing the results of this survey was intended to raise the flag on the universities’ inadequate preparation of their student teachers, it fell on deaf ears. To this day, still only one university in New Zealand bothers to assess the numeracy of all their student teachers before graduation.
What can be made of such a muted response by the universities? There is little incentive to improve the training of pre-service teachers if there is an opportunity to sell Ministry-funded professional development to in-service teachers. In the decade following the Numeracy Project, primary school teachers in New Zealand engaged in higher levels of mathematics professional development than the international average. Yet, the “temporary situation” of 2010 has not improved. Page 53 of this report provides insight into what some Ministry-funded professional development programmes look like.
If, on the other hand, publishing the results of this survey was intended to provide a reason for the failure of the Numeracy Project, the researchers had left it too late. They weren’t assessing students taught in the days of old; they were assessing the earliest victims of their own experiment. These kids were now feeding back into the system as teachers.
Enough is enough. The results speak for themselves. Our primary school teachers are not at fault for a flawed curriculum based on an academic theory of learning. Our children deserve a quality mathematics education; right now, some of them are not even getting a basic one.
What should happen next? That will be addressed in my next post.
Dr Audrey Tan, Mathmo Consulting