# What are National Standards worth when our 10-year-olds are the worst in the world at multiplication?

Teachers and parents would naturally think that if a child is at or above the National Standard in Mathematics, then that child must be doing okay.

Results from the international Trends in Mathematics and Science Survey (TIMSS) tell us something quite different.

At the end of 2014, a representative sample of 6,321 New Zealand Year 5 students with an average age of 10.0 years were surveyed. Out of 49 countries, New Zealand placed 34th, behind all other participating predominantly English-speaking countries. Radio New Zealand put it a little more bluntly.

To be fair, some of the questions were considered too advanced for a New Zealand Year 5 student. However, when restricted to the questions deemed appropriate against the New Zealand Year 5 National Standards, the average student answered fewer than half of those questions correctly.

And yet, the National Standards data for 2014 tells us that 73.2% of New Zealand Year 5 students were at or above the National Standard. If we match this up with the TIMSS international benchmarks, it suggests that some of these students who were at or above the National Standard would probably have been classified as Low achievers in TIMSS.

A student meeting the Low international benchmark “has some basic mathematical knowledge. They can add and subtract whole numbers, have some understanding of multiplication by one-digit numbers, and can solve simple word problems. They have some knowledge of simple fractions, geometric shapes, and measurement. Students can read and complete simple bar graphs and tables.”

This is well below the standard we should expect for a 10-year-old. Now, spare a thought for the students who were classified as Below Low…

16% of our 10-year-old TIMSS participants were Below Low. These students completed fewer than half of the Low benchmark tasks correctly. This is a significant proportion compared to other countries, e.g. England (4%), the United States (5%), Australia (9%). In the top performing countries, less than 1% of their 10-year-olds are Below Low.

More concerning are the statistically significant increases in the large proportions of Māori (26%) and Pasifika (31%) students who were Below Low. If we are going to address the inequality in this country, providing these students with a maths education leading to greater opportunities would be a very good place to start.

I analysed the performance of our TIMSS 10-year-olds, question by question. They were mostly on the wrong side of average, but the stand-out questions were the basic arithmetic questions:

To add to the humiliation of coming last, 27 x 43 was a multiple choice question with four options. New Zealand’s result is worse than what we would expect from random guessing (25%). The previous cycle of TIMSS suggests a constructed response success rate would have been lower.

Our current maths curriculum has made our children so bad at basic arithmetic that they’d be better off guessing. Is this a standard to be proud of?

One might claim that it doesn’t matter – maths is not about the numbers, after all. TIMSS dispels that myth. There was a very strong positive correlation between country performance in Number versus both Geometric Shapes and Measures, and Data Display. There was also a very strong positive correlation between country performance in Knowledge versus both Applying and Reasoning. Suffice to say, number knowledge is a very strong predictor of success in all areas of mathematics.

Taxpayers should consider how much money has been spent on primary maths education since 2000 and question a further \$126m being spent over four years on more of the same, without addressing the obvious weak spot in the curriculum. I spoke to Radio NZ about it.

The cost of rolling out the Numeracy Project amounted to around \$70m in the first seven years. That’s about \$85m in today’s money. The goal of the Numeracy Project “was to improve student performance in mathematics through improving the professional capability of teachers”. It failed.

In 2015, the Minister of Education at the time said that around \$70m a year was available for professional development, and that was before she promised further money for maths professional development in response to the New Zealand Initiative’s Unaccountable report.

TIMSS informs us that New Zealand has higher proportions of teachers who participate in maths professional development compared to most other countries. Despite all this professional development, student performance has not improved since 2002, so why should we believe that further money spent on teacher training will make any difference?

To put these astonishing sums of money into perspective, the Government will spend just \$40m on rolling out the brand new Digital Technologies curriculum, including \$24m on teacher training. This is an area in which teachers will have very little experience, especially programming.

There is a much cheaper and effective form of professional development. Roll out my recent presentation to members of the New Zealand Educational Institute (NZEI Te Riu Roa) nationwide, and then see the maths that our kids can do.

Dr Audrey Tan, Mathmo Consulting
2 September 2017

TIMSS resources:
TIMSS 2015: New Zealand Year 5 Maths results, Ministry of Education
What we know about maths achievement: New Zealand Year 5 and Year 9 results from TIMSS 2014/15, Ministry of Education
TIMSS 2015 International Database

# Have New Zealand’s TIMSS maths scores really improved?

The latest Trends in Mathematics and Science Study (TIMSS) data has been released. At first glance, it looks like New Zealand’s maths scores have improved since 2010, but unfortunately we cannot be certain of this. The scores are published with a statistical margin of error, which means that if we were to run the survey again with different samples of children, we might not see the same “improvement”. If we include the published margins of error, we see overlapping bands of achievement rather than increasing lines from 2010 to 2014. In fact, over 20 years, New Zealand’s performance has been disappointly consistent. We’re still below average.

The Ministry of Education has been honest and sober in its reporting, but nevertheless, the Minister of Education has said, in congratulatory tones, that average scores had increased! How can she claim there is an improvement when her own officials say that scores haven’t changed?  Is she wilfully ignoring them, or does she needs a lesson on how to interpret statistical reports?

There was some encouraging growth in Year 5 students working at an “advanced” level, but at the other end of the spectrum, less than half of the student samples were working at the desired level of mathematics in the New Zealand Curriculum, and when looking at only the TIMSS questions which fit with New Zealand curriculum expectations, the average student answered just under half of these questions correctly. We have a high proportion of under-achieving students compared to other countries, and at the Year 9 level, this proportion has grown since 1995.

The Bring Back Column Addition Campaign was launched in response to New Zealand’s poor performance in TIMSS 2011(*). It would appear there is no reason to stop campaigning. We asked for some simple, pragmatic changes to the curriculum that would allow under-achieving students to progress. Without them, any improvements are likely to remain statistically insignificant.

Dr Audrey Tan, Mathmo Consulting
29 November 2016

(*) Internationally, TIMSS data is labelled by the odd-numbered years in which students in the northern hemisphere are assessed.  New Zealand students are assessed at the end of the year prior, hence the even-numbered years referred to in the Ministry’s reports.

# A brighter future for mathematics education in New Zealand

In the NZ Herald, Peter Hughes asks how TIMSS 2011 and PISA 2009 can produce “such wildly contradictory results” reflecting New Zealand children’s performance in mathematics.

We can argue endlessly trying to compare two assessments of children of different ages, taken at different times, assessing different aspects of understanding mathematics. It is pointless, not least because PISA 2009 points to a population no longer representative of 15-year-olds in New Zealand today. Whatever PISA may tell us, it should not detract our attention from the “very depressing” results produced by New Zealand primary school children who have known no other approach to learning maths than that of the Numeracy Project, the very curriculum Peter Hughes helped to write.

Hughes says we urgently need to work on algebra and geometry rather than “number”. I couldn’t agree more. So much for the pioneering curriculum that was supposed to develop “algebraic thinking” in children! Why are our primary school children spending so much time taking numbers apart and putting them back together again? This introspective over-analysis of numbers is not a good use of a young child’s time.

Curriculum co-writer Vince Wright says the Numeracy Project’s failure to deliver improved pupil performance is due to the insufficient maths knowledge of our primary school teachers. We can continue to beg for more funding to up-skill our teachers, but wouldn’t it be more practical to simplify the curriculum to meet the skill set of the teachers we have, present and future?

A bright future for maths education in New Zealand depends on bright beginnings, hence my campaign to Bring back column addition to New Zealand’s early primary maths curriculum. The aim is to redress the balance between written and mental methods of computation, and to make the curriculum more accessible to a wider range of students. I do not advocate teaching column-based methods exclusively, but it’s a good place to start. As Sir Vaughan Jones says, “We have this wonderful decimal system which took tens of thousands of years to bring to perfection and to not take advantage of it for basic operations is nothing short of folly!”

When recent untested ideologies about learning take precedence over the principles of mathematics, which by their very nature are the most logical of all, one has to wonder where things are heading. There should be a division of responsibility in writing a mathematics curriculum: content and delivery. Those with true, long-term mathematical knowledge and experience should determine the content. The educationists should determine how to deliver that content, and ensure our teachers deliver it effectively.

Our university mathematics lecturers should influence what is taught in secondary schools. They, together with well-qualified and experienced secondary school teachers, should influence what is taught in primary schools. This is my idea for maths education reform, and a brighter future for maths education, in New Zealand.

Dr Audrey Tan, Mathmo Consulting
April 2013