researchED Auckland 2018


Isn’t it crazy that, in 2018, we’re still “working out what works” in Education?

In fact, some of us do already have a pretty good idea of what works, but getting the right people to listen is a different problem altogether.

And so, a group of like-minded individuals (and maybe a couple of sceptics) gave up their Saturday on Queen’s Birthday weekend to attend New Zealand’s very first researchED conference in Auckland. researchED is a growing movement based in the UK but spreading internationally, “a grass-roots, teacher-led project that aims to make teachers research-literate and pseudo-science proof” (and by golly does this country need proofing). Founder Tom Bennett quickly realised that his own teacher training was based more on edu-myths and dogma (e.g. learning styles) than any scientific, evidence-based research.  He’s not the only one.  Daisy Christodoulou’s book, Seven Myths About Education, is the coffee that any waking 21st century learning fanatic should smell.  Briar Lipson at the New Zealand Initiative hasn’t spent very long in this country, but has already sized up our education system very well and should be commended for bringing researchED to New Zealand.

Every talk raised serious questions about how we teach in New Zealand, and everyone was there in the belief that we can, and should, be doing better.  Not surprisingly, the academics are calling for the Ministry of Education to change their ways and look for evidence before adopting fads as policies, while the pragmatic principals and teachers cannot afford to wait and are simply getting on with things.

The common factor of the day was subject knowledge and the importance of committing knowledge to long-term memory.  The 21st century learning ethos suggests that we should leapfrog, or at least skim over, these foundational skills in a bid to produce generic critical thinkers and problem solvers, but surely common sense tells us we cannot reasonably expect students to think critically or solve problems unless they actually have some knowledge to work with.

I have no desire to repeat what has been said so well by others, so instead I will direct readers to a newly created blog by Derek Hopper, a music teacher at Tauraroa Area School who has read up on what works and is spreading the word.  He and his colleagues are seeing significant improvements in student behaviour and achievement. Happy students, happy teachers.  Having already spoken to a maths teacher at Tauraroa who is offering guidance to their primary teachers, I believe this school may well provide the model for other schools to follow.

Some other reflections of the day:

Tom Bennett, founder of researchED: Teachers might think that indulging in (catering for individual) learning styles is a harmless bit of fun, but there is no time to waste when teaching children from disadvantaged backgrounds.  Every minute counts.

Katharine Birbalsingh, keynote speaker and founder/Headmistress of the evidence-informed Michaela Community School in London: Her teachers do not play “Guess what’s in my head?”, i.e. they don’t question their students before the relevant knowledge has been taught, so that every student, regardless of their background, has an equal chance of answering the teachers’ questions correctly.  A subtle but powerful way to address social inequity and level the playing field.

Dr. Michael Johnston, Victoria University: When new skills are learned and practised sufficiently, they become automatic and free up the working memory to concentrate on higher-order thinking.  With particular reference to mathematics pedagogy, the current NCEA internal assessment system provides little incentive for students to practise skills and procedures to the point of automaticity, and if they haven’t reached that point, then they will struggle with the cognitive demands of solving the contextualised problems presented in assessment.

Prof. Elizabeth Rata, Auckland University: Already widely known for her views on the lack of academic knowledge in the curriculum.  When she used the definition of the apostrophe as an example of understanding the epistemic structure of academic knowledge, I genuinely thought she was going to ask the audience if they had spotted the misplaced apostrophe in the previous slide.  She didn’t.  I suddenly felt alone.

Dr. Graham McPhail, Auckland University: There is little evidence that deep learning occurs through subject integration.  Wineburg and Grossman (2000) warned that ‘often the choice to implement a new curriculum is based on symbolic factors, such as a desire to be seen as progressive and in the forefront of reform’.

Louise Zame, primary school teacher:  When listening to a teacher speak so eloquently about the professional challenges of implementing Inquiry Learning…to a bunch of 5-7 year olds…you realise just how much the Ministry of Education has lost the plot.  As part of her Master’s research, Louise asks the pertinent question: what content knowledge do young students (aged 5-7 years) gain through inquiry learning?

Dr. Shaun Hawthorne, Cognition Education Ltd: Prof. John Hattie has recently updated his list of influences on student achievement, and top of the list is now “collective teacher efficacy” with a whopping effect size of 1.57.  For those who don’t know about Hattie’s effect size measure, almost everything on the list has a positive effect, so teachers and schools should not be too complacent. They should be looking to maximise their impact, and punching above the average effect size of 0.40.

To finish:

  • I was probably the only person excited to spend a bit of time in the Vaughan Jones Room during the lunch break.
  • Great care must be exercised when evaluating “evidence-based research”.  There is a lot of rubbish out there.  For example, the Numeracy Development Projects “research” showed that if you teach children strategies then children will learn strategies.  Big deal.
  • The panel discussion at the end left me in no doubt of the monumental challenge we face trying to fix New Zealand’s education system. To quote John Morris, “Currently education policy is being determined by political imperatives. It should not be. All policy initiatives, and in education there are so many of them, should be evidence-based.”
  • Tom Haig from the NZPPTA was naturally highly sensitive to the political undertones of the day and felt the debate was too one-sided.  Perhaps that’s because there is little to debate when we rely on evidence.  If the focus on credible and reliable evidence can take the politics out of Education, then bring it on I say, for I can think of no group of stakeholders less politically-minded than our precious children.

Dr Audrey Tan, Mathmo Consulting
8 June 2018

NCEA – where less is more

Our “revolutionary” National Certificate(s) of Educational Achievement (NCEA) secondary school qualification, built on ideals of inclusion and equity, has failed to deliver on many fronts.

The New Zealand Initiative has highlighted some serious problems with the NCEA system. Briar Lipson’s well researched report, Spoiled by Choice: How NCEA Hampers Education and What It Needs To Succeed, exposed the harsh reality behind the dramatic growth in numbers of students achieving NCEA Level 2 each year. NCEA performance may be “improving”, but the international survey PISA shows that our 15-year-olds’ capabilities in maths, science and reading are declining.


Moreover, despite there being a minimum requirement for literacy and numeracy credits, a Tertiary Education Commission (TEC) study in 2014 found that reasonably large proportions of students with NCEA Level 1 (approximately half) or NCEA Level 2 (four out of 10) were not functionally literate, and similar proportions were not functionally numerate.


So what can we actually ascertain about a student with an NCEA qualification?

Very little, according to the Initiative’s second report, Score! Transforming NCEA Data. It brought to attention the huge variation in grade distribution across subject sub-fields. It is comparatively easier to gain an Excellence grade in Languages and Performing Arts than it is in Mathematics and Statistics. And not surprisingly, they found that more students pass internal assessments than external assessments.


In an attempt to redress these imbalances, the economists at the Initiative came up with a Weighted Relative Performance Index (WRPI) that endeavours to make “sense” of a student’s NCEA credits and provide a fairer comparison of performance between students. 1

It was a laudable attempt. However, one blatant inequity in the NCEA system that was not addressed by the WRPI, and has yet to be discussed widely, is the absence of a sensible time restriction applied to individual external assessments.

Allow me to explain. In Mathematics, students at any of the three NCEA Levels may be entered for a three-hour external examination comprising up to three achievement standards. Each achievement standard is a self-contained assessment/paper, sealed up in its own plastic wrapping. (Biodegradable, I hope.)

It would seem reasonable that each of the three papers should be completed in approximately one hour. 2 Therefore, it would seem reasonable that if a school decides to enter a student for fewer than three achievement standards, then the duration of the exam should be reduced, i.e. one hour for one paper, two hours for two papers. But no, not with NCEA! Schools may enter students for one or two papers, and students still get three hours to complete them!

This obviously puts students entered for three papers at a disadvantage. Are these students supposed to console themselves that they are receiving a better education, even if their peers come out with higher grades because they had more time?

Schools can, and do, game the system by entering their students for fewer than three papers. But students can also game the system by electing to not attempt all of the papers for which they are entered. If they leave the plastic wrapping intact, they will receive a Standard Not Attempted/Assessed (SNA), and apparently this is better than a Not Achieved (N).

Really?? According to this memo, it’s better for schools because School Result Summaries will include N’s but not SNA’s. Also, SNA allows for the possibility that “a student ran out of time so an N would not be a fair result”. In other words, it’s better to fail to try than to try and fail.

For those chasing an Endorsement, it is actually in their best interests to attempt fewer papers and go for higher grades. For example, for a student chasing an Excellence endorsement, two Excellence grades would be preferable to three Merit grades.

Our national secondary school examination system actually rewards students for doing less work.

The acknowledgement of effort is lacking even within an individual paper. In a traditional marking scheme, every correct answer would contribute to the final mark, but NCEA is a standards-based assessment system with “top down” marking. 3 Therefore, confident students can take a gamble and jump straight to the harder parts of a question. It’s a risky strategy, but if it pays off, they may achieve Excellence having answered roughly a third of the paper. This flies in the face of the instruction printed on the cover page: “You should attempt ALL the questions in this booklet.” If you do well, then some of what you do could turn out to be a waste of time.

NCEA is a system that does little to incentivise students to put in maximum effort or to persevere if the results are likely to be sub-optimal. Until these problems are addressed, mediocrity will prevail. We welcome the Ministry of Education’s review of NCEA this year, and hope to be part of that discussion.

Dr Audrey Tan, Mathmo Consulting
21 March 2018

1   Ironically, NZQA in their utopian socialist bubble didn’t want us to compare students, but it’s happening anyway. Students compare themselves, universities have already come up with their own weighted metrics, and employers are learning that “E” grades on a Record of Achievement are actually higher than “A” grades.

2   Prior to 2013, each paper used to start with the recommendation “You are advised to spend 60 minutes answering the questions in this booklet.”, but not any more.

3   Anecdotally, not all markers appear to follow this methodology. Perhaps they too feel that any positive efforts should be acknowledged, even if ultimately ignored in the final score.

Maths I Can Do – a maths version of Shape of You by Ed Sheeran

It turns out Ed Sheeran’s number knowledge is not so bad, but subtraction is his weak spot.

Last month, I delivered a talk to members of the New Zealand Educational Institute (NZEI Te Riu Roa) in Christchurch. The talk was oversubscribed, limited by the size of the venue.

I explained why the current primary maths curriculum is failing our children and the cognitive science behind it. I demonstrated how to develop algebraic thinking (another big failure of the Numeracy Project) to support computational thinking (in the context of the new Digital Technologies curriculum).

I also responded to teachers’ feedback on the areas their students find particularly difficult. It wasn’t a great surprise to see that subtraction was a common problem.

It turns out Ed Sheeran’s number knowledge is not so bad, but subtraction is his weak spot too.

Ed’s maths quiz and fondness of mathematical symbols inspired me to write a maths-themed version of his Platinum hit “Shape of You”, the deeper meaning of the lyrics revealed in my talk.

“Maths I Can Do” is for New Zealand teachers and their students to sing in their classrooms, but classrooms in other countries may enjoy it too. It is for non-profit educational purposes only. Please do not use it commercially.

Please share as widely as possible to raise awareness of New Zealand’s big maths problem.

Dr Audrey Tan, Mathmo Consulting
2 September 2017

What are National Standards worth when our 10-year-olds are the worst in the world at multiplication?

Our maths curriculum has made our children so bad at basic arithmetic that they’d be better off guessing.

Teachers and parents would naturally think that if a child is at or above the National Standard in Mathematics, then that child must be doing okay.

Results from the international Trends in Mathematics and Science Survey (TIMSS) tell us something quite different.

At the end of 2014, a representative sample of 6,321 New Zealand Year 5 students with an average age of 10.0 years were surveyed. Out of 49 countries, New Zealand placed 34th, behind all other participating predominantly English-speaking countries. Radio New Zealand put it a little more bluntly.

To be fair, some of the questions were considered too advanced for a New Zealand Year 5 student. However, when restricted to the questions deemed appropriate against the New Zealand Year 5 National Standards, the average student answered fewer than half of those questions correctly.

And yet, the National Standards data for 2014 tells us that 73.2% of New Zealand Year 5 students were at or above the National Standard. If we match this up with the TIMSS international benchmarks, it suggests that some of these students who were at or above the National Standard would probably have been classified as Low achievers in TIMSS.

A student meeting the Low international benchmark “has some basic mathematical knowledge. They can add and subtract whole numbers, have some understanding of multiplication by one-digit numbers, and can solve simple word problems. They have some knowledge of simple fractions, geometric shapes, and measurement. Students can read and complete simple bar graphs and tables.”

This is well below the standard we should expect for a 10-year-old. Now, spare a thought for the students who were classified as Below Low…

16% of our 10-year-old TIMSS participants were Below Low. These students completed fewer than half of the Low benchmark tasks correctly. This is a significant proportion compared to other countries, e.g. England (4%), the United States (5%), Australia (9%). In the top performing countries, less than 1% of their 10-year-olds are Below Low.

More concerning are the statistically significant increases in the large proportions of Māori (26%) and Pasifika (31%) students who were Below Low. If we are going to address the inequality in this country, providing these students with a maths education leading to greater opportunities would be a very good place to start.

I analysed the performance of our TIMSS 10-year-olds, question by question. They were mostly on the wrong side of average, but the stand-out questions were the basic arithmetic questions:

To add to the humiliation of coming last, 27 x 43 was a multiple choice question with four options. New Zealand’s result is worse than what we would expect from random guessing (25%). The previous cycle of TIMSS suggests a constructed response success rate would have been lower.

Our current maths curriculum has made our children so bad at basic arithmetic that they’d be better off guessing. Is this a standard to be proud of?

One might claim that it doesn’t matter – maths is not about the numbers, after all. TIMSS dispels that myth. There was a very strong positive correlation between country performance in Number versus both Geometric Shapes and Measures, and Data Display. There was also a very strong positive correlation between country performance in Knowledge versus both Applying and Reasoning. Suffice to say, number knowledge is a very strong predictor of success in all areas of mathematics.

Taxpayers should consider how much money has been spent on primary maths education since 2000 and question a further $126m being spent over four years on more of the same, without addressing the obvious weak spot in the curriculum. I spoke to Radio NZ about it.

The cost of rolling out the Numeracy Project amounted to around $70m in the first seven years. That’s about $85m in today’s money. The goal of the Numeracy Project “was to improve student performance in mathematics through improving the professional capability of teachers”. It failed.

In 2015, the Minister of Education at the time said that around $70m a year was available for professional development, and that was before she promised further money for maths professional development in response to the New Zealand Initiative’s Unaccountable report.

TIMSS informs us that New Zealand has higher proportions of teachers who participate in maths professional development compared to most other countries. Despite all this professional development, student performance has not improved since 2002, so why should we believe that further money spent on teacher training will make any difference?

To put these astonishing sums of money into perspective, the Government will spend just $40m on rolling out the brand new Digital Technologies curriculum, including $24m on teacher training. This is an area in which teachers will have very little experience, especially programming.

There is a much cheaper and effective form of professional development. Roll out my recent presentation to members of the New Zealand Educational Institute (NZEI Te Riu Roa) nationwide, and then see the maths that our kids can do.

Dr Audrey Tan, Mathmo Consulting
2 September 2017

TIMSS resources:
TIMSS 2015: New Zealand Year 5 Maths results, Ministry of Education
What we know about maths achievement: New Zealand Year 5 and Year 9 results from TIMSS 2014/15, Ministry of Education
TIMSS 2015 International Database

Have New Zealand’s PISA rankings really improved?

The PISA 2015 results are out and the Minister of Education is claiming an improvement in New Zealand’s rankings! Unfortunately, upon looking more closely at the Mathematics scores, citing a move from 23rd place to 21st place as an improvement is pure fantasy.

Liechtenstein, ranked 8th in 2012, did not participate in 2015. Had they participated in 2015, it is unlikely their score (535) would have fallen by so much as to affect New Zealand’s ranking. New Zealand automatically went up by a place just because Liechtenstein pulled out.

Vietnam scored 511 in 2012 but has dropped back significantly to 495 in 2015, exactly the same score as New Zealand. It’s not clear to me why New Zealand was ranked one place ahead of Vietnam, and not the other way round.

These facts alone mean that New Zealand could easily be placed at 23rd again.

There are two other countries whose performance has affected New Zealand’s rankings. Australia has dropped significantly from 504 in 2012 to 494 in 2015. On the other hand, Norway has improved significantly from 489 in 2012 to 502 in 2015.

The net effect to New Zealand’s ranking is actually zero.

A more mature approach to understanding the PISA results is to look at New Zealand’s recent and long-term score trends, relative to the OECD average.

From 2012 to 2015, all of our scores (in Maths, Science and Reading) have dropped, but in line with the OECD average. However, there is a much more concerning long-term decline, with a significant drop from 2009 to 2012, that does not follow the same trend as the OECD average. The 28 point drop in Mathematics from 2003 to 2015 is equivalent to nearly a year’s worth of schooling.

Of particular concern are the growing proportions of low-achieving children performing below Level 2. In Reading, students below Level 2 “have difficulty with all but the simplest reading tasks measured by PISA. Level 2 is considered a baseline level at which students begin to demonstrate the reading skills and competencies that will enable them to participate effectively later in life.” In Mathematics, “Level 2 is considered to be a baseline level at which students begin to demonstrate the competencies that will enable them to participate actively in mathematics-related life situations.” In 2015, 22% of New Zealand’s 15-year-old students could “complete only relatively basic mathematics tasks and whose lack of skills is a barrier to learning.





Source: NZ Ministry of Education, PISA 2015: New Zealand Summary Report

PISA (Programme for International Student Assessment) is an international study that assesses and compares how well countries are educationally preparing their 15-year-old students to meet real-life opportunities and challenges. With our apparent long-term decline in all three subjects, and in conjunction with our perennial poor performance in TIMSS, can we say honestly say that New Zealand is heading in the right direction?

Dr Audrey Tan, Mathmo Consulting
7 December 2016

Have New Zealand’s TIMSS maths scores really improved?

Source: NZ Ministry of Education, TIMSS 2014/15 Year 5 full report

Source: NZ Ministry of Education, TIMSS 2014/15 Year 9 full report

The latest Trends in Mathematics and Science Study (TIMSS) data has been released. At first glance, it looks like New Zealand’s maths scores have improved since 2010, but unfortunately we cannot be certain of this. The scores are published with a statistical margin of error, which means that if we were to run the survey again with different samples of children, we might not see the same “improvement”. If we include the published margins of error, we see overlapping bands of achievement rather than increasing lines from 2010 to 2014. In fact, over 20 years, New Zealand’s performance has been disappointly consistent. We’re still below average.

Source: NZ Ministry of Education, TIMSS 2014/15 Year 5 full report

Source: NZ Ministry of Education, TIMSS 2014/15 Year 9 full report

The Ministry of Education has been honest and sober in its reporting, but nevertheless, the Minister of Education has said, in congratulatory tones, that average scores had increased! How can she claim there is an improvement when her own officials say that scores haven’t changed?  Is she wilfully ignoring them, or does she needs a lesson on how to interpret statistical reports?

There was some encouraging growth in Year 5 students working at an “advanced” level, but at the other end of the spectrum, less than half of the student samples were working at the desired level of mathematics in the New Zealand Curriculum, and when looking at only the TIMSS questions which fit with New Zealand curriculum expectations, the average student answered just under half of these questions correctly. We have a high proportion of under-achieving students compared to other countries, and at the Year 9 level, this proportion has grown since 1995.

The Bring Back Column Addition Campaign was launched in response to New Zealand’s poor performance in TIMSS 2011(*). It would appear there is no reason to stop campaigning. We asked for some simple, pragmatic changes to the curriculum that would allow under-achieving students to progress. Without them, any improvements are likely to remain statistically insignificant.

Dr Audrey Tan, Mathmo Consulting
29 November 2016

(*) Internationally, TIMSS data is labelled by the odd-numbered years in which students in the northern hemisphere are assessed.  New Zealand students are assessed at the end of the year prior, hence the even-numbered years referred to in the Ministry’s reports.