Have New Zealand’s TIMSS maths scores really improved?

timss2014-yr5
Source: NZ Ministry of Education, TIMSS 2014/15 Year 5 full report

timss2014-yr9
Source: NZ Ministry of Education, TIMSS 2014/15 Year 9 full report

The latest Trends in Mathematics and Science Study (TIMSS) data has been released. At first glance, it looks like New Zealand’s maths scores have improved since 2010, but unfortunately we cannot be certain of this. The scores are published with a statistical margin of error, which means that if we were to run the survey again with different samples of children, we might not see the same “improvement”. If we include the published margins of error, we see overlapping bands of achievement rather than increasing lines from 2010 to 2014. In fact, over 20 years, New Zealand’s performance has been disappointly consistent. We’re still below average.

timss2014-yr5-dist
Source: NZ Ministry of Education, TIMSS 2014/15 Year 5 full report

timss2014-yr9-dist
Source: NZ Ministry of Education, TIMSS 2014/15 Year 9 full report

The Ministry of Education has been honest and sober in its reporting, but nevertheless, the Minister of Education has said, in congratulatory tones, that average scores had increased! How can she claim there is an improvement when her own officials say that scores haven’t changed?  Is she wilfully ignoring them, or does she needs a lesson on how to interpret statistical reports?

There was some encouraging growth in Year 5 students working at an “advanced” level, but at the other end of the spectrum, less than half of the student samples were working at the desired level of mathematics in the New Zealand Curriculum, and when looking at only the TIMSS questions which fit with New Zealand curriculum expectations, the average student answered just under half of these questions correctly. We have a high proportion of under-achieving students compared to other countries, and at the Year 9 level, this proportion has grown since 1995.

The Bring Back Column Addition Campaign was launched in response to New Zealand’s poor performance in TIMSS 2011(*). It would appear there is no reason to stop campaigning. We asked for some simple, pragmatic changes to the curriculum that would allow under-achieving students to progress. Without them, any improvements are likely to remain statistically insignificant.

Dr Audrey Tan, Mathmo Consulting
29 November 2016

 
(*) Internationally, TIMSS data is labelled by the odd-numbered years in which students in the northern hemisphere are assessed.  New Zealand students are assessed at the end of the year prior, hence the even-numbered years referred to in the Ministry’s reports.

MCAT (Mathematical Crisis, the Awful Truth) 2016

It’s time for New Zealand to look past the hysterical response to this year’s NCEA Level 1 MCAT exam and try to understand what’s really going on here.

Was the exam appropriate in level and difficulty?

In my previous post, I analysed the second of the two (supposedly) parallel papers and found that most of the questions were at a reasonable level for NCEA Level 1, and also reflective of the title “Apply algebraic procedures in solving problems”.

There was a section that was more investigative in nature and new for MCAT (but such questions have appeared in other Level 1 maths assessments in the past).  This section was made difficult by its poor construction and confusing wording, and most Level 1 students would have struggled to understand the intention.  But most exams have a Very Hard Question (VHQ), so I guess this is the VHQ for this exam.

 Was it too different from previous years?

Apart from the investigative question, I don’t think so, but I might have said differently last year, when there was a noticeable step up.  From the 2015 MCAT Exemplar:

This year at least one of the three questions will not have any directed straight procedure-based parts and the other questions a maximum of one such part.…candidates will not be able to provide evidence by following a direction to solve factorised quadratics, factorise, expand, write or solve a linear equation, or simplify an expression involving the collection of like terms in response to being told to.  One part in each question may direct the student to perform such procedures; but without further evidence at Achievement level, this will not be sufficient for the award of the standard. Utilising procedures such as factorising, simplifying a rational function, or writing an equation from a word problem will provide evidence of solving a problem.  Candidates must know that given a word problem, they will be required to write equation(s) and demonstrate consistent use of these in solving a problem. Candidates will be expected to have a basic understanding of the relationship between a quadratic function and the associated graph.

MCAT was last reviewed in 2013 and is up for review at the end of this year.  Whether a change in style between reviews is appropriate should certainly be up for discussion.

So why did students find it so difficult?

The unfortunate reality is that students did struggle with this exam.  The gap between what MCAT is expecting of students, and what students are actually capable of, is widening.

There are complaints that the lack of “gimme” questions at the start of the paper has left students “shell-shocked” and “killed” their confidence.  Are we seriously saying that our students are capable of factorising a quadratic when explicitly told to do so, but they are unable to decode a basic word problem and factorise a supplied quadratic expression for themselves, even though they probably wouldn’t know of anything else to do with an expanded quadratic?  What does this say about the resourcefulness or resilience of our students?

We cannot blame this year’s Level 1 maths teachers for what has happened, and they should rightly feel insulted.  The problem started many years before this one.

Let’s do the maths.  Year 11 students in 2016 were Year 8 students in 2013.  This is the generation of students who were failing to grasp maths fundamentals such as fractions and decimals in Year 8.

What we’re really seeing here is the fruits of a flawed primary maths curriculum floating its way through the system.  Even two and a half years at secondary school isn’t enough to turn things around.  The damage is too great.

If you look at what the Numeracy Project was trying to achieve at primary school level, our secondary school students should, by all accounts, be highly numerate problem solvers, but in fact they are worryingly innumerate and apparently not very good problem solvers either.  It’s ironic that one of the big selling points of this “new approach” to teaching maths was the development of early “Algebraic Thinking”.  I think we can safely call that a Not Achieved.

A systemic failure in mathematics education is playing out before our very eyes.  NZQA is trying to inch up the standard, year by year, when the reality is that students are actually getting worse at algebra, year by year.  When students are struggling to master the basics, it’s hard to see how teachers can lift their students to the higher levels of problem solving now expected.

Given that next year’s Year 11 students will be the same generation of 9-year-olds who performed so abysmally in TIMSS 2011, alarm bells should be ringing loudly.  It would not be surprising if fewer students were entered for next year’s MCAT.

Spring forward, fall back

NZQA could make the MCAT easier again, but that would be disappointing.  I believe this year’s MCAT is the standard we should be aspiring to.  If the examination team could tighten up on the construction of certain questions, the MCAT would be an examination to be proud of on the world stage.  (The assessment side of things, however, needs a lot more work.)

And whilst I accept that normalisation is sometimes necessary, I do not think that assessment schedules should be adjusted to meet pre-defined targets as a standard practice.  The universities have already discovered that NCEA grades are an unreliable measure of preparedness for tertiary study.

The best thing NZQA can do is go back to examining algebra at the end of the year.

September is a really bad time of year for students to face their first high-stakes external examination.  Some students barely appreciate its significance when it is tangled up with mock exams for other topics and different subjects, and the ones that do appreciate its significance prioritise the MCAT at the expense of preparing for their mock exams.

The sensible thing to do, surely, is to fold it in with “Tables, Equations and Graphs”.  We’re already seeing questions about graphs in the MCAT anyway, and why shouldn’t we?  Algebra and Graphs are not separate topics, they are inextricably tied.  As we now see, NCEA’s compartmentalising of topics as separate assessments is hurting students’ ability to make connections and become effective problem solvers.

The decision to deliver the assessment earlier in the year and have it administered by the schools has a distinct whiff of cost-cutting about it, but it has been a disaster for maths education and is costing the country dearly.  If we want students to pursue STEM subjects at university, we need to give them every chance of succeeding in algebra at Level 1, as this effectively marks the fork in the road between calculus and statistics at Level 2.  If we want to increase the “dollar value” of Kiwis contributing to New Zealand’s economy, fixing our maths education system is a very good place to start.

Dr Audrey Tan, Mathmo Consulting
22 September 2016

The primary maths issue that won’t go away

Oh dear, it’s that pesky maths problem that won’t go away, no matter how much Government money is thrown in the wrong direction.

Whilst I would never stand in the way of any initiative that raises the bar for teaching in New Zealand, we do not need specialist maths teachers at the primary school level. What we do need is to stop asking children how they know the answer to 3 + 4 is 7 and if there are other ways to get that answer. Is it any wonder our Year 8 students are ill-prepared for secondary school maths when their precious brain power is wasted on such trivia?

Such patronising recommendations from so-called “specialists” highlight the lack of understanding in New Zealand of what success in maths looks like. It is scandalous that the Ministry of Education continues to cling on to flawed ideals created by people who have no mathematical qualifications or experience, despite every indication our children are failing, year after year. They claim that implementing effective maths teaching and learning in classrooms is “challenging and complex”. It gives the impression they’d rather see students continue to fail at maths than acknowledge the compelling evidence of a quick and effective solution.

The Bring Back Column Addition campaign was never supposed to be a long-term crusade. I thought common sense would prevail; how wrong I was, and how much I have learned about attitudes within the education sector. This campaign will continue until the Minister of Education and her officials acknowledge that the acquisition of basic maths skills is not negotiable. Every child should leave primary school knowing their single digit addition and multiplication facts as well as they know their alphabet. They should be able to add, subtract, multiply and divide numbers fluently. They should be able to work confidently with fractions, decimals and percentages. As clients of the system, every parent should demand this.

Education professor John O’Neill says it would take 20 years to pull this country out of its downward spiral. It may well take that long, but while there are still some practising teachers who can remember life before the dreadful Numeracy Project was dispersed over the country like a gas bomb, let’s harness that experience and give our current children a fighting chance. Teachers, please let your students line up the columns and get them doing maths again. It’s the least you can do for our kids and our country.

Dr Audrey Tan, Mathmo Consulting
May 2016



The Primary Issue: Ministry counts cost of children failing at maths – National – NZ Herald News
• Maths scores have been declining since 2002, with National Standards figures showing one in four are behind in the subject by the time they leave primary school • Ministry – New Zealand Herald
nzherald.co.nz

Leaving a legacy worth billions

BBCA Flag_of_New_Zealand

I have said very little in public about the New Zealand flag referendum, apart from suggesting that the referendum should have been funded by the sale of tea towels. If everyone who voted correctly in the first referendum bought a tea towel of their preferred flag for approximately $25, that would have covered the estimated $26 million. Given the Prime Minister’s financial acumen, I’m surprised he didn’t think of that himself.

Instead, New Zealand’s coffers are $26 million poorer and our Prime Minister is still chasing a legacy.

It’s not too late for John Key to leave a legacy. Instead of worrying about New Zealand’s branding and the “billions” a new flag might be worth, he should worry about the 59% of Year 8 children who are struggling to grasp fractions and decimals. He could Bring Back Column Addition and give the 48% of Year 5 children who cannot add two three-digit numbers a fighting chance. He could allow children to learn their times tables without requiring them to use “number properties” to work them out. Perhaps then, more than 8% of Year 5 children would be able to multiply two two-digit numbers.

Since the Union Jack remains embedded in our national flag, let’s look at what Britain has been up to in recent years. They’ve brought back column addition, they’ve introduced testing of times tables, they’ve brought in teaching expertise from Shanghai, they’ve created a Mathematics Mastery curriculum inspired by Singaporean methods. You may not agree with everything they’re doing, but at least they’re doing something.

Our mother country has recognised the widespread impact of adult innumeracy in the UK. If John Key’s government fixed our primary school maths curriculum, how much would that be worth to New Zealand’s future economy? Billions. Think about it, John.

Dr Audrey Tan, Mathmo Consulting
April 2016

What are the chances of completing a Disney-Pixar Domino Stars collection from a fixed number of packets?

It’s a strange thing for a mathematical person to say, but some people are luckier than others. Luckily for our son (or his parents), we shop at Countdown, which means he received a regular supply of Disney-Pixar Domino Stars throughout the 6.5 week promotion.

It was all a bit of a yawn to me, this time round. Having calculated the chances of collecting a full set of Dreamworks Heroes Action Cards, I was neither surprised nor frustrated by the increasing number of duplicates as his collection grew. The theoretical hyperbolic growth of the number of packets he would have to open to find a new domino implied a spend of $4480 to complete a set of 50. Almost $700 per week?? Not very likely.

To cut a long story short, on the last day of the promotion, we trudged home from a swap meet needing just two more to complete the set. Our chances of finding those last two looked bleak…except that a neighbour had just dropped off a big bag of 22 packets in our letterbox.

As I said, some people are luckier than others, and to our amazement, our son actually found the very two that he needed! Now how lucky is that?! I mean, what are the chances…?

Without going into too much detail, you can either focus on what you don’t want to happen, i.e. you get none of one of the coveted dominoes and any number of the other:

domino-stars-1

or you can focus on what you do want to happen, i.e. you eventually find one of the two that you’re looking for, and then hope to find the other one amongst the packets remaining:

domino-stars-2

But a friend of ours came up with an expression that best reflects what really happened, i.e. our son stopped opening packets as soon as he found the two he needed:

domino-stars-3

Reassuringly, all three expressions evaluate to this rather impressive looking fraction:

domino-stars-4

So it was very close to a 1 in 8 chance. Our son doesn’t realise just how lucky he is.

Dr Audrey Tan, Mathmo Consulting
September 2015

What about the children?

Over the past two weeks, there has been a flurry of public discussion about primary maths education in New Zealand.  Two reports, released within a week of each other, came out with the same message: our primary school children are performing very badly in maths.  Not exactly news, but it’s good to see the nation talking about it.

Releasing the Crown’s National Monitoring Study of Student Assessment on Mathematics and Statistics (2013?!) late on a Friday before the long Queen’s Birthday weekend was a possible reason for the muted response to the latest depressing statistic: only around 41% of approximately 2000 Year 8 students in 2013 met the expected level of achievement.  Our primary school leavers are struggling with decimals, fractions and percentages, and don’t I know it…

But what really got people riled up was The New Zealand Initiative’s report, “Un(ac)countable: Why millions on maths returned little”, written by Rose Patterson.  As the Minister of Education said herself, it provided a “fresh perspective” on New Zealand’s maths learning woes.

Now that the media hullabaloo is settling down, let’s try to set the record straight.  The report investigated and established the failure of the Numeracy Project by:

  • providing evidence of a decline in student maths performance that aligns with the roll-out of the Project;
  • debunking the myth that it’s not such a bad thing that Kiwi kids don’t “know” anything anymore because their strength lies in the higher-order areas of “applying” and “reasoning”, unlike their east-Asian rote-learning counterparts. Sorry, but it turns out those east-Asian kids are not only better at “knowing”, they’re also better at “applying” and “reasoning” because they’ve actually got some knowledge to work with.

The report also investigated maths teaching quality in our primary schools, citing a study in which a significant proportion of 125 student teachers were unable to answer some basic primary-level maths questions.  Personally, I feel this raises some serious questions about the quality of the Bachelor of Education degree.  Is it not reasonable to expect that all graduating primary school teachers should be able to do primary school maths?

Critics called it unfair, but Rose Patterson was professionally compelled to examine teacher quality after interviewing curriculum writer Vince Wright and maths education researcher Jenny Young-Loveridge.  Both interviewees prefer to blame the failure of the Numeracy Project on its poor implementation by teachers rather than its flawed ideology.  It really wasn’t the Herald’s finest moment when it accused the report, and by association the Minister of Education because she agreed to launch the report, of criticising teachers.  Not one single journalist mentioned the report’s actual conclusion: that teacher quality was unlikely to have changed over the past 15 years, and that the decline in student maths performance was due to the Numeracy Project’s multiple strategy approach to numeracy and the loss of emphasis on the basics.

I am pleased that I had the opportunity to speak publicly in support of our teaching workforce.  Frankly, it is shameful that the people who are supposed to be looking after our teachers chose to not defend them.  Instead, the NZEI Te Riu Roa president said underfunding for teacher professional development (PD) was to blame for the poor results, even though the report pointed out that New Zealand spends a lot of money on maths PD – more than most other countries, in fact.  Even after the Minister of Education responded to the “maths problem” by promising to raise the quality of maths teaching through more PD, the NZEI president still wasn’t happy, saying things would only improve if the training was better than what’s currently available.

Sigh. Result. By the way, has anybody thought about the children lately? When political and professional pride get in the way of helping our kids, it is really sad.

Had certain things happened or not happened, the political response to the Un(ac)countable report might have panned out quite differently.  But, having been to Wellington and heard with my own ears the Minister of Education’s response, it is clear there is more work to be done.  Nobody could have held the Minister accountable for the mistakes of past governments.  In her own words, the Numeracy Project “was in line with international thinking at the time”, so she missed a great opportunity to renounce it and become the public hero.  Instead, everything’s gonna be alright now that we’ve got National Standards.  Erm, would these be the same National Standards that her Ministry just deemed as lacking dependability?

What has been truly heartening, however, is the public’s response.  The Herald’s suspiciously unoriginal editorial and the unrepentant curriculum writers’ opinion editorial have been met with mockery akin to the Emperor’s New Clothes.  It’s good to see the public ain’t buying it any more.

I am proud of this campaign’s role in bringing the debate to this point, but merely talking about it won’t help our children.  A shocking amount of taxpayers’ money has been spent on what can only be described as a failed experiment.  By allowing it to continue, we are failing our children.  From here on, we are all accountable.

Dr Audrey Tan, Mathmo Consulting
June 2015

Open letter to Hon Hekia Parata MP, Minister of Education

Dear Minister Parata,

I write in response to your speech at the launch of the New Zealand Initiative’s report “Un(ac)countable: Why millions on maths returned little” on 4 June 2015.

With respect, the Crown’s recently released National Monitoring Study of Student Achievement has NOT shown that the system is failing a minority of students. To quote: “The curriculum expectation at Year 8 is that students will be working solidly at Level 4. About 41 percent of [approximately 2000] Year 8 students achieved at Level 4 or higher on the KAMSI assessment.” In actual fact, the system is failing the majority of primary school students in New Zealand, not a minority.

It is incorrect to suggest that the Un(ac)countable report continues the age-old debate in education between those who believe in rote learning and those who place a higher value on critical thinking. By doing so, you have precisely proved the report correct: “Rather than striking a good balance between instrumental learning and relational learning, and enabling the two to build on each other, they tend to be falsely dichotomised. They should work in tandem.

Looking at the graphs, we can see that Year 5 student performance in TIMSS has been on the decline since 2002, Year 9 student performance in TIMSS has also been on the decline since 2002, and PISA 2012 showed a sharp decline in the performance of 15-year-old students in mathematical literacy, in stark contrast to the OECD average. To say that New Zealand student performance in these international assessments has “declined slightly in recent years” is something of an understatement.

Striking the right balance between the practice and mastery of basic skills and developing higher-order thinking is much easier than you claim, and should be guided by evidence. A report released last month, written by mathematician Assoc. Prof. Anna Stokke for the C.D. Howe Institute in Canada, explains that “studies consistently show direct instruction is much more effective than discovery-based instruction, which leads to straightforward recommendations on how to tilt the balance toward best instructional techniques.

Your commitment to raise the quality of maths teaching in New Zealand is welcome. However, the Un(ac)countable report shows that teacher quality is unlikely to have changed over the last 15 years, and the true reason for the decline in New Zealand student performance in mathematics is the loss of emphasis on the basics. Until the Ministry acknowledges the overwhelming evidence and addresses the deficiencies in curriculum content and delivery, throwing more money at professional development for teachers will, sadly, have little effect. That is why I have decided to share this letter, so that parents, teachers and principals can also examine the evidence and make appropriate choices for the children in front of them. By working together, I am confident that we will bring back column addition to New Zealand’s early primary maths curriculum.

Yours sincerely,

Dr Audrey M. Tan
Mathmo Consulting