It’s time for New Zealand to look past the hysterical response to this year’s NCEA Level 1 MCAT exam and try to understand what’s really going on here.

**Was the exam appropriate in level and difficulty?**

In my previous post, I analysed the second of the two (supposedly) parallel papers and found that most of the questions were at a reasonable level for NCEA Level 1, and also reflective of the title “Apply algebraic procedures in solving problems”.

There was a section that was more investigative in nature and new for MCAT (but such questions have appeared in other Level 1 maths assessments in the past). This section was made difficult by its poor construction and confusing wording, and most Level 1 students would have struggled to understand the intention. But most exams have a Very Hard Question (VHQ), so I guess this is the VHQ for this exam.

** ****Was it too different from previous years?**

Apart from the investigative question, I don’t think so, *but* I might have said differently last year, when there was a noticeable step up. From the 2015 MCAT Exemplar:

“*This year at least one of the three questions will not have any directed straight procedure-based parts and the other questions a maximum of one such part.…candidates will not be able to provide evidence by following a direction to solve factorised quadratics, factorise, expand, write or solve a linear equation, or simplify an expression involving the collection of like terms in response to being told to. One part in each question may direct the student to perform such procedures; but without further evidence at Achievement level, this will not be sufficient for the award of the standard. Utilising procedures such as factorising, simplifying a rational function, or writing an equation from a word problem will provide evidence of solving a problem. Candidates must know that given a word problem, they will be required to write equation(s) and demonstrate consistent use of these in solving a problem. Candidates will be expected to have a basic understanding of the relationship between a quadratic function and the associated graph.*”

MCAT was last reviewed in 2013 and is up for review at the end of this year. Whether a change in style between reviews is appropriate should certainly be up for discussion.

**So why did students find it so difficult?**

The unfortunate reality is that students *did* struggle with this exam. The gap between what MCAT is expecting of students, and what students are actually capable of, is widening.

There are complaints that the lack of “gimme” questions at the start of the paper has left students “shell-shocked” and “killed” their confidence. Are we seriously saying that our students are capable of factorising a quadratic when explicitly told to do so, but they are unable to decode a basic word problem and factorise a *supplied* quadratic expression for themselves, even though they probably wouldn’t know of anything else to do with an expanded quadratic? What does this say about the resourcefulness or resilience of our students?

We cannot blame this year’s Level 1 maths teachers for what has happened, and they should rightly feel insulted. The problem started many years before this one.

Let’s do the maths. Year 11 students in 2016 were Year 8 students in 2013. This is the generation of students who were failing to grasp maths fundamentals such as fractions and decimals in Year 8.

What we’re really seeing here is the fruits of a flawed primary maths curriculum floating its way through the system. Even two and a half years at secondary school isn’t enough to turn things around. The damage is too great.

If you look at what the Numeracy Project was trying to achieve at primary school level, our secondary school students should, by all accounts, be highly numerate problem solvers, but in fact they are worryingly innumerate and apparently not very good problem solvers either. It’s ironic that one of the big selling points of this “new approach” to teaching maths was the development of early “Algebraic Thinking”. I think we can safely call that a Not Achieved.

A systemic failure in mathematics education is playing out before our very eyes. NZQA is trying to inch up the standard, year by year, when the reality is that students are actually getting worse at algebra, year by year. When students are struggling to master the basics, it’s hard to see how teachers can lift their students to the higher levels of problem solving now expected.

Given that next year’s Year 11 students will be the same generation of 9-year-olds who performed so abysmally in TIMSS 2011, alarm bells should be ringing loudly. It would not be surprising if fewer students were entered for next year’s MCAT.

**Spring forward, fall back**

NZQA could make the MCAT easier again, but that would be disappointing. I believe this year’s MCAT is the standard we should be aspiring to. If the examination team could tighten up on the construction of certain questions, the MCAT would be an examination to be proud of on the world stage. (The assessment side of things, however, needs a lot more work.)

And whilst I accept that normalisation is sometimes necessary, I do not think that assessment schedules should be adjusted to meet pre-defined targets as a standard practice. The universities have already discovered that NCEA grades are an unreliable measure of preparedness for tertiary study.

The best thing NZQA can do is go back to examining algebra at the end of the year.

September is a really bad time of year for students to face their first high-stakes external examination. Some students barely appreciate its significance when it is tangled up with mock exams for other topics and different subjects, and the ones that do appreciate its significance prioritise the MCAT at the expense of preparing for their mock exams.

The sensible thing to do, surely, is to fold it in with “Tables, Equations and Graphs”. We’re already seeing questions about graphs in the MCAT anyway, and why shouldn’t we? Algebra and Graphs are not separate topics, they are inextricably tied. As we now see, NCEA’s compartmentalising of topics as separate assessments is hurting students’ ability to make connections and become effective problem solvers.

The decision to deliver the assessment earlier in the year and have it administered by the schools has a distinct whiff of cost-cutting about it, but it has been a disaster for maths education and is costing the country dearly. If we want students to pursue STEM subjects at university, we need to give them every chance of succeeding in algebra at Level 1, as this effectively marks the fork in the road between calculus and statistics at Level 2. If we want to increase the “dollar value” of Kiwis contributing to New Zealand’s economy, fixing our maths education system is a very good place to start.

Dr Audrey Tan, Mathmo Consulting

22 September 2016