Maths I Can Do – a maths version of Shape of You by Ed Sheeran

It turns out Ed Sheeran’s number knowledge is not so bad, but subtraction is his weak spot.

Last month, I delivered a talk to members of the New Zealand Educational Institute (NZEI Te Riu Roa) in Christchurch. The talk was oversubscribed, limited by the size of the venue.

I explained why the current primary maths curriculum is failing our children and the cognitive science behind it. I demonstrated how to develop algebraic thinking (another big failure of the Numeracy Project) to support computational thinking (in the context of the new Digital Technologies curriculum).

I also responded to teachers’ feedback on the areas their students find particularly difficult. It wasn’t a great surprise to see that subtraction was a common problem.

It turns out Ed Sheeran’s number knowledge is not so bad, but subtraction is his weak spot too.

Ed’s maths quiz and fondness of mathematical symbols inspired me to write a maths-themed version of his Platinum hit “Shape of You”. the deeper meaning of the lyrics revealed in my talk.

“Maths I Can Do” is for New Zealand teachers and their students to sing in their classrooms, but classrooms in other countries may enjoy it too. It is for non-profit educational purposes only. Please do not use it commercially.

Please share as widely as possible to raise awareness of New Zealand’s big maths problem.

Dr Audrey Tan, Mathmo Consulting
2 September 2017

What are National Standards worth when our 10-year-olds are the worst in the world at multiplication?

Our maths curriculum has made our children so bad at basic arithmetic that they’d be better off guessing.

Teachers and parents would naturally think that if a child is at or above the National Standard in Mathematics, then that child must be doing okay.

Results from the international Trends in Mathematics and Science Survey (TIMSS) tell us something quite different.

At the end of 2014, a representative sample of 6,321 Year 5 students with an average age of 10.0 years were surveyed. Out of 49 countries, New Zealand placed 34th, behind all other participating predominantly English-speaking countries. Radio New Zealand put it a little more bluntly.

To be fair, some of the questions were considered too advanced for a New Zealand Year 5 student. However, when restricted to the questions deemed appropriate against the New Zealand Year 5 National Standards, the average student answered fewer than half of those questions correctly.

And yet, the National Standards data for 2014 tells us that 73.2% of New Zealand Year 5 students were at or above the National Standard. If we match this up with the TIMSS international benchmarks, it suggests that some of these students who were at or above the National Standard would probably have been classified as Low achievers in TIMSS.

A student meeting the Low international benchmark “has some basic mathematical knowledge. They can add and subtract whole numbers, have some understanding of multiplication by one-digit numbers, and can solve simple word problems. They have some knowledge of simple fractions, geometric shapes, and measurement. Students can read and complete simple bar graphs and tables.”

This is well below the standard we should expect for a 10-year-old. Now, spare a thought for the students who were classified as Below Low…

16% of our 10-year-old TIMSS participants were Below Low. These students completed fewer than half of the Low benchmark tasks correctly. This is a significant proportion compared to other countries, e.g. England (4%), the United States (5%), Australia (9%). In the top performing countries, less than 1% of their 10-year-olds are Below Low.

More concerning are the statistically significant increases in the large proportions of Māori (26%) and Pasifika (31%) students who were Below Low. If we are going to address the inequality in this country, providing these students with a maths education leading to greater opportunities would be a very good place to start.

I analysed the performance of our TIMSS 10-year-olds, question by question. They were mostly on the wrong side of average, but the stand-out questions were the basic arithmetic questions:

To add to the humiliation of coming last, 27 x 43 was a multiple choice question with four options. New Zealand’s result is worse than what we would expect from random guessing (25%). The previous cycle of TIMSS suggests a constructed response success rate would have been lower.

Our current maths curriculum has made our children so bad at basic arithmetic that they’d be better off guessing. Is this a standard to be proud of?

One might claim that it doesn’t matter – maths is not about the numbers, after all. TIMSS dispels that myth. There was a very strong positive correlation between country performance in Number versus both Geometric Shapes and Measures, and Data Display. There was also a very strong positive correlation between country performance in Knowledge versus both Applying and Reasoning. Suffice to say, number knowledge is a very strong predictor of success in all areas of mathematics.

Taxpayers should consider how much money has been spent on primary maths education since 2000 and question a further $126m being spent over four years on more of the same, without addressing the obvious weak spot in the curriculum. I spoke to Radio NZ about it.

The cost of rolling out the Numeracy Project amounted to around $70m in the first seven years. That’s about $85m in today’s money. The goal of the Numeracy Project “was to improve student performance in mathematics through improving the professional capability of teachers”. It failed.

In 2015, the Minister of Education at the time said that around $70m a year was available for professional development, and that was before she promised further money for maths professional development in response to the New Zealand Initiative’s Unaccountable report.

TIMSS informs us that New Zealand has higher proportions of teachers who participate in maths professional development compared to most other countries. Despite all this professional development, student performance has not improved since 2002, so why should we believe that further money spent on teacher training will make any difference?

To put these astonishing sums of money into perspective, the Government will spend just $40m on rolling out the brand new Digital Technologies curriculum, including $24m on teacher training. This is an area in which teachers will have very little experience, especially programming.

There is a much cheaper and effective form of professional development. Roll out my recent presentation to members of the New Zealand Educational Institute (NZEI Te Riu Roa) nationwide, and then see the maths that our kids can do.

Dr Audrey Tan, Mathmo Consulting
2 September 2017

TIMSS resources:
TIMSS 2015: New Zealand Year 5 Maths results, Ministry of Education
What we know about maths achievement: New Zealand Year 5 and Year 9 results from TIMSS 2014/15, Ministry of Education
TIMSS 2015 International Database

Have New Zealand’s PISA rankings really improved?

The PISA 2015 results are out and the Minister of Education is claiming an improvement in New Zealand’s rankings! Unfortunately, upon looking more closely at the Mathematics scores, citing a move from 23rd place to 21st place as an improvement is pure fantasy.

Liechtenstein, ranked 8th in 2012, did not participate in 2015. Had they participated in 2015, it is unlikely their score (535) would have fallen by so much as to affect New Zealand’s ranking. New Zealand automatically went up by a place just because Liechtenstein pulled out.

Vietnam scored 511 in 2012 but has dropped back significantly to 495 in 2015, exactly the same score as New Zealand. It’s not clear to me why New Zealand was ranked one place ahead of Vietnam, and not the other way round.

These facts alone mean that New Zealand could easily be placed at 23rd again.

There are two other countries whose performance has affected New Zealand’s rankings. Australia has dropped significantly from 504 in 2012 to 494 in 2015. On the other hand, Norway has improved significantly from 489 in 2012 to 502 in 2015.

The net effect to New Zealand’s ranking is actually zero.

A more mature approach to understanding the PISA results is to look at New Zealand’s recent and long-term score trends, relative to the OECD average.

From 2012 to 2015, all of our scores (in Maths, Science and Reading) have dropped, but in line with the OECD average. However, there is a much more concerning long-term decline, with a significant drop from 2009 to 2012, that does not follow the same trend as the OECD average. The 28 point drop in Mathematics from 2003 to 2015 is equivalent to nearly a year’s worth of schooling.

Of particular concern are the growing proportions of low-achieving children performing below Level 2. In Reading, students below Level 2 “have difficulty with all but the simplest reading tasks measured by PISA. Level 2 is considered a baseline level at which students begin to demonstrate the reading skills and competencies that will enable them to participate effectively later in life.” In Mathematics, “Level 2 is considered to be a baseline level at which students begin to demonstrate the competencies that will enable them to participate actively in mathematics-related life situations.” In 2015, 22% of New Zealand’s 15-year-old students could “complete only relatively basic mathematics tasks and whose lack of skills is a barrier to learning.

pisa2015-nzscience

pisa2015-nzreading

pisa2015-nzmathsscores

pisa2015-nzmathsproficiency

Source: NZ Ministry of Education, PISA 2015: New Zealand Summary Report

PISA (Programme for International Student Assessment) is an international study that assesses and compares how well countries are educationally preparing their 15-year-old students to meet real-life opportunities and challenges. With our apparent long-term decline in all three subjects, and in conjunction with our perennial poor performance in TIMSS, can we say honestly say that New Zealand is heading in the right direction?

Dr Audrey Tan, Mathmo Consulting
7 December 2016

Have New Zealand’s TIMSS maths scores really improved?

timss2014-yr5
Source: NZ Ministry of Education, TIMSS 2014/15 Year 5 full report

timss2014-yr9
Source: NZ Ministry of Education, TIMSS 2014/15 Year 9 full report

The latest Trends in Mathematics and Science Study (TIMSS) data has been released. At first glance, it looks like New Zealand’s maths scores have improved since 2010, but unfortunately we cannot be certain of this. The scores are published with a statistical margin of error, which means that if we were to run the survey again with different samples of children, we might not see the same “improvement”. If we include the published margins of error, we see overlapping bands of achievement rather than increasing lines from 2010 to 2014. In fact, over 20 years, New Zealand’s performance has been disappointly consistent. We’re still below average.

timss2014-yr5-dist
Source: NZ Ministry of Education, TIMSS 2014/15 Year 5 full report

timss2014-yr9-dist
Source: NZ Ministry of Education, TIMSS 2014/15 Year 9 full report

The Ministry of Education has been honest and sober in its reporting, but nevertheless, the Minister of Education has said, in congratulatory tones, that average scores had increased! How can she claim there is an improvement when her own officials say that scores haven’t changed?  Is she wilfully ignoring them, or does she needs a lesson on how to interpret statistical reports?

There was some encouraging growth in Year 5 students working at an “advanced” level, but at the other end of the spectrum, less than half of the student samples were working at the desired level of mathematics in the New Zealand Curriculum, and when looking at only the TIMSS questions which fit with New Zealand curriculum expectations, the average student answered just under half of these questions correctly. We have a high proportion of under-achieving students compared to other countries, and at the Year 9 level, this proportion has grown since 1995.

The Bring Back Column Addition Campaign was launched in response to New Zealand’s poor performance in TIMSS 2011(*). It would appear there is no reason to stop campaigning. We asked for some simple, pragmatic changes to the curriculum that would allow under-achieving students to progress. Without them, any improvements are likely to remain statistically insignificant.

Dr Audrey Tan, Mathmo Consulting
29 November 2016

 
(*) Internationally, TIMSS data is labelled by the odd-numbered years in which students in the northern hemisphere are assessed.  New Zealand students are assessed at the end of the year prior, hence the even-numbered years referred to in the Ministry’s reports.

MCAT (Mathematical Crisis, the Awful Truth) 2016

It’s time for New Zealand to look past the hysterical response to this year’s NCEA Level 1 MCAT exam and try to understand what’s really going on here.

Was the exam appropriate in level and difficulty?

In my previous post, I analysed the second of the two (supposedly) parallel papers and found that most of the questions were at a reasonable level for NCEA Level 1, and also reflective of the title “Apply algebraic procedures in solving problems”.

There was a section that was more investigative in nature and new for MCAT (but such questions have appeared in other Level 1 maths assessments in the past).  This section was made difficult by its poor construction and confusing wording, and most Level 1 students would have struggled to understand the intention.  But most exams have a Very Hard Question (VHQ), so I guess this is the VHQ for this exam.

 Was it too different from previous years?

Apart from the investigative question, I don’t think so, but I might have said differently last year, when there was a noticeable step up.  From the 2015 MCAT Exemplar:

This year at least one of the three questions will not have any directed straight procedure-based parts and the other questions a maximum of one such part.…candidates will not be able to provide evidence by following a direction to solve factorised quadratics, factorise, expand, write or solve a linear equation, or simplify an expression involving the collection of like terms in response to being told to.  One part in each question may direct the student to perform such procedures; but without further evidence at Achievement level, this will not be sufficient for the award of the standard. Utilising procedures such as factorising, simplifying a rational function, or writing an equation from a word problem will provide evidence of solving a problem.  Candidates must know that given a word problem, they will be required to write equation(s) and demonstrate consistent use of these in solving a problem. Candidates will be expected to have a basic understanding of the relationship between a quadratic function and the associated graph.

MCAT was last reviewed in 2013 and is up for review at the end of this year.  Whether a change in style between reviews is appropriate should certainly be up for discussion.

So why did students find it so difficult?

The unfortunate reality is that students did struggle with this exam.  The gap between what MCAT is expecting of students, and what students are actually capable of, is widening.

There are complaints that the lack of “gimme” questions at the start of the paper has left students “shell-shocked” and “killed” their confidence.  Are we seriously saying that our students are capable of factorising a quadratic when explicitly told to do so, but they are unable to decode a basic word problem and factorise a supplied quadratic expression for themselves, even though they probably wouldn’t know of anything else to do with an expanded quadratic?  What does this say about the resourcefulness or resilience of our students?

We cannot blame this year’s Level 1 maths teachers for what has happened, and they should rightly feel insulted.  The problem started many years before this one.

Let’s do the maths.  Year 11 students in 2016 were Year 8 students in 2013.  This is the generation of students who were failing to grasp maths fundamentals such as fractions and decimals in Year 8.

What we’re really seeing here is the fruits of a flawed primary maths curriculum floating its way through the system.  Even two and a half years at secondary school isn’t enough to turn things around.  The damage is too great.

If you look at what the Numeracy Project was trying to achieve at primary school level, our secondary school students should, by all accounts, be highly numerate problem solvers, but in fact they are worryingly innumerate and apparently not very good problem solvers either.  It’s ironic that one of the big selling points of this “new approach” to teaching maths was the development of early “Algebraic Thinking”.  I think we can safely call that a Not Achieved.

A systemic failure in mathematics education is playing out before our very eyes.  NZQA is trying to inch up the standard, year by year, when the reality is that students are actually getting worse at algebra, year by year.  When students are struggling to master the basics, it’s hard to see how teachers can lift their students to the higher levels of problem solving now expected.

Given that next year’s Year 11 students will be the same generation of 9-year-olds who performed so abysmally in TIMSS 2011, alarm bells should be ringing loudly.  It would not be surprising if fewer students were entered for next year’s MCAT.

Spring forward, fall back

NZQA could make the MCAT easier again, but that would be disappointing.  I believe this year’s MCAT is the standard we should be aspiring to.  If the examination team could tighten up on the construction of certain questions, the MCAT would be an examination to be proud of on the world stage.  (The assessment side of things, however, needs a lot more work.)

And whilst I accept that normalisation is sometimes necessary, I do not think that assessment schedules should be adjusted to meet pre-defined targets as a standard practice.  The universities have already discovered that NCEA grades are an unreliable measure of preparedness for tertiary study.

The best thing NZQA can do is go back to examining algebra at the end of the year.

September is a really bad time of year for students to face their first high-stakes external examination.  Some students barely appreciate its significance when it is tangled up with mock exams for other topics and different subjects, and the ones that do appreciate its significance prioritise the MCAT at the expense of preparing for their mock exams.

The sensible thing to do, surely, is to fold it in with “Tables, Equations and Graphs”.  We’re already seeing questions about graphs in the MCAT anyway, and why shouldn’t we?  Algebra and Graphs are not separate topics, they are inextricably tied.  As we now see, NCEA’s compartmentalising of topics as separate assessments is hurting students’ ability to make connections and become effective problem solvers.

The decision to deliver the assessment earlier in the year and have it administered by the schools has a distinct whiff of cost-cutting about it, but it has been a disaster for maths education and is costing the country dearly.  If we want students to pursue STEM subjects at university, we need to give them every chance of succeeding in algebra at Level 1, as this effectively marks the fork in the road between calculus and statistics at Level 2.  If we want to increase the “dollar value” of Kiwis contributing to New Zealand’s economy, fixing our maths education system is a very good place to start.

Dr Audrey Tan, Mathmo Consulting
22 September 2016

Analysis of the 2016 MCAT Exam (Day 2)

The media is buzzing with excitement over last week’s NCEA Level 1 MCAT (Mathematics Common Assessment Task) examination.  Students are in tears and teachers are outraged over the exam that was “very different in style” and “far too difficult”.

For those who don’t know what the MCAT is, or why the MCAT might be so important, here are some salient facts:

  • The examination topic is algebra, specifically “Apply algebraic procedures in solving problems”.
  • This is an external NZQA exam, administered internally by the high schools in September. Prior to 2011, algebra was examined in November, as part of the three-hour end-of-year external exam.
  • Calculators are NOT allowed in this examination.
  • Schools may struggle to administer the exam in a single day, so there are two similar, but different, versions of the exam – one to be delivered on a Tuesday, the other on a Thursday.
  • The traditional assessment (as opposed to the trial online EMCAT) was made harder last year. From the official NZQA assessment (marking) schedule:  “The style of some of the questions in this year’s assessment has changed so as to align more closely with the requirements of the achievement standard. The title of the standard requires the candidate to use algebraic procedures in solving problems.”
  • Students who don’t do well in algebra at NCEA Level 1 will have limited options at NCEA Level 2. Such students are typically not allowed to study calculus at Level 2.
  • Students who don’t do well in algebra will struggle with calculus.
  • Students who don’t do well in algebra or calculus will find it difficult to pursue STEM subjects at university.

So just how different or difficult was the exam?  Now that the papers are in the public domain, I will review one of them (the Thursday paper) here.  If you aren’t interested in the details, you can skip to the conclusion here.

[Update: I have seen the assessment (marking) schedule and added comments below.  As with most NCEA exams, it’s not the questions I have so much of a problem with, it’s how they are assessed…]

image001.png

The intention here is probably for students to factorise the quadratic as image003.png and supply the two factors as the side lengths.  However, I suspect the examination team failed to notice that this question has infinitely many possible answers.  For example, another factorisation that students might have reasonably obtained is

image005.png

The wording of this question should have been sharpened so that the intended factorisation (if there was one) was made clear.  (See Question Three (a)(i) below.)

[Update: As predicted, the assessment schedule does not allow for any answer other than the intended factorisation.  I pity anyone who offered a different but valid answer.]

Poor wording aside, factorising quadratics is a basic algebraic procedure that is typically introduced to Year 10 students, and this question put a thin veneer of an application on top.  Level 1 students should be familiar with applying algebra in the context of measuring area, albeit in the opposite direction.  If they knew how to multiply two quantities to form the area of a rectangle, it doesn’t seem unreasonable to expect them to recognise that this question was asking them to reverse that process.

image007.png

This is a reasonable question.  Students at this level should be able to solve a quadratic equation that is not equal to zero to begin with, but can be made equal to zero and then factorised.

[Update: To my surprise, if this question was answered in full, it was worth an Excellence!  That should cheer up a few people.]

image008.png

This is a simultaneous equations question, not trivial, but reasonable.

[Update: If students were able to write down at least one equation correctly, that was worth an Achieved.]

image009.png

This is a basic procedure, namely adding algebraic fractions.  There is hardly anything to do here – they even supply the answer.  The subtraction of algebraic fractions appears more commonly in maths exams because students are frequently caught out by multiplication of negative numbers.

image010.png

This is probably one of the questions that some people are suggesting is more suitable for NCEA Level 2.  I am inclined to disagree.  Certainly, a general index equation might require logarithms to solve it, but not this one.  As long as students recognise that image011.png, this question is manageable.

Furthermore, this index question is not much harder than index questions in previous years.  Last year’s trial online EMCAT asked “What is the connection between image013.png and image015.png if image017.png?”  Again, as long as students know that image019.png, they should be able to say something sensible.  In the 2014 MCAT, students were asked to solve image021.png.

In summary, I feel that Question One is fair.  Part (a)(i) might have deviated from early questions in past papers of the “Factorise this” or “Simplify that” variety, but the application was not particularly difficult or surprising.  As NZQA pointed out last year, the title of the standard is “Apply algebraic procedures in solving problems”.

 

image023.png

This question is an algebraic substitution question.  The fact that the equation modelled a parabola is again a thin veneer of an application.  If students didn’t know what a parabola was, I hope they managed to ignore it and press on.

image024.png

Solving algebraic inequalities seems to unsettle students, even though the steps required are almost identical to the steps required to solve algebraic equations – there is only one extra thing to remember, which is to reverse the inequality if multiplying or dividing by a negative number.  All steps required to solve this question should be known to students at this level.  [Update: Even just one correct expansion was worth an Achieved.]

Interestingly, this was one of the few occasions where the parallel question in the Tuesday paper was noticeably trickier at first appearance, because there were identical factors on both sides of the inequality and it might have been tempting to cancel them out.  Unfortunately, that would have led to something nonsensical.  The lack of equivalence between these parallel questions is something that should have been picked up by the examination team and corrected.

image025.png

This is potentially another question that looks like a Level 2 question, but isn’t.  A reasonable first attempt at solving this would be to rewrite the inequality as image026.png.  A sharp observer might notice that

image029.pngimage031.pngimage032.png

Either way, as long as students know their powers of 2 up to 32, they should see that the whole number image035.png has to be less than or equal to 5.  As noted above, index questions of this type have been asked in previous years.  Changing an equality symbol to an inequality symbol does not affect the algebra required to solve the problem.

image036.png

This question requires students to expand and simplify two expressions and then look for similarities between them.  If students had seen a similar 2014 MCAT question, in which students were asked to write image037.png in terms of image039.png, then they would have been adequately prepared to attempt this question.  [Update: Even just one correct expansion was worth an Achieved.]

image041.png

OK, this is where things start to get interesting!  Students were given 9 lines of paper in which to conduct their investigation and answer this part of the question.

Firstly, what is meant by “when Janine changes the order of the numbers in Line 1”?  The directness of the question “Does she get the same answer as in Line 4?” suggests that we are supposed to investigate a single event.  But there are 23 ways in which Janine could change the order of the numbers.  Are students expected to try all 23 ways?  Over 9 lines of paper, probably not.  If we are meant to infer that Janine changes the numbers only once, how can we possibly investigate if we don’t know what the new order looks like?  The wording of the question is decidedly murky.

In actual fact, the answer is “yes” or “no”, depending on how the numbers are re-ordered.  I’ve seen probability questions in which a “yes” is an Achieved answer but a “no” is a Merit answer, but at this early stage in this question, students would be choosing a new order at random, so it would be unfair if either answer fell into a different achievement band.

[Update:  If you tried one rearrangement and said “yes” or “no”, that was worth an Achieved.  So even though I understood the question well enough to perform a succinct investigation with the preferable answer of “no”, I would have only gotten an Achieved.  If you tried two rearrangements but didn’t say anything, that was also an Achieved.  The wording of the Merit criteria is just as nonsensical as the question, but I think they want at least two rearrangements with a point of difference and some sort of statement about one’s findings.]

image042.png

I had to parse this sentence many times and came to the conclusion that it doesn’t make a lot of sense, even after tweaking it to read “Find, using algebra, the relationship between the numbers in Line 1 and the numbers in Line 4 when she changes the order of the numbers in Line 1.”  If students are meant to investigate what happens to the expression for the number in Line 4 after the numbers are changed, then it has not been made very clear.  A better question would be “Using algebra, show how the number in Line 4 might be affected if Janine changes the order of the numbers in Line 1.”

This question doesn’t feel like problem solving.  We are using algebra to make a general observation.  Students at Level 1 will have very little experience of using algebra in this way, but such questions have been seen in other Level 1 Maths assessments in the past, namely “Tables, Equations and Graphs”, an end-of-year assessment concerned specifically with the application of algebra to graphs.

It is certainly possible to express the number in Line 4 in terms of the numbers in Line 1 at any given point in time.  For example, if the numbers in Line 1 are called image043.png, then the number in Line 4 is image045.png.  But perhaps we are supposed to pay attention to the pattern of the example numbers 2, 4, 6, 8 and call the numbers image047.png?  Is this important or is this a distraction?

I would conjecture that a student would be better able to answer part (i) after answering part (ii).  That is the whole point of algebra, after all.  It enables us to see patterns in numbers, or at least understand better the patterns that we see, because the numbers themselves often get in the way.  Assuming there is a pattern in the numbers in Line 1 gets in the way of truly understanding any pattern that might be observed in the triangular formation, and ideally we should not be distracted by this.  So my preferred setting of the first two parts of Question Two would be split into three parts:

  • Janine changed the order of the numbers in Line 1 and found that it changed the number in Line 4. What might Janine’s new ordering look like?
  • Janine wonders whether changing the order of the numbers in Line 1 will always change the number in Line 4. Use algebra to find the relationship between arbitrary numbers in Line 1 and the resulting number in Line 4.
  • Use your expression for the number in Line 4 to explain how Janine could change the order of the numbers in Line 1 but not change the number in Line 4.

image049.png

If students were brave enough to attempt this question, hats off to them!  I quite enjoyed this question myself, but only students with a good understanding of divisibility would understand how to interpret the algebra in this question.  Furthermore, open questions such as “what do you know?” are only fair if the answers are marked “openly”.  Unfortunately, that’s not the case with NCEA.  They still have something specific in mind, answered to a greater (Excellence) or lesser (Achieved) extent, or somewhere in the middle (Merit).  Therefore, students should reasonably expect to be given better guidance as to what is intended by the question.  E.g. “If the number in Line 4 is divisible by 3, then identify the position of the number or numbers in Line 1 that are divisible by 3.”

My final comment about this question is the lack of continuity.  In part (iii), we were to assume there was a pattern in the numbers in Line 1, but it wasn’t the pattern demonstrated at the start of the question.  It appears that the 2, 4, 6, 8 pattern was a red herring, and the examination team should have chosen numbers that appeared to be more random.

In summary, Question Two was fair up to part (d).  Part (e) was poorly written and too hard for Level 1.

 

image050.png

This question is almost identical to Question One (a)(i) but it has an important difference.  One of the side lengths is given, which means the intended factorisation of the quadratic expression for the area has been made clear.  It begs the question, why was Question One (a)(i) even included??

image051.png

Another open question!  What’s wrong with “State any restrictions on the value of image052.png for this rectangle”?

[Update: This is amazing.  If you answer the question correctly, you only get an Achieved.  If you say something about the area or the side lengths, i.e. more than was asked for, you get a Merit!  If they wanted commentary on the area or the side lengths, why didn’t they just say “Explain.” at the end?]

image054.png

This is a “changing the subject of the formula” question, and there have always been questions of this type in previous years.

image056.png

Er, what’s this question doing here?  This is the sort of basic procedural question that would have been better placed as Question One (a)(i).

image057.png

This is not a trivial question, but it is suitable for Level 1.  However, it’s worth noting that the parallel question in the Tuesday paper resulted in a quadratic equation that was different enough to be inequivalent.  The Tuesday students may have struggled more than the Thursday students.  This should not have happened.

[Update: Sure enough, the Tuesday question must have been done very badly, because you could get Excellence even if you didn’t quite solve the equation!  I’ve also heard reports of students (not ours!) getting caught up trying to calculate the square root of 8 without a calculator…]

image058.png

If students weren’t sure how to answer this question, looking at (ii) would have given them a good clue!

 

image060.png

Like many NCEA questions, this is quite ‘wordy’ and requires a high level of literacy to understand the question.  It also requires knowledge about the features of the graph of a quadratic expression, and the use of algebra to solve a quadratic equation.  The icing on the cake is to form a percentage from the two quantities obtained (the maximum horizontal width and the width at a vertical depth of 3cm).

[Update: Solving the equation, i.e. bulk of the work and the algebraic heart of the problem, was only worth a Merit.  The trivial step of calculating a numerical percentage at the end was what it took to get an Excellence.  Seriously?? Where is the extended abstract thinking in there?  I suspect the examination team pulled this question from a traditional “marks out of 100” paper but failed to modify it for NCEA.]

Although I think this is a reasonable Excellence question for this assessment, it is worth noting that this question could easily appear in the “Tables, Equations and Graphs” assessment at the end of the year.  There needs to be some discussion about whether or not teachers and students should expect the same knowledge to be potentially assessed twice.

It seems that some schools had not yet taught their students about graphs, but in all fairness, last year’s MCAT exam had questions that required graphing knowledge.

In summary, Question Three was challenging but fair, provided students had been taught the appropriate material.

In conclusion…

This exam wasn’t a walk in the park, but actually most of the questions were fair for Level 1, even if they weren’t identical in form to past exam questions.  There were certainly some poorly-worded questions, but unfortunately I see them in NCEA maths exams every year.  MCAT 2016 is by no means the exception.

It is true that the MCAT now has fewer basic questions that test purely algebraic procedures, but most of these procedures should be introduced in Year 10, so it is not unreasonable to expect students be ready to apply them in Year 11.  Given that the change in style occurred last year, I am surprised that the huge uproar didn’t occur 12 months ago.

Most importantly, I believe this year’s MCAT is the standard we should be aspiring to.  Media reports suggest the reality in schools is very different.  I will discuss this in my next post.

Dr Audrey Tan, Mathmo Consulting
20 September 2016

 

 

 

 

The primary maths issue that won’t go away

Oh dear, it’s that pesky maths problem that won’t go away, no matter how much Government money is thrown in the wrong direction.

Whilst I would never stand in the way of any initiative that raises the bar for teaching in New Zealand, we do not need specialist maths teachers at the primary school level. What we do need is to stop asking children how they know the answer to 3 + 4 is 7 and if there are other ways to get that answer. Is it any wonder our Year 8 students are ill-prepared for secondary school maths when their precious brain power is wasted on such trivia?

Such patronising recommendations from so-called “specialists” highlight the lack of understanding in New Zealand of what success in maths looks like. It is scandalous that the Ministry of Education continues to cling on to flawed ideals created by people who have no mathematical qualifications or experience, despite every indication our children are failing, year after year. They claim that implementing effective maths teaching and learning in classrooms is “challenging and complex”. It gives the impression they’d rather see students continue to fail at maths than acknowledge the compelling evidence of a quick and effective solution.

The Bring Back Column Addition campaign was never supposed to be a long-term crusade. I thought common sense would prevail; how wrong I was, and how much I have learned about attitudes within the education sector. This campaign will continue until the Minister of Education and her officials acknowledge that the acquisition of basic maths skills is not negotiable. Every child should leave primary school knowing their single digit addition and multiplication facts as well as they know their alphabet. They should be able to add, subtract, multiply and divide numbers fluently. They should be able to work confidently with fractions, decimals and percentages. As clients of the system, every parent should demand this.

Education professor John O’Neill says it would take 20 years to pull this country out of its downward spiral. It may well take that long, but while there are still some practising teachers who can remember life before the dreadful Numeracy Project was dispersed over the country like a gas bomb, let’s harness that experience and give our current children a fighting chance. Teachers, please let your students line up the columns and get them doing maths again. It’s the least you can do for our kids and our country.

Dr Audrey Tan, Mathmo Consulting
May 2016



The Primary Issue: Ministry counts cost of children failing at maths – National – NZ Herald News
• Maths scores have been declining since 2002, with National Standards figures showing one in four are behind in the subject by the time they leave primary school • Ministry – New Zealand Herald
nzherald.co.nz