researchED Auckland 2018

researchED2018

Isn’t it crazy that, in 2018, we’re still “working out what works” in Education?

In fact, some of us do already have a pretty good idea of what works, but getting the right people to listen is a different problem altogether.

And so, a group of like-minded individuals (and maybe a couple of sceptics) gave up their Saturday on Queen’s Birthday weekend to attend New Zealand’s very first researchED conference in Auckland. researchED is a growing movement based in the UK but spreading internationally, “a grass-roots, teacher-led project that aims to make teachers research-literate and pseudo-science proof” (and by golly does this country need proofing). Founder Tom Bennett quickly realised that his own teacher training was based more on edu-myths and dogma (e.g. learning styles) than any scientific, evidence-based research.  He’s not the only one.  Daisy Christodoulou’s book, Seven Myths About Education, is the coffee that any waking 21st century learning fanatic should smell.  Briar Lipson at the New Zealand Initiative hasn’t spent very long in this country, but has already sized up our education system very well and should be commended for bringing researchED to New Zealand.

Every talk raised serious questions about how we teach in New Zealand, and everyone was there in the belief that we can, and should, be doing better.  Not surprisingly, the academics are calling for the Ministry of Education to change their ways and look for evidence before adopting fads as policies, while the pragmatic principals and teachers cannot afford to wait and are simply getting on with things.

The common factor of the day was subject knowledge and the importance of committing knowledge to long-term memory.  The 21st century learning ethos suggests that we should leapfrog, or at least skim over, these foundational skills in a bid to produce generic critical thinkers and problem solvers, but surely common sense tells us we cannot reasonably expect students to think critically or solve problems unless they actually have some knowledge to work with.

I have no desire to repeat what has been said so well by others, so instead I will direct readers to a newly created blog by Derek Hopper, a music teacher at Tauraroa Area School who has read up on what works and is spreading the word.  He and his colleagues are seeing significant improvements in student behaviour and achievement. Happy students, happy teachers.  Having already spoken to a maths teacher at Tauraroa who is offering guidance to their primary teachers, I believe this school may well provide the model for other schools to follow.

Some other reflections of the day:

Tom Bennett, founder of researchED: Teachers might think that indulging in (catering for individual) learning styles is a harmless bit of fun, but there is no time to waste when teaching children from disadvantaged backgrounds.  Every minute counts.

Katharine Birbalsingh, keynote speaker and founder/Headmistress of the evidence-informed Michaela Community School in London: Her teachers do not play “Guess what’s in my head?”, i.e. they don’t question their students before the relevant knowledge has been taught, so that every student, regardless of their background, has an equal chance of answering the teachers’ questions correctly.  A subtle but powerful way to address social inequity and level the playing field.

Dr. Michael Johnston, Victoria University: When new skills are learned and practised sufficiently, they become automatic and free up the working memory to concentrate on higher-order thinking.  With particular reference to mathematics pedagogy, the current NCEA internal assessment system provides little incentive for students to practise skills and procedures to the point of automaticity, and if they haven’t reached that point, then they will struggle with the cognitive demands of solving the contextualised problems presented in assessment.

Prof. Elizabeth Rata, Auckland University: Already widely known for her views on the lack of academic knowledge in the curriculum.  When she used the definition of the apostrophe as an example of understanding the epistemic structure of academic knowledge, I genuinely thought she was going to ask the audience if they had spotted the misplaced apostrophe in the previous slide.  She didn’t.  I suddenly felt alone.

Dr. Graham McPhail, Auckland University: There is little evidence that deep learning occurs through subject integration.  Wineburg and Grossman (2000) warned that ‘often the choice to implement a new curriculum is based on symbolic factors, such as a desire to be seen as progressive and in the forefront of reform’.

Louise Zame, primary school teacher:  When listening to a teacher speak so eloquently about the professional challenges of implementing Inquiry Learning…to a bunch of 5-7 year olds…you realise just how much the Ministry of Education has lost the plot.  As part of her Master’s research, Louise asks the pertinent question: what content knowledge do young students (aged 5-7 years) gain through inquiry learning?

Dr. Shaun Hawthorne, Cognition Education Ltd: Prof. John Hattie has recently updated his list of influences on student achievement, and top of the list is now “collective teacher efficacy” with a whopping effect size of 1.57.  For those who don’t know about Hattie’s effect size measure, almost everything on the list has a positive effect, so teachers and schools should not be too complacent. They should be looking to maximise their impact, and punching above the average effect size of 0.40.

To finish:

  • I was probably the only person excited to spend a bit of time in the Vaughan Jones Room during the lunch break.
  • Great care must be exercised when evaluating “evidence-based research”.  There is a lot of rubbish out there.  For example, the Numeracy Development Projects “research” showed that if you teach children strategies then children will learn strategies.  Big deal.
  • The panel discussion at the end left me in no doubt of the monumental challenge we face trying to fix New Zealand’s education system. To quote John Morris, “Currently education policy is being determined by political imperatives. It should not be. All policy initiatives, and in education there are so many of them, should be evidence-based.”
  • Tom Haig from the NZPPTA was naturally highly sensitive to the political undertones of the day and felt the debate was too one-sided.  Perhaps that’s because there is little to debate when we rely on evidence.  If the focus on credible and reliable evidence can take the politics out of Education, then bring it on I say, for I can think of no group of stakeholders less politically-minded than our precious children.

Dr Audrey Tan, Mathmo Consulting
8 June 2018

NCEA – where less is more

Our “revolutionary” National Certificate(s) of Educational Achievement (NCEA) secondary school qualification, built on ideals of inclusion and equity, has failed to deliver on many fronts.

The New Zealand Initiative has highlighted some serious problems with the NCEA system. Briar Lipson’s well researched report, Spoiled by Choice: How NCEA Hampers Education and What It Needs To Succeed, exposed the harsh reality behind the dramatic growth in numbers of students achieving NCEA Level 2 each year. NCEA performance may be “improving”, but the international survey PISA shows that our 15-year-olds’ capabilities in maths, science and reading are declining.

Spoiled_by_choice_Fig1

Moreover, despite there being a minimum requirement for literacy and numeracy credits, a Tertiary Education Commission (TEC) study in 2014 found that reasonably large proportions of students with NCEA Level 1 (approximately half) or NCEA Level 2 (four out of 10) were not functionally literate, and similar proportions were not functionally numerate.

TEC_Literacy_Fig1
TEC_Numeracy_Fig2

So what can we actually ascertain about a student with an NCEA qualification?

Very little, according to the Initiative’s second report, Score! Transforming NCEA Data. It brought to attention the huge variation in grade distribution across subject sub-fields. It is comparatively easier to gain an Excellence grade in Languages and Performing Arts than it is in Mathematics and Statistics. And not surprisingly, they found that more students pass internal assessments than external assessments.

Score_Fig4
Score_Fig2

In an attempt to redress these imbalances, the economists at the Initiative came up with a Weighted Relative Performance Index (WRPI) that endeavours to make “sense” of a student’s NCEA credits and provide a fairer comparison of performance between students. 1

It was a laudable attempt. However, one blatant inequity in the NCEA system that was not addressed by the WRPI, and has yet to be discussed widely, is the absence of a sensible time restriction applied to individual external assessments.

Allow me to explain. In Mathematics, students at any of the three NCEA Levels may be entered for a three-hour external examination comprising up to three achievement standards. Each achievement standard is a self-contained assessment/paper, sealed up in its own plastic wrapping. (Biodegradable, I hope.)

It would seem reasonable that each of the three papers should be completed in approximately one hour. 2 Therefore, it would seem reasonable that if a school decides to enter a student for fewer than three achievement standards, then the duration of the exam should be reduced, i.e. one hour for one paper, two hours for two papers. But no, not with NCEA! Schools may enter students for one or two papers, and students still get three hours to complete them!

This obviously puts students entered for three papers at a disadvantage. Are these students supposed to console themselves that they are receiving a better education, even if their peers come out with higher grades because they had more time?

Schools can, and do, game the system by entering their students for fewer than three papers. But students can also game the system by electing to not attempt all of the papers for which they are entered. If they leave the plastic wrapping intact, they will receive a Standard Not Attempted/Assessed (SNA), and apparently this is better than a Not Achieved (N).

Really?? According to this memo, it’s better for schools because School Result Summaries will include N’s but not SNA’s. Also, SNA allows for the possibility that “a student ran out of time so an N would not be a fair result”. In other words, it’s better to fail to try than to try and fail.

For those chasing an Endorsement, it is actually in their best interests to attempt fewer papers and go for higher grades. For example, for a student chasing an Excellence endorsement, two Excellence grades would be preferable to three Merit grades.

Our national secondary school examination system actually rewards students for doing less work.

The acknowledgement of effort is lacking even within an individual paper. In a traditional marking scheme, every correct answer would contribute to the final mark, but NCEA is a standards-based assessment system with “top down” marking. 3 Therefore, confident students can take a gamble and jump straight to the harder parts of a question. It’s a risky strategy, but if it pays off, they may achieve Excellence having answered roughly a third of the paper. This flies in the face of the instruction printed on the cover page: “You should attempt ALL the questions in this booklet.” If you do well, then some of what you do could turn out to be a waste of time.

NCEA is a system that does little to incentivise students to put in maximum effort or to persevere if the results are likely to be sub-optimal. Until these problems are addressed, mediocrity will prevail. We welcome the Ministry of Education’s review of NCEA this year, and hope to be part of that discussion.

Dr Audrey Tan, Mathmo Consulting
21 March 2018

1   Ironically, NZQA in their utopian socialist bubble didn’t want us to compare students, but it’s happening anyway. Students compare themselves, universities have already come up with their own weighted metrics, and employers are learning that “E” grades on a Record of Achievement are actually higher than “A” grades.

2   Prior to 2013, each paper used to start with the recommendation “You are advised to spend 60 minutes answering the questions in this booklet.”, but not any more.

3   Anecdotally, not all markers appear to follow this methodology. Perhaps they too feel that any positive efforts should be acknowledged, even if ultimately ignored in the final score.

Have New Zealand’s PISA rankings really improved?

The PISA 2015 results are out and the Minister of Education is claiming an improvement in New Zealand’s rankings! Unfortunately, upon looking more closely at the Mathematics scores, citing a move from 23rd place to 21st place as an improvement is pure fantasy.

Liechtenstein, ranked 8th in 2012, did not participate in 2015. Had they participated in 2015, it is unlikely their score (535) would have fallen by so much as to affect New Zealand’s ranking. New Zealand automatically went up by a place just because Liechtenstein pulled out.

Vietnam scored 511 in 2012 but has dropped back significantly to 495 in 2015, exactly the same score as New Zealand. It’s not clear to me why New Zealand was ranked one place ahead of Vietnam, and not the other way round.

These facts alone mean that New Zealand could easily be placed at 23rd again.

There are two other countries whose performance has affected New Zealand’s rankings. Australia has dropped significantly from 504 in 2012 to 494 in 2015. On the other hand, Norway has improved significantly from 489 in 2012 to 502 in 2015.

The net effect to New Zealand’s ranking is actually zero.

A more mature approach to understanding the PISA results is to look at New Zealand’s recent and long-term score trends, relative to the OECD average.

From 2012 to 2015, all of our scores (in Maths, Science and Reading) have dropped, but in line with the OECD average. However, there is a much more concerning long-term decline, with a significant drop from 2009 to 2012, that does not follow the same trend as the OECD average. The 28 point drop in Mathematics from 2003 to 2015 is equivalent to nearly a year’s worth of schooling.

Of particular concern are the growing proportions of low-achieving children performing below Level 2. In Reading, students below Level 2 “have difficulty with all but the simplest reading tasks measured by PISA. Level 2 is considered a baseline level at which students begin to demonstrate the reading skills and competencies that will enable them to participate effectively later in life.” In Mathematics, “Level 2 is considered to be a baseline level at which students begin to demonstrate the competencies that will enable them to participate actively in mathematics-related life situations.” In 2015, 22% of New Zealand’s 15-year-old students could “complete only relatively basic mathematics tasks and whose lack of skills is a barrier to learning.

pisa2015-nzscience

pisa2015-nzreading

pisa2015-nzmathsscores

pisa2015-nzmathsproficiency

Source: NZ Ministry of Education, PISA 2015: New Zealand Summary Report

PISA (Programme for International Student Assessment) is an international study that assesses and compares how well countries are educationally preparing their 15-year-old students to meet real-life opportunities and challenges. With our apparent long-term decline in all three subjects, and in conjunction with our perennial poor performance in TIMSS, can we say honestly say that New Zealand is heading in the right direction?

Dr Audrey Tan, Mathmo Consulting
7 December 2016

MCAT (Mathematical Crisis, the Awful Truth) 2016

It’s time for New Zealand to look past the hysterical response to this year’s NCEA Level 1 MCAT exam and try to understand what’s really going on here.

Was the exam appropriate in level and difficulty?

In my previous post, I analysed the second of the two (supposedly) parallel papers and found that most of the questions were at a reasonable level for NCEA Level 1, and also reflective of the title “Apply algebraic procedures in solving problems”.

There was a section that was more investigative in nature and new for MCAT (but such questions have appeared in other Level 1 maths assessments in the past).  This section was made difficult by its poor construction and confusing wording, and most Level 1 students would have struggled to understand the intention.  But most exams have a Very Hard Question (VHQ), so I guess this is the VHQ for this exam.

 Was it too different from previous years?

Apart from the investigative question, I don’t think so, but I might have said differently last year, when there was a noticeable step up.  From the 2015 MCAT Exemplar:

This year at least one of the three questions will not have any directed straight procedure-based parts and the other questions a maximum of one such part.…candidates will not be able to provide evidence by following a direction to solve factorised quadratics, factorise, expand, write or solve a linear equation, or simplify an expression involving the collection of like terms in response to being told to.  One part in each question may direct the student to perform such procedures; but without further evidence at Achievement level, this will not be sufficient for the award of the standard. Utilising procedures such as factorising, simplifying a rational function, or writing an equation from a word problem will provide evidence of solving a problem.  Candidates must know that given a word problem, they will be required to write equation(s) and demonstrate consistent use of these in solving a problem. Candidates will be expected to have a basic understanding of the relationship between a quadratic function and the associated graph.

MCAT was last reviewed in 2013 and is up for review at the end of this year.  Whether a change in style between reviews is appropriate should certainly be up for discussion.

So why did students find it so difficult?

The unfortunate reality is that students did struggle with this exam.  The gap between what MCAT is expecting of students, and what students are actually capable of, is widening.

There are complaints that the lack of “gimme” questions at the start of the paper has left students “shell-shocked” and “killed” their confidence.  Are we seriously saying that our students are capable of factorising a quadratic when explicitly told to do so, but they are unable to decode a basic word problem and factorise a supplied quadratic expression for themselves, even though they probably wouldn’t know of anything else to do with an expanded quadratic?  What does this say about the resourcefulness or resilience of our students?

We cannot blame this year’s Level 1 maths teachers for what has happened, and they should rightly feel insulted.  The problem started many years before this one.

Let’s do the maths.  Year 11 students in 2016 were Year 8 students in 2013.  This is the generation of students who were failing to grasp maths fundamentals such as fractions and decimals in Year 8.

What we’re really seeing here is the fruits of a flawed primary maths curriculum floating its way through the system.  Even two and a half years at secondary school isn’t enough to turn things around.  The damage is too great.

If you look at what the Numeracy Project was trying to achieve at primary school level, our secondary school students should, by all accounts, be highly numerate problem solvers, but in fact they are worryingly innumerate and apparently not very good problem solvers either.  It’s ironic that one of the big selling points of this “new approach” to teaching maths was the development of early “Algebraic Thinking”.  I think we can safely call that a Not Achieved.

A systemic failure in mathematics education is playing out before our very eyes.  NZQA is trying to inch up the standard, year by year, when the reality is that students are actually getting worse at algebra, year by year.  When students are struggling to master the basics, it’s hard to see how teachers can lift their students to the higher levels of problem solving now expected.

Given that next year’s Year 11 students will be the same generation of 9-year-olds who performed so abysmally in TIMSS 2011, alarm bells should be ringing loudly.  It would not be surprising if fewer students were entered for next year’s MCAT.

Spring forward, fall back

NZQA could make the MCAT easier again, but that would be disappointing.  I believe this year’s MCAT is the standard we should be aspiring to.  If the examination team could tighten up on the construction of certain questions, the MCAT would be an examination to be proud of on the world stage.  (The assessment side of things, however, needs a lot more work.)

And whilst I accept that normalisation is sometimes necessary, I do not think that assessment schedules should be adjusted to meet pre-defined targets as a standard practice.  The universities have already discovered that NCEA grades are an unreliable measure of preparedness for tertiary study.

The best thing NZQA can do is go back to examining algebra at the end of the year.

September is a really bad time of year for students to face their first high-stakes external examination.  Some students barely appreciate its significance when it is tangled up with mock exams for other topics and different subjects, and the ones that do appreciate its significance prioritise the MCAT at the expense of preparing for their mock exams.

The sensible thing to do, surely, is to fold it in with “Tables, Equations and Graphs”.  We’re already seeing questions about graphs in the MCAT anyway, and why shouldn’t we?  Algebra and Graphs are not separate topics, they are inextricably tied.  As we now see, NCEA’s compartmentalising of topics as separate assessments is hurting students’ ability to make connections and become effective problem solvers.

The decision to deliver the assessment earlier in the year and have it administered by the schools has a distinct whiff of cost-cutting about it, but it has been a disaster for maths education and is costing the country dearly.  If we want students to pursue STEM subjects at university, we need to give them every chance of succeeding in algebra at Level 1, as this effectively marks the fork in the road between calculus and statistics at Level 2.  If we want to increase the “dollar value” of Kiwis contributing to New Zealand’s economy, fixing our maths education system is a very good place to start.

Dr Audrey Tan, Mathmo Consulting
22 September 2016

Analysis of the 2016 MCAT Exam (Day 2)

The media is buzzing with excitement over last week’s NCEA Level 1 MCAT (Mathematics Common Assessment Task) examination.  Students are in tears and teachers are outraged over the exam that was “very different in style” and “far too difficult”.

For those who don’t know what the MCAT is, or why the MCAT might be so important, here are some salient facts:

  • The examination topic is algebra, specifically “Apply algebraic procedures in solving problems”.
  • This is an external NZQA exam, administered internally by the high schools in September. Prior to 2011, algebra was examined in November, as part of the three-hour end-of-year external exam.
  • Calculators are NOT allowed in this examination.
  • Schools may struggle to administer the exam in a single day, so there are two similar, but different, versions of the exam – one to be delivered on a Tuesday, the other on a Thursday.
  • The traditional assessment (as opposed to the trial online EMCAT) was made harder last year. From the official NZQA assessment (marking) schedule:  “The style of some of the questions in this year’s assessment has changed so as to align more closely with the requirements of the achievement standard. The title of the standard requires the candidate to use algebraic procedures in solving problems.”
  • Students who don’t do well in algebra at NCEA Level 1 will have limited options at NCEA Level 2. Such students are typically not allowed to study calculus at Level 2.
  • Students who don’t do well in algebra will struggle with calculus.
  • Students who don’t do well in algebra or calculus will find it difficult to pursue STEM subjects at university.

So just how different or difficult was the exam?  Now that the papers are in the public domain, I will review one of them (the Thursday paper) here.  If you aren’t interested in the details, you can skip to the conclusion here.

[Update: I have seen the assessment (marking) schedule and added comments below.  As with most NCEA exams, it’s not the questions I have so much of a problem with, it’s how they are assessed…]

image001.png

The intention here is probably for students to factorise the quadratic as image003.png and supply the two factors as the side lengths.  However, I suspect the examination team failed to notice that this question has infinitely many possible answers.  For example, another factorisation that students might have reasonably obtained is

image005.png

The wording of this question should have been sharpened so that the intended factorisation (if there was one) was made clear.  (See Question Three (a)(i) below.)

[Update: As predicted, the assessment schedule does not allow for any answer other than the intended factorisation.  I pity anyone who offered a different but valid answer.]

Poor wording aside, factorising quadratics is a basic algebraic procedure that is typically introduced to Year 10 students, and this question put a thin veneer of an application on top.  Level 1 students should be familiar with applying algebra in the context of measuring area, albeit in the opposite direction.  If they knew how to multiply two quantities to form the area of a rectangle, it doesn’t seem unreasonable to expect them to recognise that this question was asking them to reverse that process.

image007.png

This is a reasonable question.  Students at this level should be able to solve a quadratic equation that is not equal to zero to begin with, but can be made equal to zero and then factorised.

[Update: To my surprise, if this question was answered in full, it was worth an Excellence!  That should cheer up a few people.]

image008.png

This is a simultaneous equations question, not trivial, but reasonable.

[Update: If students were able to write down at least one equation correctly, that was worth an Achieved.]

image009.png

This is a basic procedure, namely adding algebraic fractions.  There is hardly anything to do here – they even supply the answer.  The subtraction of algebraic fractions appears more commonly in maths exams because students are frequently caught out by multiplication of negative numbers.

image010.png

This is probably one of the questions that some people are suggesting is more suitable for NCEA Level 2.  I am inclined to disagree.  Certainly, a general index equation might require logarithms to solve it, but not this one.  As long as students recognise that image011.png, this question is manageable.

Furthermore, this index question is not much harder than index questions in previous years.  Last year’s trial online EMCAT asked “What is the connection between image013.png and image015.png if image017.png?”  Again, as long as students know that image019.png, they should be able to say something sensible.  In the 2014 MCAT, students were asked to solve image021.png.

In summary, I feel that Question One is fair.  Part (a)(i) might have deviated from early questions in past papers of the “Factorise this” or “Simplify that” variety, but the application was not particularly difficult or surprising.  As NZQA pointed out last year, the title of the standard is “Apply algebraic procedures in solving problems”.

 

image023.png

This question is an algebraic substitution question.  The fact that the equation modelled a parabola is again a thin veneer of an application.  If students didn’t know what a parabola was, I hope they managed to ignore it and press on.

image024.png

Solving algebraic inequalities seems to unsettle students, even though the steps required are almost identical to the steps required to solve algebraic equations – there is only one extra thing to remember, which is to reverse the inequality if multiplying or dividing by a negative number.  All steps required to solve this question should be known to students at this level.  [Update: Even just one correct expansion was worth an Achieved.]

Interestingly, this was one of the few occasions where the parallel question in the Tuesday paper was noticeably trickier at first appearance, because there were identical factors on both sides of the inequality and it might have been tempting to cancel them out.  Unfortunately, that would have led to something nonsensical.  The lack of equivalence between these parallel questions is something that should have been picked up by the examination team and corrected.

image025.png

This is potentially another question that looks like a Level 2 question, but isn’t.  A reasonable first attempt at solving this would be to rewrite the inequality as image026.png.  A sharp observer might notice that

image029.pngimage031.pngimage032.png

Either way, as long as students know their powers of 2 up to 32, they should see that the whole number image035.png has to be less than or equal to 5.  As noted above, index questions of this type have been asked in previous years.  Changing an equality symbol to an inequality symbol does not affect the algebra required to solve the problem.

image036.png

This question requires students to expand and simplify two expressions and then look for similarities between them.  If students had seen a similar 2014 MCAT question, in which students were asked to write image037.png in terms of image039.png, then they would have been adequately prepared to attempt this question.  [Update: Even just one correct expansion was worth an Achieved.]

image041.png

OK, this is where things start to get interesting!  Students were given 9 lines of paper in which to conduct their investigation and answer this part of the question.

Firstly, what is meant by “when Janine changes the order of the numbers in Line 1”?  The directness of the question “Does she get the same answer as in Line 4?” suggests that we are supposed to investigate a single event.  But there are 23 ways in which Janine could change the order of the numbers.  Are students expected to try all 23 ways?  Over 9 lines of paper, probably not.  If we are meant to infer that Janine changes the numbers only once, how can we possibly investigate if we don’t know what the new order looks like?  The wording of the question is decidedly murky.

In actual fact, the answer is “yes” or “no”, depending on how the numbers are re-ordered.  I’ve seen probability questions in which a “yes” is an Achieved answer but a “no” is a Merit answer, but at this early stage in this question, students would be choosing a new order at random, so it would be unfair if either answer fell into a different achievement band.

[Update:  If you tried one rearrangement and said “yes” or “no”, that was worth an Achieved.  So even though I understood the question well enough to perform a succinct investigation with the preferable answer of “no”, I would have only gotten an Achieved.  If you tried two rearrangements but didn’t say anything, that was also an Achieved.  The wording of the Merit criteria is just as nonsensical as the question, but I think they want at least two rearrangements with a point of difference and some sort of statement about one’s findings.]

image042.png

I had to parse this sentence many times and came to the conclusion that it doesn’t make a lot of sense, even after tweaking it to read “Find, using algebra, the relationship between the numbers in Line 1 and the numbers in Line 4 when she changes the order of the numbers in Line 1.”  If students are meant to investigate what happens to the expression for the number in Line 4 after the numbers are changed, then it has not been made very clear.  A better question would be “Using algebra, show how the number in Line 4 might be affected if Janine changes the order of the numbers in Line 1.”

This question doesn’t feel like problem solving.  We are using algebra to make a general observation.  Students at Level 1 will have very little experience of using algebra in this way, but such questions have been seen in other Level 1 Maths assessments in the past, namely “Tables, Equations and Graphs”, an end-of-year assessment concerned specifically with the application of algebra to graphs.

It is certainly possible to express the number in Line 4 in terms of the numbers in Line 1 at any given point in time.  For example, if the numbers in Line 1 are called image043.png, then the number in Line 4 is image045.png.  But perhaps we are supposed to pay attention to the pattern of the example numbers 2, 4, 6, 8 and call the numbers image047.png?  Is this important or is this a distraction?

I would conjecture that a student would be better able to answer part (i) after answering part (ii).  That is the whole point of algebra, after all.  It enables us to see patterns in numbers, or at least understand better the patterns that we see, because the numbers themselves often get in the way.  Assuming there is a pattern in the numbers in Line 1 gets in the way of truly understanding any pattern that might be observed in the triangular formation, and ideally we should not be distracted by this.  So my preferred setting of the first two parts of Question Two would be split into three parts:

  • Janine changed the order of the numbers in Line 1 and found that it changed the number in Line 4. What might Janine’s new ordering look like?
  • Janine wonders whether changing the order of the numbers in Line 1 will always change the number in Line 4. Use algebra to find the relationship between arbitrary numbers in Line 1 and the resulting number in Line 4.
  • Use your expression for the number in Line 4 to explain how Janine could change the order of the numbers in Line 1 but not change the number in Line 4.

image049.png

If students were brave enough to attempt this question, hats off to them!  I quite enjoyed this question myself, but only students with a good understanding of divisibility would understand how to interpret the algebra in this question.  Furthermore, open questions such as “what do you know?” are only fair if the answers are marked “openly”.  Unfortunately, that’s not the case with NCEA.  They still have something specific in mind, answered to a greater (Excellence) or lesser (Achieved) extent, or somewhere in the middle (Merit).  Therefore, students should reasonably expect to be given better guidance as to what is intended by the question.  E.g. “If the number in Line 4 is divisible by 3, then identify the position of the number or numbers in Line 1 that are divisible by 3.”

My final comment about this question is the lack of continuity.  In part (iii), we were to assume there was a pattern in the numbers in Line 1, but it wasn’t the pattern demonstrated at the start of the question.  It appears that the 2, 4, 6, 8 pattern was a red herring, and the examination team should have chosen numbers that appeared to be more random.

In summary, Question Two was fair up to part (d).  Part (e) was poorly written and too hard for Level 1.

 

image050.png

This question is almost identical to Question One (a)(i) but it has an important difference.  One of the side lengths is given, which means the intended factorisation of the quadratic expression for the area has been made clear.  It begs the question, why was Question One (a)(i) even included??

image051.png

Another open question!  What’s wrong with “State any restrictions on the value of image052.png for this rectangle”?

[Update: This is amazing.  If you answer the question correctly, you only get an Achieved.  If you say something about the area or the side lengths, i.e. more than was asked for, you get a Merit!  If they wanted commentary on the area or the side lengths, why didn’t they just say “Explain.” at the end?]

image054.png

This is a “changing the subject of the formula” question, and there have always been questions of this type in previous years.

image056.png

Er, what’s this question doing here?  This is the sort of basic procedural question that would have been better placed as Question One (a)(i).

image057.png

This is not a trivial question, but it is suitable for Level 1.  However, it’s worth noting that the parallel question in the Tuesday paper resulted in a quadratic equation that was different enough to be inequivalent.  The Tuesday students may have struggled more than the Thursday students.  This should not have happened.

[Update: Sure enough, the Tuesday question must have been done very badly, because you could get Excellence even if you didn’t quite solve the equation!  I’ve also heard reports of students (not ours!) getting caught up trying to calculate the square root of 8 without a calculator…]

image058.png

If students weren’t sure how to answer this question, looking at (ii) would have given them a good clue!

 

image060.png

Like many NCEA questions, this is quite ‘wordy’ and requires a high level of literacy to understand the question.  It also requires knowledge about the features of the graph of a quadratic expression, and the use of algebra to solve a quadratic equation.  The icing on the cake is to form a percentage from the two quantities obtained (the maximum horizontal width and the width at a vertical depth of 3cm).

[Update: Solving the equation, i.e. bulk of the work and the algebraic heart of the problem, was only worth a Merit.  The trivial step of calculating a numerical percentage at the end was what it took to get an Excellence.  Seriously?? Where is the extended abstract thinking in there?  I suspect the examination team pulled this question from a traditional “marks out of 100” paper but failed to modify it for NCEA.]

Although I think this is a reasonable Excellence question for this assessment, it is worth noting that this question could easily appear in the “Tables, Equations and Graphs” assessment at the end of the year.  There needs to be some discussion about whether or not teachers and students should expect the same knowledge to be potentially assessed twice.

It seems that some schools had not yet taught their students about graphs, but in all fairness, last year’s MCAT exam had questions that required graphing knowledge.

In summary, Question Three was challenging but fair, provided students had been taught the appropriate material.

In conclusion…

This exam wasn’t a walk in the park, but actually most of the questions were fair for Level 1, even if they weren’t identical in form to past exam questions.  There were certainly some poorly-worded questions, but unfortunately I see them in NCEA maths exams every year.  MCAT 2016 is by no means the exception.

It is true that the MCAT now has fewer basic questions that test purely algebraic procedures, but most of these procedures should be introduced in Year 10, so it is not unreasonable to expect students be ready to apply them in Year 11.  Given that the change in style occurred last year, I am surprised that the huge uproar didn’t occur 12 months ago.

Most importantly, I believe this year’s MCAT is the standard we should be aspiring to.  Media reports suggest the reality in schools is very different.  I will discuss this in my next post.

Dr Audrey Tan, Mathmo Consulting
20 September 2016

 

 

 

 

What are the chances of completing a Disney-Pixar Domino Stars collection from a fixed number of packets?

It’s a strange thing for a mathematical person to say, but some people are luckier than others. Luckily for our son (or his parents), we shop at Countdown, which means he received a regular supply of Disney-Pixar Domino Stars throughout the 6.5 week promotion.

It was all a bit of a yawn to me, this time round. Having calculated the chances of collecting a full set of Dreamworks Heroes Action Cards, I was neither surprised nor frustrated by the increasing number of duplicates as his collection grew. The theoretical hyperbolic growth of the number of packets he would have to open to find a new domino implied a spend of $4480 to complete a set of 50. Almost $700 per week?? Not very likely.

To cut a long story short, on the last day of the promotion, we trudged home from a swap meet needing just two more to complete the set. Our chances of finding those last two looked bleak…except that a neighbour had just dropped off a big bag of 22 packets in our letterbox.

As I said, some people are luckier than others, and to our amazement, our son actually found the very two that he needed! Now how lucky is that?! I mean, what are the chances…?

Without going into too much detail, you can either focus on what you don’t want to happen, i.e. you get none of one of the coveted dominoes and any number of the other:

domino-stars-1

or you can focus on what you do want to happen, i.e. you eventually find one of the two that you’re looking for, and then hope to find the other one amongst the packets remaining:

domino-stars-2

But a friend of ours came up with an expression that best reflects what really happened, i.e. our son stopped opening packets as soon as he found the two he needed:

domino-stars-3

Reassuringly, all three expressions evaluate to this rather impressive looking fraction:

domino-stars-4

So it was very close to a 1 in 8 chance. Our son doesn’t realise just how lucky he is.

Dr Audrey Tan, Mathmo Consulting
September 2015

Ministry’s concern over gaps in NZ maths teaching

Readers will be pleased to know that we already have a name for “space and shape” mathematics. It’s called geometry.

I put it to the Ministry of Education that students’ lack of exposure to “formal maths” is a direct consequence of students’ over-exposure to numeracy strategies. There are only so many hours in a school day, after all.

But let’s remain optimistic. The Ministry may never admit that the Numeracy Project was an abject failure, but the report paves the way for them to quietly sweep it under the carpet. The term “formal maths” suggests more direct teaching and less discovery-based learning.



Concern over gaps in NZ maths teaching – National – NZ Herald News
Students are not being taught enough “space and shape” mathematics and the “huge” learning gap is hurting achievement, the Ministry of Education says. – New Zealand Herald
nzherald.co.nz