California Dreamin’ about equity in mathematics education

(Image source: https://wallpapersafari.com/)

It was a welcome surprise to hear Brian Conrad talk to Kerre Woodham last week.  Brian Conrad is a Professor of Mathematics and Director of Undergraduate Studies in Mathematics at Stanford University in California, USA. 

Woodham had come across Conrad’s recent op-ed in The Atlantic, in which he discussed the findings of his examination of a 1000-page draft of the controversial California Mathematics Framework (CMF).  The CMF is similar to New Zealand’s draft Common Practice Model (CPM) in that it proposes how teachers should teach and the rationale behind it, with a strong focus on equity.

I shan’t attempt to explain what is already very well articulated by Conrad himself.  Believe me when I say this guy is thorough.  You can go to the website he created to publicise his critique of the CMF, including a whole document about citation misrepresentations.  It is, unfortunately, no surprise to me that Conrad discovered so many false or misleading claims in the CMF.  One of the CMF authors has been called out in the past for making claims but citing research that says almost the exact opposite. 

Conrad also shines a light on the CMF’s false promise of greater equity by meddling with traditional pathways (algebra and calculus) in a bid to level the playing field.  Again, evidence of success in San Francisco over a span of 10 years was exposed as a misrepresentation of the facts.  Removing an “inequitable” option led to a reasonably predictable outcome: the socioeconomically advantaged students found other ways to access that option, while the socioeconomically disadvantaged students were left even further behind, thus widening the achievement gap.  In much the same way that New Zealand’s Ministry of Education sat on their hands for 10 years after the Numeracy Project turned out to be a failed experiment, California’s State Board of Education has decided to press on with the CMF’s flawed ideology, turning a blind eye to the truth and quietly removing references to the San Francisco experiment from the final document.

But algebra and calculus is so last century, right?  Instead, the CMF wants high schools to focus on offering alternative pathways (data literacy masquerading as the more rigorous data science) that Conrad describes as “off-ramps”.  That too has already happened here in New Zealand.  Once upon a time, there was a fork in the high school road at the end of Form 6 (Year 12), when students could choose to study calculus or statistics in Form 7 (Year 13).  That fork has moved forward to the end of Year 11 (Form 5), and in reality, for many students, their choice has been pre-determined.  The modular nature of NCEA means that schools can offer different courses with more or less algebra in Year 11.  Students with less algebra in their course are effectively limited to a statistics pathway, while students with more algebra in their course have the full choice of studying calculus or statistics in Year 12.  Many students discover too late that their study and future career options are limited by a lack of algebra (and hence calculus) in their maths education.  The struggle to get back onto the calculus track without the pre-requisite algebraic skills is enormous.  Meanwhile, students on the statistics pathway are also short-changed because, over time, the mathematical content has been eroded and replaced with statistical literacy – the ability to interpret statistical graphs and critique statistical claims.  Don’t get me wrong, these are very valuable skills to have in the age of information and disinformation, but as Conrad points out, statistical/data literacy on its own will not lead to exciting careers in data science, as promised by the CMF. 

It is important to note that high schools are not to blame; they have a responsibility to navigate the system and find achievable pathways for all students, including those who struggle with algebra.  Any high school maths teacher will tell you that students tend to struggle with algebra not because of the mysterious variables x and y, but because they lack fluency with basic arithmetic, especially division and fractions.  In other words, the problems start at primary school.  

While there are many similarities between what’s happening on both sides of the Pacific Ocean, there is one major difference.  California’s CMF underwent two rounds of public consultation before being adopted in July this year, whereas no public feedback was sought when New Zealand’s Ministry of Education released Phase 1 of the CPM.  In my previous post, I drew attention to the flimsy evidence offered to support one of the CPM’s theoretical teaching approaches.  Schools would be right to be wary of claims that such approaches are “evidence-based”, particularly when some of the CPM authors are also teacher educators who have a commercial interest in promoting their research. 

We are less than a week away from finding out whether the CPM will be made compulsory for all state school teachers in New Zealand.  Whether or not you believe in the sociocultural ideals of the CPM, the sleight of hand in attempting to push this model through without public consultation is troubling enough.

Dr Audrey M. Tan
9 October 2023

Further listening/reading about the California Mathematics Framework:

Ep 11. California’s math controversy with Jelani Nelson Part I | Anna Stokke
Ep 12. California’s math controversy with Jelani Nelson Part II | Anna Stokke
Ep 15. Modern relevance in the math curriculum with Brian Conrad | Anna Stokke
Jelani Nelson and Tom Loveless on the California Math Framework | American Enterprise Institute – AEI

When will they ever learn?

The findings of the 2022 National Monitoring Study of Student Achievement (NMSSA) in mathematics and statistics have been released, and it’s the same old same old. Only 42% of Year 8 students are meeting curriculum expectations, down from 45% in 2018, with statistically significant drops in achievement among girls, and Māori and Pacific students.  

If you want some good news for a change, the schools that I work with defy these depressing statistics. Read all about it here. It shows you what progress and achievement looks like when you understand the mathematical and cognitive principles of teaching approaches that really work, and simply get on with teaching the kids.

The Minister of Education is probably right to be grateful that the results weren’t even worse after three years of disruption to learning during the Covid-19 pandemic. But can we honestly expect to raise student achievement when the Ministry continues to allow experimentation in the classroom, promoting “theoretical frameworks or approaches to teaching, informed by evidence of how ākonga learn”? It was dismaying to hear the Minister defend some of the highly dubious content in Phase 1 of the Common Practice Model (CPM), published earlier this year. Not to be confused with the refreshed mathematics and statistics learning area, where feedback was sought on the first two drafts before being finalised, there is no evidence the CPM was ever released to the public in draft form.

In this document, we are told that “a critical maths pedagogical approach uses maths to develop critical awareness about wider social, environmental, political, ideological, and economic issues. Critical maths recognises the importance of understanding, interpreting, and addressing issues of power, social justice and equity in the community and the wider world.”

Of the references offered to support this approach:

  • two were not publicly accessible;
  • one was 25 years old, an indulgent academic introspection suggesting that teachers play “mind games” to increase their awareness of the “values” (definitely not the numerical type) they signal to students when they teach maths;
  • one was a respected university professor’s plea for undergraduate teaching to move away from textbook problem-solving and towards the use of more open problems. Written 13 years ago, this opinion piece is not even close to being educational research, let alone applicable to either critical maths or teaching school-aged children;
  • this one was actually an interesting read, but the authors rather undermined themselves by citing an example of moving beyond teaching about and for social justice to teaching with social justice. They admitted the problem wasn’t real-world, let alone related to injustice – it was just inclusive teaching.  Indeed, the authors acknowledged that mathematical proof is democratic in nature, so this was a fine example of how, if you respect the culture of mathematics, there is no injustice. Simple.

Which brings me back to one sentence in the CPM that ought to set alarm bells ringing:

“Ākonga are encouraged to interrogate dominant discourses and assumptions, including that maths is benign, neutral, and culture-free.”

The Common Practice Model Phase 1: Principles and Pedagogical Approaches

Is the Ministry seriously proposing that teachers should encourage their students to challenge the neutrality of mathematics, to mistrust the very tool that would empower them to rise above dominant discourses and assumptions and reach their own conclusions through objective thinking, critical analysis and logical deduction??

My mathematical training compels me to keep an open mind, so I decided to ask an entity that could reasonably be expected to be benign and neutral (but not culture-free since it was created by humans. Well, two out of three ain’t bad.) Here is what ChatGPT had to say about the matter:

Q: Is mathematics benign, neutral and culture-free?

A: Mathematics itself is a branch of knowledge that deals with logical reasoning and quantitative analysis. As a subject, mathematics is generally considered to be neutral and culture-free because mathematical concepts and principles are universal and can be applied across different cultures and contexts.

However, the way mathematics is taught, used, and applied in different cultures and contexts can reflect the values, beliefs, and biases of those cultures. For example, the way that mathematical concepts are taught and presented may differ based on cultural norms and traditions. Additionally, the use of mathematics in various fields such as finance, engineering, and social sciences can reflect cultural biases and power structures.

Overall, while mathematics as a subject is generally considered to be neutral and culture-free, its application and use can be influenced by cultural factors. It is important to be aware of these potential biases and to approach mathematics with a critical and culturally-sensitive perspective.

ChatGPT 3.5

So even an artifical intelligence engine is able to recognise the difference between mathematics being benign, neutral and culture-free, as opposed to its application or the way it is taught. That the Ministry of Education failed to make that distinction is disturbing.

By all means, let students use maths to examine social injustice…in their social science lessons.

Dr Audrey M. Tan
July 2023

NCEA Numeracy – a looming disaster

Weaving is the Big Idea of NCEA Numeracy, but it is starting to unravel. Although an NCEA Literacy disaster is looming too, I shall focus on Numeracy in this post, partly to reduce the word count, partly to avoid referring to the curiously punctuated Literacy & Communication and Maths Strategy, but mostly because I have first-hand experience of some of the work done on the Numeracy side, whereas I have only been an observer of the work done on the Literacy side.

Trials tests in 2021 and 2022 of the new NCEA Numeracy standard have not gone well.  So poor were the results of the 2021 pilot, the Ministry of Education had no choice but to postpone the co-requisite unit standard, by one year, to 2024.  In the first instance, the standard will not be mandatory as originally intended, otherwise too many students might fail to achieve the NCEA qualification.  

Questions about the readiness of secondary schools and their students, the practicalities of administering the test, and indeed the test itself, are being raised.  This debacle more or less confirms what we should have known all along: there is no quick fix to address New Zealand’s numeracy crisis.  

It is understandable that familial familiarity with the National Certificate of Educational Achievement (NCEA) prompted the then recently appointed Minister of Education to launch a review of the NCEA in 2018.  Feedback gathered during the review supported the findings of a Tertiary Education Commission study in 2014: the current NCEA numeracy requirement – a minimum of 10 Numeracy credits gathered from a selection of unit or achievement standards – “cannot be used as a reliable indicator of students’ numeracy capabilities”. (Unfortunately, I can attest to that.  Teachers and educators like myself do their utmost to get their students to pass, but unless students appreciate that we are not just teaching to the test, rather the skills acquired will be of genuine value in everyday life, there is no guarantee that students will retain those skills and go on to become numerate adult citizens.)

The official response was to replace the current NCEA numeracy requirement with a direct assessment of “foundational numeracy”.  Most people would agree this is a sensible idea.  But where is the roadmap for teachers and students?  How can we reasonably expect students to be ready for this assessment when we haven’t changed the teaching and learning of Mathematics and Statistics in all prior school years?  The Maths Strategy and Action Plan published this year as part of The New Zealand Curriculum Refresh announced in February 2021, comes too late.  To quote a retired principal, “the decline in maths achievement in primary schools needs an immediate solution, not a five-year plan”.  By enacting the NCEA Change Programme before the Curriculum Refresh, the Ministry of Education has clearly put the cart before the horse.  

New Zealand’s maths education system is fractured, with no realised vision of the minimum 10-year journey towards Numeracy and beyond. The Ministry pumps out documents “filled with a lot of bureaucratic speak”, or inspirational videos aimed at…who? Oh, so the key message now is that “all [high school] teachers will be teachers of literacy and numeracy. All teachers will need to know their learners even more.” Sense the urgency, as high school teachers are given guidance on how to weave numeracy into their specialist subject. How ironic that we already have a workforce of primary school teachers who know their learners well. They teach across all subjects, including Mathematics and Statistics, and are perfectly positioned to weave the maths they are teaching into all aspects of their students’ learning. That, folks, is how we develop true numeracy.

Full disclosure: In 2020, I was part of the Numeracy Subject Expert Group (SEG) that developed the new NCEA Numeracy standard: a single standard and assessment worth 10 credits, which is a lot for a single standard – all standards worth Numeracy credits currently range from 2 to 6 credits. Many aspects of the assessment had already been pre-determined, e.g. the content level and the number of credits, but the structure and the timing of the assessment were up for discussion.  

My initial recommendation was that the Numeracy assessment should be split into three standards, as per the three strands of the Mathematics and Statistics curriculum, the rationale being “a single standard/assessment would be overwhelming for most learners, particularly given the importance and mandatory nature of the credits at stake”.  I also suggested that “we should allow for the accumulation of 10 Numeracy credits over more than one year.  For example, students might be able to apply their number knowledge to solve problems well before they are able to reason statistically.” A modular approach to the assessment would have relieved some of the pressure on students and their teachers.  In fact, one wonders why the Ministry did not simply look at developing the rarely-used current package of three numeracy unit standards (total of 10 credits – all three required) into the mandatory co-requisite.

Sadly, my recommendations were not taken up, and secondary schools are now faced with a single high-stakes assessment that takes so long, it causes major disruption to the usual school routine. Even more worrying is that the trials have focussed on students in Year 10, as if the standard is a pre-requisite for the NCEA, not a co-requisite. The intention is that students will sit the test when they are deemed ready by their teachers’ assessment tools. The mathematical level of the content might be suitable for Year 10, but that does not necessarily extend to an assessment of numeracy. At the time of developing the standard, numeracy was pre-defined as “the ability to access, use, interpret and communicate mathematical information and ideas, in order to engage in and manage the mathematical demands of a range of situations in learning, everyday life, participatory citizenship and work.” (Try saying all that in one breath…) Does that description align well with the maturity level of a Year 10 student? Why are trials not being run with older students, as suggested by a participating principal?

In 2007, one of the original writers of the Numeracy Project hoped that in fifteen years we’d be closer to having every child effective in mathematics. This is the year of reckoning…I think we’ll call that “Not Achieved”. And in the true spirit of the NCEA, the Numeracy Project writers are apparently allowed as many re-sits as they like; they are still contributing at every level, digging us into an ever deeper hole, writing ever longer sentences. As long as the Ministry of Education continues to rely on such expertise, we are likely to be waiting another fifteen years.

Dr. Audrey M. Tan
September 2022

NZPF Conference 2021

It was such a privilege to simply be at the New Zealand Principals’ Federation Conference in Rotorua at the start of last week, in the presence of so many dedicated primary and intermediate principals from around the whole country, let alone be invited to speak. The atmosphere was positive, energetic and energising. It really is no wonder these outstanding people were all selected to become school leaders!

Somehow, despite trying to remain discreet during the pōwhiri, I ended up on stage, sitting next to the Associate Minister of Education, Jan Tinetti. I was truly humbled by her special mention of me in her speech.

There were some brilliant speakers at the conference, and not only did I get to shake Sir Ian Taylor’s hand, I had a one-on-one with him at Rotorua Airport before we both flew home.

As for my own session, here is a bespoke slide to tie in with the conference theme “Power, Passion, Pace”, signifying the need to renew, refresh and move ahead after a fragmented year in 2020.

I asked my audience to engage with the sobering data presented, and to be inspired by the students at a decile 1 school whose lives changed forever when their teacher showed them how to line up the columns. This is not just about getting better results in our local and international tests – this is about equity in our education system, and ultimately, equity in our society.

The feedback has been overwhelmingly positive, with many special messages like this one:

“Your talk was the highlight of the NZPF Conference for me, as far as what can be done to change practice in my team to improve outcomes for students. What you said made sense and the changes that need to be made are not major. Thank you for enlightening us. I hope the government take notice and promote your ideas!”

Special thanks must go to NZPF President Perry Rush, for leading the charge this year to do something about New Zealand’s maths education. The hundreds of principals who jumped onto this website as soon as he sent them a link are a testament to his tremendous leadership. Let’s see what these amazing school leaders can do on the ground. Wishing them all Power, Passion and Pace!

Dr Audrey M Tan
9 August 2021

researchED Auckland 2018

researchED2018

Isn’t it crazy that, in 2018, we’re still “working out what works” in Education?

In fact, some of us do already have a pretty good idea of what works, but getting the right people to listen is a different problem altogether.

And so, a group of like-minded individuals (and maybe a couple of sceptics) gave up their Saturday on Queen’s Birthday weekend to attend New Zealand’s very first researchED conference in Auckland. researchED is a growing movement based in the UK but spreading internationally, “a grass-roots, teacher-led project that aims to make teachers research-literate and pseudo-science proof” (and by golly does this country need proofing). Founder Tom Bennett quickly realised that his own teacher training was based more on edu-myths and dogma (e.g. learning styles) than any scientific, evidence-based research.  He’s not the only one.  Daisy Christodoulou’s book, Seven Myths About Education, is the coffee that any waking 21st century learning fanatic should smell.  Briar Lipson at the New Zealand Initiative hasn’t spent very long in this country, but has already sized up our education system very well and should be commended for bringing researchED to New Zealand.

Every talk raised serious questions about how we teach in New Zealand, and everyone was there in the belief that we can, and should, be doing better.  Not surprisingly, the academics are calling for the Ministry of Education to change their ways and look for evidence before adopting fads as policies, while the pragmatic principals and teachers cannot afford to wait and are simply getting on with things.

The common factor of the day was subject knowledge and the importance of committing knowledge to long-term memory.  The 21st century learning ethos suggests that we should leapfrog, or at least skim over, these foundational skills in a bid to produce generic critical thinkers and problem solvers, but surely common sense tells us we cannot reasonably expect students to think critically or solve problems unless they actually have some knowledge to work with.

I have no desire to repeat what has been said so well by others, so instead I will direct readers to a newly created blog by Derek Hopper, a music teacher at Tauraroa Area School who has read up on what works and is spreading the word.  He and his colleagues are seeing significant improvements in student behaviour and achievement. Happy students, happy teachers.  Having already spoken to a maths teacher at Tauraroa who is offering guidance to their primary teachers, I believe this school may well provide the model for other schools to follow.

Some other reflections of the day:

Tom Bennett, founder of researchED: Teachers might think that indulging in (catering for individual) learning styles is a harmless bit of fun, but there is no time to waste when teaching children from disadvantaged backgrounds.  Every minute counts.

Katharine Birbalsingh, keynote speaker and founder/Headmistress of the evidence-informed Michaela Community School in London: Her teachers do not play “Guess what’s in my head?”, i.e. they don’t question their students before the relevant knowledge has been taught, so that every student, regardless of their background, has an equal chance of answering the teachers’ questions correctly.  A subtle but powerful way to address social inequity and level the playing field.

Dr. Michael Johnston, Victoria University: When new skills are learned and practised sufficiently, they become automatic and free up the working memory to concentrate on higher-order thinking.  With particular reference to mathematics pedagogy, the current NCEA internal assessment system provides little incentive for students to practise skills and procedures to the point of automaticity, and if they haven’t reached that point, then they will struggle with the cognitive demands of solving the contextualised problems presented in assessment.

Prof. Elizabeth Rata, Auckland University: Already widely known for her views on the lack of academic knowledge in the curriculum.  When she used the definition of the apostrophe as an example of understanding the epistemic structure of academic knowledge, I genuinely thought she was going to ask the audience if they had spotted the misplaced apostrophe in the previous slide.  She didn’t.  I suddenly felt alone.

Dr. Graham McPhail, Auckland University: There is little evidence that deep learning occurs through subject integration.  Wineburg and Grossman (2000) warned that ‘often the choice to implement a new curriculum is based on symbolic factors, such as a desire to be seen as progressive and in the forefront of reform’.

Louise Zame, primary school teacher:  When listening to a teacher speak so eloquently about the professional challenges of implementing Inquiry Learning…to a bunch of 5-7 year olds…you realise just how much the Ministry of Education has lost the plot.  As part of her Master’s research, Louise asks the pertinent question: what content knowledge do young students (aged 5-7 years) gain through inquiry learning?

Dr. Shaun Hawthorne, Cognition Education Ltd: Prof. John Hattie has recently updated his list of influences on student achievement, and top of the list is now “collective teacher efficacy” with a whopping effect size of 1.57.  For those who don’t know about Hattie’s effect size measure, almost everything on the list has a positive effect, so teachers and schools should not be too complacent. They should be looking to maximise their impact, and punching above the average effect size of 0.40.

To finish:

  • I was probably the only person excited to spend a bit of time in the Vaughan Jones Room during the lunch break.
  • Great care must be exercised when evaluating “evidence-based research”.  There is a lot of rubbish out there.  For example, the Numeracy Development Projects “research” showed that if you teach children strategies then children will learn strategies.  Big deal.
  • The panel discussion at the end left me in no doubt of the monumental challenge we face trying to fix New Zealand’s education system. To quote John Morris, “Currently education policy is being determined by political imperatives. It should not be. All policy initiatives, and in education there are so many of them, should be evidence-based.”
  • Tom Haig from the NZPPTA was naturally highly sensitive to the political undertones of the day and felt the debate was too one-sided.  Perhaps that’s because there is little to debate when we rely on evidence.  If the focus on credible and reliable evidence can take the politics out of Education, then bring it on I say, for I can think of no group of stakeholders less politically-minded than our precious children.

Dr Audrey Tan, Mathmo Consulting
8 June 2018

NCEA – where less is more

Our “revolutionary” National Certificate(s) of Educational Achievement (NCEA) secondary school qualification, built on ideals of inclusion and equity, has failed to deliver on many fronts.

The New Zealand Initiative has highlighted some serious problems with the NCEA system. Briar Lipson’s well researched report, Spoiled by Choice: How NCEA Hampers Education and What It Needs To Succeed, exposed the harsh reality behind the dramatic growth in numbers of students achieving NCEA Level 2 each year. NCEA performance may be “improving”, but the international survey PISA shows that our 15-year-olds’ capabilities in maths, science and reading are declining.

Spoiled_by_choice_Fig1

Moreover, despite there being a minimum requirement for literacy and numeracy credits, a Tertiary Education Commission (TEC) study in 2014 found that reasonably large proportions of students with NCEA Level 1 (approximately half) or NCEA Level 2 (four out of 10) were not functionally literate, and similar proportions were not functionally numerate.

TEC_Literacy_Fig1
TEC_Numeracy_Fig2

So what can we actually ascertain about a student with an NCEA qualification?

Very little, according to the Initiative’s second report, Score! Transforming NCEA Data. It brought to attention the huge variation in grade distribution across subject sub-fields. It is comparatively easier to gain an Excellence grade in Languages and Performing Arts than it is in Mathematics and Statistics. And not surprisingly, they found that more students pass internal assessments than external assessments.

Score_Fig4
Score_Fig2

In an attempt to redress these imbalances, the economists at the Initiative came up with a Weighted Relative Performance Index (WRPI) that endeavours to make “sense” of a student’s NCEA credits and provide a fairer comparison of performance between students. 1

It was a laudable attempt. However, one blatant inequity in the NCEA system that was not addressed by the WRPI, and has yet to be discussed widely, is the absence of a sensible time restriction applied to individual external assessments.

Allow me to explain. In Mathematics, students at any of the three NCEA Levels may be entered for a three-hour external examination comprising up to three achievement standards. Each achievement standard is a self-contained assessment/paper, sealed up in its own plastic wrapping. (Biodegradable, I hope.)

It would seem reasonable that each of the three papers should be completed in approximately one hour. 2 Therefore, it would seem reasonable that if a school decides to enter a student for fewer than three achievement standards, then the duration of the exam should be reduced, i.e. one hour for one paper, two hours for two papers. But no, not with NCEA! Schools may enter students for one or two papers, and students still get three hours to complete them!

This obviously puts students entered for three papers at a disadvantage. Are these students supposed to console themselves that they are receiving a better education, even if their peers come out with higher grades because they had more time?

Schools can, and do, game the system by entering their students for fewer than three papers. But students can also game the system by electing to not attempt all of the papers for which they are entered. If they leave the plastic wrapping intact, they will receive a Standard Not Attempted/Assessed (SNA), and apparently this is better than a Not Achieved (N).

Really?? According to this memo, it’s better for schools because School Result Summaries will include N’s but not SNA’s. Also, SNA allows for the possibility that “a student ran out of time so an N would not be a fair result”. In other words, it’s better to fail to try than to try and fail.

For those chasing an Endorsement, it is actually in their best interests to attempt fewer papers and go for higher grades. For example, for a student chasing an Excellence endorsement, two Excellence grades would be preferable to three Merit grades.

Our national secondary school examination system actually rewards students for doing less work.

The acknowledgement of effort is lacking even within an individual paper. In a traditional marking scheme, every correct answer would contribute to the final mark, but NCEA is a standards-based assessment system with “top down” marking. 3 Therefore, confident students can take a gamble and jump straight to the harder parts of a question. It’s a risky strategy, but if it pays off, they may achieve Excellence having answered roughly a third of the paper. This flies in the face of the instruction printed on the cover page: “You should attempt ALL the questions in this booklet.” If you do well, then some of what you do could turn out to be a waste of time.

NCEA is a system that does little to incentivise students to put in maximum effort or to persevere if the results are likely to be sub-optimal. Until these problems are addressed, mediocrity will prevail. We welcome the Ministry of Education’s review of NCEA this year, and hope to be part of that discussion.

Dr Audrey Tan, Mathmo Consulting
21 March 2018

1   Ironically, NZQA in their utopian socialist bubble didn’t want us to compare students, but it’s happening anyway. Students compare themselves, universities have already come up with their own weighted metrics, and employers are learning that “E” grades on a Record of Achievement are actually higher than “A” grades.

2   Prior to 2013, each paper used to start with the recommendation “You are advised to spend 60 minutes answering the questions in this booklet.”, but not any more.

3   Anecdotally, not all markers appear to follow this methodology. Perhaps they too feel that any positive efforts should be acknowledged, even if ultimately ignored in the final score.

Have New Zealand’s PISA rankings really improved?

The PISA 2015 results are out and the Minister of Education is claiming an improvement in New Zealand’s rankings! Unfortunately, upon looking more closely at the Mathematics scores, citing a move from 23rd place to 21st place as an improvement is pure fantasy.

Liechtenstein, ranked 8th in 2012, did not participate in 2015. Had they participated in 2015, it is unlikely their score (535) would have fallen by so much as to affect New Zealand’s ranking. New Zealand automatically went up by a place just because Liechtenstein pulled out.

Vietnam scored 511 in 2012 but has dropped back significantly to 495 in 2015, exactly the same score as New Zealand. It’s not clear to me why New Zealand was ranked one place ahead of Vietnam, and not the other way round.

These facts alone mean that New Zealand could easily be placed at 23rd again.

There are two other countries whose performance has affected New Zealand’s rankings. Australia has dropped significantly from 504 in 2012 to 494 in 2015. On the other hand, Norway has improved significantly from 489 in 2012 to 502 in 2015.

The net effect to New Zealand’s ranking is actually zero.

A more mature approach to understanding the PISA results is to look at New Zealand’s recent and long-term score trends, relative to the OECD average.

From 2012 to 2015, all of our scores (in Maths, Science and Reading) have dropped, but in line with the OECD average. However, there is a much more concerning long-term decline, with a significant drop from 2009 to 2012, that does not follow the same trend as the OECD average. The 28 point drop in Mathematics from 2003 to 2015 is equivalent to nearly a year’s worth of schooling.

Of particular concern are the growing proportions of low-achieving children performing below Level 2. In Reading, students below Level 2 “have difficulty with all but the simplest reading tasks measured by PISA. Level 2 is considered a baseline level at which students begin to demonstrate the reading skills and competencies that will enable them to participate effectively later in life.” In Mathematics, “Level 2 is considered to be a baseline level at which students begin to demonstrate the competencies that will enable them to participate actively in mathematics-related life situations.” In 2015, 22% of New Zealand’s 15-year-old students could “complete only relatively basic mathematics tasks and whose lack of skills is a barrier to learning.

pisa2015-nzscience

pisa2015-nzreading

pisa2015-nzmathsscores

pisa2015-nzmathsproficiency

Source: NZ Ministry of Education, PISA 2015: New Zealand Summary Report

PISA (Programme for International Student Assessment) is an international study that assesses and compares how well countries are educationally preparing their 15-year-old students to meet real-life opportunities and challenges. With our apparent long-term decline in all three subjects, and in conjunction with our perennial poor performance in TIMSS, can we say honestly say that New Zealand is heading in the right direction?

Dr Audrey Tan, Mathmo Consulting
7 December 2016

MCAT (Mathematical Crisis, the Awful Truth) 2016

It’s time for New Zealand to look past the hysterical response to this year’s NCEA Level 1 MCAT exam and try to understand what’s really going on here.

Was the exam appropriate in level and difficulty?

In my previous post, I analysed the second of the two (supposedly) parallel papers and found that most of the questions were at a reasonable level for NCEA Level 1, and also reflective of the title “Apply algebraic procedures in solving problems”.

There was a section that was more investigative in nature and new for MCAT (but such questions have appeared in other Level 1 maths assessments in the past).  This section was made difficult by its poor construction and confusing wording, and most Level 1 students would have struggled to understand the intention.  But most exams have a Very Hard Question (VHQ), so I guess this is the VHQ for this exam.

 Was it too different from previous years?

Apart from the investigative question, I don’t think so, but I might have said differently last year, when there was a noticeable step up.  From the 2015 MCAT Exemplar:

This year at least one of the three questions will not have any directed straight procedure-based parts and the other questions a maximum of one such part.…candidates will not be able to provide evidence by following a direction to solve factorised quadratics, factorise, expand, write or solve a linear equation, or simplify an expression involving the collection of like terms in response to being told to.  One part in each question may direct the student to perform such procedures; but without further evidence at Achievement level, this will not be sufficient for the award of the standard. Utilising procedures such as factorising, simplifying a rational function, or writing an equation from a word problem will provide evidence of solving a problem.  Candidates must know that given a word problem, they will be required to write equation(s) and demonstrate consistent use of these in solving a problem. Candidates will be expected to have a basic understanding of the relationship between a quadratic function and the associated graph.

MCAT was last reviewed in 2013 and is up for review at the end of this year.  Whether a change in style between reviews is appropriate should certainly be up for discussion.

So why did students find it so difficult?

The unfortunate reality is that students did struggle with this exam.  The gap between what MCAT is expecting of students, and what students are actually capable of, is widening.

There are complaints that the lack of “gimme” questions at the start of the paper has left students “shell-shocked” and “killed” their confidence.  Are we seriously saying that our students are capable of factorising a quadratic when explicitly told to do so, but they are unable to decode a basic word problem and factorise a supplied quadratic expression for themselves, even though they probably wouldn’t know of anything else to do with an expanded quadratic?  What does this say about the resourcefulness or resilience of our students?

We cannot blame this year’s Level 1 maths teachers for what has happened, and they should rightly feel insulted.  The problem started many years before this one.

Let’s do the maths.  Year 11 students in 2016 were Year 8 students in 2013.  This is the generation of students who were failing to grasp maths fundamentals such as fractions and decimals in Year 8.

What we’re really seeing here is the fruits of a flawed primary maths curriculum floating its way through the system.  Even two and a half years at secondary school isn’t enough to turn things around.  The damage is too great.

If you look at what the Numeracy Project was trying to achieve at primary school level, our secondary school students should, by all accounts, be highly numerate problem solvers, but in fact they are worryingly innumerate and apparently not very good problem solvers either.  It’s ironic that one of the big selling points of this “new approach” to teaching maths was the development of early “Algebraic Thinking”.  I think we can safely call that a Not Achieved.

A systemic failure in mathematics education is playing out before our very eyes.  NZQA is trying to inch up the standard, year by year, when the reality is that students are actually getting worse at algebra, year by year.  When students are struggling to master the basics, it’s hard to see how teachers can lift their students to the higher levels of problem solving now expected.

Given that next year’s Year 11 students will be the same generation of 9-year-olds who performed so abysmally in TIMSS 2011, alarm bells should be ringing loudly.  It would not be surprising if fewer students were entered for next year’s MCAT.

Spring forward, fall back

NZQA could make the MCAT easier again, but that would be disappointing.  I believe this year’s MCAT is the standard we should be aspiring to.  If the examination team could tighten up on the construction of certain questions, the MCAT would be an examination to be proud of on the world stage.  (The assessment side of things, however, needs a lot more work.)

And whilst I accept that normalisation is sometimes necessary, I do not think that assessment schedules should be adjusted to meet pre-defined targets as a standard practice.  The universities have already discovered that NCEA grades are an unreliable measure of preparedness for tertiary study.

The best thing NZQA can do is go back to examining algebra at the end of the year.

September is a really bad time of year for students to face their first high-stakes external examination.  Some students barely appreciate its significance when it is tangled up with mock exams for other topics and different subjects, and the ones that do appreciate its significance prioritise the MCAT at the expense of preparing for their mock exams.

The sensible thing to do, surely, is to fold it in with “Tables, Equations and Graphs”.  We’re already seeing questions about graphs in the MCAT anyway, and why shouldn’t we?  Algebra and Graphs are not separate topics, they are inextricably tied.  As we now see, NCEA’s compartmentalising of topics as separate assessments is hurting students’ ability to make connections and become effective problem solvers.

The decision to deliver the assessment earlier in the year and have it administered by the schools has a distinct whiff of cost-cutting about it, but it has been a disaster for maths education and is costing the country dearly.  If we want students to pursue STEM subjects at university, we need to give them every chance of succeeding in algebra at Level 1, as this effectively marks the fork in the road between calculus and statistics at Level 2.  If we want to increase the “dollar value” of Kiwis contributing to New Zealand’s economy, fixing our maths education system is a very good place to start.

Dr Audrey Tan, Mathmo Consulting
22 September 2016

Analysis of the 2016 MCAT Exam (Day 2)

The media is buzzing with excitement over last week’s NCEA Level 1 MCAT (Mathematics Common Assessment Task) examination.  Students are in tears and teachers are outraged over the exam that was “very different in style” and “far too difficult”.

For those who don’t know what the MCAT is, or why the MCAT might be so important, here are some salient facts:

  • The examination topic is algebra, specifically “Apply algebraic procedures in solving problems”.
  • This is an external NZQA exam, administered internally by the high schools in September. Prior to 2011, algebra was examined in November, as part of the three-hour end-of-year external exam.
  • Calculators are NOT allowed in this examination.
  • Schools may struggle to administer the exam in a single day, so there are two similar, but different, versions of the exam – one to be delivered on a Tuesday, the other on a Thursday.
  • The traditional assessment (as opposed to the trial online EMCAT) was made harder last year. From the official NZQA assessment (marking) schedule:  “The style of some of the questions in this year’s assessment has changed so as to align more closely with the requirements of the achievement standard. The title of the standard requires the candidate to use algebraic procedures in solving problems.”
  • Students who don’t do well in algebra at NCEA Level 1 will have limited options at NCEA Level 2. Such students are typically not allowed to study calculus at Level 2.
  • Students who don’t do well in algebra will struggle with calculus.
  • Students who don’t do well in algebra or calculus will find it difficult to pursue STEM subjects at university.

So just how different or difficult was the exam?  Now that the papers are in the public domain, I will review one of them (the Thursday paper) here.  If you aren’t interested in the details, you can skip to the conclusion here.

[Update: I have seen the assessment (marking) schedule and added comments below.  As with most NCEA exams, it’s not the questions I have so much of a problem with, it’s how they are assessed…]

image001.png

The intention here is probably for students to factorise the quadratic as image003.png and supply the two factors as the side lengths.  However, I suspect the examination team failed to notice that this question has infinitely many possible answers.  For example, another factorisation that students might have reasonably obtained is

image005.png

The wording of this question should have been sharpened so that the intended factorisation (if there was one) was made clear.  (See Question Three (a)(i) below.)

[Update: As predicted, the assessment schedule does not allow for any answer other than the intended factorisation.  I pity anyone who offered a different but valid answer.]

Poor wording aside, factorising quadratics is a basic algebraic procedure that is typically introduced to Year 10 students, and this question put a thin veneer of an application on top.  Level 1 students should be familiar with applying algebra in the context of measuring area, albeit in the opposite direction.  If they knew how to multiply two quantities to form the area of a rectangle, it doesn’t seem unreasonable to expect them to recognise that this question was asking them to reverse that process.

image007.png

This is a reasonable question.  Students at this level should be able to solve a quadratic equation that is not equal to zero to begin with, but can be made equal to zero and then factorised.

[Update: To my surprise, if this question was answered in full, it was worth an Excellence!  That should cheer up a few people.]

image008.png

This is a simultaneous equations question, not trivial, but reasonable.

[Update: If students were able to write down at least one equation correctly, that was worth an Achieved.]

image009.png

This is a basic procedure, namely adding algebraic fractions.  There is hardly anything to do here – they even supply the answer.  The subtraction of algebraic fractions appears more commonly in maths exams because students are frequently caught out by multiplication of negative numbers.

image010.png

This is probably one of the questions that some people are suggesting is more suitable for NCEA Level 2.  I am inclined to disagree.  Certainly, a general index equation might require logarithms to solve it, but not this one.  As long as students recognise that image011.png, this question is manageable.

Furthermore, this index question is not much harder than index questions in previous years.  Last year’s trial online EMCAT asked “What is the connection between image013.png and image015.png if image017.png?”  Again, as long as students know that image019.png, they should be able to say something sensible.  In the 2014 MCAT, students were asked to solve image021.png.

In summary, I feel that Question One is fair.  Part (a)(i) might have deviated from early questions in past papers of the “Factorise this” or “Simplify that” variety, but the application was not particularly difficult or surprising.  As NZQA pointed out last year, the title of the standard is “Apply algebraic procedures in solving problems”.

image023.png

This question is an algebraic substitution question.  The fact that the equation modelled a parabola is again a thin veneer of an application.  If students didn’t know what a parabola was, I hope they managed to ignore it and press on.

image024.png

Solving algebraic inequalities seems to unsettle students, even though the steps required are almost identical to the steps required to solve algebraic equations – there is only one extra thing to remember, which is to reverse the inequality if multiplying or dividing by a negative number.  All steps required to solve this question should be known to students at this level.  [Update: Even just one correct expansion was worth an Achieved.]

Interestingly, this was one of the few occasions where the parallel question in the Tuesday paper was noticeably trickier at first appearance, because there were identical factors on both sides of the inequality and it might have been tempting to cancel them out.  Unfortunately, that would have led to something nonsensical.  The lack of equivalence between these parallel questions is something that should have been picked up by the examination team and corrected.

image025.png

This is potentially another question that looks like a Level 2 question, but isn’t.  A reasonable first attempt at solving this would be to rewrite the inequality as image026.png.  A sharp observer might notice that

image029.pngimage031.pngimage032.png

Either way, as long as students know their powers of 2 up to 32, they should see that the whole number image035.png has to be less than or equal to 5.  As noted above, index questions of this type have been asked in previous years.  Changing an equality symbol to an inequality symbol does not affect the algebra required to solve the problem.

image036.png

This question requires students to expand and simplify two expressions and then look for similarities between them.  If students had seen a similar 2014 MCAT question, in which students were asked to write image037.png in terms of image039.png, then they would have been adequately prepared to attempt this question.  [Update: Even just one correct expansion was worth an Achieved.]

image041.png

OK, this is where things start to get interesting!  Students were given 9 lines of paper in which to conduct their investigation and answer this part of the question.

Firstly, what is meant by “when Janine changes the order of the numbers in Line 1”?  The directness of the question “Does she get the same answer as in Line 4?” suggests that we are supposed to investigate a single event.  But there are 23 ways in which Janine could change the order of the numbers.  Are students expected to try all 23 ways?  Over 9 lines of paper, probably not.  If we are meant to infer that Janine changes the numbers only once, how can we possibly investigate if we don’t know what the new order looks like?  The wording of the question is decidedly murky.

In actual fact, the answer is “yes” or “no”, depending on how the numbers are re-ordered.  I’ve seen probability questions in which a “yes” is an Achieved answer but a “no” is a Merit answer, but at this early stage in this question, students would be choosing a new order at random, so it would be unfair if either answer fell into a different achievement band.

[Update:  If you tried one rearrangement and said “yes” or “no”, that was worth an Achieved.  So even though I understood the question well enough to perform a succinct investigation with the preferable answer of “no”, I would have only gotten an Achieved.  If you tried two rearrangements but didn’t say anything, that was also an Achieved.  The wording of the Merit criteria is just as nonsensical as the question, but I think they want at least two rearrangements with a point of difference and some sort of statement about one’s findings.]

image042.png

I had to parse this sentence many times and came to the conclusion that it doesn’t make a lot of sense, even after tweaking it to read “Find, using algebra, the relationship between the numbers in Line 1 and the numbers in Line 4 when she changes the order of the numbers in Line 1.”  If students are meant to investigate what happens to the expression for the number in Line 4 after the numbers are changed, then it has not been made very clear.  A better question would be “Using algebra, show how the number in Line 4 might be affected if Janine changes the order of the numbers in Line 1.”

This question doesn’t feel like problem solving.  We are using algebra to make a general observation.  Students at Level 1 will have very little experience of using algebra in this way, but such questions have been seen in other Level 1 Maths assessments in the past, namely “Tables, Equations and Graphs”, an end-of-year assessment concerned specifically with the application of algebra to graphs.

It is certainly possible to express the number in Line 4 in terms of the numbers in Line 1 at any given point in time.  For example, if the numbers in Line 1 are called image043.png, then the number in Line 4 is image045.png.  But perhaps we are supposed to pay attention to the pattern of the example numbers 2, 4, 6, 8 and call the numbers image047.png?  Is this important or is this a distraction?

I would conjecture that a student would be better able to answer part (i) after answering part (ii).  That is the whole point of algebra, after all.  It enables us to see patterns in numbers, or at least understand better the patterns that we see, because the numbers themselves often get in the way.  Assuming there is a pattern in the numbers in Line 1 gets in the way of truly understanding any pattern that might be observed in the triangular formation, and ideally we should not be distracted by this.  So my preferred setting of the first two parts of Question Two would be split into three parts:

  • Janine changed the order of the numbers in Line 1 and found that it changed the number in Line 4. What might Janine’s new ordering look like?
  • Janine wonders whether changing the order of the numbers in Line 1 will always change the number in Line 4. Use algebra to find the relationship between arbitrary numbers in Line 1 and the resulting number in Line 4.
  • Use your expression for the number in Line 4 to explain how Janine could change the order of the numbers in Line 1 but not change the number in Line 4.

image049.png

If students were brave enough to attempt this question, hats off to them!  I quite enjoyed this question myself, but only students with a good understanding of divisibility would understand how to interpret the algebra in this question.  Furthermore, open questions such as “what do you know?” are only fair if the answers are marked “openly”.  Unfortunately, that’s not the case with NCEA.  They still have something specific in mind, answered to a greater (Excellence) or lesser (Achieved) extent, or somewhere in the middle (Merit).  Therefore, students should reasonably expect to be given better guidance as to what is intended by the question.  E.g. “If the number in Line 4 is divisible by 3, then identify the position of the number or numbers in Line 1 that are divisible by 3.”

My final comment about this question is the lack of continuity.  In part (iii), we were to assume there was a pattern in the numbers in Line 1, but it wasn’t the pattern demonstrated at the start of the question.  It appears that the 2, 4, 6, 8 pattern was a red herring, and the examination team should have chosen numbers that appeared to be more random.

In summary, Question Two was fair up to part (d).  Part (e) was poorly written and too hard for Level 1.

image050.png

This question is almost identical to Question One (a)(i) but it has an important difference.  One of the side lengths is given, which means the intended factorisation of the quadratic expression for the area has been made clear.  It begs the question, why was Question One (a)(i) even included??

image051.png

Another open question!  What’s wrong with “State any restrictions on the value of image052.png for this rectangle”?

[Update: This is amazing.  If you answer the question correctly, you only get an Achieved.  If you say something about the area or the side lengths, i.e. more than was asked for, you get a Merit!  If they wanted commentary on the area or the side lengths, why didn’t they just say “Explain.” at the end?]

image054.png

This is a “changing the subject of the formula” question, and there have always been questions of this type in previous years.

image056.png

Er, what’s this question doing here?  This is the sort of basic procedural question that would have been better placed as Question One (a)(i).

image057.png

This is not a trivial question, but it is suitable for Level 1.  However, it’s worth noting that the parallel question in the Tuesday paper resulted in a quadratic equation that was different enough to be inequivalent.  The Tuesday students may have struggled more than the Thursday students.  This should not have happened.

[Update: Sure enough, the Tuesday question must have been done very badly, because you could get Excellence even if you didn’t quite solve the equation!  I’ve also heard reports of students (not ours!) getting caught up trying to calculate the square root of 8 without a calculator…]

image058.png

If students weren’t sure how to answer this question, looking at (ii) would have given them a good clue!

image060.png

Like many NCEA questions, this is quite ‘wordy’ and requires a high level of literacy to understand the question.  It also requires knowledge about the features of the graph of a quadratic expression, and the use of algebra to solve a quadratic equation.  The icing on the cake is to form a percentage from the two quantities obtained (the maximum horizontal width and the width at a vertical depth of 3cm).

[Update: Solving the equation, i.e. bulk of the work and the algebraic heart of the problem, was only worth a Merit.  The trivial step of calculating a numerical percentage at the end was what it took to get an Excellence.  Seriously?? Where is the extended abstract thinking in there?  I suspect the examination team pulled this question from a traditional “marks out of 100” paper but failed to modify it for NCEA.]

Although I think this is a reasonable Excellence question for this assessment, it is worth noting that this question could easily appear in the “Tables, Equations and Graphs” assessment at the end of the year.  There needs to be some discussion about whether or not teachers and students should expect the same knowledge to be potentially assessed twice.

It seems that some schools had not yet taught their students about graphs, but in all fairness, last year’s MCAT exam had questions that required graphing knowledge.

In summary, Question Three was challenging but fair, provided students had been taught the appropriate material.

In conclusion…

This exam wasn’t a walk in the park, but actually most of the questions were fair for Level 1, even if they weren’t identical in form to past exam questions.  There were certainly some poorly-worded questions, but unfortunately I see them in NCEA maths exams every year.  MCAT 2016 is by no means the exception.

It is true that the MCAT now has fewer basic questions that test purely algebraic procedures, but most of these procedures should be introduced in Year 10, so it is not unreasonable to expect students be ready to apply them in Year 11.  Given that the change in style occurred last year, I am surprised that the huge uproar didn’t occur 12 months ago.

Most importantly, I believe this year’s MCAT is the standard we should be aspiring to.  Media reports suggest the reality in schools is very different.  I will discuss this in my next post.

Dr Audrey Tan, Mathmo Consulting
20 September 2016

What are the chances of completing a Disney-Pixar Domino Stars collection from a fixed number of packets?

It’s a strange thing for a mathematical person to say, but some people are luckier than others. Luckily for our son (or his parents), we shop at Countdown, which means he received a regular supply of Disney-Pixar Domino Stars throughout the 6.5 week promotion.

It was all a bit of a yawn to me, this time round. Having calculated the chances of collecting a full set of Dreamworks Heroes Action Cards, I was neither surprised nor frustrated by the increasing number of duplicates as his collection grew. The theoretical hyperbolic growth of the number of packets he would have to open to find a new domino implied a spend of $4480 to complete a set of 50. Almost $700 per week?? Not very likely.

To cut a long story short, on the last day of the promotion, we trudged home from a swap meet needing just two more to complete the set. Our chances of finding those last two looked bleak…except that a neighbour had just dropped off a big bag of 22 packets in our letterbox.

As I said, some people are luckier than others, and to our amazement, our son actually found the very two that he needed! Now how lucky is that?! I mean, what are the chances…?

Without going into too much detail, you can either focus on what you don’t want to happen, i.e. you get none of one of the coveted dominoes and any number of the other:

domino-stars-1

or you can focus on what you do want to happen, i.e. you eventually find one of the two that you’re looking for, and then hope to find the other one amongst the packets remaining:

domino-stars-2

But a friend of ours came up with an expression that best reflects what really happened, i.e. our son stopped opening packets as soon as he found the two he needed:

domino-stars-3

Reassuringly, all three expressions evaluate to this rather impressive looking fraction:

domino-stars-4

So it was very close to a 1 in 8 chance. Our son doesn’t realise just how lucky he is.

Dr Audrey Tan, Mathmo Consulting
September 2015