Do Maths exams hinder learning?

Sarah Tang and Candia Morgan consider how the design of Maths exams affects students' mathematical skills and their experience of the subject.

Maths

The debate about falling school standards, in mathematics and other subjects, is rehearsed every year as the GCSE results are released, while employers’ claims that the mathematics skills of school leavers and graduates are declining – in the face of apparently improving exam results – are regularly reported in the press.

Yet the question of standards over time is deeply complicated by changes in the aims and content of the curriculum and of exams, and in the demographic makeup of the groups taking different exams.

High stakes

Since the introduction of national testing and the publication of league tables of results, exams taken in English schools have involved increasingly high stakes. In a climate of increased competition between schools, secondary schools are judged by the percentage of students who achieve at least 5 A* to C grades including English and Mathematics in their GCSE exams. There are strong incentives for teachers to ‘teach to the test’. Recent moves to make Ofsted inspections more data-driven are also likely to focus attention and effort on those students deemed to be ‘borderline’ C/D students.

Teachers and students work hard to achieve high standards. They deserve a serious and measured evaluation of their achievements. This means that exam results matter: clearly for students, but also for schools. With public exams driving much of what is happening in classrooms, the role of the exam is elevated. It is crucial that we understand more about the content and nature of exams and how these have changed over time. This is important for curriculum designers, examiners and teachers, and to ensure that exams are supporting the aims of the national curriculum.

"There is a tension between assessing challenging mathematics, and enabling inclusive student access to high-stakes qualifications"

In our research, we do not ask “Have maths exams got easier?” directly, as we feel that this question is overly simplistic. We focus our attention instead on understanding what is different over time, and how these differences may play out in terms of the types of mathematical skills, and of mathematicians, that our schools produce. In doing so we find that tension between the competing aims of the school maths curriculum is played out in the exam questions – a tension between assessing challenging mathematics, and enabling inclusive student access to high-stakes qualifications. One aim of school mathematics is to “provide a strong mathematical foundation for students who go on to study mathematics at a higher level post-16” (p. 3, DfE, 2013); but at the same time there is an expectation that “All students will develop confidence and competence” with the majority of the content (p.4, DfE, 2013).

Design differences

Our project – The Evolution of School Mathematics Discourse – considers changes in higher-tier Mathematics GCSE exams over the past 30 years, and how these differences affect what students do. We analysed over 500 mathematics exam questions from the whole of this period, and found differences between years and between exam boards in the kind of mathematics that these exams expect students to engage in.

Student doing maths workWe looked at a wide range of indicators to analyse change over time. For example, we looked at the proportion of instructions to students to engage in mental processes such as ‘prove’ and ‘decide’, compared to instructions to perform material operations such as ‘calculate’ or ‘write down’. We also considered how much the student’s answer was supported and shaped by the format of the questions: were units provided in the answer line or in the question, and how was the space for the answer defined?

Some of these changes may appear too subtle to have any impact on how the student tackles the question. But we have found through student tests and interviews that changes in the wording, structure and form of questions have considerable impact on how the student engages with the task and on what mathematics is actually attempted.

In more recent exams, questions have tended to be more structured, with units provided for answers, with diagrams provided by the examiner, and with restricted space for working out. We have found that although this support enables more students to gain some success, when less support is provided, students tend to engage more with the context and have opportunities to show their ability to construct complete chains of reasoning.

However, students who were otherwise capable of solving geometric problems were surprised to be faced with a question where they were not given a diagram, and found it difficult to construct one from the verbal information given in the question. This suggests that the practice of always including diagrams in exam questions has led to the key mathematical skill of constructing diagrams being neglected. This shows that when considering the design of an exam, it is important to be aware that providing too much support for students can restrict their opportunities to show what they are capable of, and may impact on the quality of their mathematical experience as teachers prepare them for the exams. This may have an impact in turn on students’ preparation for further study of mathematics, both in terms of the skills they have acquired and their interest in the subject. 

Question context: reality or ritual?

The application of mathematics in a range of contexts is another important feature of the curriculum. Our findings on the use of ‘everyday’ contexts in exam questions show that while the contextualisation of mathematics increased after the introduction of the GCSE in 1988, there was a return to the pre-GCSE level by 2011.

"When designing exams, too much support can restrict students' opportunities to show what they are capable of, and may ipact on the quality of their mathematical experience"

In earlier years, some contextualised problems demanded the use of algebra, but by 2011 almost all algebra questions were entirely abstract. In many topic areas, the depth of engagement with context reduced over the years and more questions involved what we term only a ‘ritual’ engagement with context, following patterns that students would be very familiar with from their classroom experience. In data-handling questions, the contextualisation became increasingly trivial and irrelevant, often simply naming objects that had been counted, but only requiring students to manipulate numbers, without referring back to the original context. We gave students questions involving different levels of contextualisation. Our analysis of their responses suggest that they can answer questions with ‘ritual’ contextualisation without paying attention to the context, whereas less familiar contexts can engage them deeply in making sense of the context and applying their mathematical knowledge to its critical features.

When the context of a question is trivial or routine, we need to ask whether the objective of including ‘everyday’ applications in school maths is being met. Recent changes to GCSE mathematics have responded to a call for more functional elements and applications of mathematics. Our project did not extend to a substantial look at the most recent exams. But comparative analysis of a small number of questions suggests that some of the trends we describe here may be starting to reverse.

As we write, further changes are being made to the Key Stage 4 curriculum and to GCSE exams. Given that the current data-driven accountability system for judging school quality is unlikely to change any time soon, it is paramount that we consider what and how high-stakes exams are asking students to learn. We need to consider what mathematical knowledge, skills and attributes we want our young people to develop, and whether the form of the exams supports teachers and students in achieving them.

Sarah Tang is a research officer and Candia Morgan Professor of Education in the Department of Curriculum, Pedagogy and Assessment at the Institute of Education, University of London

References: 
  1. Department for Education (2013). Mathematics: GCSE subject content and assessment objectives [PDF]. Reference: DFE-00233-2013. Accessed 18th February 2014.

Share this page