SocietyBrought to you by

A glowing school report for NZ? In part – but beware the shallow score-keeping

The real value of assessments such as TIMSS and PISA lies not so much in the crude comparisons as the painstaking analysis of particular strengths and weaknesses that revealed in our students’ performance in particular areas of the curriculum, writes John O’Neill.

The last fortnight has seen the release of results in the 2015 Trends in Mathematics and Science Study for year 5 and Year 9 students, and of the 2015 Programme for International Student Assessment for 15-year-olds (mostly Year 11). Depending on one’s preferred metaphor, they are neither triumph nor disaster, but may be something of a curate’s egg.

These international assessments, abbreviated as TIMSS and PISA, provide four yearly opportunities to compare the performance of New Zealand learners with their compatriot same-age peers from previous assessment cycles, and with their same age peers in other countries in the same assessment cycle.

A roadside sign for a school bus route along a country road. [url=file_closeup.php?id=26641495][img]file_thumbview_approve.php?size=2&id=26641495[/img][/url] [url=file_closeup.php?id=10927528][img]file_thumbview_approve.php?size=2&id=10927528[/img][/url] [url=file_closeup.php?id=9708988][img]file_thumbview_approve.php?size=2&id=9708988[/img][/url]

On bus trips to school, New Zealand students are all ‘TIMMS this’ and ‘PISA that’

A first response typically reports country rankings, average scores and longitudinal trends in country and student performance. Has New Zealand’s country ranking improved or deteriorated relative to other countries and to previous cycles? Has the average score of New Zealand children increased or decreased? How have rankings and scores changed over time?

The answers to these trivial questions, and the narratives that accompany them, mostly reflect popular cultural and political emotions similar to those experienced while couch surfing major international sporting or cultural competitions (satisfaction, pride, disappointment, puzzlement, envy). But such narratives have little interest in what occurred during the months or years of everyday practice and repetition that preceded the event performance itself.

A second response is to identify those countries which are at the top of the rankings, and which consequently tend to have both the largest proportions of higher performing students and the smallest proportions of lower performing students. These countries are typically seen as exemplary, which inevitably leads to the lament: If only we could teach and learn more like them, our scores and our ranking would improve.

New Zealand is commonly praised for its above-average country score, but criticised for the magnitude of the gap in average scores between its highest and lowest performing students. This is typically described as an issue of equity because many of the lowest performing students are those living in the greatest material deprivation.

Māori and Pasifika students are over-represented among the materially deprived. Māori and Pasifika students also tend to perform somewhat below their non-Māori and non-Pasifika peers in the same socioeconomic circumstance. This raises the prospect that both material deprivation in family and community settings and insufficient cultural responsiveness in teaching, learning and curriculum contribute to lower levels of performance in international assessments.

Some other countries with significant proportions of students who live in material deprivation and/or which are highly culturally diverse do not have as wide a gap between the scores of the highest and lowest performing students. This argument permits officials to largely discount the effects of material deprivation on the grounds that if some countries, schools and teachers can enable materially deprived students to succeed, then there is no reason why all materially deprived students should not succeed. On this logic, it is the adoption of overseas successes that will reduce the gap effectively and efficiently.

A major problem with this argument is the assumption that because TIMSS and PISA each use the same assessments with same age students, that they are comparing apples with apples. This may arguably be true of the assessments in terms of their validity and reliability but it ignores a number of education policy factors that are beyond the control or responsibility of the teacher and school.

First, there may be a stronger correspondence between what is taught, what is learned and what is tested in some high performing countries. Some countries that do well in international assessments ensure that their official curriculum deliberately and systematically prepares students for the kinds of questions they will be asked in international assessments. In New Zealand, while there is a national curriculum, it is liberally interpreted. There are many different combinations of subject choices and NCEA credits that students accumulate by Year 11, even in English, mathematics and science. Equally, not all New Zealand primary schools have teachers who are confident in their understanding of mathematics and science, so students may experience too little learning that invites and sustains their interest in these learning areas.

On the one hand, international assessment oriented curricula could be argued to prepare students well in some of the most important knowledge, skills and dispositions they need to succeed in later study and employment. On the other hand, these assessments “only” cover mathematics, science, reading and, in PISA 2015, collaborative problem-solving. There are many other important things to learn at school about education, cultural identification, employment, life and civic participation. It could well be that New Zealand performs much better in these other important areas than some countries that do better than us in international assessments. Not everything of educational value can or should be tested

Second, there are long standing disputes in reading, mathematics and science communities about how best to teach these learning areas to students. Teachers are expected to teach according to national guidelines and to participate in officially endorsed and centrally funded professional learning opportunities despite the reality that experts disagree and talk past each other about what teachers and learners should be doing in the classroom. Moreover, especially since the introduction of National Standards, teachers’ access to high quality professional development in these (and other) learning areas has been reduced, while the funding that is made available is instrumentally targeted at improving the outcomes of learning to meet Better Public Services targets, not the quality of learning experienced by the child. It has been observed, for example, that all countries that have introduced high stakes testing have subsequently seen their national performance decline in TIMSS and PISA.

Third, when politicians, think tanks, experts and others look overseas for “fixes”, they almost invariably cherry pick evidence to suit their particular interests and agenda. Several years ago an Australian think tank report examined teaching and learning practices in high performing South East Asian countries and recommended adoption of what had been observed: larger classes, whole class teaching, direct instruction in preference to inquiry methods, regular peer observation of teachers, and regular practice by teachers of model lessons. There was, however, little discussion of the fact that teachers in these countries had on average many fewer contact hours than teachers in New Zealand, nor of the reality that in these countries parents spend large proportions of their income on supplementary private tutoring for their children, that students spend much of their time not at school doing homework, that entry to the best universities, and the most lucrative employment pathways is entirely dependent on the students’ score and ranking in end-of school examinations, that education may be more highly valued in cultural terms, and so on and so forth. All these factors shape learning and educational success too.

PISA and TIMSS do have value, but the value is lies in painstaking analysis of particular strengths and weaknesses that are revealed in our students’ performance in particular areas of the curriculum. This provides a starting point for conversations about how teaching and learning, curriculum and policy settings, including resourcing, may need to change to better support learning. Couch surfing PISA and TIMSS results tells us nothing of educational value.

This content is brought to you by AUT. As a contemporary university we’re focused on providing exceptional learning experiences, developing impactful research and forging strong industry partnerships. Start your university journey with us today.

The Spinoff has turned off comments. If you want to have your say on a story, please head to our Facebook or Twitter – or send a letter to the editor (we publish a selection weekly): Thanks!