John Jerrim, Jake Anders, Silvan Häs, Nikki Shure & Laura Zieger

When the Programme for International Student Assessment (PISA) results are released every three years, it is now little surprise that a set of East Asian nations (e.g. Singapore, Taiwan, Japan, South Korea) dominate the top spots in these rankings. These nations typically substantially outperform most English-speaking Western nations, with one important exception – Canada. This has not gone unnoticed by policymakers and the education media. Indeed, after the release of the PISA 2015 results, Canada was described as an “education superpower” with various theories (from the strong academic performance of immigrants through to high levels of student motivation) put forward to explain this result. Indeed Andreas Schleicher – the man who has led the OECD’s PISA programme – suggesting that the strong commitment to equity in Canada is the key.

But how much confidence can we really have in the Canadian PISA results?

One of the key pillars of PISA is meant to be that it is representative of each country’s 15-year-old population. If this is not achieved, then we are not comparing like-with-like. For instance, if country A were to disproportionately exclude some groups of students, then it cannot be fairly compared to country B where a representative cross-section of young people did actually take part. This situation could emerge if, for instance, children with Special Educational Needs (SEN) are identified and treated differently across countries. Alternatively, in some countries, a significant number of students and schools may refuse to participate in the study.

The reality is that this is what happens in PISA – and we believe could substantially undermine the Canadian results.

This point is illustrated in Figure 1, which draws upon figures reported in the PISA 2015 technical report. Clearly, the figures for Canada are striking. Only around half (53%) of 15-year-olds in Canada were covered within the PISA 2015 assessment. This compares to more than 90% of 15-year-olds in Japan and South Korea.

Figure 1. The number of 15-year-olds in Canada, Japan and South Korea and the (weighted) number covered within the PISA assessment.
Source

Why do these figures for Canada look so bad? There is a mix of reasons.

First, schools in Canada were more likely to refuse to take part than schools in other countries, with the Canadian national report flagging particular issues within Quebec (where less than half of those schools approached agreeing to take part, see Table A2). If it certain types of school (e.g. those with lower performing students) are less likely to take part than others (e.g. higher-performing schools), then this could lead to an upward bias in the Canadian PISA results.

Second, Canada was much more likely to exclude pupils from taking the PISA assessment due to issues such as Special Educational Needs (7.5% of 15-year-olds were excluded in Canada compared 2.4% in Japan and less than 1% in South Korea) – a group who are likely to be very low achievers.

Finally, students in Canada were less likely to actually sit the PISA assessment – even within schools that agreed to take part. Specifically, the official figures show that almost 20% of Canadian teenagers were counted as absent on the day of the PISA test compared to less than 3% of those in Japan and South Korea. It is well-known that certain types of student (e.g. lower achievers from lower socio-economic backgrounds) have higher absence rates from school and these characteristics are likely to be associated with performance on the PISA test. It hence seems likely that this would lead to an upward bias in the results.

Together, this adds up to a significant problem, which we believe significantly undermines our confidence in the PISA 2015 data for Canada. We believe that there are particular problems in drawing comparisons to other “high-performing” countries – Japan and South Korea in our example – where a genuinely representative cross-section of children took part. Indeed, after scratching below the surface, evidence of Canada being an “education superpower” does not seem to be particularly strong at all.

About the author(s)

John Jerrim

John Jerrim is Professor of Education and Social Statistics in the Institute of Education, University College London. His research interests include the economics of education, access to higher education, intergenerational mobility, cross-national comparisons and educational inequalities. He has worked extensively with the OECD Programme for International Student Assessment (PISA) data, with this research reported widely in the British media.

Jake Anders

Dr. Jake Anders is Associate Professor of Educational and Social Statistics and Deputy Director (CREATE, Early Years, and Schools) in the Centre for Education Policy and Equalising Opportunities at UCL Institute of Education, University College London. Jake's research focuses on understanding the causes and consequences of educational inequality and the evaluation of policies and programmes aiming to reduce it.

Silvan Häs

Silvan is a PhD student at UCL’s Institute of Education and Marie Curie Research Fellow as part of the OCCAM network.  In his research he focusses on parental worklessness and how it affects children’s education trajectories, using PISA data and the UK’s Millennium Cohort Study.

Nikki Shure

Dr. Nikki Shure is Lecturer (Assistant Professor) of Economics at University College London Institute of Education and Research Affiliate at the Institute for Labor Economics (IZA). Nikki's research interests include the effects of childcare on maternal labor supply, non-cognitive skills and educational outcomes, gender and ambition, international comparisons of education systems, and inequalities in access to higher education and the labor market.

Laura Zieger

Laura Zieger is a PhD student at the Department of Social Science at the Institute of Education, University College London. She is part of the European Training Network OCCAM which supports her research on the statistical methodology behind international large-scale assessments.