Is Sweden’s PISA Score Overstated? A Look at the Data
Linda Borger
Sweden’s 2018 PISA results sparked both celebration and controversy. When the first results were released, it seemed that Swedish students had made significant gains in reading, math, and science. But behind the headlines, a debate was brewing about how representative the sample of students really was. In a new study, we used detailed national register data to investigate whether Sweden’s PISA scores are an accurate reflection of student achievement. Our findings suggest that the sample may significantly overestimate Sweden’s educational performance.
PISA and the Importance of Sampling
The Programme for International Student Assessment (PISA) aims to compare student performance across countries every three years, but for these comparisons to be meaningful, the samples must be representative of each country’s student population. If large groups of students are excluded or fail to participate, the results can be biased, leading to inflated or deflated national scores. This study examines the Swedish PISA 2018 sample by comparing it to comprehensive register data, which includes all students born in 2002.
Findings: Overestimated Achievement and Narrowed Gaps
1. Higher Average Achievement
The study found that the Swedish PISA sample significantly overestimated average achievement levels. Compared to national records, the sample’s scores were consistently higher on several measures. For example, the PISA sample had an average GPA that was about 15 points higher than the actual GPA of the population, a gap that represents about one year of learning. This discrepancy suggests that Sweden’s overall PISA score may overstate the country’s true educational performance.
2. Less Variation in Scores
Not only did the PISA sample show higher average scores, but it also reflected less variation in performance. The standard deviation of the PISA sample’s scores was significantly lower than that of the national data, meaning it didn’t capture the full range of student performance. This narrowed range suggests that the sample may have excluded lower-achieving students, possibly leading to a distorted picture of Sweden’s educational landscape.
3. Under-representation of Certain Student Groups
The researchers also found that students from lower socio-economic backgrounds and students with immigrant backgrounds were underrepresented in the Swedish PISA sample. These groups tend to face more educational challenges, so their exclusion may have contributed to the inflated scores. The study’s analysis shows that while native students and those from highly educated families were overrepresented, children of low-educated parents and those with immigrant backgrounds were underrepresented.
Why Does This Matter?
For policymakers and educators, reliable data is essential for making informed decisions. The inflated scores in Sweden’s 2018 PISA results could lead to an overly optimistic view of student achievement, potentially downplaying the need for targeted interventions. This could result in less support for students from disadvantaged backgrounds or fewer resources to address educational inequality.
As many countries face similar challenges with non-representative samples in large-scale assessments, the Swedish example underscores a broader issue. For PISA and similar assessments to provide a truly global benchmark, their sampling methods must be as rigorous and inclusive as possible.
Moving Forward: Stricter Standards Needed
Our findings call for stricter standards in PISA sampling to ensure that results accurately reflect the full range of students. This includes reducing the allowable exclusion rates and increasing participation requirements. These adjustments could help ensure that future PISA results present a more balanced picture, giving policymakers a clearer idea of where educational support is most needed.
Ultimately, accurate data is essential for understanding and improving educational outcomes. Sweden’s experience with the 2018 PISA assessment is a reminder of the importance of scrutinizing data sources and methods to gain meaningful insights and, ultimately, a more equitable education system.
Read the full article here. Borger, L., Johansson, S., & Strietholt, R. (2024). How representative is the Swedish PISA sample? A comparison of PISA and register data. Educational Assessment, Evaluation and Accountability, 36(3), 365-383.

