Statistics comparing countries with very different education systems can be difficult to interpret in a reliable way. But some institutions do try to provide comparative figures and these are used, for example, to compare education spending levels in different countries, or different levels of achievement.

The best source of information on how the UK system of education compares with those of other countries is the reports produced by the Organisation for Economic Co-operation and Development (OECD). The regular publicationEducation at a Glance is available here, as well as many comparisons, for example in participation, achievement, spending, lifelong learning and conditions for pupils and teachers, as well as international trends.

The OECD Programme for International Student Assessment (PISA) is produced regularly and compares achievement amongst 10th grade students (15 year olds) in reading, maths and science, with a strong focus on each subject in turn. PISA produces several reports each year. Teaching across the OECD is compared by the OECD Talis (Teaching and Learning International Survey).

The International Association for the Evaluation of Educational Achievement (IEA), an academic organisation, conducts two much longer-established tests. The Trends in International Mathematics and Science Study (TIMSS) provides “reliable and timely data” on the mathematics and science achievement of 4th and 8th grade students. In 2011 more than 60 countries and jurisdictions participated in TIMSS, which tested around 5 million students. The Progress in International Reading Literacy Study (PIRLS) is an international comparative study of the reading literacy of young students. PIRLS studies the reading achievement and reading behaviours and attitudes of fourth-grade students in participating countries.

PISA and IEA use rigorous and similar testing methodologies; both contract the National Foundation for Educational Research (NFER) to conduct their tests in the UK. TIMSS employs more multiple choice and fewer problem solving questions in maths than PISA. Neither scores nor international rankings are consistent between the two tests.

There are several useful websites for international comparison. EURYDICE, the education information network in Europe, produces comparable information on national education systems and policies and produces studies and analyses. EURYDICE also has its own database, EURYBASE .

The United Nations Educational, Scientific and Cultural Organisation (UNESCO) produces reports on education across the world.


A group of East Asian jurisdictions (Shanghai, Singapore, Korea, Taipei, Hong Kong and Japan) has by the far the highest achievement in both PISA and IEA tests.

In the 2012 PISA maths test the UK achieved a mean score of 494 in maths, exactly the mean for the 34 participating OECD countries. Within the UK, England scored 495, with Scotland on 498 and Northern Ireland on 487, neither statistically significantly different from England. Wales achieved 468, significantly lower than the three other countries. PISA calculates that 41 points is equivalent to the progress expected in one year of schooling; this provides the basis for politicians’ claims that our children are so many months behind. The UK was ranked 23rd of 65 participants (this included 31 “OECD partners”).

In the 2011 TIMMS test of grade 8 pupils, England scored 507 against a scale centre point of 500 and was ranked 10th of 41 participants. In the 2011 PIRLS test of grade 4 pupils, England scored 552 against a scale centre point of 500, and was ranked 11th of 49 participants.

Current issues

Government has highlighted concerns that the UK may be slipping down the international achievement tables. A report produced with the Government’s White Paper, The case for change drew on international comparisons. However some authors have cast doubt on whether the UK is slipping in comparison to other countries. A pamphlet by the Institute for Public Policy Research looked at the value of international comparisons.

The OECD’s publications suggest policy directions based on correlations between PISA scores and features of education systems, but PISA points out that the correlations do not prove causal connections. While some governments seek to borrow policies from successful systems (in PISA terms), other evidence casts doubt on their relative importance for pupil achievement. A study of the PISA results of second generation migrants from high achieving East Asian countries into Australia found that, despite being educated entirely within the Australian system, they performed equally well as their peers in their countries of origin (605 points), 102 points higher than the Australian “native group” average. Only half of the difference was explained by the “migrant effect”. This suggests that pupil performance globally may be explained more by social factors than in-school factors, which is also a consistent finding of a range of other research in the UK over a long period.

Some governments (Germany, Wales) have adopted policy aims defined in terms of PISA performance, leading to more “PISA friendly” curricula and teacher training on PISA tests. The impact of these policies on learning and achievement remains to be seen.

February 2015

Back to International data

Back to top