International

Background

Statistics comparing countries with very different education systems can be difficult to interpret in a reliable way. But some institutions do try to provide comparative figures and these are used, for example, to compare education spending levels in different countries, or different levels of achievement.

The best source of information on how the UK system of education compares with those of other countries is the reports produced by the Organisation for Economic Co-operation and Development (OECD). The regular publicationEducation at a Glance has many comparisons, for example of participation, achievement, spending, lifelong learning and conditions for pupils and teachers, as well as international trends.

The OECD Programme for International Student Assessment (PISA) is produced every three years and compares achievement amongst 10th grade students (15 year olds) in reading, maths, science and problem solving with a strong focus on each subject in turn. In 2015, 72 countries and territories participated, most using on-line tests. A report of performance in England provides a useful account of PISA methodology, which remains extremely complex and controversial.

Teaching across the OECD is compared by the OECD Talis (Teaching and Learning International Survey).

The International Association for the Evaluation of Educational Achievement (IEA), an academic organisation, conducts two much longer-established tests. The Trends in International Mathematics and Science Study (TIMSS) provides “reliable and timely data” on the mathematics and science achievement of 4th and 8th grade students. In 2015, 55 countries and jurisdictions participated in TIMSS at grade 4 and 44 at grade 8. The Progress in International Reading Literacy Study (PIRLS) is an international comparative study of the reading literacy of young students. PIRLS studies the reading achievement and reading behaviours and attitudes of fourth-grade students in participating countries; the most recent tests were taken in 2016.

PISA and IEA use rigorous and similar testing methodologies. TIMSS employs more multiple choice and fewer problem solving questions in maths than PISA. Neither scores nor international rankings are consistent between the two tests.

There are several useful websites for international comparison. EURYDICE, the education information network in Europe, produces comparable information on national education systems and policies and produces studies and analyses. EURYDICE also has its own database, EURYBASE .

The United Nations Educational, Scientific and Cultural Organisation (UNESCO) produces reports on education across the world.

Data

The 2015 England Report on PISA warns against ranking countries by test scores, and instead uses groups of countries with statistically significant differences. England’s score of 512 for science is similar to that of nine other countries, including China and South Korea, and below two other groups totalling nine countries. The top group consists of Singapore (556), Japan (538), Estonia (534), and Taiwan (532). 40 countries scored significantly lower than England, including Scotland, Northern Ireland and Wales.

Like that of many countries, England’s performance in science, reading and maths has not changed significantly since 2006. The repeated use of test items enables this comparison. In science there is a larger than average range of performance, and like most comprehensive systems the within school range is larger than the between school range.

In the 2011 TIMMS test of grade 8 pupils, England scored 507 against a scale centre point of 500 and was ranked 10th of 41 participants. In the 2011 PIRLS test of grade 4 pupils, England scored 552 against a scale centre point of 500, and was ranked 11th of 49 participants.

Current issues

The 2015 results were published in December 2016. In response to criticism, the OECD was more clear about their limitations, such as: correlations do not prove causal connections; variations in pupil performance globally may be explained more by social factors than in-school factors; the lack of pupil progress measurement. A study of the PISA results of second generation migrants from high achieving East Asian countries into Australia found that, despite being educated entirely within the Australian system, they performed equally well as their peers in their countries of origin (605 points), 102 points higher than the Australian “native group” average. Only half of the difference was explained by the “migrant effect”.

The England report states “ . . . it is not appropriate to treat PISA as a direct indicator of the ‘quality’ of England’s schools. Trends in PISA results should therefore not be taken as providing robust evidence as to the direct impact of any previous or on-going educational reform.”

In response to these health warnings, the Government and media responses to the 2015 results was comparatively muted. In the past, some governments (Germany, Wales) have adopted policy aims defined in terms of PISA performance, leading to more “PISA friendly” curricula and teacher training on PISA tests. The impact of these policies on learning and achievement remains to be seen, and the change of approach by PISA may reduce the likelihood of such policies in future.

March 2017


Back to International data

Back to top