Introducing East Asian teaching methods into western schools: is it a good idea?

About six months ago I released a paper (https://johnjerrim.com/wp-content/uploads/2013/07/australia_asia_paper.pdf) discussing reason for East Asian success in PISA, focusing largely upon the role of home background and culture. I have been somewhat overwhelmed by the number of people who have shown an interest in this paper and have contacted me about this work since. Today I will present some evidence on the other side of the story – the ‘impact’ of East Asian teaching methods on children’s mathematics test scores.

Over the last two and a half years I have been evaluating the ARK ‘Maths Mastery’ programme along with Anna Vignoles from the University of Cambridge (http://www.educ.cam.ac.uk/people/staff/vignoles/). This introduced a Singaporean inspired ‘mastery’ teaching approach into a selection of England’s primary and secondary schools, with the impact evaluated via a Randomised Controlled Trial methodology. The Education Endowment Foundation reports can be found here (http://educationendowmentfoundation.org.uk/projects/mathematics-mastery/) and the subsequent academic paper here (www.johnjerrim.com/papers).

The two trials were of reasonable quality, and produced some interesting results. They both pointed towards a small positive effect, though neither quite reached statistical significance independently. When combining the evidence across the two trials, we found children exposed to the programme made around a month more progress in mathematics than those who did not. To put this another way, the programme would move a child at the 50th percentile (i.e. ranked 50th in mathematics within a school of 100 children) up to around 47th.

There is of course quite a bit of uncertainty surrounding this result. For instance, it is not clear how far one can extrapolate results from this trial to the wider population, while the confidence intervals suggest that the ‘true’ effect size could be a lot bigger (double) or smaller (essentially zero) than we report. We nevertheless feel that the results provide some interesting insights, particularly given policymakers interest in East Asian teach methods, given these countries strong performance in PISA.

To begin, there is no escaping that the effect size we found was small. This suggests that introducing such methods would be unlikely to springboard England to the top of the PISA rankings. Indeed, as I have noted previously (https://johnjerrim.com/wp-content/uploads/2013/07/australia_asia_paper.pdf), there are likely to be a lot of other factors in these countries at play.

Yet, at the same time, effects of this magnitude are also not trivial, particularly given the low cost per pupil. For instance, effects of a similar magnitude were reported for The Literacy Hour (http://cee.lse.ac.uk/ceedps/ceedp43.pdf) – which many consider to be a good example of a low-cost intervention that was a success. Moreover, our trials only considered the impact of introducing such methods after just one year (the first year such methods were used in these schools). But programmes like Maths Mastery are meant to develop children’s skills over several years, which may result in bigger gains. However, there is currently no empirical evidence available for us to judge whether this is indeed the case or not.

Given the above, our advice is that policymakers should proceed with their investigations into the impact of East Asian teaching methods, while also exercising caution. In particular, the empirical evidence currently available does not have sufficient scope or depth to base national policy upon. But there are perhaps some positive signs. What is now needed is further research establishing the long-run impact of such methods after they have been implemented within schools for several years, and after teachers have more experience with this different approach.

Why do East Asian children do so well at PISA?

It is no secret that East Asian children excel at school. For instance, 78 percent of ethnic Chinese children obtain at least 5 A*-C GCSE grades, compared to a national average of just 60 percent. (https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/280689/SFR05_2014_Text_FINAL.pdf). Yet, despite some very interesting qualitative work by Becky Francis (http://www.theguardian.com/education/2011/feb/07/chinese-children-school-do-well), we still know very little about why this is the case.

I explore this issue in my new paper (https://johnjerrim.com/papers/) using PISA 2012 data from Australia. Just like their counterparts in the UK, Australian-born children of East Asian heritage do very well in school – particularly when it comes to maths. Infact, I show that they score an average of 605 points on the PISA 2012 maths test. This puts them more than two years ahead of the average child living in either England or Australia. They even outperform the average child in perennial top PISA performers like Singapore, Hong Kong and Japan.

As we all know, policymakers frequently tell us that we need to learn lessons from high-performing countries (http://www.theguardian.com/politics/2013/dec/03/gove-defends-education-reforms). Yet, in my opinion, it is actually more insightful to consider what is driving the high performance of East Asian children born and raised within ‘average performing’ country such as Australia. After all, they clearly excel at the PISA tests, despite having been exposed to a western culture and education system similar to our own.

So what do my results suggest?

First, there does not seem to be a ‘silver bullet’ that explains why East Asian children excel at school. Rather a combination of inter-linked factors are at play.

Second, I find little evidence that children of East Asian heritage simply put more effort into the PISA test. It thus seems unlikely that their high performance is a statistical artefact, or that they are more motivated to do well in the test than their British or Australian peers.

Third, school selection matters a great deal. This accounts for roughly half the achievement gap between children with East Asian parents versus those with western (either Australian or British) parents. This may partly be a reflection of culture, including the high value East Asian families place upon their children’s education (meaning they send them to the best possible school).

Finally, even after accounting for differences in family background and schools, children with East Asian parents remain one whole school year ahead of their peers with Australian (or British) parents. This is partly due to East Asian parents investing more in out-of-school tuition and instilling a harder work ethic in their children. Out-of-school factors therefore play an important role in explaining why East Asian children do so much better in the PISA test than their British and Australian peers.

What are the implications of these findings for us here in the UK? Well, every time international assessments like PISA are released, we hear about the lessons to be learned from the high performing East Asian economies. This has led to us comparing our curriculum to those in Singapore and Hong Kong (https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/184064/DFE-RR178.pdf), and sending delegations to observe teaching methods in East Asian schools (https://www.gov.uk/government/news/experts-to-visit-shanghai-to-raise-standards-in-maths). Yet many of the key reasons why East Asian children excel are cultural, and therefore beyond the control of schools. Therefore, what PISA 2012 really teaches us is that parents and familial culture matter a great deal, and that our middling performance in such international comparisons captures a lot more than just the ‘performance’ of our education system, teachers and schools.

Find all my latest papers at https://johnjerrim.com/

Minister, it will cost young people a lot of money to attend a high status university

Last week, I had the pleasure of speaking at the Sutton Trust summit about access to ‘high status’ universities. It was great to be able to engage with a number of leading experts in higher education and widening participation over the two days. For the summit, I produced this report (https://johnjerrim.com/wp-content/uploads/2013/07/john-jerrim-report-final.pdf) outlining various issues surrounding widening access in England and the United States. This covers a range of issues, from the link between family background and access to high status institutions, to the thorny issue of tuition fees, living costs, bursaries and debt.

Although the report seems to have been generally well received, it has also touched upon some sensitive nerves. In particular, the Secretary of State for Business, Innovation and Skills, David Willets, stressed that no graduate has to pay back their loan until they earn more than £21,000 – and then only nine pence in the pound over this amount.

This is indeed an incredibly important point, as my colleague from the London School of Economics Gill Wyness pointed out during our session at the summit. Such ‘income contingent loans’ help insure young people against the risk associated with investing in a degree, and mean they do not have to turn to wonga.com if they do not immediately find a well paid job.

However, one cannot escape the fact that at some point this debt has to be repaid. This is particularly relevant for young people attending elite institutions like Oxford, Cambridge and the London School of Economics. Graduates from such leading universities tend to earn the highest wages. It therefore seems reasonable to ask, how much will attending one of these elite institutions costs?

Unfortunately, this is not an easy question to answer, as it depends upon the specific career path chosen after university. But, lets take secondary school teaching as an interesting example. How much might an Oxbridge graduate who enters this noble profession pay for their degree?

Below is a spreadsheet with some back-of-an-envelope calculations based upon the following (generally conservative) simplifying assumptions:

  • Zero real interest is paid on the loan and there is zero inflation
  • There is no real pay increase in the teaching pay scales and no change to the £21,000 repayment threshold
  •  The debt is written off after 30 years (as per the current HE finance system)
  • All costs associated with post-graduate study (including a PGCE) are ignored
  • The teacher chooses to work in an inner-London school throughout their career
  • They complete a 3 year undergraduate course
  • They work full-time continuously between the ages of 21 and 50
  • They take out a £5,000 per year government tuition fee loan to cover living costs
  • Their parents household income is roughly £40,000 per year
  • The teacher is promoted bi-annually (from M1 to M6 on the teacher pay scale) during their first ten years of service. They then remain on pay point M6 for the next 10 years (to age 40) and then move on to pay point U1 to age 50[1].

 

Teachers_Payback

 To highlight a few key figures:

  • This student (and prospective teacher) would accumulate a debt pile of around £42,000 by the end of their course.
  • They would repay their debt at age 50, just before the 30 year cut-off point where the outstanding balance is written off
  • The cost of their university education would therefore equal approximately £14,000 per year (including all tuition, books and living expenses).

This is of course just one single example. Many Oxbridge graduates will go on to enjoy much higher wages, incur substantial real interest on their debt, and will pay back substantially more. Others may decide to live outside of London, follow a different career path or work part-time, with university costing them substantially less.

However, one thing should be abundantly clear. Income contingency may mean that repayment of debt is manageable and reduces the financial risk of investing in a university degree. But for the majority of young people, the cost of attending one of England’s most prestigious universities is going to be quite high. Indeed, even more stark figures come from this cost calculator (http://www.thecompleteuniversityguide.co.uk/student-loan-repayment-calculator) which shows that even medium earnings graduates may make total repayments of more than £60,000.

Prospective students, as well as the Secretary of State, should remember this fact when thinking about higher education and the student finance system currently in place in England. They should be clear that going to a high status university is likely to cost quite a lot of money upon graduation. Furthermore, the government must start to present figures for the total cost of higher education separately from the funding mechanisms (income contingent loans) that are designed to reduce risk and ease the upfront costs of attending such an institution.

People having a pop at PISA should give it a break…

For those who don’t know, the Programme for International Student Assessment (PISA) is a major cross-national study of 15 year olds’ academic abilities. It covers three domains (reading, maths and science), and since 2000 has been conducted tri-annually by the OECD. This study is widely respected, and highly cited, by some of the world’s leading figures – including our own Secretary of State for Education Michael Gove.

Unfortunately not everyone agrees that PISA is such an authoritative assessment. Over the last month it has come in for serious criticism from academics, including Svend Kreiner (PDF) and Hugh Morrison (PDF). These interesting and important studies have been followed by a number of media articles criticising PISA  – including a detailed analysis in the Times Educational Supplement last week.

As someone who has written about (PDF) some of the difficulties with PISA  I have read these studies (and subsequent media coverage) with interest. A number of valid points have been raised, and point to various ways in which PISA may be improved (the need for PISA to become a panel dataset – following children throughout school – raised by Harvey Goldstein is a particularly important point). Yet I have also been frustrated to see PISA being described as “useless”.

This is a gross exaggeration. No data or test is perfect, particularly when it is tackling a notoriously difficult task such as cross-country comparisons, and that includes PISA. But to suggest it cannot tell us anything important or useful is very far wide of the mark. For instance, if one were to believe that PISA did not tell us anything about children’s academic ability, then it should not correlate very highly with our own national test measures. But this is not the case. Figure 1 illustrates the strong (r = 0.83) correlation between children’s PISA maths test scores and performance in England’s old Key Stage 3 national exams. This illustrates that PISA scores are in-fact strongly associated with England’s own measures of pupils’ academic achievement.

Figure 1. The correlation between PISA maths and Key Stage 3 maths test scores

pisa1

Source: https://www.education.gov.uk/publications/eOrderingDownload/RR771.pdf page 100

To take another example, does the recent criticism of PISA mean we actually don’t know how the educational achievement of school children in England compares to other countries? Almost certainly not. To demonstrate this, it is very useful to draw upon another major international study of secondary school pupils’ academic achievement, TIMSS. This has different strengths and weaknesses relative to PISA, though at least partially overcomes some of the recent criticisms, with the key point being – does it tell us the same broad story about England’s relative position?

The answer to this question is yes – and this is shown in Figure 2.  PISA 2009 maths test scores are plotted along the horizontal axis and TIMSS 2011 maths test scores along the vertical axis. I have fitted a regression line to illustrate the extent to which the two surveys agree over the cross-national ranking of countries. Again, the correlation is very strong (r = 0.88). England is hidden somewhat under a cloud of points, but is highlighted using a red circle. Whichever study we use to look at England’s position relative to other countries, the central message is clear. We are clearly way behind a number of high performing East Asian nations (the likes of Japan, Korea and Hong Kong) but are quite some way ahead of a number of low and middle income countries (for example Turkey, Chile, Romania). Our exact position in the rankings may fluctuate a little (due to sampling variation, differences in precise skills tested and sample design) but the overall message is that we are doing okay, but there are other countries that are doing a lot better.

Figure 2. The correlation between PISA 2009 and TIMSS 2011 Maths test scores

pisa2

Source: Appendix 3 of https://johnjerrim.com/wp-content/uploads/2013/07/main_body_jpe_resubmit_final.pdf

I think what needs to be realised is that drawing international comparisons is intrinsically difficult. PISA is not perfect, as I have pointed out in the past, but it does still contain useful and insightful information. Indeed, there are a number of other areas – ‘social’ (income) mobility being one – where cross-national comparisons are on a much less solid foundation. Perhaps we in the education community should be a little more grateful for the high quality data that we have rather than focusing on the negatives all the time, while of course looking for further ways it can be improved.

Confusion in the (social mobility) ranks? Interpreting international comparisons

Readers have probably heard that social mobility is low in the UK by international standards. A number of sensationalist stories have led with headlines such as “Britain has worst social mobility in western world and that “UK has worse social mobility record than other developed countries.

Leading policymakers have made similar statements. To quote England’s Secretary of State for Education Michael Gove: “Those who are born poor are more likely to stay poor and those who inherit privilege are more likely to pass on privilege in England than in any comparable country”.

But is this really the case? Are we sure social mobility is indeed lower in this country than our international competitors? Or is it the case that, just like global league tables of educational achievement, there remains great uncertainty (and misunderstanding) surrounding cross-national comparisons of social mobility?

The answer can actually be found by exploring a little further academic research that has been published on the Sutton Trust website. Figure 1 is taken from a Social Mobility Report published on 21 September 2012.

Figure 1: International comparisons of social mobility – Sutton Trust report 21st September 2012

Saving the technical details for another time, the longer the bars in this graph, the less socially mobile a country is. Here we see a familiar story; Britain ties with Italy as being the least socially mobile.

Figure 2, however, tells a different story. This is taken from another report published by the Sutton Trust just three days later.

Figure 2: International comparisons of social mobility – Sutton Trust report 24th September 2012

This graph plots a measure of income inequality (horizontal axis) against an economic measure of social mobility (vertical axis). Thus the closer a country is to the top of the graph, the lower its level of social mobility. Now, it appears that the UK may actually be more socially mobile than France, Italy and the US, and very similar to countries like Australia, Canada and Germany. Perhaps even more surprisingly, the UK is also similar to Sweden, Finland and Norway. Indeed, the only country that we can have any real confidence that the UK is significantly different to is Denmark.

Why is there such a contrast between these two sets of results? The trouble is, cross-national studies of social mobility have to rely upon data that are not really cross-nationally comparable. Rather, data of varying quality have been used in each of the different countries. Individuals are interviewed at different ages, using different questionnaires and survey procedures. Indeed, even different statistical analysis methods are used. No wonder, then, that social mobility in the UK can look very different, depending upon which dataset and method of analysis are used.

So although global rankings of educational attainment can be misleading, so can those of social mobility. In fact, problems with international comparisons of social mobility are often significantly worse. Yet this does not seem to stop journalists and policymakers making bold claims that “Britain has some of the lowest social mobility in the developed world“. Things are rarely so black or white in the social sciences – and social mobility is no exception. This uncertainty should be recognised when journalists and government officials report on social mobility rankings in the future. Otherwise, I fear for the credibility of this extremely important social issue.