Quantitative Skills conference 26th March 2012

On Monday, I attended an excellent conference at the British Academy on Quantitative Skills, with a focus on international comparisons. The main question posed by the conference was, ‘How well trained are UK social science students in quantitative skills?’ The resounding answer was ‘Not very well’!

The conference examined some of the reasons for this, starting with secondary education. In a survey of 24 countries, England, Wales and Northern Ireland were the only countries in which fewer than 20% of upper secondary students study maths. Of these 20% who study maths at A-level, a large proportion go on to study STEM (Science, Technical, Engineering and Medical) subjects at university. A much smaller proportion choose social science degrees and herein lies the first challenge to improving quantitative skills at undergraduate level – students come to university having done little or no maths since GCSE.  The Nuffield Report ‘Is the UK an outlier? An international comparison of secondary mathematics education’ is certainly worth a read, and it clearly shows that the way to raise the percentage of students who study maths post-16 is to make it compulsory, as it is in the Czech Republic, Estonia, Finland, Japan, Korea , Russia, Sweden and Taiwan. http://www.nuffieldfoundation.org/uk-outlier-upper-secondary-maths-education

In his fascinating report on undergraduate quantitative methods training in the social sciences (excluding psychology and economics which historically have a much greater focus on statistics), Professor John MacInnes highlights some of the other challenges the UK faces.

  • Cost pressures and lack of skilled staff lead to inadequate teaching (Professor MacInnes’s research found that only 10-15% of staff at UK institutions have the skills needed to teach an introductory statistics course!)
  • Teachers face hostility to numbers from students who had a negative experience of maths at school or haven’t studied any maths since GCSE and have never before encountered maths in an applied setting
  • Methodology and quantitative skills are divorced from the rest of the substantive curriculum and therefore seen as an optional extra
  • Students aren’t aware of the career advantage of having good quantitative skills

http://www.esrc.ac.uk/_images/Undergraduate_quantitative_research_methods_tcm8-2722.pdf

Though not a new problem, Professor MacInnes is optimistic that change is possible now and in his role as the ESRC Strategic Advisor on Quantitative Methods Training is involved in numerous initiatives to improve quantitative skills teaching at UK universities, including the launch of a network of 20 universities selected by the ESRC, Hefce and the British Academy to run innovative projects aiming to develop skills using quantitative methods. Read the press release here! http://www.esrc.ac.uk/news-and-events/press-releases/19172/technology-boost-for-maths-skills.aspx

David Willetts, Minister for Universities and Science, gave the keynote address and agreed that raising quantitative skills was a battle worth fighting, not just for the sake of social science research, but for a more statistically literate society. “We need more evidence-based policy!” he cried, though one does always resist the urge to hear that as “We need more policy-based evidence!” The David Nutt example springs to mind; Government didn’t question the quality of his research evidence  but sacked him because his findings and recommendations didn’t align with ideology. http://www.guardian.co.uk/politics/2009/oct/30/drugs-adviser-david-nutt-sacked

In a breakout session led by David Walker of the Guardian, who is the figurehead for the RSS initiative Getstats (http://www.getstats.org.uk/) which is campaigning to make Britain better with numbers , we brainstormed ideas for improving journalists’ statistical literacy. Newspapers regularly misreport research mistaking correlation for causation, comparing incomparable numbers, and completely misunderstanding probability and risk, and this leads to a misinformed public. A great example of this was the recent article in the Daily Mail which claimed “Each daily serving of unprocessed red meat, equivalent to a helping of beef, lamb or pork about the size of a deck of cards, raised the risk of death 13per cent, while processed meat increased it by 20per cent.” So, as Professor MacInnes joked, avoiding processed red meat gives you an 80% chance of being immortal??! http://www.dailymail.co.uk/health/article-2113986/Red-meat-early-death-study-Eating-regularly-increases-risk-death-heart-disease.html#ixzz1qVQi5Jn6

However, the problem of statistically illiterate journalists is not easily solved. Journalists come from a wide variety of educational backgrounds and many will not have had formal statistics training at university. Is the answer more short courses like the BBC’s and RSS’s science training for journalists programme? Does statistics need a popular media spokesperson al la Brian Cox to educate the general public about why stats matter and train them to spot gross errors in what they read? Does the academic statistics community need to do much more to shame newspapers for misreporting their research? If PR offices at universities are responsible for pushing out press releases, what more needs to be done to ensure this isn’t where errors are cropping up? I’m sure all of the above need addressing if we’re to make headway, and the challenge is knowing where to start when there is limited time and money to throw at the problem.

A number of people at the conference bemoaned the ease with which people will expose themselves as statistically illiterate, saying ‘I don’t do maths’ with no embarrassment whatsoever. You don’t hear people saying ‘I don’t do reading’ with the same nonchalance, and while it’s true that this is hardly comforting, I think it’s misleading to suggest the two are exactly the same. David Walker made the analogy that The Daily Mail wouldn’t comfortably publish grievous errors of spelling and grammar, and questioned why their reporting of numbers should be any different. And yet, as Daniel Kahnemen’s excellent book Thinking, fast and slow makes apparent, people are not intuitively good statisticians in the same way that we are intuitively good communicators. http://www.lrb.co.uk/v34/n06/glen-newey/sheep-dont-read-barcodes Statistics are hard for us, they require overcoming the systematic biases of intuition we’re all prone to and they cause cognitive discomfort. Any campaign to improve statistical literacy amongst students, journalists and consumers of research needs, I think, not to forget this.

Previous
Previous

One-Tailed Tests

Next
Next

Qualitative content analysis - the nuts and bolts...