3/12/2012

How to question numbers

 
 
 
Linda Nordling on how questioning numbers in policy statements gives journalists credibility and clout — and reveals new stories.
Journalists are comfortable with words and can usually spot spin in speeches a mile away, yet when it comes to numbers, many develop a mental block. A budget sheet can give many journalists sweaty palms.
Reporting numbers badly means that copy suffers. Silly mix-ups between millions and billions and other basic errors damage reputations for individuals and publications.
Poor numeracy among journalists fails to hold governments to account on dodgy statistics or un-supported rhetoric. And it encourages even well-meaning data producers to be selective about the numbers they publish. Why publish the ‘complete details’ of the annual budget, they might think, if journalists will misinterpret them?
But when journalists report numbers well, their stories gain depth, accuracy and influence. Overcoming your fear and distrust of numbers will make you a better journalist. And if, like many journalists in the developing world, you live in a country where the government is starting to pay attention to science and technology (S&T), you and your colleagues will need to step up your game to monitor whether ambitions are realistic and promises are kept.
This guide explains how to get to grips with numbers, focus on budgets, and assess numerical goals announced for new government projects.
But I’m a science journalist…
You might expect science journalists to be better at handling numbers than your average hack. But even when reading a data-rich scientific paper, most of us rely more than we care to admit on scientists’ conclusions, rather than cross-checking with the data.
But that mindset serves journalists poorly when writing about government budgets and initiatives to build scientific capacity in the country where they live, which is an increasingly important side of being a science journalist.
This is where journalists in Africa — my ‘beat’ — urgently need to improve in reporting numbers. And probably the same goes for many other developing, and developed, regions of the world.
Watch out for ’single numbers’
‘Single numbers’ are my biggest gripe in science for development reporting. Let’s say a president has made a speech, announcing an unprecedented push for S&T in her country. An article soon hits the Internet quoting the president talking about S&T in rosy terms like ‘crucial for development’ and ‘becoming a knowledge economy’.
Usually, there is a single headline number, such as “we will spend one per cent of GDP on S&T by 2015″, or “we will spend US$30 million on science in the next budget year”. These figures are easy enough for a journalist to pick up in the hubbub, and are pushed by press officers as flagship ambitions.
But what do they mean? Single figures float around copy like questions, begging to be answered. How much is one per cent of GDP, approximately, and what sort of percentage is being spent this year? Or how does US$30 million next year actually compare with current and previous years?
Single figures, especially when they are sold as ambitious targets set by governments, need other figures to provide context. One per cent might not be an increase at all. Ask press officers or officials for comparative data. Accepting figures uncritically is very poor journalism.
Do your sums
It may sound obvious, but simply adding things up, and then doing it again to make double sure, can avoid many number-related mistakes.
For example, if 37 per cent of university applicants are male, and 62 per cent female, ask yourself about the missing one per cent. Is it a mistake? Did one per cent not answer the survey? It’s better to double check your figures than to assume either that they were transgender or that your readers won’t spot the inconsistency.
And if you break down a budget, make sure the parts add up to the whole. Let’s say the Bill & Melinda Gates Foundation announces US$20 million for hiccups research, and you report figures allocated to each country. If your breakdown adds up to US$18 million, readers will query the remaining US$2 million (and if the parts sum to over US$20 million, you will look foolish).
Check what you’ve missed. Is the omitted US$2 million allocated for non-research activities? If so, be clear. Your story will be more informative than the ‘official line’.
How big is that?
It often helps to put numbers in context. For instance, a 20-acre wind farm may be difficult to picture. Easier is to say that the farm will be the same size as ‘about ten football pitches’.
When it comes to money, know the difference between a million (1,000,000) and a billion (usually 1,000,000,000 i.e. a thousand million, but some countries use a million million so check with your editor). Try to express science funding as what it might buy, particularly in local currency.
For example, a US$3 million grant may not sound like a lot in the context of a top US research university, which administers dozens, or even hundreds such grants each year. But in an African university such a grant could double the available research budget.
These basics make it easier to spot unrealistic announcements. South Africa’s Department for Science and Technology, which bankrolls most of the country’s research councils, has a total budget for 2012/13 of just under five billion rand (US$675 million). If you get a press release saying that “the University of the Witwatersrand has received five billion rand for water research”, that is probably too much to be realistic, and there might be a typo in the release, so it is worth checking.
Or if the African Union announces it will build 30 new nanotechnology centres for ten million rand, that is too little (at 330,000 rand per centre — the annual salary for one mid-career researcher).