Mathematical models lie behind planning and performance in many areas including education, transport, health and the economy. But how much confidence can the public have in them? David Hand reflects on banking, while Lindsay Davies and Sheila Bird discuss swine flu.
Don’t forget the real world, says David Hand
It is increasingly clear that the recent economic malaise had multiple contributory factors. They include greed and fraud, remuneration structures which favour short term gain over value creation, the unwillingness of people to heed warnings when things are going their way, ambiguity of the links between the ratings agencies and other bodies, and so on. A particular factor which has been singled out as a possible contributor is the role of the highly sophisticated mathematical models.
Mathematical and statistical models are ubiquitous in banking. They are used at all levels, from pricing options, forecasting likely share movements, balancing portfolios, evaluating the creditworthiness of individual customers, and for making operational decisions. Modern banking would be impossible without them.
The Turner report1 is subtle in its choice of words to describe the possible role of mathematics. It refers to a ‘misplaced reliance on sophisticated maths’. This does not suggest that the mathematics per se is wrong. The problem, the report suggests, arises from a failure to take proper account of the context of the mathematics, its limitations when applied to the real world, perhaps a lack of proper appreciation of the fact that models built on past data may not adequately describe the future, the use of inappropriate measures of risk, and a willingness to use the results of the mathematics by people who do not understand it or its limitations.
There is certainly some force in these suggestions. The real world involves all the additional aspects I listed above, as well as others. A mathematical model is a simplified abstraction. ‘Sophisticated’ does not necessarily mean it models all the interacting factors - and in the present case that is clearly true.
The mathematics used is sophisticated, but we must always remember that the solution to a mathematical model is not a solution to the real problem, but a solution to a simplified idealisation of the real world problem. Its adequacy in helping us make decisions about the real world will depend on how well it represents the key factors - and is always vulnerable to the appearance of additional factors, or of changes which we did not foresee.
Likewise, effective application of a mathematical model relies on an understanding of its limitations, and on the external environment in which it is legitimate to apply it. If managers applied the models without this understanding, then certainly they share some of the blame.
1 A Turner (2009), The Turner Review: a Regulatory Response to the Global Banking Crisis. Financial Services Authority.
Lindsey Davies counts the ways
The UK has been planning for the potential pandemic for a number of years. Our plans are recognised by the World Health Organization as amongst the strongest in the world and modelling has been critical to this.
As a planning tool, modelling allows us to consider the likely impact of a pandemic and the resources needed to respond effectively. To take modern-day Britain and unpick how elements of our day-to-day lives could be affected by a pandemic is a hugely complex process.
Healthcare, and caring for those affected, is a significant part of this but there are broader considerations. How do you maintain fresh food in the supermarkets and vital medicines in hospitals if some of your workforce is down with flu? And how do you predict how many people that will be?
Modelling can help inform these critical decisions. But the extent of the uncertainties associated with pandemic flu is a major challenge for emergency planners - and there is no substitute for real-time data.
Evidence comes from previous pandemics but before a new pandemic there is little data and a wide range of plausible assumptions. Models cannot predict what will happen but they can map out a range of scenarios of what might happen. This is what we did when we first published our national framework for responding to a flu pandemic four years ago.
What models can do is inform us on which responses are likely to be of some value over the range of possible scenarios. They can show where interventions like travel restrictions are unlikely to have any significant benefit. Or where others – like school closures – may only have significant effects in certain circumstances, or if combined with other measures.
Today ‘real-time modelling’ is being used to integrate the results of observations from across the world to understand the behaviour of the new pandemic virus. It is also being used to construct a range of scenarios that indicate how the pandemic may develop and feed directly into NHS and other national planning.
Up to mid-October, swine flu has been a mild illness for most people. But a small, though significant, minority have gone on to develop complications and some have died. Thanks to the robust plans in place, informed by modelling, we are in a strong position to respond. But we must be ready to adapt our response if flu changes. Constant vigilance is essential.
Sheila Bird considers swine flu
Whereas Scotland organised its swine-fest sagely, England has had to re-learn the salutary lesson that data collection systems need to be professionally well-designed. Improvements in transparency, reporting standards and data collection on swine flu have followed an intervention by the President of the Royal Statistical Society.
Real-time epidemic modelling needs to be informed by really well-collected data. Messing up data collection messes up! Simplifying assumptions have to be invoked which may conflict with nature’s complexity but, if there are no real data, then modellers remain blind to the hidden conflict. In particular, the government should make funds available for confirmatory testing of all hospitalized cases with suspected swine flu.
UK’s pandemic preparedness has paid off in important respects. Epidemic modelling of historical pandemics emphasised the importance of containment and social distancing, to buy time and delay the epidemic peak. Similarly valuable was the initial containment phase of rapid antiviral treatment for laboratory-confirmed index cases, and prophylaxis of their household and other close contacts. The prescient ordering of sufficient H1N1v vaccine to offer two doses to priorit groups from mid-October was also good preparation.
The transition to ‘treatment only’ phase was well-managed, but not enough representative sampling was done to confirm (by swabs) what proportion of suspect cases actually had swine flu. For the week ending 2 August, for example, at least five times as many people were getting Tamiflu as were getting flu.1 This gave no confidence that UK is capable of organising a party in a brewery . . .
Who actually has swine flu?
England’s late recognition of the importance of H1N1v confirmatory testing in hospitalised patients with suspect swine flu has been a swine not only for modellers but also for the Chief Medical Officer, who has had to make expedient provision to double the number of functional critical care beds.
Unvaccinated pregnant women’s risk from swine flu is real: 26 critical care admissions per 100,000 pregnant women was the winter rate for unvaccinated pregnant women in Australia and New Zealand where, on a population basis, all winter critical care admissions for confirmed swine flu were at least five times the rate we've seen to mid-October in Scotland.
UK’s pandemic preparedness has been let down by failures to design well its essential data collection when data are not available for England. Real-time epidemic modelling needs to be able to distinguish between two epidemics: actual H1N1v and phone-calls about swine flu.
1 N Hawkes (7 August 2009), Stockpiling antivirals? Straight Statistics http://straightstatistics.org/blog/2009/08/07/stockpiling-antivirals