Monday, 14 September 2020

Panic and pandemic: learning from history

The WHO Collaborating Centre on Global Health Histories, which is supported by the Wellcome Trust produced a seminar in early March 2020, looking at the unfolding COVID-19 pandemic in the context of recent history.

 

The full seminar can be listened to here: 

COVID19: Inter-disciplinary approaches (GHH Seminar 144, March 2020) 

Speakers: Dr Owain Williams (University of Leeds), Dr Fang Xiaoping (Nanyang Technological University, Singapore); Dr Kate Mason (Brown University)

This briefing draws upon the material and presentations at that seminar. 

 

 The COVID-19 pandemic silenced the world’s cities in the first half of 2020. Nobody knew what was going to happen, as industries shut down and people were sent ‘home’ (wherever that might be and whatever hardship going home might entail). Reactions around the world to the disease were mixed; some were extremely comprehensive, some were patently inadequate, and many are still to be assessed. Compliance with public health messages was much the same, both within communities and between communities.

 

Yet this is by no means the world’s first outbreak of seemingly uncontainable disease. Dr Owain Williams of the University of Leeds describes the quarantined cruise ships sitting in south-east Asia and off the coast of Japan as “a return to the yellow flag days of previous plague events”. China, where COVID-19 first emerged, has experienced both cholera and severe acute respiratory system (SARS) in the past 60 years. How did the country respond to these three epidemics, and what themes emerge?

 

Cholera

The current cholera pandemic (the seventh in global history) broke out in south-east Asia in 1961 [https://www.who.int/news-room/fact-sheets/detail/cholera]. The following year the disease broke out in Guangdong province and rapidly spread down the south-east coast of China. By 1965 cholera in the area was contained – reduced to a very small number of cases, monitored and treated – as the result of a programme of strict quarantine and isolation.

 

“When a cholera pandemic first broke out in Yangjiang County, Guangdong Province, in June 1961, the central government mobilised medical resources and personnel across the country to impose a cordon sanitaire around the affected areas in order to prevent further spread,” explains Fang Xiaoping, Assistant Professor of History in School of Humanities of the Nanyang Technological University, Singapore. “These interventionist response schemes in Guangdong established a general framework for controlling the cholera pandemic.”

 

Fang focuses particularly on the south-eastern Zhejiang province. A week after the first cholera case was confirmed in Rui’an County in July 1962, the provincial government partitioned the whole of Zhejiang into a series of concentric circles centred on the cholera-affected Wenzhou area. Major observations stations (for identifying confirmed and suspected patients) and temporary joint quarantine stations (into which confirmed and suspected patients were expected to move) formed quarantine ‘rings’ to contain the spread of the disease. The county and city governments divided the area within each ring into further quarantine zones, down to the level of districts and communes. 

 

In addition, the provincial government set up a comprehensive inoculation campaign. On 3 August 1962 directives from the provincial party committee ruled that the entire population in each county of Whenzhou prefecture were to be inoculated against cholera before 15 August – which meant that the local health services had to inoculate nearly three million people within only 12 days.

 

It was not always easy, Fang explains. “The emergency inoculation scheme in the summer of 1962 suffered, due to the poor coordination of local cadres and chaotic information on inoculation subjects.” But from 1963 onwards it was much more of a concerted campaign, with better-organised cadres and reliable inoculation registers. 

 

Two pandemics: building a public health system

Even with the initial stumbling-blocks, this cholera outbreak occurred at a period when China’s health system had started to develop its capacity for handling mass outbreaks of disease. “The medical system (which included the epidemic prevention scheme) did not emerge until the mid-1950s, just six to seven years before the outbreak of the global cholera pandemic in 1962,” Fan explains. As a result, this pandemic presented both considerable challenges, but also the opportunity to restructure and integrate the medical and administrative systems. 

 

For instance, officials in Wenzhou City, Rui’an and Pingyang counties were issued with a detailed timetable for reporting to the provincial health department (although initially, the national reporting scheme lacked adequate staffing and equipment). In 1963, hospitals and clinics set up outpatient departments for intestinal disease, in order to identify and test suspected cases of cholera. The government also committed to establishing complete statistical data on actual and potential cholera patients.

 

And importantly this was also a period when China was very much isolated from the rest of the world’s health community. “When the pandemic broke out in 1961, the People’s Republic of China was not a member of WHO and it remained isolated from the international epidemic reporting network”, Fan explains: although it did keep informed about the global pandemic surveillance network.

 

However, the public health infrastructure in China slowly deteriorated over the following decade, after the death of Chairman Mao in 1976. Finally, in the early years of this century, the Chinese government finally began reconstructing the system and setting up new Centers for Disease Control and Prevention (CDCs). “This was an explicit reference to the US CDC in Atlanta Georgia and was intended to evoke a highly modern scientific ethos,” explains Dr Kate Mason, Assistant Professor of Anthropology at Brown University. It was, she adds, “serendipitous timing” because in 2003 the country was hit by a new epidemic. The CDCs suddenly had a very clear purpose and mandate: containing and preventing SARS. 

 

SARS

Mason was in fact caught up in the SARS outbreak herself. “On 12 April 2003, I was evacuated from my post teaching English at Georgetown University in Guangzhou. I packed my belongings into two suitcases in a duffel bag, got on a bus, crossed the border into Hong Kong and flew with my N95 mask on back to the United States. And at the time I was perplexed by how a virus that up until that point had seemed so thoroughly unimpressive to my friends and colleagues in Guangzhou – which competed with scores of other microbes to cause disease in a tiny minority of Guangzhou’s millions of citizens and that inspired my neighbours to take little more drastic action than to open the windows or repeatedly wash the floors with vinegar – had nevertheless spurred enough panic back home for my sponsor to demand my evacuation back to the United States and my family to attempt to quarantine me.”

 

China’s central government finally admitted the scope of the disease and began instituting control measures, including quarantining entire villages, setting up neighbourhood watch systems to identify potential carriers and building new hospitals in a matter of days. “The WHO praised China's control efforts and credited them in part with the success of the global containment effort.”

 

Controls, freedoms and trust

These measures in China laid the foundations for the way this country responded to COVID-19. They also expose some of the wider issues involved in controlling the pandemic, both in China and elsewhere. “A debate has emerged about whether the nature of the Chinese state has, in a sense, given it a special ability to deal with pandemic events: whether that degree of state control is in some way a positive in this story of attempts to control coronavirus,” Williams says. “We've seen the large-scale deployment of military personnel by the Chinese state, forced testing and forced quarantine of citizens; and China has been roundly criticised by some people in the human rights and health community for its abuse of its citizens’ human rights.” 

 

Mason raises a further issue. There are assumptions of what is permissible and even expected, in China but less permissible elsewhere. “One of the big takeaways that the Chinese government got from SARS is that draconian actions are necessary to control a new virus if it does occur and that China will be praised by the international community if it takes such actions – but only if it does so within its own borders.” 

 

On the other hand, both Mason and Williams point out that many Chinese people themselves do not agree with the idea that these measures are in fact permissible. “Issues of trust and secrecy and cover-ups came to a head, especially with the death of Dr Li Wenliang (the first doctor who recognised and tried to warn about COVID-19, before dying of it himself),” says Williams. In reality, Mason adds, the idea that the Chinese government operates blanket disease surveillance is also far from accurate. “China’s health system operates as a collection of thousands of little fiefdoms at the local level with very little ability on the part of the central government to compel ongoing action in any one of these. What this means in practice is that the central government does not have a lot of power to make local officials report what they are seeing, when they don't want to or feel unable to do so.”

 

Information in the social media age

There is also one glaringly obvious difference with previous epidemics: the way that information has been consumed, perpetuated and manufactured.

 

“In 2003 people whispered of a strange new virus when the news of SARS was officially released and they complained under their breaths about local government responses,” says Mason. “But this happened at a relatively slow pace through word of mouth. In 2020, with hundreds of millions of Chinese citizens cooped up at home with nothing better to do than to look at their social media accounts, there was an explosion of online information, disinformation, and serious discontent the likes of which the Chinese government has not really ever had to deal with before.” 

 

None of this has been restricted to China, as the pandemic spread across the rest of the world: Williams describes it as “a crisis replete with misinformation, with conspiracy theories abounding everywhere”. In a number of different countries distrust of government messages – especially when those messages themselves are conflicting and/or hard to understand – have led to people flatly ignoring advice or rules about masks and/or social distancing (the effects of which have been seen, for instance, in the UK).

 

Health systems and preparedness

The unfolding story of COVID-19 isn’t over yet. Nobody knows what kind of ending it will have, or when that will be reached – or indeed whether it will remain a threat in some parts of the world while others are protected against it. 

 

Williams is particularly wary about the future. “There's a political economy of chronic and continued under-investment in health systems and institutions globally, regionally and locally across the world; and the United States is very much not an exception to that story. We are seeing chronic and continued under-investment in vaccines and prophylaxis. We have neglected health and global health at our peril and I think we will pay the price for that very soon. There's a real story here of politics of fear and neglect and misinformation.”

 

Radhika Holmström is a Wellcome Trust-funded writer and communications specialist working with the WHO Global Health Histories project at the University of York

Monday, 7 September 2020

The many strands of malaria elimination

The WHO Collaborating Centre on Global Health Histories supported by the Wellcome Trust has produced a series of seminars and webinars which look at different aspects of malaria control, and at how this can only be achieved through a combination of approaches.

 

The two seminars can be viewed here:

Can malaria be eradicated? The future of malaria control

(GHH seminar 148, 21 July 2020)

Speakers: Dr Ian Graham, Director of BioYork and Weston Chair of Biochemical Genetics at the University of York), Dr Karen Barnes (Professor of Clinical Pharmacology at the University of Cape Town and Co-Chair of the South African Malaria Elimination Committee), Dr Rajitha Wickremasinghe (Professor of Public Health and former Dean of the Faculty of Medicine at the University of Kelaniya in Sri Lanka)

 

Health Diplomacy: The bases for international and global health

(GHH seminar 164, 6 March 2020)

Speaker(s): Professor Sanjoy Bhattacharya (Professor in the History of Medicine, University of York, UK and Director, WHO Collaborating Centre for Global Health Histories), Dr Lakshmi C. Somatunga Deputy Director General (Medical Services(I)), Ministry of Health, Nutrition and Indigenous Medicine, Sri Lanka)

 

This briefing draws upon the presentations at these seminars, as well as additional conversations with Professor Sanjoy Bhattacharya.

 

“We need to recover the huge amount of information on what makes it possible to implement public health programmes,” says Professor Sanjoy Bhattacharya, director of the Centre for Global Health Histories at the University of York and of the WHO Collaborating Centre for Global Health Histories. “Countries need to recognise the importance of their own implementation histories and experiences and fund the recording of that because they are disempowering themselves by not doing so and falling victim to the idea that the only important ideas were at global rather than the national level. This is a critical history that engages with policy but tries to record all sides of the argument to see which of those ideas made a difference on the ground.”

 

To date, the world has eradicated one human disease [Smallpox]. It is becoming increasingly possible that malaria may follow suit. With a new vaccine showing some signs of success in reducing malaria in young children [https://www.who.int/malaria/media/malaria-vaccine-implementation-qa/en/], and an increasing number of countries being declared malaria-free, along with advances in treatment, the discussions about malaria have moved from control to elimination to the possibility of global eradication.

 

However, the disease is still responsible for hundreds of thousands of deaths every year, almost all of them in Africa and especially among young children. And as Bhattacharya points out, when malaria is finally eradicated, this will be the result of what has (and has not) happened in different countries, specific to those cultures and needs.

 

Effective public health: Sri Lanka


Dr Rajitha Wickremasinghe, who is Professor of Public Health and former Dean of the Faculty of Medicine at the University of Kelaniya in Sri Lanka, describes the reason why the country was certified malaria-free on 6 September 2016 as “nothing but good public health practice”. Along the way, there were definite peaks and troughs: by 1963 the disease had almost vanished from Sri Lanka but then returned (and during this period Sri Lanka also experienced civil war and a huge amount of violence and unrest).  However, in 2009 the country moved from a ‘control’ programme to an ‘elimination’ one, led by the Ministry of Health and two local NGOs and targeting different malaria species in succession. “The interventions were basically universal access to diagnosis and treatment, and vector control” (the use of insecticide-treated nets and/or residual indoor spraying, to get rid of the mosquitoes which transmit the malaria parasite).

 

Much of this good practice, Wickremasinghe explains, involved comprehensive access to control measures (which includes access to education that means people adopt those control measures). “We have an excellent public health system; we have good road connectivity and transport network; we have a literate population, and also there were no counterfeit medicines in the market. Even during the separatist war malaria commodities were provided to rebel-held areas.”

 

 It also involved coordination and a degree of flexibility over how health bodies spend donated funds. And, significantly, the period when a greater proportion of funding was spent on ‘management and other costs’ – in other words, human resources and technical support – rather than insecticides and spraying materials is when the rate of malaria decreased. This is an important counter to the widespread objection to ‘spending on administration rather than frontline costs’ that many potential donors raise.

 

In reality, Wickremasinghe says, “flexible funding which enabled them to do things that government regulations did not otherwise provide was more effective. A programme must be agile and able to respond to the on-the-ground realities.”  An integrated approach is essential, everyone agrees. And that approach has to focus on the people and communities who are at risk of the disease.

 

Experts in their own realities


“Communities are not all the same and of course, because of that, you can't have the same engagement strategy for all communities. There's no ‘one-size-fits-all’,” says Professor Karen Barnes of the University of Cape Town, who is also the Co-Chair of the South African Malaria Elimination Committee. “We've got some good tools that have advanced the control of malaria and aided its elimination in some countries, Sri Lanka included. But each of these tools really depends on community buy-in and community acceptance of what needs to happen. And community engagement must be bidirectional, it's not just a case of the government or the technical experts telling communities what to do; you have to find a way of hearing what the communities think – what they like, what they understand and what they need more of.” 

 

These are the people, she points out, who are “experts in their own realities”. “These are communities that know how bad malaria is. Most people know someone who's died from malaria and they've often had malaria themselves.” (Conversely, areas where malaria rates are low, community buy-in requires a lot more work: communities have, after all, plenty of other concerns that they worry more about than malaria.)

 

Barnes was involved in the Lubombo Spatial Development Initiative (LSDI), a government initiative between Mozambique, South Africa and Eswatini (Swaziland) aimed at enhancing economic development in the area, a key component of which was malaria control. It used a combination of approaches – indoor spraying (the malarial mosquitoes in the area rest predominantly indoors) and/or the use of bed nets to prevent the disease, rapid diagnosis if someone is suspected to have malaria and combination therapies including the drug artemisinin for treatment. All of these require people at risk, or who contract malaria, to take their own action – not least because they need to finish a full course of treatment, rather than stop as soon as they feel better and save the remaining medication for later. “We also needed community engagement to support the very important surveillance necessary to understand how people respond to treatment. Are they cured? Are there factors around a homestead that might put them at more risk of malaria? Have the number of malaria-transmitting mosquitoes been reduced?” 

 

Those experts in their own realities are the people who can bring about real change: for instance,  138 community members in the LSDI volunteered to have window traps on their homes to monitor the number of mosquitoes in their area. However, that only happens if they believe that it’s worth doing and they trust the people doing it; indoor spraying, in particular, involves taking the risk of letting a stranger into the home and in other areas people may refuse this. 

 

In Mozambique, local communities have selected people to be trained as community health workers – either unpaid or with minimal pay. These health workers are trained in carrying out malaria tests, in the correct use of drug treatments, and in recognising when sick people in their community need to make the long journey to the hospital. As a result, many people do not need to leave the local area to travel to a clinic or health centre; the programme is being brought directly to them.   

 

The limitations of treatment


There is also a major concern that the major treatment on offer may become less effective. Dr Ian Graham, Director of BioYork and Weston Chair of Biochemical Genetics at the University of York, has worked on producing hybrid variants of the Artemisia annua plant which can be grown by small-scale farmers. The plant itself has been used as a fever treatment for over 400 years, but it is the specific artemisinin molecule (identified by Chinese pharmaceutical chemist Tu Youyou, who was awarded the Nobel Prize in 2016 as a result) that is usually highly effective, and needs to be produced in larger quantities. It is also most effective as part of combination therapy, and it is very important to keep it as such. 

 

“Artemisinin derived drugs are the most effective treatment for malaria but we already know that the parasite causing the disease can mutate and develop resistance to these drugs, which represents a major public health threat,” Graham explains. “The main way to combat resistance developing is to use artemisinin in combination with partner drugs, as this prevents parasites that manage to mutate and develop resistance to one drug taking hold. Herbal treatments containing artemisinin, using the drug as monotherapy or not completing a full course of combination therapy treatment can all increase the risk of resistance developing. It is essential that we do everything we can to prevent resistance from becoming established, especially in sub-Saharan Africa where local emergence of resistance has recently been reported.”

 

“Artemisinin resistance is a global threat,” says Barnes. “If artemisinins don't work well then, it’s not just that they clear the parasites more slowly. There’s also an increase in the malaria parasites (called gametocytes)that transmit malaria. That means a potential increase in malaria transmission. There’s also more pressure on the other drugs that are being used in the combination treatment, so these partner drugs start to fail too. There are some places in Southeast Asia where there are quite frightening levels of treatment failure: over half the patients treated were not cured by the recommended artemisinin combinations. And we are now getting isolated reports from elsewhere in the world. The impact on malaria cases and deaths if such artemisinin resistance spreads to the Indian subcontinent or sub-Saharan Africa or South America are quite frightening to think about. We'll stop talking about elimination for a long time if that happens.” “It is important to keep looking for alternatives,” Graham agrees. “We are in an arms race with the malaria parasite, which given the opportunity will almost certainly develop resistance.”

 

 

No scope to stand still


Artemisinin resistance is not the only reason why malaria control or elimination programmes have to be, as Barnes says, “dynamic, not static”. In areas where control programmes lose funding and support, the rate of malaria goes up – and that is in areas where communities are very happy to play their part. 

 

In addition, malaria is by definition a very mobile disease. Mosquitoes do not observe regional or national borders. In any case, the borders in many places, like southern Africa, are extremely porous – they are lines drawn up by previous imperial rulers, rather than any division that local people themselves would recognise. This is why regional collaboration like the LSDI, and collaboration between malaria programmes, is essential. 

 

The global plan for elimination is a genuine, and a realistic, goal. However, it will only be achieved if those coordinated, integrated and dynamic programmes are continued – and if they involve the people who are at most risk of malaria themselves.

 

 

 

Radhika Holmström is a writer and communications specialist working with the Global Health Histories project at the University of York. 

 

 

Monday, 17 August 2020

Talking about Ebola

The WHO Collaborating Centre on Global Health Histories supported by the Wellcome Trust has produced a series of seminars and webinars looking at the different ‘narratives’ of Ebola, and how this disease is a prime example of neglect in the global health context.

 

The full seminars can be seen and/or listened to here:

Haemorrhagic fevers: The deepest fears (GHH seminar 41, 12 May 2010)

Speakers: Professor Melissa Leach, Director of the Institute of Development Studies based at the University of Sussex; Asiya Odugleh, Alert and Response Department, WHO 

 

Ebola: exploring the cultural contexts of an epidemic (GHH seminar 91, 8 October 2015)

Speakers; Zsuzsanna Jakab, Regional Director for Europe. WHO; Jeremy Farrar, Director of the Wellcome Trust; Guénaël Rodier, Director of the Division of Communicable Diseases, Health Security and Environment at the Regional Office for Europe, WHO; and Dr João Nunes, lecturer in International Relations, University of York.

 

This briefing draws on the speeches and material presented at those seminars, as well as additional material from and discussions with Dr João Nunes.  

 

Ebola is not new. It was first officially identified and named in 1976 and successive outbreaks since then have killed thousands of people in different African countries. The virus has mutated between different outbreaks, which is one reason why the disease has consistently defied attempts to contain it fully – but this is certainly not the only reason. “The frenzy about Ebola was not accompanied by a systematic engagement with its broader context or the different degrees of vulnerability to it,” says Dr João Nunes, Senior Lecturer in International Relations, University of York.

 

Instead, Ebola is at the centre of myths and misconceptions – most of which portray the disease as somehow a terrifying inevitability, rather than the result of assumptions and decisions about who matters and who does not in terms of global health. “As a multidimensional, complex reality Ebola has been neglected,” Nunes says.

 

The myth of the dark continent

Ebola is undeniably frightening. It is transmitted through bodily fluids, and the symptoms include haemorrhage, kidney failure, lesions, and high temperature; it is highly lethal, with mortality rates of between one in four and nine out of 10; and it progresses very quickly, from first symptoms to death within a couple of weeks. There is currently no vaccine or antiviral treatment. On the other hand, it actually kills far fewer people than Lassa fever, which has a minimal mortality rate by comparison but is much more widespread. Professor Melissa Leach, Director of the Institute of Development Studies based at the University of Sussex, points out that Ebola is an “exceptional” or “master status” disease, which inspires a dramatic level of fear – on several levels. 

 

    Part of the reason for that status is the idea that Ebola is a disease that originated in dark African forest, perpetuated and spread by people and customs that are different, foreign, other. In fact, according to this narrative, the people who live in the areas where Ebola breaks out actively spread the disease further, through what they eat and what they do – so not only are they victims but they are perpetrators. 

 

    “Ebola is linked to ‘exotic African practices’,” says Nunes. “It’s heavily racialised; it cannot be separated from the persistent anxiety over certain types of groups.” Alongside that, he adds, there is “the underlying racist narrative which depicts the entire African continent as a homogeneous entity, a place of despair and helplessness. A place where things like Ebola ‘just happen’, because it is not possible to implement effective governance.” 

 

    The myth underpins and justifies an idea that Ebola outbreaks are, fundamentally, unavoidable tragedies. Yet the reality is that the outbreaks happen in regions that have been rendered vulnerable because of political, social and economic decisions. 

 

Customs and practices

Social customs do play a part in increasing the spread of Ebola. Some of these are associated with traditional healing, involving blood and unsterile knives. Others, though, are more about a refusal to maintain distance from the sick person (which has very obvious parallels with the arguments in other parts of the world over masks and/or social distancing during the COVID-19 outbreak). 

 

    And in fact, some practices fit in very well with other medical beliefs about how to contain the spread of an outbreak. Leach points to the Acholi people in Northern Uganda, who isolate patients, encourage people to stay in their own homes, and keep people who have recovered within their homes for a further month. All these measures are completely in line with what Leach describes as “the biomedical cultural model employed by international teams”. 

 

    Guénaël Rodier, former Director of the Division of Communicable Diseases, Health Security and Environment at the Regional Office for Europe, World Health Organization (WHO), also points out that traditional healers and funeral practices are not the only reasons for the spread of Ebola. “More importantly it is amplified by hospitals and the health system. With all large outbreaks, the health system plays a major role in the amplification of the disease.”

 

Ebola stories

Leach identifies four different Ebola ‘stories’ – the versions of the disease and its context that all contribute to how this disease is considered and treated as a global health emergency. They vary according to who is telling the story; how the ‘problem’ is defined; who or what is considered responsible for the problem; whose knowledge is valued; and what strategies are considered useful in tackling the problem. “In looking at stories you can begin to clarify some of the choices and cast sharper light on how to go about some of the practical issues.”

 

    The first is the global threat – the plague which emerges from Africa (that dark continent) and needs to be stopped from spreading across the world. Much of the worry is about how a virus that comes from ‘over there’ can come to affect ‘us’, and indeed people often talk as if the virus has some kind of agency of its own. This is the model that has underpinned much of the international perspectives on outbreaks. “Ebola is an archetype for this ‘outbreak narrative’,” says Leach, pointing out how much of the response to Ebola has been motivated by fear. This is also the story that we have seen played out in fiction and in film (most recently in the film Contagion, which was watched by countless people in lockdown in the spring of 2020, as cities emptied and the death tolls climbed). In the fictional versions there is usually a medical solution where the white-coated (and usually white) scientists finally conquer the threat.

 

     The second is the one of deadly local disease events, and mounting a rapid response against them, which has gained more traction in recent years. The emphasis here is on containing a short-term and local – though deadly – outbreak. The local people are again mostly ignorant and/or misguided, perpetrating dangerous practices; and authority (and the solutions) lies with what Leach terms the “standardised technical response package” of isolation, contact tracing and barrier nursing. 

            

    The third is based on culture and context, and reverses some of the assumptions in the other two models to put the people and communities that are affected by the disease at the centre. Ebola and similar diseases are, after all, not new, and people have built up knowledge and medical/cultural practices that may well overlap with the practices that of other medical disciplines (like the Acholi strategies of social isolation). Even when the local practices are in conflict with ‘mainstream’ medical approaches, they cannot be overridden without thinking. 

 

    Finally, there is a narrative of mysteries and mobility, which has been evolving since around the mid-1990s. This is in some ways a group of different stories, bringing together the environmental-social-animal-disease-ecological systems that are all associated with Ebola; from migration to political systems to climate change. Leach describes it as “a much less coherent narrative” but also “very exciting”, as it draws attention to a whole variety of overlapping issues that drive the spread of the disease. 

             

    These last two narratives move away from the top-down, standardised responses to Ebola (and other diseases) which many people in the area resist in any case, because these is imposed on them whether or not they agree. The last, importantly, also moves away from seeing Ebola as a solely medical issue. All too often, says, Nunes, Ebola “is framed as an African problem that requires surveillance and containment. The focus is on preventing infection and containing disease, which runs the risk of overlooking the broader context that makes the crisis possible in the first place. Crucially, there is almost no attention on the wider social and political context.”

 

    Nunes brings in another perspective; that Ebola is part of the “complex issue of neglect in global health”. Neglect, as he defines it, maybe the failure to care about an issue at all, or the failure to take address it adequately – either because it isn’t considered sufficiently important, or because the action that is taken is not competent and/or adequate. “It’s about a moral landscape and a political arena in which effective political solutions are not imagined or mobilised.” 

 

    Neglect, he explains, doesn’t somehow emerge on its own; it is the result of the context and the culture, the assumptions about the disease and the people that are affected by the disease. ”At the crux of the production of neglect, it is always possible to locate human agency and choices. Issues are rendered invisible because certain actors follow certain purposes. Neglect should not be considered mere invisibility, but rather a process of making something invisible and denying an adequate response.”

 

    And emotion plays a very strong part in this too. Ebola is associated with people that are thought of as “alien, outside the sphere of moral obligation, disgusting, beyond the possibility of any moral improvement”. They have become effectively invisible; their needs and suffering don’t count as much as the needs of people who somehow ‘count more’. It’s not that Ebola gets no attention or sympathy. But it is depicted as a strange and frightening disease – one which hits the headlines briefly and then drops out of public focus. “The same forces that made it trending contributed to its construction as exotic.” It is treated as an emergency – time and time again – when in fact it is the result of a set of endemic, rooted problems.

 

Local resistance

“Ebola is not a rare event,” says Jeremy Farrar, Director of the Wellcome Trust. “It is a series of epidemics – and with each, we have failed to respond. We need to have systems that can prepare for it and act.” “It is possible to control outbreaks without sophisticated tools,” Rodier adds. “It is people-centred. Each contact is a person and needs to be on board, and to break the chain of transmissions need to know who has been in contact and follow them up. It is not highly transmissible.” Yet many people who are all too well aware of the devastation that Ebola can inflict refuse to comply with the advice and the practices imposed by medical staff.

 

    That can be partly because other diseases – HIV, malaria, water-borne disease and others – are often endemic in the areas where Ebola breaks out; and in practice, these may be more of a priority for local people. But it’s also the case that some of the things people are asked to do, in order to prevent or treat Ebola, can be distressing and/or can clash with their usual beliefs and practices. Leach cites an outbreak in Gabon in 2001, where villagers actually mounted an armed resistance to medical teams because in previous outbreaks sick and dead people had been taken to isolation units and there was a fear that their body parts were being stolen. Rodier gives the example of young male Red Cross workers removing bodies, in a culture where women are usually in charge of healthcare and the care of the dead, and putting them in black bin bags whereas the colour associated with death is white. Sometimes a team of health workers has arrived ahead of an outbreak which was spreading towards a locality, even before the disease itself, so it looked as if they themselves were responsible for the sickness and deaths. People distrust the motives of what they perceive as ‘the Ebola business’; their own ways of doing things are being ignored; and at the same time, the shame and stigma about Ebola and people who survive Ebola is also still a very powerful social issue. “When you look of the typology of the resistance, it was perceived as fighting ‘a war against us’, in the belief that the international community was spreading the disease,” Rodier explains. “Sometimes you can understand why they believed it.” 

 

Changing and uniting

“If I have suggestions, it would be about drawing some of these narratives together,” says Leach. They all have elements that can be useful: and in practice they interact and evolve together in any case. Leach suggests, in particular, integrating the ‘outbreak narratives’ with those of longer-term, endemic disease; the global perspective with the local; and sustaining the focus on culture and context, while extending it to include the “environmental dynamics” of disease. 


    The narratives are changing, too. The world in which Ebola breaks out today is a very different from the world of the 1976 outbreak. “The virus has not changed dramatically,” says Farrar. “What changed is the cultural context.” People are moving at an unprecedented rate, and moving from villages into cities and from cities across the world. “Societies are different, the way in which societies work together is different, the way individuals behave is different. So we have to think forward not backwards and look at the world as it is today and will be in the future.”

 

    As part of this, health systems and services need to do things differently as well, he argues. “Health systems around the world were established in an era of infectious diseases when relatively young people got ill, came into the hospital, and either recovered or died. The new world is very different. The dreadful tragedy of Ebola gives us an opportunity to redefine how we do things. The 21st century will bring challenges we’ve never faced before but we are also at the dawn of a golden scientific age. We should not be separating, we should be pulling together, including with the clinical solutions.”

 

    “We need alternative framings of Ebola that consider power inequalities, the relations between groups and the production of harm, vulnerability and structural violence in the international sphere. It is frustrating going over the same debate each time instead of thinking more systematically about the status quo that seems to accept that crises will occur and human ingenuity will get us through,” Nunes concludes. “We get unnecessary suffering, unnecessary deaths and millions in wasted resources. We are not united by contagion. We are divided by the global structures and relations that create the privilege of some and the vulnerability of others. Swift decisions are important but short-term policies shouldn’t be the sole focus of global health governance. We need first and foremost a strong political commitment.” 

 


Radhika Holmström is a writer and communications specialist working with the Global Health Histories project at the University of York.

Monday, 10 August 2020

Laying the foundations for eradicating smallpox

The WHO Collaborating Centre on Global Health Histories supported by the Wellcome Trust has produced a series of seminars and webinars which discuss the WHO’s successful programme to eradicate smallpox, focusing on a diverse spread of national drives early in the programme which were integral to its eventual success.

The three seminars can be viewed and/or listened to here:

The fruits of a new internationalism?: South Asian governments, the WHO and global smallpox (GHH seminar 26, 2 October 2008)

Speaker: Dr Sanjoy Bhattacharya, now Professor in the History of Medicine and Director of the Centre for Global Health Histories at the University of York and of the WHO Collaborating Centre for Global Health Histories. 

 The Creation & Expansion of the Worldwide Smallpox Eradication Programme (GHH seminar 121,March 2019)

Speakers: Lu Chen (University of York), Dr Susan Heydon (University of Otago), Dr Carlos Campani (University of York) and Dr Namrata Ganneri (University of York and SNDT College of Arts & SCB College of Commerce and Science for Women, Mumbai).

Smallpox eradication 40 years on (Cultural Contexts of Health (CCH)* and GHH 138 webinar, 5 November 2019)

Speakers: Dr Namrata Ganneri (University of York, and SNDT College of Arts & SCB College of Commerce and Science for Women, Mumbai), Mr John F Wickett, World Health Organization (retired). 

*The CCH webinars are a subset of the WHO GHH seminars, delivered for the CCH project based in WHO Europe

This briefing draws on the speeches and material presented at those seminars.

 

On 8 May 1980, the World Health Organization (WHO) made a declaration which would have been considered impossible only a few years earlier: the disease smallpox was declared eradicated from the world. To date, no other human disease has been eradicated in this way.

 

It followed a concerted programme coordinated by the WHO to eradicate smallpox, first discussed at the World Health Assembly (WHA) in 1957, and proposed and voted in the WHA of 1958. Most attention has focused on the so-called ‘intensified phase’ of the eradication programme from the late 1960s onwards. Yet this was only made possible by the work that preceded it: a range of very different drives in different countries.

 

Some of those early drives were partial, and/or dependent upon the political and socioeconomic context. All of them were produced at a time when technology, understanding and access were very different from today. They cannot be summed up as a simple set of strategies that ‘worked or didn’t work’. But they did provide the initial data about a range of socio-political contexts, and this in turn justified the continuation and extension of the entire WHO programme. They were an essential precursor to the ‘intensified phase’, and specific themes certainly do emerge from their work. 

 

The WHO smallpox eradication programme


Over the past 40 years the effects of smallpox have often been downplayed. In reality, the disease, particularly the more lethal form of variola major, had posed a major threat for thousands of years. Those who survived (and many did not: some estimates suggest that variola major had a mortality rate of up to one in two) might be blinded, made sterile or otherwise damaged – and, obviously, often significantly disfigured by the characteristic ‘pockmark’ scars. 

 

The difficulties of eradicating smallpox have also been downplayed. Although smallpox does not have animal hosts, it is highly infectious. Nor was it possible, even in the 1950s, to confine eradication to specific locations. One message that comes through very clearly from the earliest years of the eradication programme is that ‘disease is global’, and that it can be repeatedly reintroduced to areas which have been declared free of infection. The original proposal in 1957 to the WHA which led to the WHO’s smallpox eradication programme (SEP) came from the Soviet Union, which had attempted to eradicate the disease but found that had been reintroduced several times. 

 

The SEP involved a lot of complex negotiations (including over funding and supplies of vaccine and equipment) and encountered a series of unexpected challenges. It gained traction and support from 1967 when the programme was officially ‘intensified’. Yet, points out Professor Sanjoy Bhattacharya, director of the Centre for Global Health Histories at the University of York and of the WHO Collaborating Centre for Global Health Histories, “there is detailed evidence-based research which shows that international engagements between 1958 and 1967 were rich and important”. In four areas in particular, vaccination and surveillance programmes had made significant progress by the time of intensification – and the evidence from those made it easier for international groups of officials to advocate the wider programme.

 

More centralised approaches: China and Brazil 


In China the disease was eradicated without WHO involvement, and before the intensified programme started. China is, however, a very specific case, having withdrawn from WHO membership completely under the Communist government. “Smallpox eradication was carried out within local health structures, and the political and social and geographical and cultural and epidemiological realities,” says Lu Chen of the University of York. 

 

In the 1950s, smallpox was highly endemic in China – not as much as in India but more than in other bordering countries – and one of the most fatal diseases in the country. The Chinese eradication programme started in 1950, with a mass vaccination programme for the whole of mainland China. The three main rounds of vaccination (1950 to 1953,1955 to 1958 and 1960) eliminated and/or reduced the disease considerably in a number of areas, but it was repeatedly reintroduced. “It was a continual process of elimination in different areas,” Chen explains. The last case was found in 1965, before the WHO increased communication with China.

 

The Chinese programme was carried out at a time when China was experiencing very low economic growth, and food was scarce. Yet it still managed finally to eliminate smallpox in the country. This was very much part of a wider government commitment to improve public health in general. “Disease and poor sanitation were considered an enemy of progress,” says Chen. “It was highly political and ideological.” There was a widespread health education campaign, using a number of different media and popular art forms. All children were vaccinated periodically from the ages of six months to 18; and any new cases were to be reported to the authorities within a matter of hours. 

 

In the late 1970s China finally engaged with the WHO and submitted a country report; it received certification as smallpox-free in 1979. It was the achievement, as Chen says, of “national and regional health officials, research scientists, local health workers and vaccinators. We need to acknowledge the names unknown; stories untold; and voices unheard.”

 

In Brazil, a centrally-driven programme was actually made possible as the result of a military coup in 1964. The earlier, more dispersed, programmes were centralised as a result – and, importantly, the new government wanted national and international recognition and legitimacy, which an eradication programme could provide. The last case in Brazil was recorded in 1971.

 

Smallpox had posed less of a threat in Brazil than in China, but there were major outbreaks every three to six years, especially in the ports – with a knock-on effect on commerce. “Every disease that disrupted international commerce was a priority for public health,” explains Dr Carlos Eduardo Campani, who is now at the Royal London Hospital. Compared to other diseases, however, smallpox was increasingly considered less important, especially after 1930 when variola minor became the more prevalent form. “It was accepted as minor and curable, and no longer a priority,” says Campani. Smallpox became the responsibility of small local agencies, and vaccination was mainly carried out only in urban areas. Pockets of the disease remained, and it was reintroduced from neighbouring countries. 

 

The Pan American Health Organization (PAHO) launched a continental eradication programme in 1950 but vaccination remained decentralised to local programmes. Even after the WHO programme was launched in 1958, and decisions about smallpox policy were made centrally, Brazil lacked resources and above all the vaccine to combat the disease effectively. When a national campaign against smallpox was launched in 1962, with vaccine provided by government, it was still hugely inconsistent. “The 26 different states organised their own programmes,” says Campani. “There was a lot of variation in how they approached the problem, and two big problems: it was heavily underfunded – there was no funding for vaccination at all 1963 – and there was a lot of political instability at that period.” Yet nearly 24m people were in fact vaccinated between 1962 and 1966, when the health ministry took over the programme, with the goal of vaccinating 90 per cent of the population and Brazil started working with WHO technical support and funding.

 

 “When you’re asked about how long you’ve been married, I believe you should date it back to when you started to date. The whole history is important for your marriage,” Campani points out. “And similarly, you can’t lose the story of smallpox eradication before 1967 in Brazil. National particularities must be taken account of. If we lose that, we lose the evidence of how the support was built on the ground for political decisions.”

 

More decentralised approaches: India and Nepal


India is the part of the world where smallpox has been most endemic since earliest history. It is the home of variola major, with major epidemics every five to seven years and seasonal peaks between December and May. There were also complex historical, social and religious associations with the disease (see below). Yet the early stages of India’s own eradication programme, launched in 1962, only granted limited WHO involvement. And when the WHO SEP entered its intensified phase India – unlike nearly all the other 34 countries where smallpox was endemic – did not immediately set up a WHO-assisted campaign.

 

However, work from India made a significant contribution to the SEP. Dr Namrata Ganneri, Commonwealth Rutherford Fellow at CGHH and the History Department, University of York, and SNDT College of Arts and CB College of Commerce and Science for Women, Mumbai, has charted the contribution of Indian health officials to the WHO SEP: in particular the work of Dr KM Lal, the director of the National Smallpox Eradication Programme (NSEP). Dr Lal made a presentation of his findings to the first WHO expert committee in January 1964 which set a target of 100 per cent of the population in its first ‘attack’ phase, in all probability drawing from the Indian experience. The WHO records also singled out the use of family registers and independent appraisals of the programme in different parts of India as a template for national control programmes in other countries, along with ‘concurrent evaluation’ (evaluating the programme as it was being carried out). 

 

The first pilots for the Indian NSEP were set up in 1959 after a massive outbreak of smallpox (and also cholera) the previous year. Specially recruited teams moved systematically from house to house and from village to village throughout a district in an effort to vaccinate or revaccinate not less than 80 per cent of the population, with the aim of creating herd immunity so that transmission would terminate spontaneously. Alongside this, ‘enumerators’ compiled comprehensive registers for each area, to check that sufficient numbers had been successfully vaccinated. After the first round of vaccinations, local health units were responsible for vaccinating people omitted from the first programme. 

 

Finally, there was an injunction to revaccinate every five years, and to vaccinate the contacts of anyone who did contract smallpox – because, despite the programme, there were still repeated outbreaks of the disease, including a major one in the winter of 1963 after which the target was changed to 100 per cent vaccination coverage. The programme was repeatedly assessed and evaluated, and successive reports were produced; and while WHO involvement remained limited it was at least increased. 

 

India did eventually respond to the intensified programme. More vaccine became available, as the result of bilateral agreements between different states; and, importantly, the Indian programme also became more centralised. “With a powerful centralising Prime Minister, assurance of more funding and more vaccine, states started to come on board and the government started to increase its financial input,” says Bhattacharya. “The government cleared many more international personnel to work in India and WHO officials were able to work more efficiently with district officials and at village level.” However, Ganneri points out, the work of the previous years underpinned both the Indian and the global SEP. “The global programme itself was rapidly changing and drawing on experiences from the field, and the Indian experience became central to the WHO eradication strategy. Perhaps it is time to study national stories upwards rather than from the international level down.”

 

Nepal was one of the last areas to eradicate smallpox; it was classified as no longer endemic in 1973 and the last case was in 1975. The WHO’s attitude to Nepal moved “from despair to praise,” says Dr Susan Heydon of the University of Otago as the country finally succeeded in a mass vaccination, surveillance and control programme.

 

Nepal did engage with the WHO, starting with the smallpox control pilot project known as WHO Project Nepal 9. Cooperation and involvement with Nepal through the WHO’s South-East Asia Regional Office (SEARO) “offered a strategy for Nepal, with its limited resources, towards achieving its own goals for better health services”, Heydon explains. This included support for a smallpox control pilot project to start in early 1962 in the Kathmandu valley, aiming to build a ‘nucleus’ of vaccination activities and expand ‘as and when possible’ to other areas. Although it had the huge advantages of freeze-dried vaccine and fridges for storage (see below) the project encountered a number of setbacks – outsiders constantly underestimated the enormous logistical difficulties – and the annual field visits from SEARO consistently found that despite the large numbers of vaccinations and revaccinations, numbers simply were not enough to achieve control. 

 

In 1965 the government of Nepal decided independently to extend the programme, and the following year the government and WHO drew up a revised plan of operation for ‘smallpox eradication and control of other communicable disease’, which superseded the previous project.

 

Alongside this, Heydon highlights three rather different initiatives which also ran before the intensified SEP. One was the 1965 locally-initiated and organised Medical Association drive to vaccinate all children in Morang and Sunsari districts, Kosi Zone. Working with the panchayat (district authority) and local structures, this achieved higher vaccination rates and coverage than the WHO pilot project. “The pilot was the largest communicable disease programme then in Nepal, but it achieved but low coverage,” Heydon points out. “This local initiative in 1965 showed how it could be better.”

 

By contrast, the other two initiatives involved “non-expert foreigners” – responding, importantly, to local demands and requests. The first was in the Mount Everest area, where Edmund Hillary’s Mount Everest expedition team met its first case of smallpox near the village of Lukla on 12 March1963. The epidemic was by then starting to spread between the valley villages. Hillary’s team set out to vaccinate as many people as possible, operating independently and mainly using vaccine obtained from the WHO representative in Kathmandu. They eventually vaccinated around 7,000 people. 

The second was in the Lamjung district where Peace Corps volunteers Don Messerschmidt and Bruce Morrison worked again through the panchayat to organise nearly 20,000 vaccinations in early 1964 – although they, like the Everest team members, were not officially health workers. There were effectively no health services in the area, and Messerschmidt and Morrison had considerable difficulty in obtaining sufficient supplies of vaccine (see below). 

 

“These early years highlighted many challenges but also offered ways forward and around,” Heydon points out. “The later success built on these foundations. And the goal was worldwide eradication: so small countries matter and need to be part of the history.” 

 

Beliefs about smallpox and vaccination


In a number of countries (such as Brazil at the beginning of the eradication programme there) health officials felt that a smallpox drive was the wrong priority for healthcare resources and priorities. As a result, vaccination teams were sometimes refused assistance (and this continued right through the intensified programme). 

 

In China, vaccination was already widely accepted, although there was some resistance in border areas and areas with a non-Chinese minority population. The vaccinators were selected from local cadres, local teachers and medical students (since there were not enough medical personnel). “These people were already familiar and trusted, so its was easier to get people vaccinated,” Chen explains. Alongside this, the government targeted the population – over half of which was still illiterate – through peer pressure from local cadres, broadcast media and traditional folk art performances. The message was highly political and ideological: vaccination was presented as a way to protect against the threats from the US, Russia and bacteria. 

 

India, by contrast, presented some very different challenges. The whole issue of smallpox vaccination already had a long and complicated history, and there were a number of beliefs that vaccination itself was dangerous and would inflict damage (for instance, that it caused people’s hands to fall off). There was also a complex range of beliefs specifically about smallpox, particularly in terms of the balance between heat and cool. “For Hindus, smallpox was seen as a visitation from the goddess Sitala (or Mariyamman in South India). Smallpox can arise also from her wrath but she also has the power to cool the disease and prevent it being fatal,” says Ganneri. Smallpox is also seen by the Ayurvedic medical tradition as the result of an imbalance between heat and cool. 

 

“It wasn’t a simple opposition of science versus irrationality and/or religion,” explains Bhattacharya. “People could believe in several remedies simultaneously; and indeed many people first had their children vaccinated, and then took them to be blessed in local temples.” “Religion is important but not as important as it may be made out to be,” Ganneri adds. “Traditional ideas about cause and treatment are very important as the backdrop to the eradication programme.” In fact, she points out, the main resistance came not from the rural but from government officials and from the more educated population. For instance, vaccination meant one was unable to work for several days; the vaccinators usually came from the so-called lower castes, which meant they might be refused access to upper-caste households; and so on. “Complex stories need to be unravelled about what we see as resistance and what we see as acceptance. Resistance needs to be understood on its own terms.”

 

“Vaccinators had to convince repeatedly that vaccination was safe. I think the active voices of local health officials did get across, and this was an important part of negotiations,” says Bhattacharya “Efforts were made to find if others believed those problems would arise more widely and if so, the best way to negotiate with the elders and/or headmen.”

 

Logistics and delivery


One huge issue was the availability and also the type of vaccine. The liquid (glycerinated) form of the vaccine needed to be kept refrigerated at all times, whereas the freeze-dried form only needed to be kept away from direct sunlight. China was in the ideal position here: the country manufactured its own freeze-dried vaccine, and different manufacturing areas covered different areas of the country, so there was no need to import large quantities.

 

Other countries were in a very different position. In Brazil, the early rural projects before 1962 only received the liquid vaccine (freeze-dried was used in a few urban areas). India produced its own liquid vaccine but also received freeze-dried vaccine from the Netherlands and (after complex negotiations) from the Soviet Union. Scandinavian countries were also ready to pass on the technology for producing a vaccine. It was not till the intensified programme got underway that large amounts of Indian-manufactured freeze-dried vaccine became available.

 

Along with the vaccine itself, teams needed fridges to keep it in – and especially in these early years this could be a major sticking-point. In Nepal, however, although the WHO Regional Office for South-East Asia did make provision for refrigeration and also made the freeze-dried vaccine available, a lot of logistical problems remained. “One of the big issues was actually getting started,” says Heydon, pointing out that agencies consistently underestimated the difficulties of getting almost anything achieved. This was a country without roads, and where a phone call between Lamjung and Kathmandu was routed through 14 different operators – which was why over the Christmas of 1963 the Peace Corps volunteers Morrison and Messerschmidt simply walked to the capital, taking several days, in order to get hold of more vaccine. John Wickett, who worked with the WHO, recalls how the day-to-day issues of organising and servicing vehicles were absolutely crucial. “To maintain the key strategy of surveillance of outbreaks, you had to have the mobility of staff. If you didn’t have a vehicle or some means of transport you weren’t going to get the outbreaks contained.” That might mean trucks, boats, or even helicopters.

 

Finally, vaccinators were not always well-trained in their work; and quite a few refused to adopt newer products or vaccination techniques In Brazil, some teams initially used ‘jet injectors’ from a fixed location, whereas others used ‘multipuncture’ techniques, going from house to house, till the first method was shown to be both more efficient and much cheaper. 

 

The move towards a global programme


Eradicating smallpox was a huge achievement. It took many attempts, in different parts of the world, with different levels of success: from the government programmes in China to the two-person volunteer drives in Nepal. Some (notably China) did not engage with the WHO SEP at all. Others used a mix of foreign assistance and national or local work. None of them were possible if the people at risk of smallpox refused to be vaccinated. 

 

“The world needed to work together to ensure that smallpox was gone for good,” Bhattacharya concludes. “That is where history can help, to point out the particular conditions existing in different localities where challenges were met and overcome. And there is no doubt that this enables us to prepare for future outbreaks of infectious disease.”

 

Radhika Holmström is a Wellcome Trust-funded writer and communications specialist working with the WHO Global Health Histories project at the University of York.