The United Nations Children’s Fund, or UNICEF, was established shortly after World War II to improve the lives of children worldwide, but it was facing hard times when Jim Grant took over as executive director in 1980. In the poor countries where the agency did most of its work, 15 million children under age five were dying annually, mostly from epidemics of pneumonia, malaria, diarrhea, and other killers rendered largely benign in the West by hygiene, antibiotics, and vaccines. The 1970s recession had hit these countries hard and many had in any case been taken over by cold-war dictators with little apparent concern for their most vulnerable citizens. The leaders of UNICEF’s major donors—British Prime Minister Margaret Thatcher and soon-to-be US President Ronald Reagan—were also no friends of the poor. Contributions to UNICEF were flagging as a mood of cynicism in foreign affairs took hold.
But Grant was relentlessly optimistic. At a retreat several months into what would turn out to be his fifteen-year directorship, he urged his staff to think big. As his colleague and friend Peter Adamson recalled years later:
The phrase [Grant] uses again and again is that he wants UNICEF to shift gears. He feels the organization has been going along nicely in second. Now he wants to see a rapid shift to third, and then fourth…. He doesn’t want to know about a five percent or ten percent a year improvement in UNICEF’s performance…. He wants UNICEF’s impact in the world to increase ten-fold, fifty-fold, a hundred-fold.
But rather than motivating his audience, he succeeded “mainly in mystifying and alarming” it, Adamson continued. Some colleagues even began to fear for the sanity of their new boss.
But in 1982, Grant had a revelation. A group of public health experts, including his friend Jon Rohde, had for years been promoting the idea that a small number of simple medical supplies, including vaccines for measles, diphtheria, tetanus, and polio as well as oral rehydration solution—a salt and sugar mixture that protects children with diarrhea from dehydration—could prevent about half of child deaths in the developing world, if only the international community would pay for them. Even Thatcher and Reagan could not ignore the argument that no child should die for lack of a five-cent vaccine. That year, Grant and his UNICEF colleagues launched the Child Survival Revolution, a worldwide campaign to make these medical supplies widely available in the world’s poorest nations.
Adam Fifield’s A Mighty Purpose: How Jim Grant Sold the World on Saving Its Children movingly recounts Grant’s dramatic life story. He was born in China in 1922, where his father was a missionary doctor and also worked for the Rockefeller Foundation. The foundation’s aims, like those of most charities, were partly political. In order to support the Nationalist government, the foundation established a modern medical school in Beijing. But many graduates were actually alienating the people the foundation was hoping to win over. The new doctors disdained the peasants, refused to work in rural areas, and made fortunes in private practice. Some joined the corrupt Kuomintang government. Jim’s father urged the Rockefeller Foundation to support a system of rural health care, but then in 1937 the Japanese invaded and the Grants left for the US.
Jim eventually graduated from the University of California, married a classmate, joined the army, and shipped out to Asia during World War II. On a stopover in Calcutta in 1944, he witnessed the famine in colonial Bengal, where the British administration looked on while three million Indians perished. As the Nobel Prize–winning economist Amartya Sen later demonstrated, Bengal had been experiencing a boom in rice production at the time, but the British were exporting it to the battlefields of Europe and North Africa or diverting it to Indian cities to feed military and civil defense workers. As rice prices soared, traders hoarded it, causing prices to rise even higher. Landless peasants starved because their wages didn’t keep pace with prices. Grant’s commanders urged him not to get involved, but he never forgot the sight of corpses in the streets and emaciated men pleading at the gates of middle-class houses for leftover rice water.
During the 1960s, Grant worked for the US Agency for International Development in Southeast Asia, where he briefly oversaw America’s disastrous aid program in Vietnam. More successfully, he persuaded the Turkish government to promote to farmers the use of highly productive Green Revolution rice varieties. The Green Revolution led to an enormous increase in food production in Asia, saving millions of lives. At UNICEF, Grant came to believe that vaccines and other low-tech medical supplies could have a similarly spectacular effect on the survival of children.
Advertisement
Not everyone agreed. At the time, most international development experts knew from bitter experience that epidemic diseases are seldom amenable to simple “magic bullet” cures. Although a worldwide vaccine campaign succeeded in eradicating smallpox in 1977, similar campaigns against hookworm, yellow fever, yaws, tuberculosis, and malaria had been costly failures, the last of these particularly so.
Controlling malaria is complicated. Until the 1950s, it usually involved a strategy called “rural uplift,” combining health education, better drainage, improvements in housing, and treatment with quinine and insecticides. These had eliminated malaria from some Western countries, including the US, but in an act of technocratic folly, the World Health Organization, with strong Eisenhower administration backing, replaced these programs with a worldwide DDT spraying campaign. Armies of workers in khaki uniforms and pith helmets were recruited to spray the pesticide in villages, homes, and fields from Mexico to Indonesia. This worked magnificently at first, but by the late 1960s, the bugs had become so DDT resistant that they could breed in a vat of the stuff. In part because of the reduction in rural uplift programs, malaria was killing even more people in the 1970s than it had in the 1950s.
In view of this failure, WHO officials began rethinking their approach to public health. The problem with the Malaria Eradication Program, as they saw it, was that it relied on a single tool, DDT, applied in the same way everywhere, taking no account of local ecology, politics, or social life. Even when eradication worked, as with smallpox, its impact was limited because the poor continued to be afflicted by so many other diseases. WHO’s new strategy, unveiled in 1975, would instead support health services that were “shaped around the life patterns” of the people, not dreamed up by bureaucrats in Washington and Geneva. They called the strategy Primary Health Care, and it was intended to build upon the networks of health clinics and hospitals established in former French, British, and Portuguese colonies.
These clinics were often poorly run and even nonexistent in the rural areas where most of the poor lived, but WHO officials reasoned that with relatively small investments of money and training, new clinics could provide treatment for malaria and other childhood diseases, as well as vaccinations, health education, emergency hospital referrals, and other services, depending upon the needs of the population. In keeping with the ideology of liberation prevalent at the time, WHO officials hoped that Primary Health Care would give communities a voice in how services were delivered. By strengthening the link between governments and their peoples, Primary Health Care might even help build a more just world based on empowerment and democracy.
The chief advocate for this utopian idea was Halfdan Mahler, the Danish executive director of WHO. When he heard about Grant’s plan for a Child Survival Revolution based on promoting and supplying vaccines and other low-tech medical items, he made no secret of his contempt for it. In 1983, Mahler delivered what became known as his “parachute/red herring” speech to a large gathering of public health officials at WHO’s Geneva headquarters. “I am all for impatience if it leads to speedier action,” Mahler began,
But I am all against it if it imposes fragmented action from above. I am referring to such initiatives as the selection by people outside the developing countries of a few isolated elements of primary health care…; or the parachuting of foreign agents into these countries to immunize them from above; or the concentration on only one aspect of diarrheal disease control without thought for the others. Initiatives such as these are red herrings….
Grant went ahead with UNICEF’s Child Survival Revolution anyway, and eventually WHO, unable to find support for Primary Health Care because Western donors found it complicated and vaguely socialist, signed on too.
Tall, with ambassadorial good looks, Grant was trained as a lawyer, but his manner with colleagues, world leaders, and with the children he encountered in slums and refugee camps the world over was that of an avuncular family physician. He was also extraordinarily brave. The cold war was a hot one in many poor countries, and UNICEF staff worked courageously to deliver relief to children in war zones on nearly every continent.
Grant visited these programs himself, often at great personal risk. When diplomats visited Sarajevo, Bosnian snipers would sometimes shoot from the rooftops in order to provoke a rain of Serb bullets in return, thus demonstrating the enemy’s brutality. Once Grant’s vehicle was hit by gunfire. In 1995, when being treated for the cancer that killed him at age seventy-two, he sometimes brought a colleague to Sloan Kettering. They’d discuss the next board meeting or annual report as radiation was being beamed into him.
Advertisement
Of the scores of people Fifield interviewed about Grant, many recalled his striking blue eyes. When things went wrong, as they inevitably did from time to time, those eyes “could stop you, freeze you where you stood.” It wasn’t anger. He seldom lost his temper; but it sometimes seemed as though an irresistible moral force issued from that stare.
Grant’s charisma was crucial to the Child Survival Revolution. In speeches, public festivities, and private meetings, he persuaded scores of world leaders to champion his health campaigns. UNICEF would provide medical supplies—vaccines or packets of oral rehydration solution—but the leaders had to mobilize their people. During Colombia’s 1984 National Vaccination Crusade for example, President Belisario Betancur rallied 120,000 volunteers—teachers, health workers, Boy Scouts, church volunteers, and even soldiers—to round up and immunize children all across the country. In only three days spaced one month apart, 800,000 were vaccinated.
Everyone wanted to be on the side of children, and Grant worked successfully with some of the world’s most notorious tyrants and war criminals, including El Salvador’s José Napoleón Duarte, Republika Srpska President Biljana Plavšić, and all sides in the Sudanese civil war. Shortly after the Tiananmen massacre, he persuaded Premier Li Peng to iodize all the salt in China. He even convinced Haiti’s notorious Baby Doc Duvalier to sponsor a song and dance festival on the theme of infant diarrhea on the portico of the presidential palace.
UNICEF’s goal was to push vaccination rates to 80 percent of children in all developing countries by 1990. Whether this was actually achieved remains disputed, but there’s no doubt that vaccination rates rose steeply nearly everywhere during the 1980s, while child mortality also plummeted. How much of this was due to UNICEF’s efforts and how much to economic growth and the ending of wars is unknown, but there’s little doubt that at a time when the world seemed to have lost its conscience, the Child Survival Revolution campaigns inspired a new concern about the well-being of vulnerable young people everywhere.
UNICEF’s programs worked best in Asia and Latin America. In Africa, the poorest and most politically troubled world region, rates of child death declined more slowly during the heyday of the Child Survival Revolution than they had in the preceding decades, a sad reality that Fifield neither mentions nor explains. Vaccination rates did rise, but despite a fivefold increase in UNICEF funding to the African region over roughly the same period, this appears to have had little impact on child survival.1
The reasons varied from place to place, but the main problem was that the diseases that kill most African children—malaria, pneumonia, and diarrhea—weren’t preventable with the vaccines and other cheap medical supplies available at the time. Controlling those diseases would have required environmental, hygiene, and nutrition programs, as well as medicine, health workers, and clinics—that is, something rather like Primary Health Care. During the 1960s, the leaders of many newly independent African states had been attempting to expand health services along just such lines, even before WHO came up with the idea and called it Primary Health Care. But during the 1980s, the IMF and World Bank forced African governments to gut those very programs, even as they generously funded UNICEF’s Child Survival Revolution.
The problems began with the debt crisis of the 1970s. As oil prices soared and global economic crisis spread, many African countries found themselves unable to repay the commercial banks that had lent them money during the boom years of the 1960s. The IMF and World Bank agreed to take over the loans, but they required African governments to reduce funding for social services, including health. Countless doctors, nurses, and community health workers were laid off, nutrition and sanitation programs were shut down, and clinics and hospitals across the continent fell into dereliction. According to World Bank publications, private clinics and episodic programs to distribute medical supplies, including UNICEF’s Child Survival Revolution, would fill the gap.2
It didn’t work. Throughout the 1980s and 1990s, informal drug shops selling often failed or counterfeit medicine sprang up in trading centers across Africa. In once-functioning government clinics, equipment broke down and was never fixed, health worker salaries were delayed or not paid at all, staff were unsupervised and often absent. When they did turn up, they solicited bribes from patients and then fought over the money or stole medicine from the clinic stores.3
In Tanzania, anthropologists surveyed hundreds of women to find out why they had used the fumes of burned elephant dung to treat their dying children. Ninety-nine percent of the women said they had sought treatment at least once from a clinic or drug shop. When the medicine they received invariably failed, they consulted traditional healers, who prescribed elephant dung.4
We now know that Primary Health Care is not only the solution to these problems, but also costs little more than the Child Survival Revolution campaigns. Strong evidence for this comes from a remarkable experiment conducted among the Kassena-Nankana people of northern Ghana. During the 1980s, one child in five died in this parched region where temperatures routinely hover around 110 degrees, and the brief, intense rains sometimes wash people’s huts away. During the long dry season, sand blown down from the Sahara brought epidemics of meningitis that could kill half the children in a village. Malnutrition was rife, but if people settled near streams and ponds, parasites in the water could blind or cripple them. Measles, malaria, and other scourges of poverty were common too.
In 1993, Fred Binka, a Ghanaian physician, and James Phillips, a demographer with the Population Council, decided to compare a typical Child Survival Revolution program to a slightly more expensive Primary Health Care program. In the Child Survival Revolution intervention, community volunteers—usually peasant farmers—were given six weeks of training to distribute essential medicines to families and educate them about hygiene and vaccination. In the Primary Health Care intervention, Ghana Health Service nurses were moved from clinics in the towns out to the villages where they conducted regular visits to families with pregnant women and small children. Within a couple of years, it was clear that while the nurses saved many lives, the volunteers with their satchels of medicine were having no effect.
In order to understand these results, it’s necessary to appreciate that all public health crises have political and historical dimensions that make some children especially vulnerable. Before colonial times, much of West Africa was governed by indigenous empires, such as Sokoto and Borno in modern-day Nigeria and Akan in modern-day Ghana. They plundered smaller tribes like the Kassena-Nankana for slaves, whom they either kept or sold to European traders. Many modern-day African-Americans can trace part of their ancestry to the region where the Kassena-Nankana now live. Today you can visit the scattered ruins of slave camps and listen to old men perform songs about the milk-heavy breasts of wives left behind.
To cope with the precariousness of their existence, the Kassena-Nankana came to see themselves as inhabiting a borderland between life and death. Every tree, stone, and cave was enchanted by ancestral spirits who could make crops grow and children thrive, or bring sterility, danger, and death. When a child became ill, parents had to rule out the possibility of witchcraft before taking her to a clinic. This involved consulting the village soothsayer—a traditional priest who lived in a hut and might wear, if anything, a goatskin loin cloth, dreadlocks, and perhaps a gourd affixed with Viking-style horns for a hat. The soothsayer would seek advice from the ancestors using incantations, animal sacrifices, and rituals involving stones, seashells, kola nuts, and other sacred objects.5
Saving children’s lives can be simple with the right tools, but when they get sick, they die very fast without treatment. These soothsayer rituals might take days and Binka and Phillips realized that thousands of children were dying as a result. Educating the community was useless, because if a woman dared to visit a clinic without ancestral permission, people would think she was a witch. Her male relatives might beat her, take her children, and evict her from her house.
When the volunteers turned up with supplies of medicine, people ignored them. After all, what did these peasants know? Some were unscrupulous and incompetent, brandishing the certificates the researchers had given them as if they were medical degrees. By contrast the nurses in the Primary Health Care program were able to link the medical establishment to the realities of poor people’s lives. They exuded authority and confidence, and their methods worked because they knew what they were doing. If people had questions, the nurse could answer them; if the supply of drugs ran out, the nurse could obtain new supplies quickly. Practical, kind, and secure in themselves, the best nurses didn’t need to adopt pretensions or lord their knowledge over the villagers.
Today, the Kassena-Nankana allow children to be treated medically first, eschewing the rituals unless treatment fails. Though still extremely poor, their children are now more likely to survive than those born in Greater Accra, which includes the nation’s capital. This costs the Ghanaian heath care system only an additional five dollars per person annually, less than many programs involving medical supplies and volunteers, which frequently rely on expensive expatriate managers.6
About twelve years ago, a delegation of Ethiopian health officials visited Ghana and decided to introduce a similar program in their own country. Child mortality in Ethiopia has since fallen steeply, but the country is politically divided, and progress has been much faster in the areas where ruling party elites and their relatives live.7 In 2010, I visited an area where many people oppose the government and wondered if this didn’t have less to do with the quality of health services—the clinics seemed well run—than with a general sense of demoralization in an area where ubiquitous government spies watched people’s every move and punished any sign of dissent or initiative.
Thousands of children in this region were dying from kwashiorkor, a mysterious disease associated with malnutrition and characterized by vomiting, diarrhea, swelling, and refusal to eat. Kwashiorkor can be cured with a sweet, vitamin-filled, high-calorie peanut paste called Plumpy’nut, and Ethiopia’s new rural clinics were offering it to the mothers of children diagnosed with the disease. But when a former UNICEF official named Alessandro Conticini found Plumpy’nut packets on sale in local shops, he suspected that mothers were selling them. In impoverished regions the world over, the bond between mothers and children forms slowly; loving a baby who probably won’t survive carries enormous emotional risk and many poor women neglect sick children, assuming they’ll die no matter what.8
In 2009, Conticini introduced a program to help mothers reconnect with their sick children. In Ethiopia, parents don’t traditionally play with babies or talk to them, and childcare tends to be limited to feeding and cleaning, but Conticini trained community nurses to teach the mothers how to make toys out of sticks, cloth, discarded toothpaste boxes, whatever they could find. A field trial of the program is underway, but when I interviewed mothers who had been through it a few years ago, many told me that the simple act of playing and laughing together had restored their children’s appetite and saved their lives. “I’d prepared the funeral shroud and told the church that he would be buried soon,” said one mother of the healthy toddler playing beside her. “Now I can’t get enough of him. I can’t quite believe he’s alive.”
Programs like these transcend charity by enabling the poor to help themselves, but they require committed people—nurses, doctors, lab technicians—precisely what the World Bank and IMF were forcing African governments in the 1980s not to spend money on. Despite a sevenfold increase in global health spending by Western donors since the 1990s, such locally designed programs still receive little support from foreign aid agencies. Most large donors, including the US Agency for International Development, the Bill and Melinda Gates Foundation, and the William J. Clinton Family Foundation, almost exclusively support formulaic programs for distributing medical supplies, carried out in much the same way in the bustling townships of South Africa as in the lonely expanses of the Sahel. The World Bank even has a new set of “Global Practice Units” that aim to find universal solutions to public health, hunger, education, poverty, and trade. Political considerations and the intimate peculiarities of culture seem to have no place in these programs.
Donations of medical supplies probably appeal to philanthropists like Gates and Clinton and also to corporate-friendly Western legislators because they have an eye on African markets for health-related goods. As Western populations age and shrink, and as pharmaceutical innovation founders, Africa, with its many diseases and rising population and GDP, represents a vast potential market for Western firms. But it’s a market currently dominated primarily by generic medicines made by Chinese and Indian companies, whose sales volumes have increased up to eightfold in the past decade.
In some respects, programs that supply and promote equipment and medicine function in much the same way that free pharmaceutical samples given to your doctor by reps from Pfizer and Lilly do. Once you try them, the hope is, you’ll keep buying them. The same goes for entire nations receiving medical donations through foreign aid. There’s nothing wrong with this: vaccines and other items are obviously necessary. But an exclusive focus on capturing markets and distributing technology will never address the fundamental social and political factors responsible for the six million or so needless child deaths that still occur each year, mostly in Africa.
In 2000, world leaders gathered at the United Nations in New York to commit themselves to a fifteen-year program of “Millennium Development Goals,” including reductions in poverty, gender inequality, and child mortality in developing countries. The results of this, likely to be announced in December, are already known. Those African countries that have invested in Primary Health Care, including Ghana, Ethiopia, and Rwanda, have made remarkable progress. Countries like Uganda and Nigeria that relied on the World Bank’s prescription of privatization and episodic, donor-sponsored campaigns to promote the use of particular medical supplies have not.
This would probably not surprise Jim Grant, who did the best job anyone could have during a tragic period in world history. Shortly before his death he wrote that while the specific interventions promoted by the Child Survival Revolution had saved many lives,
a major concern is the sustainability of these achievements…. Mobilization should start with national leaders, but its sustainability depends on continuing community participation…. The most important responsibility of health and other services is to promote the capacity of families and communities to solve their own problems with self-reliance.9
Research for this article was supported by the Open Society Foundations.
-
1
Kenneth Hill and Agbessi Amouzou, “Chapter 3: Trends in Child Mortality, 1960 to 2000,” Disease and Mortality in Sub-Saharan Africa (second edition), edited by D.T. Jamison, R.G. Feachem, M.W. Makgoba, et al. (World Bank, 2006). ↩
-
2
Elliot Berg, Accelerated Development in Sub-Saharan Africa: A Plan for Action (World Bank, 1981); World Development Report 1993: Investing in Health (Oxford University Press). ↩
-
3
Frederick Golooba-Mutebi, “When Popular Participation Won’t Improve Service Provision: Primary Health Care in Uganda,” Development Policy Review, Vol. 23, No. 2 (2005). ↩
-
4
Don de Savigny, Charles Mayombana, Eleuther Mwageni, Honorati Masanja, Abdulatif Minhaj, Yahya Mkilindi, Conrad Mbuya, Harun Kasale, and Graham Reid, “Care-seeking Patterns for Fatal Malaria in Tanzania,” Malaria Journal, Vol. 3, No. 27 (July 28, 2004). ↩
-
5
Pierre Ngom et al., “Gate-Keeping and Women’s Health Seeking Behaviour in Navrongo, Northern Ghana,” African Journal of Reproductive Health, Vol. 7, No. 1 (April 2003); F.N. Binka et al., “Rapid Achievement of the Child Survival Millennium Development Goal: Evidence From the Navrongo Experiment in Northern Ghana,” Tropical Medicine and International Health, Vol. 12, No. 5 (May 2007). In the interests of transparency, I wish to declare that I have in the past worked as a consultant for the Doris Duke Charitable Foundation (DDCF), one of the funders of James Phillips’s child survival research in Ghana. In 2012, DDCF supported a workshop I conducted to introduce Ghanaian journalists to various research projects in the region, including that of Drs. Phillips and Binka. ↩
-
6
In 2001, UNICEF launched a trial of an expanded version of the Child Survival Revolution known as Accelerated Child Survival and Development (ACSD) in three West African countries. In addition to providing the vaccines that were part of the original program, ACSD included malaria medicine, insecticide-treated bednets, and many other things. The results, published in 2010, found that the program had no effect on child survival, probably because the basic primary health care system in the trial areas was so weak. See Jennifer Bryce et al., “The Accelerated Child Survival and Development Programme in West Africa: A Retrospective Evaluation,” The Lancet, Vol. 375, No. 9714 (February 13, 2010). ↩
-
7
Central Statistical Agency [Ethiopia] and ICF International, 2012; Ethiopia Demographic and Health Survey 2011; Addis Ababa, Ethiopia and Calverton, Maryland: Central Statistical Agency and ICF International. ↩
-
8
See, for example, Nancy Scheper-Hughes, Death Without Weeping: The Violence of Everyday Life in Brazil (University of California Press, 1993). ↩
-
9
See David C. Taylor and Carl E. Taylor, Just and Lasting Change: When Communities Own Their Futures (Johns Hopkins University Press, 2002). ↩