Cover Image
close this bookSouthern Lights - Celebrating the Scientific Achievements of the Developing World (IDRC, 1995, 148 p.)
View the document(introduction...)
View the documentForeword
View the documentChapter 1 - A northern misconception demolished
View the documentChapter 2 - Is there really science in the south?
View the documentChapter 3 - How the south got left behind
View the documentChapter 4 - Third world achievements
View the documentChapter 5 - Marching to a different drummer
View the documentChapter 6 - Solving global problems together
View the documentChapter 7 - What needs to be done
View the documentAcronyms
View the documentBibliography
View the documentAcknowledgements

Chapter 4 - Third world achievements

In 1987, the British journal Nature published a paper describing a synthetic vaccine that appeared to protect monkeys from malaria (Patarroyo et al. 1987). Manuel Patarroyo’s claim to have developed what would be the first vaccine to provide protection against a parasitic disease was greeted with intense skepticism.

The skepticism increased - mixed with criticism about the ethics involved - when, soon afterward, Patarroyo began testing his vaccine on thousands of Colombians, without using what were considered by the scientific establishment to be the necessary safeguards.

It didn’t help that Patarroyo hadn’t done what scientists are supposed to do: expose his results to the criticisms of his peers at international meetings. Nor did it help that his approach to the development of his vaccine was thoroughly unorthodox.

The result was an international controversy which “was characterized by a combination of envy of Patarroyo and outright disbelief at his results, and genuine scientific criticism,” wrote Professor F.E.G. Cox of King’s College London in Nature in 1993. Britain’s Medical Research Council refused a request for a field test from one of its own African laboratories. The us Walter Reed Army Institute of Research declared it was unable to duplicate Patarroyo’s vaccine production methods, and WHO studies failed to replicate his results.

Attitudes changed, however, when Patarroyo paid heed to his detractors and published data that conformed to their standards. He also began attending conferences to reply personally to his detractors’ concerns. In person, his warm, open, and modest manner shone through and helped immeasurably to further his cause. Soon Britain’s Medical Research Council acceded to the request of its African scientists and established a vaccine trial in The Gambia. Walter Reed scientists, after working more extensively with Patarroyo’s methods, achieved identical results. And soon it became apparent that the WHO studies may have suffered from similar difficulties in arriving at a correct vaccine formulation.

By the spring of 1993, the word in the world’s scientific literature was that the Colombian’s vaccine might indeed be the real thing.

“[Patarroyo’s] persistence now seems to have been vindicated,” wrote Cox (1993) about the first large-scale trials in Colombia. “The degree of protection achieved is not dramatic. But it is the first time that such a large trial has been undertaken and there can be no doubt that a certain degree of protection was achieved.... From the point of view of public health, fewer days would be lost by sickness if everyone were immunized with a vaccine, even only a partially effective one. Malaria is a complex disease, and it may be that all it will be possible to achieve in the near future will be a tolerable level of malaria with which it is possible to live.”

Part of the doubt about Patarroyo’s vaccine arose because his methodology differed completely from the norm.

A complex disease, malaria poses particularly difficult problems for those trying to combat it. First of all, it can be caused by four different parasites: Plasmodium falciparum, which occurs in West Africa and South America; Plasmodium vivax and Plasmodium ovale, found in North Africa; and Plasmodium malariae, in Asia. Secondly, these parasites undergo a series of transformations: first in the mosquito, next in the liver of the person bitten, and finally, in the victim’s red blood cells, where it reproduces itself.

At each stage, the organism has a different biological identity. A vaccine works by introducing something into the body that the immune system will recognize and produce antibodies to destroy. In the case of malaria, the vaccine designer has to decide which of the various biological identities will be represented in the vaccine. According to Geneva-based science writer John Maurice (1993):

The conventional wisdom is that to be effective, a malaria vaccine would have to mobilize the cellular component of the immune system in addition to generating antibodies. Yet Patarroyo made his vaccine from peptides more likely to produce an antibody response. What’s more, many researchers in the mid-1980s were pinning their hopes on a vaccine to attack the parasite in its sporozoite stage, the form in which the organism is injected into the bloodstream by a biting mosquito. But Patarroyo targeted mainly the merozoite form, which develops from the sporozoite and causes the fever and chills typical of the disease. Although his initial attempts to find a merozoite peptide that would completely protect monkeys from malaria were unsuccessful, he eventually came up with a cocktail of three peptides that showed promise, and then by a deft stroke of chemical legerdemain used two sporozoite peptides to link them in a stable form.

Patarroyo attributes his success to attacking the problem through chemical synthesis rather than genetic engineering. The problem was to synthesize the string of peptides, but genetic engineering seemed impossible because the pieces were too small. So Patarroyo decided to do it chemically. Although some of his peers acknowledged that his solution could work, they thought it would not be achieved until about the year 2025 because of the technical difficulties involved.

Patarroyo, however, had trained at Rockefeller University in the United States with Dr Robert Bruce Merrifield, who, in 1984, won the Nobel Prize in Chemistry “for his development of methodology for chemical synthesis on a solid matrix.” So, Patarroyo said: “Why should we wait so long? The technology is now ready to be used, why not use it right now? And that’s what happened - we made it!”

A Case of Scientific Xenophobia?

Did the fact that Patarroyo’s work was done in the Third World really have anything to do with its initial rejection in the North, or would it have been received with the same scepticism had he been American, British, or Canadian?

According to a trio of authors from scientific centres in Spain, Switzerland, and Tanzania who examined this question in the international journal Vaccine, research strategies and investments are heavily governed by the prevailing sociopolitical context (Alonso et al. 1993). The us military has long been involved in malaria vaccine research, particularly vaccines against the preerythrocytic stage, in the hope of reducing the number of casualties during operations in endemic areas. This type of vaccine is unlikely to be useful to people who live in endemic areas of the Third World - particularly in sub-Saharan Africa - although it might be very useful for short-term visitors.

With the end of the cold war came budget cuts for research such as the military’s vaccine work. And because of the limited financial return, the pharmaceutical industry is unlikely to undertake research into vaccines that will protect only people living in endemic areas. Stressing that Patarroyo’s vaccine is unlike those researched by the us military, in that it targets the different stages of the disease-causing organism and could “be a central tool in malaria control programmes in endemic areas,” the authors write:

The history of SPf66 [Patarroyo’s vaccine] is an excellent case study of the problems and position of science within development cooperation, of the sociopolitical context that shapes it, and of the confusion among scientists about their role in policy formation.... His claims have been met with a spectrum of reactions ranging from unquestioning enthusiasm, through scepticism to intellectual racism.

Noting early shortcomings of Patarroyo’s work, the authors said:

These are admittedly sound arguments which could justify a rational scepticism. However, they could equally be applied to many of the pre-clinical, clinical and field trials performed by those scientific establishments that have chosen to ignore and discredit Patarroyo’s claims.

A WHO/PAHO ad hoc committee visited Bogota in June 1990 and concluded that SPf66 merited further study. The committee recommended that independent randomized placebo-controlled trials should be carried out urgently among children living in areas of high transmission, particularly in Africa. Nevertheless, the British Medical Research Council twice refused to carry out such trials of SPf66 in The Gambia. A review of malaria research in Time [31 May 1993] chose to ignore totally Patarroyo’s work. These are just two examples of how the story of SPf66 is governed by the sociopolitical anatomy of our time. The development of a potential malaria vaccine by the South American group, rather than by an established malaria vaccine research centre, represents a blow to the scientific establishment that believes itself to be the trustee of malaria vaccine research. An important question of ethics is surely raised when a potential avenue for the advance of science, and one which could lead to a way of controlling one of the world’s most important parasitic diseases, is deliberately ignored.

Patarroyo’s vaccine breaks new ground in a number of ways: it is the first to have been synthesized chemically, rather than made - like other vaccines - from dead or genetically altered pathogens such as viruses or bacteria. The vaccine is also inexpensive to make: each dose is estimated to cost 30 cents, which makes it ideal for Third World use.

“It’s a pretty major breakthrough,” said Pedro Alonso, an epidemiologist from Barcelona’s Foundation for Biomedical Research who worked on the Tanzanian trial. “It’s not only the first malaria vaccine, but the first against any parasitic disease of humans” (Brown, P. 1994).

Although Patarroyo’s achievement is finally being recognized by the world’s malaria experts, some indicate that it needs more work before it will be ready for widespread use. Nicholas J. White of the Oxford Tropical Medicine Research Program of Welcome-Manidol University in Bangkok, Thailand, wrote in the 20 October 1994 issue of The Lancet:

In another year, we will have the results of the other trials and a clearer idea of the overall efficiency of SPf66. Success - i.e., confirmation that Patarroyo’s vaccine prevents malaria - will raise many important questions. How long will the protection last? Will the vaccine select for “resistant strains”? Is there a danger of destabilizing malaria and increasing mortality in areas of intense stable transmission? Can we improve on the present vaccine, and how about the several other vaccines also in development? We have not reached the end of the journey towards an ideal vaccine against falciparum malaria, but the SPf66 vaccine has held up well so far, and we are still on track.

Odile Puijalon, a molecular biologist working on malaria at the Pasteur Institute in Paris, put Patarroyo’s advance in a broader context: “I think even if this vaccine is not in the end used on a large scale, what we owe Manuel Patarroyo is that he has proved that using long peptides is feasible, and at a reasonable price” (quoted in Brown, P. 1994).

The World’s First Family-Planning Vaccine

The risk of death from pregnancy and childbirth for African and South Asian women is 200 times greater than for women in the industrialized world. In nearly every African country where surveys were conducted, at least half the married women wanted to postpone their next pregnancy or did not want any more children. And in developing countries generally, the high incidence of induced abortion reveals the need for better methods of fertility regulation.

An Indian medical researcher has developed such a method: a safe, long-lasting, and reversible vaccine. Dr Gursaran Talwar began the research in 1975 as Director of India’s National Institute of Immunology, with funding from the Indian government and IDRC. The latest clinical studies show that among 88 women injected with the vaccine, only one pregnancy occurred over 821 menstrual cycles. Without the vaccine, 250 to 300 pregnancies would have been expected in such a group (Newton 1993).

“We have passed an important milestone and that is to confirm the vaccine works,” said Dr Talwar. “This is the first time we’ve seen a birth-control vaccine that can prevent pregnancy in humans.”

Although approval for widespread use of the vaccine will take considerable time, Dr Talwar hopes it will be approved before the turn of the century.

The vaccine has many advantages over other methods of fertility regulation. Unlike “the pill,” it does not stop ovulation or alter the menstrual cycle - and it does not involve daily medication. Nor does it involve insertion by a doctor - and irregular bleeding - associated with IUDS (intrauterine devices). Even the new, implantable devices such as steroid packs cause irregular bleeding.

Dr Talwar’s vaccine works by increasing the body’s production of antibodies to the human chronic gonadotropin (HCG) - the hormone normally produced in the uterus in preparation for implantation of an embryo. By increasing the production of the antibodies, this hormonal action is blocked and embryo implantation does not take place.

The vaccine does not produce an abortion; rather, it helps prevent pregnancy. Moreover, in doing so, it simply makes more efficient a process that occurs naturally. In unvaccinated women, approximately 50 to 75% of embryos fail to become implanted in the uterus because of the presence of antibodies normally produced by the body against HCG. The vaccine increases the antibody level to a point where no embryos are implanted.

Currently, the vaccine must be administered once a month for 3 months, after which time it is effective for a year. During the initial 3 months, women must use other methods of contraception. Dr Talwar is now working on the development of a contraceptive based on a purified extract of the neem tree for use during the 3-month “waiting period.” Researchers at the University of Alberta are collaborating with him to determine why and how the extract works, as it does in trials with rats and monkeys. Dr Talwar is also working on an implant that would extend the vaccine’s protection from 3 months to a year.

As did Patarroyo with his malaria vaccine, Dr Talwar faced disbelief and skepticism when he proposed his pregnancy vaccine in the early 1970s. “People felt it was a fantasy,” he said. “Vaccines were traditionally made for diseases, pestilence, viruses, and bacteria - not birth control.” He also faced the usual skepticism concerning Third World scientists. “For a scientist in a developing country to make an original contribution, he or she has to work five times harder,” he added.

To refute his critics, Dr Talwar has subjected his work to rigorous animal trials over 10 years. An international committee of the Population Council of New York has conducted toxicology studies in Brazil, Chile, Finland, and Sweden, while other studies are being carried out by WHO.

The vaccine will obviously benefit all countries, not just the South. “Many people want alternatives to current methods of contraception,” says Dr Talwar. “There is no dividing line between North and South.”

VACCINATING CHICKENS

An acute and highly contagious virus, Newcastle disease either kills the poultry it infects or reduces egg production and retards growth. In 1976, considering Newcastle disease the biggest threat facing the poultry industry in Malaysia, Professor Abdul Latif Ibrahim of the University Pertanian Malaysia proposed imperfect protection against the disease. With support from the International Foundation for Science over 4 years, he succeeded in isolating two new clones from the virus and preparing stable new vaccines.

In 1982, Professor Latif was named Dean of the Faculty of Veterinary Medicine and Animal Science at his university, and acquired new funding for his vaccine research. Using the cloning technique, he produced a potent vaccine - which was subsequently manufactured - from the Australian strain of the Newcastle virus. In 1984 he received the first Seven Brohult Award, named after the founding president of the International Foundation for Science, for this work. Professor Latif is currently doing research, financed by the Australian Centre for International Agricultural Research, on an oral vaccine that could be fed to chickens reared in the backyards of rural areas of Malaysia and Southeast Asia.

The Conquest of Yellow Jack

Viral diseases and parasitic infections are common scourges in Third World countries, and colonizing Europeans have often been particularly hard-hit by such maladies. Yellow fever is a case in point.

It has been argued that yellow fever originated in the Caribbean and was exported to Africa. It has also been said that the first Europeans to fall victim to the disease were Christopher Columbus’ crew. In his 1976 book, Plagues and Peoples, William H. McNeill argues the traffic went in the other direction. “Yellow fever announced its successful transfer from West Africa to the Caribbean for the first time in 1648, when epidemics broke out in both Yucatan and Havana.” The transfer, he said, required establishment in the New World of a special species of mosquito - Aedes aegypti - which breeds only in artificial containers such as water casks, rather than in water with a natural bottom of mud or sand.

Whatever its origin, yellow fever produced sudden, serious, and often fatal fevers among Europeans in the Caribbean. In 1762, for example, over half the members of the British army who arrived to set sedge to Havana fell ill within a month. In Santo Domingo 31 years later, 6000 of the English troops sent to help in the revolt against the French Creoles died of yellow fever, while only 100 perished in the actual fighting. The English sailors nicknamed the dreaded disease “Yellow Jack” because of the jaundice it produced.

It was a Cuban doctor, Carlos Finlay, who first identified the cause of yellow fever as an infection produced by a mosquito bite. In Havana in 1881, he presented his hypothesis to the Royal Academy of Medical, Physical and Natural Sciences, and subsequently confirmed it with experimental inoculations on 100 volunteers. A us medical team later validated his discovery, and in 1898 Finlay proposed the chemical destruction of mosquito larvae in water tanks. This method was put into practice in 1901 and backed by the us Army. The last case of yellow fever in Havana was registered that same year.

Thanks to Finlay’s discovery and the later development of a vaccine, yellow fever is no longer a serious threat to travelers from the North to central Africa and to South and Central America.

The Assassin Bug Strikes

A chronic disease in South and Central America is a version of the African sleeping sickness (trypanosomiasis) called Chagas’ disease. In Africa, the disease is spread by the tsetse fly, while in South and Central American it is transmitted by the blood-sucking insect Triatoma infestans, which carries the parasite Trypanosoma cruzi.

Chagas’ disease is found predominantly in young children and can have serious effects, including death. Some 15 to 20 million people are infected with it in Latin America, 10% of whom suffer from it chronically. The chronic phase, which appears only 15 to 20 years after the insect’s bite, affects the heart, oesophagus, lower intestine, and peripheral nervous system.

The disease is named after Carlos Chagas (1879-1934), a Brazilian who studied medicine in Rio de Janeiro. Simon Flexner, the famous American doctor whose report on the us medical system led to a revolution in physician training early this century, said Chagas’ work on American trypanosomiasis was the most complete made on any human disease as of 1920. According to Kean et al. (1978), in Tropical Medicine and Parasitology: Classic Investigations: “We know little more [about it] today. The parasite, the vector, and the disease are brilliantly described in [Chagas’] monograph.”

How Chagas’ Disease Got its Name

Chagas describes in this monograph how, in 1907, he and colleagues discovered the cause of the disease while conducting an antimalarial campaign during construction of a railroad.

We had some knowledge of the existence of a hematophagic insect [called “barbeiro” by the locals] which lives in human dwellings, attacking man at night, after the lights are out, and hiding during the day in the cracks in the wall, in the roofs of the houses, in one word, in all hiding places likely to afford shelter. As a rule, the hematophagic insect is seen in greater abundance in poor dwellings, in huts with unplastered walls and covered with thatch. Their reproduction is considerable there; they are found in immense numbers in the cracks of the walls and constitute a most remarkable threat to the lives of men who live in these dwellings. We have frequently observed men attacked by the hematophagic insect: a few minutes after the lights are extinguished in the homes, they come out of the hiding places in large numbers, to bite people, preferably in the face. When the lights are turned on, the hematophagic insects escape rapidly, thus rendering their capture difficult. The hematophagic insects will stay in the dwellings only as long as man resides there: they will disappear very quickly from deserted huts, certainly due to lack of food.

To this day there are no effective drugs or vaccines for use against Chagas’ disease, so it must be prevented rather than cured. Preventive measures include keeping the blood-sucking insects from invading houses. In Paraguay, a research team funded by IDRC has concentrated on improving housing by plastering walls and filling joints between boards, smoothing roofs and ceilings, and improving ventilation and light through the installation of better doors and windows.

The key to success, however, is community involvement. Rural communities do not take the threat of a disease that strikes in 15 to 20 years seriously, when they are concerned with day-today survival. Nonetheless, Chagas’ work has obviously been vital to controlling this scourge.

Will Chagas’ Disease Invade the North?

The North does not usually regard Chagas’ disease as a threat - to many it is just another Third World problem. But it is estimated that about 100 000 Central American immigrants in the United States carry the parasite in their blood, completely unaware of it (Howard Hughes Medical Institute 1993). Spread of the parasite to the United States - and beyond - through blood transfusions is not only possible, but has already occurred. Two cases were reported by 1993. Nor is Chagas’ disease an isolated case: a tropical disease specialist warned a 1988 conference in Washington that many other disease-causing organisms might move northward from their native habitats as a result of global warming (see Chapter 6).

Staggering from Too Much Cassava

A few years after the Second World War, while working in the neurosurgical unit of the University of London’s psychiatric institute. a young Nigerian doctor named Gottlieb L. Monekosso noticed that many patients who had been prisoners of war in Germany and Japan suffered from impaired vision, hearing, and gait. When he returned to Nigeria, Dr Monekosso said years later, “the very first patients that I saw reminded me of these descriptions; and I thought this is very funny: it looks like the kind of thing that these prisoners of war suffered from. This is happening in peacetime in an African population. I had better look at it and see what it’s all about” (Nichols 1982).

What it was all about was a condition affecting about 80% of women in certain areas of Nigeria - a condition that could produce, in addition to the staggering gait, goitre, liver damage, and nerve disease. Dr Monekosso thought the condition might be caused by the victims’ diets, as it had been among the prisoners-of-war. Manipulation of their diet had no effect, however. The only thing that relieved the patients was to remove them from their environment and admit them to hospital.

On a visit to London, Dr Monekosso had met a British physician who was studying the effect of cyanide in tobacco in a rare disease of the nervous system, and he and his colleagues suspected that the Nigerian patients’ malady might be caused by cyanide in cassava.

This root crop, also known as manioc and tapioca, is the dietary staple not only of Nigerians, but of 200 to 300 million people worldwide. Easy to grow even in poor soils, cassava is drought tolerant, resistant to weeds and insects, and can be left in the ground without harvesting for long periods.

However, cassava also contains a substance called lina-marin and a related substance, lotaustralin. Under the influence of an enzyme, these substances liberate hydrogen cyanide (prussic acid), one of the most powerful poisons known. The poison acts as a natural insecticide, protecting the plant against pests.

An African Discovery

By now, Dr Monekosso was collaborating with one of his former students, Dr B.O. Osuntokun, who used the cassava research as the basis for his doctoral thesis.

“Between us we eventually set out both the field exercises and the laboratory work which led to what is now accepted as the role of cyanide in cassava diets,” Dr Monekosso said. The condition became known as “tropical ataxic neuropathy.”

The researchers learned that some of the afflicted Nigerian women were eating up to 21 meals of cassava a week, with little other intake of protein. While traditional preparation methods - soaking the root in water before cooking - did much to eliminate the cyanide, it was not enough to prevent poisoning among those with a high intake. Basically, the researchers found, the problem was one of poverty. Unlike wealthier Africans, those affected could not afford high-protein food sources that provide the enzyme that destroys cyanide.

Tropical ataxic neuropathy is now uncommon in Africa. But the work of Dr Monekosso, now Regional Director for the World Health Organization in Africa, and Dr Osuntokun, now Professor of Medicine (Neurology) at the University of Ibadan and one of Africa’s most prominent medical investigators, did more than virtually eliminate a serious disease. It also helped in the understanding of the metabolism of cyanide in the human body - a contribution to knowledge that will benefit people both rich and poor worldwide.

Cassava’s Enormous Potential

While too much cassava can kill, too little could mean starvation for the estimated 200 to 300 million people in more than 80 countries who depend on it for nourishment. Despite its significance, cassava was long neglected by scientists.

In 1971, IDRC and CIDA undertook a cassava research program that produced remarkable results, including the identification of varieties that have quadrupled the national average yield in some countries, improvement of the crop’s normally low protein content, reduction of its cyanide content through breeding programs, and control of some diseases.

The research was motivated by cassava’s enormous potential. It is capable of producing more annual energy per land unit year than any other known staple food crop. As well, it has long been exported for its starch content, the characteristics of which are of special interest to the food, paper, and chemical industries. It has also been used in dried and pelleted form as animal fodder, with an energy value almost identical to that of corn.

As human food, cassava is a subsistence crop: it is highly tolerant to drought, can grow in poor soils, and is generally resistant to weeds and insects. It can be planted and harvested in any season and can be left in the ground for long periods without harvesting, which makes it useful as security against famine.

“Once regarded as a backstop crop to tide the rural poor over in ‘hungry seasons,’ cassava has emerged as a nutritional and commercial mainstay in sub-Saharan Africa,” says the Rockefeller Foundation’s 1993 annual report. “It has become both a dietary staple for almost 200 million people in the region and an important cash crop.” In the 15 African countries that produce 70% of the continent’s cassava harvest, this crop generates income for farmers and creates new jobs in processing, packaging, and marketing cassava products.

FARMING THE SEA

About 50.000 Chileans depend on seaweed as their main source of income. Between 1975 and 1987, the price of Chilean seaweed increased six-fold because of the high-quality agar it produces. Agar is used widely in industry, pharmacology, and food products. In 1987, the International Foundation for Science in Sweden presented the Sven Brohult Award to Professor Bernabe Santelices of Chile’s Catholic University. The award recognizes Santelices’ research that helps seaweed farmers to improve its management, propagation, and cultivation. His work also protects seaweed from overexploitation, as it does the marine organisms that use seaweed as a habitat.

Disease-Causing Snails and Soapberries

An estimated 200 million people in Africa, Asia, South America, and the Caribbean are afflicted by a disease called schistosomiasis (or bilharzia), and another 600 million are at risk. They become infected by bathing or wading in water that contains snails carrying parasites called schistosomes. These parasites cause a skin rash, enter the liver through the bloodstream, and may spread to other parts of the body, causing fever, diarrhea, and lung and central nervous system damage. Schistosomiasis is difficult to treat; the best way to reduce its human toll is to prevent infection by eliminating the snails that carry the parasite.

In 1964, a young Ethiopian scientist, Aklilu Lemma, was sent to the northern part of his country to investigate a serious outbreak of schistosomiasis, and to learn how the disease was transmitted from the snails to people. While there, he noticed that women washing their clothes in a small stream used the berries of a plant (locally called endod) as soap. Endod’s scientific name is Phytolacca dodecandra, and in English is called the African soapberry plant.

“I noticed in areas, particularly where people were washing clothes with these berries of this plant, immediately below there were more dead snails floating than any other place,” Lemma said. “So this aroused some curiosity that perhaps there may be a relationship between the plant and the snails. And so I collected a batch of good snails and I asked one of the ladies to see if she could drop a few of her suds from the washing basin into there. And surely, as soon as she did, it sort of bubbled up and all the snails died” (Nichols 1982).

This discovery led to years of research by Dr Aklilu and others, the results of which were published by WHO. It also gave rise to an experiment in which people suffering from schistosomiasis had endod applied to the affected areas and their contact with the snail-infected water controlled. The experiment proved successful in significantly reducing the incidence of schistosomiasis.

In 1989, Dr Aklilu and his co-worker, Legesse Wolde-Yohannes, were awarded the Right Livelihood Award (also known as “the alternative Nobel Prize”) for the endod work. “The way I look at this study,” Dr Aklilu has said, “is not this soapberry as such as a plant to be promoted as the solution, but the approach: a locally available product that you can mobilize your own people to either grow or produce without having to drain your hard currency which is so much needed for other purposes, a method whereby the community can actively participate.”

In 1993, WHO planned large-scale African tests of endod as snail killer and detergent.

The Answer to the Zebra Mussel?

Years after Dr Aklilu’s first encounter with endod in Ethiopia, a fascinating sequel began to take shape far away in the us state of Ohio. In 1990, when Aklilu was awarded an honorary degree by the University of Toledo, a biologist at the university by the name of Harold Lee became interested in the soapberry extract and found that endod could kill an adult zebra mussel 4 to 8 hours after exposure. At the same time, endod became biodegradable within 24 hours. In 1993, IDRC sponsored a lecture tour by Dr Lee to publicize his findings.

By the time Dr Lee made his discovery, the zebra mussel was a major pest in North America’s Great Lakes system, fouling waterways in Ontario and Quebec and in 18 us states. Introduced by accident in 1985, the mollusc multiplied fearsomely, clogging water-intake equipment and causing enormous damage. It also threatened fisheries because it kills algae and covers rocks in spawning grounds. In 1990, the us Fish and Wildlife Service estimated that the zebra mussel would cause $2 billion damage to American fisheries by the year 2000.

The University of Toledo applied for and received a US patent on the use of endod to control zebra mussels; it is seeking a Canadian patent as well. A technology package has been developed and Dr Lee and his colleagues are currently seeking industrial partners with whom to market it. Profits will be shared by the university, Dr Lee, Peter Fraleigh (also of the university), and Dr Aklilu (currently a Visiting Investigator in the Department of Industrial Health at Johns Hopkins University).

Asked by telephone whether benefits might flow to Ethiopia, Dr Lee said he and Dr Fraleigh hope that endod might be made into a cash crop for the country. They also propose to devote at least 5% of their royalties to establishing an endod foundation whose principle goal would be to combat schistosomiasis in Africa. The university will also devote 10% of its royalties to the foundation.

Dreams of Magic Bullets

Cesar Milstein was born in Bahia Blanca, Argentina in 1927. Educated at Colegio Nacional de Bahia Blanca and the University of Buenos Aires, he earned a doctorate at Cambridge University in England. His first job was in Argentina, on the staff of the Instituto Nacional de Microbiologia in Buenos Aires. He was made a Fellow of the British Council a year later, became head of his division in 1961, and in 1963 returned to Cambridge to join the staff of the Medical Research Council’s Laboratory of Molecular Biology.

ln 1984, Milstein was awarded the Nobel Prize for Medicine (along with Niels K. Jerne and Georges J.F. Kohler) for an enormously important discovery: how to manufacture monoclonal antibodies. Monoclonal antibodies are a sort of “magic bullet” that can unerringly seek out specific antigens (foreign agents in the body that can cause illness). With the help of these antibodies, disease-causing organisms can be identified, targeted, and killed without harming normal body cells. They can also be used to make effective vaccines against the diseases.

Before the Milstein method was developed, antibodies had been widely used in medical diagnosis to identify disease agents, but they were only available in nonspecific mixtures. The antibodies would recognize all sorts of substances in the body as foreign - from germs to chemicals from the environment - but they couldn’t discriminate precisely between them.

Immunologists had long dreamed of finding a way to obtain antibodies so pure that they would be able to recognize just one particular antigen. Dream became reality when the Milstein method was developed in 1975.

According to Dr Milstein, he and his co-worker, Georges Kohler, discovered the method by accident. They were trying to understand how an organism can synthesize millions of different molecules in seemingly endless variety, as the body does with antibodies. Their research was what is called “basic” or “fundamental” research, because it was aimed not at any particular practical end, but simply at the acquisition of knowledge.

In the course of their experiments, they discovered that if they made hybrid cells from human cancer cells and cells from a mouse, the hybrids would produce only one kind of antibody. They were able to clone these hybrids - reproduce successive generations with identical genetic characteristics. They called the hybrids “monoclonal antibodies,” and their experiments provided them with a practically unlimited supply of antibodies that possessed a specificity previously unobtainable.

Since this discovery, monoclonal antibodies have been used extensively in medicine, including research sponsored by USAID on Chagas’ disease in Venezuela, and to produce an antivenom against Russell’s Viper in Burma, where snakebites kill about 1 000 farmers every year. Monoclonal antibodies have also been used to diagnose and study virus and bacterial infections in plants.

Was Dr Milstein’s work really an achievement of Third World science, given that the work was done in Britain? Perhaps not, but one can’t help wondering whether he might not have achieved the same success had his native Argentina been able to provide him with the scientific environment he found in the North.

Milstein himself seemed to encourage such speculation during his presentation as winner of Unesco’s 1984 Carlos J. Finlay Memorial Prize. To illustrate his strongly held opinion that basic science must be supported even in developing countries, Milstein told a story about Bernardo Houssay, the Argentinean physiologist who became Latin America’s first Nobel Laureate in medicine in 1947. Houssay, said Milstein, was once asked by a journalist whether it was too much of a luxury for a country like his to support basic research. “Sir,” replied Houssay, “we are an undeveloped country. We cannot afford to be without basic science.”

PRODUCING NEW CROPS THROUGH RADIATION

In 1959, the Belgian Congo (now Zaire) became the first African country to operate a nuclear reactor. Used primarily for university research and teaching, it was replaced in 1972 and again in 1974, at which time it became the most powerful research reactor in Africa. Housed in what was called the Regional Centre of Nuclear Study in Kinshasa, it was largely designed and built by local staff at one-quarter the turnkey price quoted from abroad. Radiation-induced mutation techniques using the reactor have allowed African countries biological control of plant diseases, and also to produce new varieties of soybean, maize, rice, and groundnuts.

A Different Point of View

One of the reasons why Third World scientific and technological advances go largely unnoticed in the North is that often they do not fit easily into what the media in the developed world considers news. This concept of news tends to feature such exploits as space flights, high-technology medical breakthroughs - including organ transplants and genetic engineering - and the wonders of new computer technology. Measured by this yardstick, advances in fields such as tropical or dry-land agriculture may simply be ignored by Northern media.

Yet these little-publicized advances may be far more important to people in the South - who comprise a huge majority of the world’s population - than much of what passes for news in the North. Third World countries are by definition poor in terms of per-capita income. One of the chief concerns of a majority of their people is getting enough to eat. So for the more than 700 million people who do not have access to enough food to meet their needs, advances in agriculture are of enormous importance (Pinstrup-Anderson 1993). One such example concerns a process known as biological nitrogen fixation and its link to food crop production in tropical countries.

Free Fertilizer from the Air

As every home gardener knows, plants grow better when fertilized, and one of the essentials in any fertilizer is nitrogen. When we fertilize our lawns or our garden plants, we usually apply nitrogen in chemical form. Nitrogen, however, is not just available in chemical form. It can be obtained freely from the air and made available to plants by way of bacteria that exist in the root nodules of legumes. This process is called biological nitrogen fixation, and under certain conditions planting legumes and other plants together can eliminate the need for chemical fertilizers.

Chemical fertilizers are expensive to produce and require large inputs of electrical energy, thereby increasing the buildup of greenhouse gases in the atmosphere. They also contribute to water pollution. So when biological nitrogen fixation by legumes replaces fertilizers, the process helps promote food production while, at the same time, reducing global warming and water pollution.

A Tropical Bonanza

Some 40 years ago, a Brazilian scientist by the name of Johanna Dobereiner began developing sound, practical, and cost-effective methods of applying biological nitrogen fixation agriculture in tropical areas. Her work helped Brazil to produce the world’s second-largest soybean crop (22 million tonnes) without the use of any nitrogen fertilizer.

In her 1989 acceptance speech as the first woman to be awarded the Unesco Science Prize, Dr Dobereiner said: “This means that we obtain annually a quantity of nitrogen from the atmosphere that is worth more than $1.5 billion. More recently we have found still more efficient rhizobial [microbial] strains which transfer most of the fixed nitrogen directly to the grains, and their introduction into commercial inoculants should provide further yield increases of at least 20% without any increase in cost. “

But Dr Dobereiner’s biggest challenge was to extend the biological fixation process to other important food crops, including cereals, and to grasses and sugar cane. She and her colleagues succeeded in identifying seven new nitrogen-fixing bacteria associated with crop plants, some of which enhanced plant yield in the field, especially in warm tropical regions.

Later Dr Dobereiner proved that, like soybeans, certain sugarcane varieties can obtain all their nitrogen through biological nitrogen fixation, and that the process could increase the then-current average Brazilian yield threefold.

Her work at the Brazilian Agricultural Research Corporation (EMBRAPA) enabled sustained production of corn and soybean from poor soil such as the acid savannas of not only Brazil, but of Venezuela, Colombia, Ecuador, and Peru. With the improved technology, these previously unutilized lands - nearly equal in size to all us agricultural land - produce as much corn and soybeans per acre as the best land in the United States.

In presenting the Unesco prize, Federico Mayor said the EMBRAPA work “represents a significant step toward environmentally harmless plant nutrition systems, and the alleviation of the acute world food shortage, particularly in developing countries.”

Brazil is only one of many Third World countries to benefit from biological nitrogen fixation through legumes. In fact, a number of countries are promoting its use as members of the Microbial Resources Centres Network (MIRCEN) (Keya et al. 1986). These centres provide training in the methods of Rhizobium inoculation in Ethiopia, Kenya, Lesotho, Malawi, Rwanda, Sudan, Tanzania, Uganda, Zambia, and Zimbabwe.

Awareness of the advantages of biological nitrogen fixation came only after the fossil fuel shortages of the 1970s (notably Australia, Brazil, and Uruguay because of their need to improve pasture land). By then, world food production depended on a 40 million tonne supply of synthetically fixed nitrogen that cost between $8 and $10 billion annually. Not surprisingly, there was concern about the adequacy of the world’s natural gas supply required to produce enough synthetic nitrogen by the end of the century. The environmentally conscious also worried that fertilizer left unused by crops would pollute groundwater and produce other undesirable side effects.

For developing countries, the advantage of having nitrogen fixed by plants is particularly important. First of all, these countries do not have the foreign exchange necessary to buy enough chemical fertilizer, and many of their farmers could not afford the fertilizer even if they had. There is a much bigger problem, however. Until recently - to meet nutritional needs - Third World farmers have moved into areas where infertile soils have not previously been used for crop production. This practice cannot continue.

“While cultivated area is still increasing in developing countries, it is doing so at a low and declining rate,- said Per Pinstrup-Andersen, Director General of the International Food Policy Research Institute (IFPRI), recently in Washington. “Area expansion is no longer a feasible option for expanding food production in most of the world, and increased food production will have to come from increased yields.”

Although biological nitrogen fixation is vital to the Third World, it is also becoming increasingly necessary to the world at large.

“Fertilizer nitrogen will become ever more expensive because of the increasing cost of electricity,” predicted C.A. Parker, Professor Emeritus and Honorary Research Fellow in the University of Western Australia’s School of Agriculture in 1986. “Furthermore, soil nitrogen is only temporarily enriched [unlike phosphate] by addition of mineral fertilizer-N. This situation can be resolved by greatly expanded use of legumes” (Parker 1986).

As populations expand and increased food production becomes even more necessary in developing and developed countries alike, the natural fertilizer techniques pioneered by Dr Dobereiner and MIRCEN will benefit us all.

Reclaiming Salty Soils

Soils with a high salt content are a problem worldwide. They cover 10% of the Earth’s surface and occur in about 100 countries and regions. Salt seriously impedes agricultural production: it hardens topsoil, decreases the rate of water infiltration, and interferes with seed germination, In China, some 26.8 million hectares, or about 7% of all cultivated land, are affected by salt, most often in regions with a high potential for agricultural yield.

Beginning in the 1950s, Chinese researchers reclaimed vast areas of salt-affected soils in the Huang-Huai-Hai Plain, using a combination of water conservation and agroforestry methods (Zheng-Ming and Cheng-tao 1986). They tripled grain yields and turned the area into China’s leading cotton-producing region.

As much as one-third of Pakistan’s irrigated agriculture is affected by salty soils, and experts estimate that, if unchecked, the high levels of salinity could lead to abandonment of 25% of the country’s cultivable land, which is sorely needed for food production.

The International Irrigation Management Institute (IIMI) in Sri Lanka says the solution for the Lower Chenab Canal system in the Punjab lies in proper management of the 19th-century irrigation system. The system’s inadequacy forces farmers to use tube-wells, which exhaust the aquifers and cause salinity problems (IIMI 1993). In the meantime, Kauser A. Malik, a Pakistani scientist with the Nuclear Institute for Agriculture and Biology in Faisalabad, has developed a pasture grass to counter the situation (Malik et al. 1986):

Kallar grass is a highly salt tolerant perennial grass that grows well even under water-logged conditions. It is deep-rooted and opens up the impermeable sodic soils; if green manured it provides considerable amounts of stable organic matter to the soil. It harbours nitrogen fixing bacteria in the rhizasphere and obtains 60-80% of its nitrogen requirement from the air... During one summer it can provide 50 tonnes of biomass per hectare even when irrigated with brackish water. It has been shown by us and our colleagues that the biomass can be used as manure, fodder and for making pulp, biogas and alcohol. All these attributes make it a very suitable crop for introduction into saline lands where it can [not] only be used as an energy crop but also improves the soil.

Soil at Risk in the North

Deteriorating soils are also a problem for developed countries. In 1984, a report by Canada’s Senate Standing Committee on Agriculture, Fisheries, and Forestry found that “on lands affected by salinization in the Prairies, crop yields have been reduced by 10 to 75%, even though farmers have increased their use of fertilizer.”

When agriculture first began on the Prairies, the natural high fertility of the soils enabled farmers to produce millions of tonnes of cereal grains and oilseeds, with minimal use of fertilizer. In the early 1900s, farmers began cropping land only in alternate years. This practice, known as “summerfallowing,” was introduced to store scarce water for the cropping year, to control weeds, and to restore soil fertility. Unfortunately, it achieved the opposite results. Organic matter in the soil declined, soil erosion increased, and the soil became saline at an alarming rate. These effects were cumulative, said the report.

“Recent studies have shown that as much as 40 to 60% of the organic matter present in virgin prairie soils has been ‘used up’ by farm production. An equally startling fact is that, although the native soils in parts of the prairies originally released up to 125 pounds of nitrogen per acre [140 kilograms per hectare] per year, the same soil today may deliver as low as 9 pounds per acre [10 kilograms per hectare] if nitrogen fertilizer has not been used. The practical result for the farmer is that he must apply ever-increasing amounts of nitrogen fertilizer in an attempt to hold production at its current level.”

The Committee found that salinization reduced average Prairie crop yields by at least 50%. It estimated that, at that time, salinization cost CA $260 million in crop production and, given the rate of expansion of saline areas, farmers lost an additional CA $26 million in revenue every year.

Dr Harold Steppuhn, who is with Agriculture Canada’s Research Centre in Swift Current, Saskatchewan, said that the problem is, if anything, worse today than in 1984. With the increased rainfall we have experienced in recent years, more salts have been brought to the soil’s surface by evaporation.

“In some areas we’re monitoring, it’s just about as bad as it’s ever been,” Dr Steppuhn said. “It just might not be noticed so much because when it rains the salts go back in solution and go down a ways, and it doesn’t look as bad.”

Dr Steppuhn knows of no figures showing the cost of salinization to farmers today. He says that although the problem of salinity continues to grow, less attention is being paid to it. Farmers are less interested in the problem of salinity as they turn from wheat production to alternate crops such as canola, flax, peas, and beans. In comparison to wheat, these other crops may be less salt tolerant, thereby creating a new set of problems.

As a result of Canada’s slackening interest in the problem and cutbacks in government personnel, fewer people are working on solving the problem of soil salinity.

Could the North Learn from the South?

Soil salinity problems in the North and the South are not always the same because of different climates, growing seasons, types of salts involved, economics, and land use. However, since both regions suffer similar damage from saline soils, any advances achieved by soil scientists in the South could also be applied in the North. This holds particularly true for biological control methods such as agroforestry, which are becoming more prevalent in the Third World.

Dr Bob Eilers of Agriculture Canada’s Manitoba Land Resources Unit in Winnipeg, sees agroforestry as a potentially worthwhile research approach for Canada.

Agroforestry as a means of water management has been investigated in Australia, for example, he notes. Salt-tolerant tree species planted in areas that have previously been cleared of vegetation and where the groundwater level - and hence the salt content - is high, can use some of the excess water.

Eiler says that Canadian applications would have to be site specific and limited by local conditions. For example, “we pretty much cultivate all parts of the landscape and we don’t have forest lots or woodlots on the prairies. But I don’t even know if there’s really been a search to look at trees and shrubs as a management tool for salinity. I’m not aware of it. I know that it has been done in other places, and it seems to me that there are opportunities here - not for thousands of acres but for local, site-specific applications - where it might be very applicable.”

THAT’S A SUPER BANANA

Bananas and plantations are the staple food of 70 million Africans, as well as a highly popular food for millions of others worldwide. But these crops are threatened with extinction by rampaging diseases called Black Sigatoka and Panama race. The Honduran Foundation for Agricultural Research, with support from IDRC and USAID, recently developed a “superbanana” called Goldfinger. Resistant to these diseased, it could save the banana export industry from collapse. Capable of growing without pesticides and in areas where traditional banana varieties do not flourish, Goldfinger is a highly productive variety that has a unique and attractive flavour, and could extend the banana crop’s range into semitropical or even upland areas. This new variety could ensure reliable food supplies for millions of Africans, Asians, and Latin Americans.

Perfecting Ancient Practices

Agroforestry is a new field of organized science based on ancient traditional Third World farming practices. It combines the growing of trees, shrubs, and other woody perennials on the same land unit with food crops and sometimes animals. It is a system that does not rely on expensive chemical inputs, yet is highly productive, protective of the environment, and - most importantly - sustainable.

Some trees are leguminous and can fix nitrogen, like Dr Dobereiner’s soybean plant. This nitrogen then becomes available to the surrounding crops through the soil. Millet yields in Senegal have been reported to be 250% higher in grain and 350% higher in protein when grown under Acacia albida trees.

Other plants can work in the same way. Indonesians have increased production of rice from 0.7 to 1.8 tonnes per hectare in 2 years by growing it between young forest plants. And in Malaysia, rubber tree growth is accelerated when intercropped with leguminous ground covers.

When used as windbreaks, trees can protect crops and increase their production. In Niger, for example, millet yields increased by 23% when neem trees were used as windbreaks. And in northern Nigeria, farmers found that in addition to crop protection, their shelter belts provided poles, timber, and firewood, and increased fodder supplies for their animals. Together with the increase in soil fertility and reduction in soil erosion, this use of agroforestry netted them between 16 and 21% return on their investment (Spurgeon 1988).

Alley Cropping

One of the most effective agroforestry arrangements is to grow crops between rows of leguminous trees or shrubs, a technique called alley cropping. To prevent the trees from overshadowing the crops, they are pruned frequently and, when spread on the ground, the leaves and branches serve as mulch. As the leaves and branches decompose, they contribute nitrogen and phosphorus to the soil, and the trees continuously pump nutrients from deep in the soil and make them accessible to the crops.

Paul Harrison, British author of The Greening of Africa, describes how an Indonesian soil scientist, B.T. Kang, and his colleagues built on traditional methods of African farmers in developing alley cropping at the International Institute of Tropical

Agriculture in Ibadan, Nigeria. They found that a tree that originated in southern Mexico, Leucaena leucocephala, was ideal for this purpose: its leaves being capable of providing as much as 166 kilograms of nitrogen, 150 kilograms of potash, and 15 kilograms of phosphorus per hectare (Harrison 1987).

“When incorporated in the soil,” wrote Harrison, “every tonne of leaves produces the same increase in maize yields as 10 kilos or more of chemical nitrogenous fertilizer,” besides greatly increasing the efficiency of chemical fertilizers.

The yield also accumulates with time. In a long-term trial in sandy soil, unfertilized maize mulched with Leucaena prunings yielded 83% more than fertilized maize, and in the following 3 years, the alley-cropped plot continued to yield, on average, three times more than the untreated one, except for 1 year of drought. In addition, the yield was more than double the Nigerian average.

A free-lance writer with extensive Third World experience, Harrison was invited by the International Institute for Environment and Development in 1985 to look for success stories in African development projects and write about them. The Greening of Africa was the result. In it he called agroforestry “arguably the single most important discipline for the future of sustainable development in Africa.”

How the North Benefits

Much of the scientific development of agroforestry was carried out by the Intemational Center for Research in Agroforestry, ICRAF), which itself arose from an IDRC study. These ancient practices are now being revived, improved, and adapted in Canada by scientists at the University of Guelph, some of the projects funded by IDRC. The researchers determined that growing trees such as poplar with such crops as barley is profitable. And although planting corn with red oak or black walnut trees, for example, reduces the corn yield slightly, the walnut trees produce a valuable nut crop and the oaks help save some of the CA $100 million spent annually on us imports. Recently, as well, farmers in Vineland, Ontario, have begun planting vegetables between rows of young peach trees.

Peter Williams, a research associate who works with the program at the University of Guelph, says agroforestry practices have been carried out in North America for many years, and were not so much adopted from the South as stimulated by the work going on there.

“There’s a lot of traditional things that farmers do here that are agroforestry, like growing vegetables in apple orchards,” he explained. The planting of corn among black walnut trees was a practice developed in Missouri, for example, and was probably a spinoff of pecan production in the southeastern United States.

“Pecan trees have always been grown with wide spacing and farmers have always grown cover crops or whatever between them because there’s so much real estate there that they have to make some money out of it,” added Williams. Just as the beginning of international interest in agroforestry saw much documentation of developing-country systems, the same is now happening in the North, Williams said. The South, therefore, provided the impetus for increased research in the North.

Work on what was called forest range management in the southeastern United States dates back 20 or 30 years, and work with walnuts in Missouri began more than 15 years ago. But the recent interest in agroforestry in the North gave researchers “a forum for their work,” where previously they were probably regarded as a “fringe element.”

“Agroforestry started out as a real bandwagon in the 1980s, but now that part is over, and people are accepting it,” said Williams. “In the United States they are rewriting the farm bill and they’re incorporating a lot of agroforestry into it. Similar things are happening in Canada. Federal and provincial programs are using the word agroforestry and are working it into their programs. It’s a great label. It gives credibility to [assorted] practices such as constructing windbreaks.”

Agroforestry research in the North and South is mutually beneficial. The University of Guelph has had projects in Indonesia, Sri Lanka, southern Africa, and Argentina, and scientists from the Third World regularly visit these projects to gather information valuable to their own countries.

“I’ve been doing some work in Argentina and what I reamed there helped me with my work here,” Williams said. “It’s a really good interchange.”

These accomplishments in science and technology offer only a sampling of the research that has been carried out in the Third World for many years. They indicate the scope of research activities, and although the emphasis often differs from similar activities in the North, the North can nevertheless benefit from them. The next chapter examines these differences and benefits more closely.