Abstract
In sub-Saharan Africa, arid soil coupled with a rapidly increasing demand for food has driven the development of small and large-scale irrigation schemes. Irrigation development has the potential to increase or decrease local malaria infection rates and this paper uses two conflicting case studies to identify four factors which largely control the effect of irrigation development on malaria infection rates: 1) the baseline (pre-irrigation) characteristics of malaria transmission, 2) the baseline length of seasonal malaria outbreaks as determined by temperature and rainfall patterns, 3) the local population composition and distribution and 4) the effect of the irrigation scheme on the population’s socioeconomic status. This paper suggests that as governments, farmers and citizens increasingly fund and support irrigation development as a way to increase food security and promote economic growth, steps should be taken to foresee the effects that proposed projects may have on local malaria infection rates.
Introduction
Malaria kills more than one million children annually. The parasitic disease also hinders the physical and social development of children, decreases worker productivity and household income, suppresses economic growth, affects the movement of people and supports the stabilization of other ailments within populations. Malaria remains both a cause and a result of poverty and the poorest 40% of the world’s people are chronically at risk of contracting malaria (see appended Figure 1). Unrestricted access to insecticide-treated bednets (cost about 10USD per net) has decreased all-cause under-5 mortality by 60% within some African countries (1).
Fully, 90% of the worldwide deaths due to malaria occur in sub-Saharan Africa, where arid soil coupled with a rapidly increasing demand for food has driven the development of small and large-scale irrigation schemes. Nearly 50% of sub-Saharan Africa’s land receives too little precipitation to sustain rain-fed agriculture and, in 2001, irrigation schemes existed for only 4% of Africa’s arable land. Between 1990 and 2020, the area of irrigated land in sub-Saharan Africa is projected to increase by 33% (2). Small and large-scale irrigation development has the potential to influence local malaria infection rates.
Analyzing the effects of irrigation development on local malaria infection rates demands that we consider many geographic and demographic factors. I will approach the issue by 1) providing a brief epidemiological background of malaria as it pertains to irrigation development, 2) considering two sharply conflicting case studies which demonstrate that irrigation development can decrease or increase local malaria infection rates and 3) identifying and offering a potential resolution to the controversy which emerges from the aforementioned case studies. Irrigation development affects local malaria rates differently depending primarily upon the baseline (pre-irrigation) characteristics of malaria transmission, the baseline length of seasonal malaria outbreaks as determined by temperature and rainfall patterns, the local population composition and distribution and the effect of the irrigation scheme on the population’s socioeconomic status.
Epidemiology of Malaria and Irrigation Development
Humans and mosquitoes play essential roles in the malaria cycle (see Figure 2). The disease is transmitted to humans when an infected female Anopheles mosquito takes a blood meal and parasites enter the human’s bloodstream. Of the four types of human malaria parasite, Plasmodium falciparum is the most deadly and also the most common in sub-Saharan Africa.
Irrigation systems in Africa mainly support the growth of rice, sugar cane and cotton. Rice is grown on flooded ground, which provides an ideal breeding ground for mosquitoes and often significantly increases the Anopheles mosquito population. Similarly, cotton, sugar cane and wheat irrigation schemes can also increase Anopheles populations when systems are not properly maintained and operated (2). An increase in the Anopheles mosquito population density has the potential to cause a surge in infection rates.
Within sub-Saharan Africa, the prevalence of malaria varies greatly, but a strong correlation exists between a region’s annual rainfall and the length of its longest season of transmission (see Figure 3). Much of sub-Saharan Africa experiences a hot, dry season, in which malaria transmission rates are significantly lower than during the cooler, rainy season. This is largely because P. falciparum dies inside the mosquito when exposed to extreme heat. In fact, a study in The Gambia found that when daytime temperatures reached 40◦C, few infective mosquitoes were captured (3).
Interestingly, irrigation development can either shorten or extend the length of a population’s longest malaria transmission season. For instance, the introduction of basic treadle pump irrigation, which uses bamboo for suction to pump water from shallow aquifers, raised annual incomes by more than 100USD/year for the average participating household in eastern India and Bangladesh. The length of the longest transmission season significantly decreased in areas using the pumps, while annual rainfall patterns remained relatively constant. Furthermore, and more importantly, overall local malaria infection rates decreased nearly threefold in the regions that economically benefited from using the treadle pumps (4).
By contrast, the infamous Gezira-Managil irrigation scheme of Sudan extended the length of the region’s longest malaria transmission season by permitting the irrigation of wheat throughout the dry season and thus creating new mosquito breeding grounds. Before irrigation development, the land of Gezira was mostly left unsown during winter months and malaria was not a problem during any part of the year. However, the Gezira scheme led to a staggering 20% increase in the local malaria infection rate, and the disease began to plague the population year-round (5).
The Gezira scheme, introduced into an area of low baseline malaria transmission, draws attention to the epidemiological role that a lack of immunity to P. falciparum can play. When irrigation development occurs in low malaria or malaria-free areas, such as some of the desert fringes and highlands of sub-Saharan Africa (see Figure 3), malaria infection rates often increase because the population lacks immunity to the parasites (2). While an increase in Anopheles population density instigated a malaria problem in Gezira, large Anopheles populations are also associated with lower local infection rates, due to factors that the first case study will now explain.
Case Study One, Lower Moshi, Tanzania: Irrigation development associated with lower rural malaria infection rate
A study conducted by Ijumba et al. (6) examined three relatively isolated villages located on the foothills of Kilimanjaro in the Lower Moshi area of Tanzania. Each rural village practiced a single type of agriculture and the study found that the two villages which utilized irrigation had the lowest malaria infection rates. Infection rates in children under 5 years of age were determined for the three villages which practiced 1) traditional, rain-fed maize agriculture (TMA), 2) irrigated sugarcane agriculture (ISA) and 3) irrigated rice agriculture (IRA). Within the ISA and IRA villages, the prevalence of malaria was lower than in the TMA village. A significantly higher Anopheles mosquito density was recorded in the two villages with irrigation, and the mosquito density was especially high in the flood-irrigated IRA village. The annual infection rates were recorded as follows: TMA 29.4% infected, ISA 16.9% and IRA 12.5% (6).
During the rainy season, malaria rates significantly increased in the TMA village, slightly increased in the ISA village and did not increase in the IRA village. While the community with rice irrigation had a relatively constant incidence of malaria year-round, the malaria pattern of the maize-producing community was closely correlated with the region’s seasonal rainfall distribution (6).
Several factors may explain why the IRA village, despite its high Anopheles population density, demonstrated resistance to the seasonal malaria outbreaks that negatively affected the other two villages. Numerous studies have shown that ricefield irrigation communities in sub-Saharan Africa and Southeast Asia often have relatively high levels of access to anti-malarial medicines, insecticide bed nets and medical care (2). Rural ricefield irrigation development often economically benefits the population near the irrigation scheme. In the Lower Moshi case study, for instance, the rice-producing village was indeed the wealthiest of the three studied and 65% of people living in the IRA village were rice farmers. Improved nutritional status could also have played a role in the IRA community’s ability to resist seasonal malaria outbreaks and the disease in general. The children of the IRA village were heavier and taller than the children of the savannah village and likely had access to more nutritious foods (6).
While the IRA village was found to be significantly wealthier than the TMA village, a drastic economic divide was not found when comparing the IRA and ISA villages. However, the ISA village used considerably fewer insecticide-treated bednets than the IRA village. In addition, migrant workers from outside regions may have increased the malaria rate in the sugarcane-producing village. The study found that migrants, who were employed by the local sugarcane industry, composed more than 60% of the ISA village’s population. Immigration can significantly contribute to malaria transmission, especially when immigrants introduce new strains of parasites that are resistant to anti-malarial drugs and foreign to the local population (7).
The comparative analysis of the three villages in the Lower Moshi is not flawless. The
TMA population, which was interpreted as having high malaria rates due to its relatively poor socioeconomic status, is located near a dam. This may contribute to the local Anopheles population density and thus increase the TMA village’s infection rate. Secondly, without local baseline infection rate information, the effect that the introduction of irrigation development had on malaria rates in the IRA and ISA villages cannot be ascertained.
Perhaps the malaria rates were lower in the IRA and ISA communities before irrigation development occurred. However, the study conducted in Lower Moshi remains useful because it demonstrates that, on a regional scale, a high density of potentially infective Anopheles mosquitoes can be negatively correlated with the incidence of malaria infection. Thus irrigation development associated with lower local infection rates.
Case Study Two, Kumasi, Ghana: Irrigation development associated with higher urban malaria infection rate
In Ghana, malaria accounts for 45% of hospital visits and nearly 25% of under 5-mortality (8). A study conducted by Afrane et al. (8) in Ghana’s second largest city classified ten select areas of Kumasi into three land types and examined the rate of infection within each land type. Malaria rates were determined through adult sampling and household surveys in 1) urban areas without agriculture (UW), 2) urban areas with agriculture (UA), and 3) peri-urban areas with some rain-fed agriculture (PU). As in the case of the Lower Moshi, denser Anopheles populations were found near the irrigation schemes. In Kumasi, however, a significantly higher incidence of malaria occurred near the dense Anopheles populations. Five times more malaria occurred on UA land than on UW and PU land during Kumasi’s dry season (8).
The positive correlation between Anopheles density and malaria prevalence may be due to socioeconomic factors. For instance, people living on UW and PU land types owned far more mosquito-screens than people living in the UA land. The mosquito-screened windows and doors, which the researchers studied in detail, offered significant protection against malaria for all age groups (8).
Additionally, people who contracted malaria while living on UA land were by and large citizens who had no involvement with the nearby irrigation development. Urban malaria is more complex and heterogeneous than rural malaria because the urban poor often live within isolated pockets of malaria. The urban poor often suffer when living near irrigation development because they do not benefit from the agricultural operations and therefore do not gain access to anti-malarial measures (9).
Hydro-geographic factors could also help explain the positive correlation between
Anopheles density and malaria presence in Kumasi, where many farmers involved with irrigation occupy lowlands and obtain water by redirecting streams into manmade wells. The wells undoubtedly provide an ideal breeding ground for Anopheles mosquitoes. However, because the irrigated urban agriculture of Kumasi is situated on the low-lying, wettest regions of the city, the lack of known baseline malaria infection characteristics becomes especially detrimental to this case study. The UA land in Kumasi would almost certainly have the most vector potential if no irrigation development had occurred on any of the studied areas. Additionally, the analysis of the data failed to consider the location of any study area in relation to the Subin, an urban river which runs through Kumasi (10).
A demographic factor which likely affects local malaria rates of infection in Kumasi is immigration. Kumasi, with its population of 1.2 million people, is central to a growing timber industry, as evidenced by the increasing number of sawmills, plywood plants and furniture factories within the city. The industrial growth and its associated labor force have attracted many immigrants from the Ashanti region of Ghana (11). Ashanti has a high incidence of malaria and drug-resistant strains of Plasmodium falciparum thrive in the region. In 2004, a study conducted during the rainy season found that 48.8% of the population living in Ashanti was infected with P. falciparum (12). Perhaps the low-income community near the irrigation scheme attracted more migrants than the higher income areas considered in the case study.
A thorough demographic analysis is required to determine the effect of migration on malaria transmission in the land type areas of Kumasi. Unlike the Lower Moshi study, the Kumasi study did not consider migration or attempt to determine the composition of any sampled populations. However, despite its several shortcomings, the Kumasi case study demonstrates that, on a regional scale, a high density of infective Anopheles mosquitoes can be positively correlated with the incidence of malaria infection and thus irrigation development associated with higher local infection rates.
Controversy
Under what circumstances does irrigation development decrease local malaria infection rates? Investigation of this controversial question through the two conflicting case studies revealed the complexity of the inquiry. Without knowing a population’s baseline malaria infection characteristics, the question proves impossible to answer in retrospect.
Research strongly suggests that the geographic and demographic characteristics of an area are paramount in determining how the introduction of irrigation development will affect the local population’s malaria infection rate. Given that malaria kills nearly one million African children each year (1) and that (between 1990 and 2020) the area of irrigated land in sub-Saharan Africa is projected to increase by 33% (2), the ability to make logical and accurate predictions regarding how an irrigation development project will alter malaria infection rates within a population could save thousands of lives within sub-Saharan Africa alone.
Possible Resolution of Controversy
When planning to introduce an irrigation scheme into an area, malaria-related irrigation development research must be considered. Determining whether or not an added source of stagnant water will likely induce a decrease in local malaria infection rates requires the consideration of the aforementioned geographic and demographic factors: (1) the baseline characteristics of malaria transmission, (2) the baseline length of seasonal malaria outbreaks as determined by temperature and rainfall patterns, (3) the local population composition and distribution, (4) the effect of the irrigation scheme on the population’s socioeconomic status.
When the baseline rate of malaria infection within a community is very low, as was the case in Gezira, Sudan, the local population generally has limited immunity to P. falciparum and increasing the Anopheles mosquito population density should be expected to increase local infection rates, as occurred in Gezira. Even if irrigation development economically empowers a population with a low baseline infection rate and allows individuals with limited immunity to access anti-malarial measures, the incidence of malaria cannot significantly decrease compared to the low baseline rate.
The seasonal distribution of malaria outbreaks (see Figure 3) within a population plays a key role in determining whether or not a local infection rate decrease will occur. If the irrigation scheme will provide mosquitoes with stagnant water during a naturally dry, low malaria season and individuals employ the same protective measures against Anopheles bites before and after the irrigation development occurs, then infection rates may be expected to increase. This phenomenon may have occurred in Kumasi, where the second case study found the local rate of malaria infection near the urban irrigation scheme to be especially high during the dry season compared to non-irrigated areas. The local infection rates were more consistent across Kumasi during the cooler, wet season (8).
Alternatively, if accompanied by increased access to anti-malarial measures, irrigation development can decrease dry season infection rates despite the increased volume of stagnant water and higher than baseline Anopheles density. This phenomenon occurred when treadle pump irrigation was introduced to regions of India and Bangladesh (4).
The local population composition and distribution also plays a role in determining how irrigation development will affect local malaria infection rates. For instance, if an irrigation scheme is likely to attract migrant workers or is introduced into an area with migrant workers, drug-resistant strains of P. falciparum may increase local infection rates (7).
In addition, the identity of the population living near the irrigation scheme must be considered. In Lower Moshi, Tanzania, over half of all community members were farmers. In Kumasi however, while urban agriculture produces 90% of the lettuce, cabbage and spring onions eaten in the city, those living near the irrigation schemes were not involved with the agriculture. This dichotomy has socioeconomic significance, as discussed below. Of all the important geographic and demographic considerations, predicting the effect of population composition on the introduction of irrigation development may prove most problematic.
The effect that irrigation development will have on a community’s socioeconomic status is perhaps the single most critical determinant of whether or not the scheme will decrease local malaria infection rates. For instance, if the population living in the vicinity of a proposed irrigation scheme will fail to benefit from the agriculture, then malaria rates should not be expected to decrease in the population. However, if the population would benefit from the increase in agricultural production, then local malaria rates may be expected to decrease due to improved access to anti-malarial measures and a more nutritious diet.
As African governments, farmers and citizens increasingly fund and support irrigation development, steps should be taken to foresee the effects that proposed schemes may have on local malaria infection rates. The primary purpose of introducing irrigation into an area is to produce more food, not to decrease the prevalence of malaria. However, basic research of an irrigation development proposal, involving the examination of baseline infection characteristics, may offer qualitative, yet life-saving insight.
References
1. Sachs, J. and Malaney, P. (2002) “The Economic and Social Burden of Malaria,” Nature, Vol. 415, No. 7.
2. Ijumba, J. and Lindsay, S. (2001) “Review Article, Impact of irrigation on malaria in Africa: paddies paradox,” Medical and Veterinary Entomology, Vol. 15, No. 1.
3. Lindsay, S.W. et al. (1991) “Ability of Anopheles gambiae mosquitoes to transmit malaria during the dry and wet seasons in an area of irrigated rice cultivation in The Gambia,” Journal of Tropical Medicine and Hygiene, Vol. 94, No. 1.
4. Shah, T. et al. (2000) Pedaling out of Poverty: Social Impact of a Manual Irrigation Technology in South Asia. International Water Management Insititute.
5. Keiser, J. et al. (2005) “Effect of Irrigation and Large Dams on the burden of malaria on a global and regional scale,” American Society of Tropical Medicine and Hygiene, Vol. 72, No.4.
6. Ijumba, J. et al. (2002) “Irrigated crop production is associated with less malaria than traditional agricultural practices in Tanzania,” Transactions of the Royal Society of Tropical Medicine and Hygiene, Vol. 96, No. 1.
7. Oaks, S.C. et al. (1991) “Malaria: Obstacles and Opportunities,” A Report for the Committee for the Study on Malaria Prevention and Control.
8. Afrane, Y. et al. (2004) “Does irrigated urban agriculture influence the transmission of malaria in the city of Kumasi, Ghana?” Acta Tropica, Vol. 89.
9. Klinkenberg, E. et. al. (2006) “Urban malaria and anemia in children: a cross sectional survey in two cities of Ghana.” Tropical Medicine and International Health, Vol. 11, No. 5.
10. Obiri-Danso, K. et al. (2005) “Aspects of health-related microbiology of the Subin, an Urban River in Kumasi, Ghana,” Journal of Water and Health, Vol. 3.
11. Konadu, K. (1991) “Reflections on the Absence of Squatter Settlements in West African Cities: The Case of Kumasi, Ghana,” Urban Studies, Vol. 28.
12. Marks, F. et al. (2004) “High prevalence of markers for sulfadoxine and pyrimethamine resistance in Plasmodium falciparum in the absence of drug pressure in the Ashanti region of Ghana,” Antimicrobial Agents and Chemotherapy, Vol. 49,
No. 3.
13. Center for Disease Control and Prevention (2004) “Worldwide Distribution of Malaria,”
14 . Roll Back Malaria (2001) “Malaria cycle, figure adapted by author,” <www.unep.no/aeo/images/fig2a1s.gif>
15. Left: Africa Environment Outlook, United Nations Environment Programme
(2002) “Annual Rainfall in Africa,”
Right: Roll Back Malaria, United Nations (2001) “Length of longest malaria transmission seasons in Africa,” <http://www.rbm.who.int>
Eline Boelee
Nice paper Nicholas, could you send me a pdf?
I have been working on this topic for some time too. A recent related paper has just been published online: Kibret S, Alemu Y, Boelee E, Tekie H, Alemu D, Petros B (in press) The impact of a small-scale irrigation scheme on malaria transmission in Ziway area, Central Ethiopia. Tropical Medicine and International Health. doi:10.1111/j.1365-3156.2009.02423.x
Regards, Eline