Skip to content

Universal Health Coverage: The Promise of Definite Results in an Indefinite World

November 18, 2013

By Jeremiah Norris

In the run-up to the Affordable Care Act (ACA) in 2010, there was frequent comment across media channels that the United States was the only developed country that didn’t have a universal health coverage system. The British National Health Service (NHS) was mentioned as a template and presented as a cohesive, single payer-system which provided comprehensive services, free at point of service.  Though it underwent vast changes, its original moment was held in amber by the same media outlets which declined to mention how the NHS had progressed into a pluralistic system reflective of its contemporary environment.  If US policy-makers had taken note, they would have understood that what is enacted today can be dramatically altered tomorrow by emerging societal imperatives.

During the premiership of Margaret Thatcher, 1979-1990, the government established the “internal market,” in which these reforms were enacted:

  • the NHS can purchase care for the citizenry from independently-run hospitals;
  • inefficient hospitals can be driven out of the NHS system due to competition;
  • private chains now run more than 100 hospitals that were formerly NHS;
  • General Practitioners, dentists, etc. now bill their private services to the NHS;
  • patients are assessed co- payments at points of service, e.g., optometry, etc.;
  • and, there must be competing bids on specific NHS services.

The British Union Provident Association (BUPA) puts to rest the notion that NHS is a single payer system. BUPA, a private health insurance carrier and provider of healthcare services, now enrolls 23% the British population. Although this cohort can opt out of the NHS, they nonetheless continue to pay general taxes for its support. In effect, the NHS is used by BUPA members as a secondary carrier, often for high-end catastrophic care.

A significant portion of the U.S. population is already in a single payer system, e.g., Medicare (52.3m), Medicaid (57.5m), and the Veterans Administration (8.8m), for a total of 118.6 million patients, consuming some 48% of national health expenditures.

Two key drivers of ACA were the 47 million that were uninsured, and pre-existing conditions. In 2012, the Federal Poverty Level (FPL) was $23,050.  The FPL doesn’t include the value of food stamps, nor subsidized housing, education and healthcare. The Kaiser Family Foundation determined that 11.3 million were at the 250% – 400% of FPL. For whatever reason, they choose not to purchase health insurance. In many cases employer-sponsored health care was offered at their places of work but they declined to pay a portion of the monthly premium. Of the 35.3 million at the 100% – 249% of FPL, many were eligible for Medicaid but failed to enroll due to stigma, a reluctance to fully disclose their assets, or simply not knowing that they were eligible for its benefits.

On pre-existing conditions, this country faced a similar problem with immunizations more than two decades ago. There is no vaccine that is 100% free of adverse reactions by an extremely small cohort of those being immunized. This led to many highly expensive legal court cases, forcing several US vaccine manufacturers out of business.

In 1988, the National Childhood Vaccine Injury Act created the National Vaccine Injury Compensation Program (VICP). It is a no-fault alternative to the traditional tort system of resolving vaccine claims that provides compensation to people found to be injured by certain vaccines. The U.S. Court of Federal Claims decides who will be paid. It is funded by a $0.75 tax on single dose vaccines and Trivalent vaccines are taxed at $2.25.

Pre-existing conditions represent an issue that could have been addressed in the same manner as vaccine injuries by building on the 35 states that now run High Risk Insurance Plans for persons that cannot otherwise buy this form of insurance.  A national risk pool could be formed, one that removes the expensive tort system from the fiscal liabilities associated with adverse outcomes. Every health policy could carry a tax that is assessed at points of service to cover pre-existing conditions, with the government serving as the payer of last resort for catastrophic care.

Programs enacted by one Administration tend to be changed by political imperatives in follow-on Administrations.  When Social Security was enacted in 1935, and Medicare in 1965, they were designed for a specific population group, mainly those aged 65+.  During subsequent Administrations, key amendments were enacted through Social Security: 1) Disability Insurance; 2) Supplemental Security Income; 3) End Stage Renal Disease; and 4) Medicare Prescription Drug Benefits. Millions of deserving beneficiaries enter these programs well under the age of 65, some as infants.

Now, it is posited that entitlement programs like SS and Medicare, have to be reformed due to future unfunded liabilities–without thought to the fact that had they sustained their original purposes, then these institutions would be solvent into the foreseeable future.

In a similar fashion to the NHS which promised definite results in 1948, the ACA will soon encounter an indefinite world of political expediency by future Administrations. As in the past, their choices to continue any previously funded program will be limited when faced with uncharted obligations. They will have to raise taxes, mainly through increased premiums or higher deductibles and co-pays, or reduce benefits—usually both.

Advertisements

The World Bank’s Fierce Grasp of Yesterday

November 4, 2013

by Jeremiah Norris

Advocates of Universal Health Coverage (UHC) believe that the key to success can be seen in the World Bank’s new vision as stated by its President in his address to the World Health Assembly in May 2013. He said: “We must be the generation that delivers universal health coverage … warning that UHC could become a toothless slogan”.

Senior policy-makers at the Bank, and its sister organization, the IMF, should take this warning seriously. In 1999, the Bank conducted an audit of 107 of its health projects, finding: 1) “the Bank does not adequately assess borrower capacity to implement planned project activities; 2) notably lacking is an adequate assessment on demand for health services; 3) we know little about what the Bank has ‘bought’ with its investments”.

In 2009, it conducted another internal audit based on corrective measures it had set in place, finding: one-third of World Bank health, nutrition and population program loans met with unsatisfactory outcomes. The Bank spent $17 billion on these initiatives.

Subsequently, the Guardian Weekly drilled deeper into the Bank’s audit, determining that: “despite the Bank’s raison d’être to end poverty, that was the specific objective of only 6% of projects and a secondary objective of 7%. Even when this was a stated objective, there was little monitoring of outcomes. Where it was done, few projects had achieved that goal … and much of the spending aids [the] richest 20% of people”.

The Bank conducts annual reviews through its Independent Evaluation Group (IEG). In a 2006 evaluation of a subsample of 25 Bank-assisted countries for which outcomes had been assessed, “only 11 reduced the incidence of poverty between 1990s and the early 2000s, while poverty either stagnated or increased in the remaining 14 countries”.

In 2007, the IMF published a report on health aid and infant mortality. It found that “despite the vast empirical literature considering the effect of foreign aid on growth, there is little systematic empirical evidence on how overall aid affects health, and none (to our knowledge) on how health aid affects health”.

The Bank’s funding is directed mainly to the public health sector of developing countries. Here, it is burdened by high transaction costs, lack of ownership, and interest payments. It competes with private resource flows, often without these operational liabilities. Official Development Assistance (ODA was once the dominate form of assistance to developing countries. But today, according to Hudson’s Index of Global Philanthropy and Remittances, “Nearly 80% of all DAC donors’ total economic engagement with the developing world is through private financial flows.”

An October report by the Bank on global remittances has estimated that they are “expected to reach $414 billion in 2013, and $540 billion by 2015”. These flows often are used to purchase health and educational benefits. Sooner or later, every remitted hard currency ends up in a country’s central bank where it is used to purchase goods and services which can only be procured through the use of scarce foreign exchange.

In the post-colonial world of the early 1960s, most countries emerging into independence had virtually no private health sectors. In that environment, it was perfectly logical for the World Bank to lend into their public sectors. That world has changed dramatically. For instance, in the largest countries, private expenditures on health as a percent of total expenditures on health show that it is 69% in Bangladesh; 81% in Cambodia; 64% in China; 76% in India; 65% in Indonesia; 55% in Mexico; 75% in Nigeria; 73% in Pakistan; 62% in South Africa; 70% in Uganda; and 73% in Viet Nam.

The successful implementation of UHC is dependent upon: “1) removal of direct out-of-pocket payments); 2) [a migration from] employment-based and contributory insurance models … to the public health sector; and 3) financial support through general government revenues”. Given the choice between low cost private resources for healthcare and heavily burdened ODA for UHC, it shouldn’t be surprising that ministries of finance would choose a resource flow which would provide them with increased levels of foreign exchange at least cost to their societies, while at the same time improving their international credit ratings for borrowing.

The Bank’s president is prescient in his caution that UHC could end up as a slogan. Yet, when used in his new vision To End Extreme Poverty by 2035 during the Bank’s Annual Meeting in October, it indemnifies donors from performance accountability. Who could be against such a laudable goal! When cast against an ocean of inconsequential Bank and IMF audits and their leadership in debt forgiveness of $93 billion for 40 poor countries through the Jubilee Program—the same countries which would be the main subjects of UHC, an embrace of these concepts by the Bank can be seen as a continuous tight grip on slogans in a retracing search for yesterday’s icebergs.

WHO’s Global Health Programs vs Independent Initiatives

October 8, 2013

By Jeremiah Norris

At Alma Ata, Kazakhstan, in 1977, WHO was charged with a new program initiative, “Health for All by the Year 2000” (HFA). Broadly, its aims were to “bring health within reach of everyone in a given country. By ‘health’, it is meant a personal state of well being, not just the availability of health services. [it implies] A state of health that enables a person to lead a socially and economically productive life.”

Subsequently, bilateral and multilateral agencies, as well as numerous non-governmental organizations, supported its goals and funded them extensively from that date through 2000 with billions of donor dollars.  In 2001, WHO published “Macroeconomics and Health: Investing in Health for Economic Development” (MH).  It was written by an impressive list of Commissioners, headed by Prof. Jeffery Sachs, then with HarvardUniversity, and funded at $2.5 million by WHO. The basic thrust of this new initiative is a formulaic expression: “expanding coverage for essential health services to the world’s poor through scaling up resources, poverty would be reduced, economic development accelerated, and global security advanced. This could save at least 8 million lives each year by the end of the decade, and assuming that each DALY saved “gives an economic benefit of 1 year’s per capita income of a projected $535 in 2015, the direct economic benefit … would be $186 billion per year, and plausibly several times more”.

A critical component of HFA was Multisectoral Planning, in which WHO would engage all line ministries in its global effort. It re-emerged in MH as a recommendation for the formation of National Macroeconomic Commissions, co-chaired by ministries of finance and health.  There isn’t any public record of these Commissions having been formed. MH is completely absent of any references to HFA in the main text, in Appendix I listing the Commissioners; their Biographical Sketches; or Reports and Working Papers; or Appendix 2; or References; or even in the Glossary!

It calculated an economic benefit for the 8 million lives saved annually by essential health interventions. The Commissioners walked away from the downstream implications by making an assumption that those saved would go on to a life of suspended animation, free of noncommunicable diseases (NCD) and the costs associated with them.  Once a child makes it beyond age 5, the unerring arrow of a costly NCD morbidity will strike somewhere along life’s path. Not much else in healthcare is certain. This is.

WHO never published a final report on HFA. This would have generated a wealth of ‘lessons learned’. Applied to the Commissioners 2001 work, this valuable resource could have substantially grounded its proposal in a legitimacy otherwise difficult to obtain.

Notwithstanding the lack of progress on these two initiatives, the 2005 World Health Assembly resolution and the 2012 UN General Assembly resolution supported the concept of Universal Health Coverage (UHC). Proponents of this concept believe that universal health coverage means “that people should have access to all the services they need, of good quality, without suffering financial hardship when paying for them”.

A key financial barrier to this coverage is co-payments by patients.  At the 66th World Health Assembly in May 2013, the President of the World Bank said: “anyone who has provided health care to poor people knows that even tiny out-of-pocket charges can drastically reduce their use of needed services. This is both unjust and unnecessary”.

His statement was in contravention to the Bank’s report of 20 years ago, titled: “Investing in Health”. It proposed the implementation of user fees, a policy which was adopted by many Member States. It is also contrary to WHO’s ranking of Members on healthcare. In every category, “Singapore is either the very best in the world or near the very top. Its out-of-pocket healthcare costs are the highest in the world”.

The predominant form of health financing in most developing countries is in their private sectors. Out-of-pocket payments include co-payments for those covered by health insurance plans, government retirees under medical care through social security-type systems (it is 65% of total public health expenditures in Bolivia), and private health plans. Their removal under UHC would require approval from ministries of finance, causing them to impose additional sources of funding for public health via new taxes.

Alongside these two global initiatives, other programs were launched independent of direct WHO management. Two of the largest are PEPFAR (the President’s Emergency Plan for HIV/AIDS, TB and Malaria) launched in 2003 with a $15 billion grant from the U. S. It was renewed in 2008 with an additional grant of $53 billion. It works closely with the Global Fund to Fight HIV, TB and Malaria, “which in its recent replenishment was able to obtain $18.9 billion in new pledges”. Together, these two organizations expect to have 10 million AIDS patients under ARV treatment by the end of 2013.  For the first time in the global fight against AIDS, the number of new infections has declined.

Rather than leading the way forward on global health, the two WHO initiatives followed past remedies: pour in money and stir. Money was used as a product for consumption when experience informed PEPFAR and the Global Fund that it needed to be used as a medium with a command power to create something of value between partners.

Fire in the Blood: Entitled to Its Own Opinions, Not Its Own Facts

May 3, 2013

Fire in the Blood, a film by Punjabi-Irish writer-director Dylan Mohan Gray was first released at the Sundance Film Festival in January 2013. Since then, it made blood boil on at least several continents.  In Global Health Check February 2013, Araddhya Mehtta blogs that the documentary film covering Africa before HIV/AIDs  mass treatment programs got underway “tells a harrowing story of inhumanity and heroism” and details how “millions upon millions of people, primarily in Africa were left to die horrible, painful deaths, while the drugs which could have saved them were being safely and cheaply produced and distributed just a short airplane ride away.”

She goes on to tell how the film accuses multinational drug companies and Western Governments of collaborating “to keep low-cost generic AIDS drugs out of the hardest hit countries.” According to Mehtta, film director Gray described the story as a “real-life David vs. Goliath tale, full of incredibly interesting, daring, courageous mavericks, who took on the world’s most powerful companies and governments to do what virtually everyone else at the time said was impossible (i.e. mass treatment of HIV/AIDS in Africa), and against all odds they won…”

Mehtta and Gray fail to mention some pertinent facts that present an altogether different picture.

First, the global health community and the Clinton Administration throughout the 1990s did not fund any HIV/AIDS overseas treatment programs. They advocated only preventive programs, such as testing, counseling, and condoms. The notion that Africans would agree to testing for the HIV/AIDS virus with the associated stigma and no available treatment was always puzzling to me. President Thabo Mbeke in South Africa was in denial of the entire AIDS epidemic. According to Roy Caroll, “He questions the link between viruses and AIDS, and believes that the correlation between poverty and the AIDS rate in Africa was a challenge to the viral theory of AIDS.” So it’s hard to see who the real “courageous mavericks” were in the 1990s and early 2000s.

It wasn’t until George W. Bush came into office and completely reversed the U.S. Government position against treating AIDS that the global health community changed its tune. President Bush ushered in treatment and created the well-known PEPFAR program in early 2003 which combined prevention and treatment for the first time for patients in poor countries hardest hit by AIDS. Only then, in December of 2003, did the WHO initiate its own program which included treatment for the first time.

Secondly, for Mehtta to proclaim, without any examination of the evidence, that there were “safely and cheaply produced” drugs just a short airplane ride away is misleading. She seems to be referring to Indian companies that were producing knock-off HIV/AIDS drugs for export. It’s safe to say that the drugs were cheap, but not that they were safe since India requires no bioequivalence testing on drugs for export. The chickens came home to roost in May of 2004 when the WHO recalled 36 Indian drugs from the market and mainly in Africa because the Indian companies could not provide proof of bioequivalence. And, while India was the epi-center of low cost production of ARVs, it had as difficult a time of providing treatment to its own patients as to those in the global arena.  In October 2006, the International Treatment Preparedness Coalition criticized India for having 785,000 patients eligible for ARV treatment while only 6% of them were receiving it.

What Mehtta and the film do not acknowledge is the role of the U.S. Government’s FDA, which in May 2004, in an effort to increase AIDS drugs for poor countries, offered to accept any ARV drug file for review from companies in any country that wanted FDA certification to confirm that their drugs were valid ARV generics. In turn, U.S. companies, also wanting to increase AIDS drugs for poor countries, did not challenge the Indian companies for violating their patents. This made it possible for the U.S. Government to purchase these drugs and expand distribution even though the price differentials between the Indian and U.S. ARVs were not that significant.

This successful and innovative program between the U.S. Government and companies resulted in the number of AIDS patients being treated increasing from 400,000 in 2004 to more than 8 million patients by the end of 2012.  Eighty percent of these patients were receiving Indian drugs, but this time, they were drugs that had been tested for safety and efficacy. And finally, even before President Bush allowed treatment of AIDS patients in U.S. programs, and the global health community followed suit, U.S. companies spun into action with multi-country and single country HIV/AIDS programs for vulnerable populations overseas. From 2000 through 2011, companies donated some $76 billion in drugs – often ARVs, and $9 billion in cash donations including capacity building and physical infrastructure. These contributions are greater than the combined health budgets of the World Bank, WHO, and USAID over the same time period.

The story of HIV/AIDS deserves a better documentary than Fire in the Blood if we are to ever learn from our mistakes for future pandemics. Yes, the story of HIV/AIDS is a sad story of unnecessary human suffering, but the film is missing the chapter on how global health experts endorsed only preventive programs without treatment for over a decade.  Yes, ARVs were blocked from reaching Africa but primarily by health policy makers, not in a conspiracy of governments and companies. Yes, it did take some courageous mavericks to set it right, but not the ones generally mentioned, rather President Bush, the U.S. FDA, Dr. Paul Farmer who was the first to demonstrate that AIDS treatment could work in resource-constrained environments, and NGOs that stood their ground on the importance of both prevention and treatment. Documentary filmmakers should at least provide a balanced story and “do no harm” when they turn their cameras to life and death matters.

The Proposal for a Global R&D Convention: A Challenge to its Premises

December 4, 2012

On May 26, 2012 the World Health Assembly (WHA) adopted a resolution calling for an inter-governmental meeting to examine the proposals made in April by the Consultative Expert Working Group on Research and Development: Financing and Coordination (CEWG) to initiate a Global R&D Convention. It called for “open approaches to R&D, pooled funds, direct grants to companies in developing countries, prizes for milestones and end products, and patent pools”. The main recommendation of the CEWG was, however, more far-reaching: “to start multilateral negotiations for the possible adoption of a binding convention on health R&D”.

The concept of a Global Convention rests on three main premises:

  • 1) “the current R&D model, based on patents and market-oriented research, fails to generate new health technologies to face global challenges arising from existing health needs, particularly in developing countries;
  • 2) it is necessary to secure product access and affordability by delinking R&D costs from the prices of products; and
  • 3) voluntary financing cannot be the main or unique source of funding—a better, more sustainable and predictable financing model is needed”.

In April 2012, Medecins sans Frontiers (MSF) stated: “the current R&D system is driven by market forces, not health needs, and relies overwhelmingly on the patent system to recoup costs by via high prices”.

How relevant are these premises to issues to the the Millennium Development Goals (MDGs)? In both cases, WHO was designated by the UN as the operative agency in September of that year.

Ten years ago, researchers showed that “from 1975-1999, only 1.3 percent of new drugs and medicines were developed for neglected tropical diseases and tuberculosis despite the fact that these diseases accounted for 12 percent of the global burden of disease”.

Yet, when adopting the eight MDGs in September 2000, WHO dismissed both neglected tropical diseases and TB.  In Goal #6, the only one with a specific disease target, it read: “Combat HIV/AIDS, Malaria and other diseases”. The urgency now being expressed in the high burden of tropical diseases by the CEWG was curiously absent in 2000 when WHO relegated them to a global potpourri of ‘other diseases’.

Is the low percentage of new drugs and medicines as important a factor in subsequent access and affordability, or is the number of drugs that actually came on the market in this period to diminish tropical diseases more important? Let’s look at a few examples.

In 1978, an R&D company introduced Ivermectin to combat an age old scourge: onchocerciasis or river blindness. It made this therapy available at no cost into perpetuity.  This disease is the world’s second leading cause of infectious blinding, leading to reduced agricultural output. A World Bank evaluation looked at riverine communities that had left the land due to endemic blindness. Once this disease “was controlled, 25 million hectares of land were returned to agricultural production, enough to feed 18 million people”.

In the 1980s, USAID sponsored a program to introduce Oral Hydration Salts (ORS) as the treatment of choice for diarrheal disease. The product was developed by an R&D company, and considered by some to be “possibly the most important advance of this century”.

Although the R&D companies patented their respective therapies, neither enforced them when other non-R&D companies massively produced these products. Today, ORS is the preferred treatment for diarrheal diseases, saving the lives of millions of poor children.

In 1987, when the global AIDS epidemic was beginning its ascendancy and no interventions were in sight, one R&D company developed the first therapy for effective treatment: AZT.  This sparked a revolution in product innovation by others: today MSF records that there are 26 different therapeutic classes of ARVs, all of them covered by extant patents. Most are being produced by Indian firms as generics without fear of legal challenge from their right-holders.  MSF has designated India as “pharmacy to the developing world”. (9)  WHO records in 2012 more than 8 million AIDS patients under life-extending ARV treatment, up from less than 50,000 in 2003.

In 2006, the Congressional Budget Office published a study on average costs and times to successfully develop a new molecular entity. “In 2000, the total cost was $802 million, with an average time of 11.8 years to bring a product to market”.  By 2012, that average costs has risen to $1.3 billion and the time to 12+ years.

All new drugs that reach the market after FDA approval don’t necessarily find a market. In 2006, one R&D company introduced a new inhaler for asthmatics. By 2011, it had to conclude there was no market for it, writing down $2.3 billion in losses.  An R&D Convention would most certainly produce some losers; how would they be covered?

The first order of business for the CEWG should be: in a global health community increasingly focused on evidence-based programming, can the concept for an R&D Convention be sustained by its founding premises?

If these answers can provide confidence to continue with the concept, then advocates need to propose a new regulatory agency with extraterritorial enforcement authorities. It would have to address this question: is there a scientific justification for the retention of FDA/EMEA pharmaceutical quality requirements more stringent than those of the proposed R&D Convention?   Investors, public and private, would want assurances of this nature before they expend untold billions in the absence of a proof of concept.

Universal Health Coverage: A Reprise of Health for All by the Year 2000

November 9, 2012

WHO inaugurated the concept of Health for All by the Year 2000 at Alma Ata, Russia in 1977.  According to the Development Assistance Committee (DAC), during the period 1990-2000, donor contributions were: $23 billion in health; $7 billion in population & family planning; and $30 billion in water supply and sanitation. More money was spent in the interim period of 1977-1990 which would bring total estimated spending to $90 billion. These initiatives were launched by WHO in disregard of a large body of excellent research into the relationship between health, poverty, and growth in developing countries. This research shows that economic development is the main driver in improving health, and that pouring money into public health spending rarely solves the problem.

To show progress on the money spent, WHO expended another $2.5 million to produce a new report in 2001: Macroeconomics and Health: Investing in Health for Economic Development, under the direction of Prof. Jeffrey Sachs.  The main text is entirely bereft of any references to Health for All. There is no mention of it even in the Appendix or Glossary. Nothing to inform the global health community of how money was spent.

The basic thrust of the 2001 report was a formulaic expression: pour money in and stir. By expanding coverage for essential health services to the world’s poor through scaling up resources, poverty would be reduced, economic development accelerated, and global security advanced. This “could save at least 8 million lives each year by the end of the decade”.

However, previous research has clearly shown that pouring money into health is not the answer. This research could have provided more balance to new initiatives, allowing a sense of imputed legitimacy to the propositions being advanced.

  1. In the World Bank’s Development Research Group’s Child Mortality and Public Spending on Health: How Much Does Money Matter, authors found that the major drivers on reductions on infant mortality are economic and educational: public health investments account for 5% of this decline;
  2. In Bulletin of the World Health Organization, the author wrote: “a 1997 examination of cross-national variation in child and infant mortality found that 95% of the differences could be explained by differences in income, income distribution, women’s education, ethnicity, and religion”;
  3. Furthermore, “public spending on health was statistically insignificant at conventional levels and total public spending explained less than one-tenth of 1 percent of the observed differences”. (for more details see here, here, and here)

None of these facts have deterred 110 civil society organizations in 40 countries from sending an open letter on November 2 to the World Bank, calling on it to advance Universal Health Coverage. They stated: “underlying all its demands—is that strong and equitable health systems are the key to achieving universal coverage”. The letter’s “first ask is for the Bank to help countries remove out-of-pocket fees”.

Within these two sentences the organizations managed to establish a principle of equity, then subordinate it with a demand that the Bank remove a country’s sovereign right to impose fees on health services.  They are challenging the Bank’s new president, Dr. Jim Kim, to change its traditional role as a bank to one of an implementing institution.

The removal of out-of-pocket payments would likely bankrupt many private health care systems in the developing world. In the largest countries by population size, the private sector is the dominant provider of health as a percentage of total national expenditures. It is 70% in Bangladesh; 82% in Cambodia; 65% in China; 83% in the Democratic Republic of the Congo; 60% in Egypt; 77% in India; 67% in Indonesia; 64% in Kenya;  55% in Mexico; and others. In these countries, patients are exercising one of the most fundamental precepts of democracy: choice. Why should the World Bank be complicit in silencing their voices!

The civil society organizations advocating Universal Health Coverage cite the example of Sierra Leon, a failed state, as the exemplar. Here, the World Bank “played a helpful role, alongside other donors, in providing financial support to the country’s successful free care policy for pregnant women and children”!

Like Health for All, Universal Health Coverage requires that donors support the initiative with ever increasing resource flows.  However, the donors are in deep debt trying to forestall a journey over the fiscal cliff in their own health care systems. If they were to remove out-of-pocket payments in Medicare and Medicaid, a financial chaos would result affecting 16% of the US economy alone, while roiling through the service sector with untoward macroeconomic consequences.

The failure of Health for All and the 2001 WHO report on economics and health reinforces the notion among skeptics that donors have a major influence on allocating resources to new program initiatives but only a limited ability to stay the course of their actions. Neither initiative has earned a footnote from the 110 civil society organizations that are now advancing the notion of Universal Health Coverage.

What Wind Blew those NCDs Hither?

October 25, 2012

On September 19, the Center for Global Development (CGD) asked: One Year Later, What Happened to Noncommunicable Diseases (NCDs)?  It cited the UN General Assembly Resolution of 2011 adopting a 13-page “political declaration” to address the prevention and control of non-communicable diseases worldwide.  Yet, no measurable goals to reduce NCDs, such as targets to reduce global mortality, or increased access to medicines, were agreed upon until a year later at the 2012 World Health Assembly (WHA).

Earlier, it seemed NCDs were gaining traction. In the Center for Strategic & International Studies (CSIS) report, they positioned this emerging issue on the global health agenda in February 2011 by stating the need to focus and leverage existing assets. In April 2011, the First Global Ministerial on Healthy Lifestyles and Noncommunicable Diseases convened in Moscow to galvanize support and provide policy guidance for the forthcoming UN High-Level Meeting on NCDs in September 2011. Subsequently, WHO drafted the ‘Moscow Declaration’ placing itself at the global epicenter of NCD prevention and control.

More than a year has passed and the upward trajectory of action hasn’t been perceptible.  The CGD looked at this stagnation and asked if was caused by the bad economic climate or lack of political attention? The answer is, neither. Rather, those advocating NCDs can be said to have based their assessments on the Columbus Effect: the donor community’s “discovery” of NCDs.  Donors are giving attention to them as if NCDs are akin to a newly discovered continent. However, reliable sources identified their emergence decades ago, and developing countries themselves invest heavily in building and operating hospital systems to address them.

Since 1984, the World Bank has reported that developing countries were expending the majority of their national resources in hospital-based services, largely for patients above the age of 15 years with chronic conditions. Many of these facilities have earned the highly reputable Joint Commission International (JCI) which has accredited organizations in 39 countries and in over 300 public and private health care facilities. Most are in aid-assisted countries, e.g., 45 are in Turkey; 25 in Brazil; 17 in India. Others with more than 2 facilities are located in Bangladesh, China, Costa Rica, Ecuador, Egypt, Ethiopia, Indonesia, Jordan, South Korea, Lebanon, Malaysia, Mexico, Pakistan, the Philippines, Thailand, Viet Nam and Yemen.

Developing countries have also become health care attractions for citizens of donor countries. Medical tourism is a major multi-billion dollar industry in Malaysia, India,

Thailandeach one an aid-assisted country. The largest, the Apollo Hospitals, is based in India and it frequently collaborates with Johns Hopkins International. Medical tourism is the largest service sector with estimated revenue of $35 billion, constituting 5.2% of India’s GNP, and employing 4 million people. By the end of 2012, it is expected to grow at 15% per annum, with revenues of $78.6 billion, reaching 6.1% of GDP, and employing 9 million people.

While donors were pouring more resources into communicable diseases, they failed to notice that recipient countries were expending large portions of national health allocations in hospitals for chronic care. In 1984, the World Bank records that Malawi was spending 81% of its total public recurrent health expenditures on hospital care; it was 75% in Jordan; 74% in Lesotho; 73% in Kenya; 72% in Jamaica; 71% in the Philippines; 71% in Sri Lanka; 70% in Somalia; 68% in Brazil; 67% in Colombia; 54% in Zimbabwe. Of the hospital expenditures listed above for these countries, “all use at least 70% of their national health resources on adult and elderly patients”.

While developing countries have long been stepping up to the plate in combating these diseases, in contrast the WHO has recommended that the donor response to NCDs be limited to only four diseases: CVDs, cancer, diabetes, and upper respiratory diseases.

These limitations are contrary to the extant clinical standards of most developing countries which have Constitutional guarantees to open and free access on healthcare. Most importantly, they reflect donor priorities, attempting to force-fit them into those already taking place by the countries themselves. They represent the values of ‘discoverers’ ring-fencing their newly found possessions around indigenous institutions.

In the end hospitals will continue to absorb the largest share of national health expenditures, independently of anything the donor health community will do with its recent ‘discovery’ of these diseases. If WHO is successful in guiding donor support for NCDs, then it will have to post this notice in public hospitals:

If you have a CVD, cancer, diabetes, or an upper respiratory infection, with one of the four designated risk factors, welcome! Otherwise, please move on to one of our nation’s local hospitals which offer comprehensive NCD prevention, care and treatment.

Such a policy outcome from the WHO recommendations has no clear or fair rationale. It is unlikely to resonate with the professional medical societies in the developing world which, in most countries, are under the purview of ministries of higher education and provide clinical staffing for NCDs to their major hospitals. Perhaps NCDs would not have blown away, had collaborative strategies been discussed at first ‘discovery’ by donors.