Making Patients Pay
U.S. Health System Puts Profits First
This article is from the May/June 2001 issue of Dollars & Sense: The Magazine of Economic Justice available at http://www.dollarsandsense.org/archives/2001/0501frank.html
at a 30% discount.
This article is from the May/June 2001 issue of Dollars & Sense magazine.
When Cynthia Herdrich visited her primary-care physician complaining of abdominal pain, the doctor scheduled an ultrasound for eight days later and sent her home. Shortly thereafter, Herdrich's appendix ruptured, causing a life-threatening infection and requiring emergency surgery. Only during the subsequent malpractice suit did Herdrich discover that her insurer, Carle Care HMO of Urbana, Illinois, instructed its doctors to delay, as a matter of course, diagnostic tests for seven days in the hope that either the symptoms or the patient would go away. Physicians who saved the plan money received year-end bonuses.
Herdrich's case, which became national news when the Supreme Court ruled that she could not sue the health maintenance organization (HMO), exemplified everything Americans have come to hate and fear about the medical system: the greedy insurers concerned more with saving money than saving lives; the deceitful providers secretly committed to withholding care to boost their own incomes; the maddening laws allowing insurance companies to dictate medical decisions, but shielding them from legal liability when things go wrong. Above all, though, Herdrich's case—in which the HMO allowed a serious but uncomplicated condition to progress to a dangerous situation requiring lengthy hospitalization and extensive surgery—highlights the sheer wastefulness of managed care.
The Bad Old Days of Fee-For-Service Medicine
It wasn't supposed to be like this. When large employers began contracting with managed-care firms (insurance plans that "manage" patient treatment, usually by requiring that procedures be ordered only through a primary-care doctor and pre-approved by the insurer) in the late 1980s, boosters argued that the HMOs would restrain costs and wring waste out of the medical system by emphasizing preventive care and proven remedies.
Few denied that the system needed fixing. By the end of the 1980s, the U.S. health-care system was the world's most costly, even though millions of Americans went without medical care altogether. Most people obtained medical insurance privately through their employer. Medicare and Medicaid, federal programs enacted in 1965, covered the elderly and the very poor. Until recently, private insurance plans, as well as the federal health programs, were organized as fee-for-service plans. Consumers chose a doctor, the doctor prescribed treatments and set a fee, and the insurer paid the bills.
Under the fee-for-service system, physicians were quick to recognize that their incomes rose directly with the number of services they performed. American doctors performed invasive tests and procedures at rates far exceeding international norms. Caesarean sections, rare in Europe, accounted by 1990 for nearly a quarter of all births in the United States. American gastroenterologists treated ulcers with invasive tests, ineffectual antacids, and frequent office visits long after doctors abroad had shown that most ulcers were easily treatable with antibiotics. U.S. surgeons performed hysterectomies and tonsillectomies at rates far above their international counterparts. In the 1980s, John Wennberg, author of the annual Dartmouth Atlas of Health Care, found that in some regions, two-thirds of all children had had their tonsils removed.
"Kickbacks" for diagnostic tests and referrals encouraged frequent and extensive testing. Physicians often sent patients for X-rays and other tests at facilities they themselves owned. In 1989, Congress directed Medicare to withhold payment for tests using equipment owned by the referring doctor, but such self-referrals continued to plague the industry, driving up costs and feeding a widespread disgust with the medical profession.
From 1960 to 1990, increases in physicians' incomes outpaced inflation virtually every year and overall health-care expenditures rose two to three times faster than the nation's income. The average medical doctor in the United States earns $200,000—seven times the average annual wage. Doctors' groups like the American Medical Association (AMA) opposed virtually every effort to reform the system. In a 1993 survey commissioned by the AMA, 44% of those polled agreed that "doctors think they are better than other people," 69% said doctors were too interested in money, and 67% believed doctors' fees were too high. Seventy percent said they were beginning to "lose faith in doctors."
The Managed Care Revolution
By the close of the 1980s, medical-insurance premiums were rising by 15% per year. Critics noted that the United States spent more money per person on health care than any other country, yet in many cases fared worse on measures of health, like life expectancy and infant mortality rates. Over 40 million U.S. residents, most of them working people, have no medical coverage at all because their employers do not provide coverage and they cannot afford to pay the skyrocketing premiums. Clinton's 1992 election campaign promised health-care reform. With the defeat of the Clinton health plan, however, "market forces" swept through the industry, hawking managed care to employers who were incensed with the soaring cost of medical insurance.
Employers balked at a system under which doctors raised their fees, insurers raised their premiums, and businesses paid the freight. Many firms cut employee health coverage altogether or passed costs on to their employees, but this strategy had limitations. Federal legislation generally requires that firms provide fringe benefits like health insurance to all employees—executives and factory operatives alike—or to none. Furthermore, employment-based medical coverage tied workers to their jobs. Despite spiraling costs, employer organizations like the National Association of Manufacturers consistently oppose national health programs, even as they object to the cost of private insurance.
As a group, employers tend to place great faith in profit-driven enterprise to solve social problems, so they were particularly receptive to promises by insurers that managed care could trim the fat from the medical system. In 1985, fewer than one in ten U.S. residents was enrolled in a managed-care plan. Today 90% of those with employment-based health insurance have managed-care plans. Also known as HMOs, managed-care firms originated as non-profit pre-paid plans maintaining their own clinics and salaried physicians. Today, most HMOs are operated by for-profit insurance firms like Aetna, whose US Healthcare HMO enrolls 10% of HMO members nationally.
The appeal of managed care to employers is simple. The HMO contracts with a group (or "network") of physicians, diagnostic labs, and hospitals to provide care for all of its members. Providers in the network, guaranteed the business of the HMO members, agree to accept sharply discounted reimbursement rates. HMO members are generally not allowed to visit providers outside the network. In addition, members must obtain their primary-care doctor's consent before using any other provider—even emergency rooms. The HMO saves money both by negotiating discounted fees and by restricting the services that members use.
Doctors, labs, and hospitals need insured patients, but insurers also need providers. To strengthen their respective bargaining positions, both providers and insurers are merging, combining, and reorganizing at a furious pace. Insurers were first to recognize the advantages of concentration. The early 1990s saw a wave of mergers and acquisitions among health insurers that left large regions of the country with only two or three competing health plans. Their superior bargaining power allowed insurers to negotiate sharp reductions in fees, which were passed on to employers in the form of lower premiums. In 1994, the average health-insurance premium fell for the first time in years; premiums increased at or below the inflation rate for the rest of the 1990s.
Hospitals, facing lower reimbursement rates, cut staff and beds for traditional inpatient care while expanding facilities for expensive services like outpatient surgery. Still, hospitals throughout the country suffered operating losses. Large urban hospitals in low-income areas were especially hard-hit, because federal law requires hospitals to treat anyone who shows up at the emergency room. For-profit hospital chains—promising deep pockets, access to capital markets, and the market clout to bargain with insurers—moved in quickly, buying up scores of non-profit community hospitals.
Founded in 1988, HCA Healthgroup, the nation's largest for-profit chain, now owns around 300 hospitals. Tenet Healthcare, owner of 110 hospitals in 17 states, has recently gone on an acquisition spree, targeting hospitals, says the Wall Street Journal, "in financial trouble, a plight that characterizes more than one-third of the nation's 5000 hospitals."
In many cases, the sale of non-profit hospitals enriches the physicians and hospital board members who broker the deals. David Himmelstein, of the Ad Hoc Committee to Defend Health Care, says "it's not clear to what extent for-profit hospital conversions are driven by economic factors like access to capital markets and to what extent the managers of non-profits are virtually bribed to convert and get a huge payoff."
Where hospitals have remained non-profit, they have merged and combined with physician groups and laboratories to form powerful negotiating blocs. Partners Healthcare of Massachusetts, for example, includes hospitals, diagnostic labs, and physician groups that together treat around one-quarter of Boston-area patients. Last fall, Partners flexed its considerable muscle by announcing that its providers would no longer treat members of Tufts HMO—which itself enrolls nearly 20% of the state's insured—unless Tufts substantially raised reimbursement rates. The HMO swiftly capitulated. Himmelstein points out that, under pressure from insurers, non-profit hospitals "over the long-term have to duplicate what the for-profits do."
The solo physician practice and free-standing community hospital are rapidly vanishing, replaced by behemoths like Partners. Managed care has also spawned a new breed of for-profit firms like PhyCor, Inc., a national chain of physician groups that, in return for a management fee of 10-15%, negotiates fees and handles insurance paperwork for the doctors.
Consolidation and integration riddle the medical industry with conflicts of interest. HCA is under federal investigation for investing in physician practices in exchange for patient referrals, a violation of Medicare anti-kickback rules. Pharmaceutical giant AstraZeneca, a manufacturer of chemotherapy drugs, owns Salick Health Care, which runs cancer treatment clinics throughout the country.
Analysts estimate that managed-care insurers spend 80-85% of premiums reimbursing providers. The remaining 15-20% covers the insurers' administrative and marketing costs and, of course, their profits. When we add the profits of drug companies, hospitals, labs, and physician groups, the costs of marketing lucrative out-of-pocket services, and the costs of sending bills and processing referral forms—it is likely that as much as one-third of "health-care" spending is skimmed off as profit or as sheer waste. This may explain why Americans spend nearly twice as much per capita on health care as the Japanese yet live, on average, four years less.
In the ideal of managed care, advanced by health-care economists like Alain Enthoven and Paul Ellwood, cost-conscious payers were supposed to work with primary care doctors to design cost-effective treatment plans, avoiding the over-treatment endemic to fee-for-service medicine. Managed-care companies would have the authority and the incentive to practice preventive medicine, encouraging members to join health clubs, quit smoking, and get regular check-ups. Primary-care doctors, long denigrated by a profession that exalted specialists, would be newly empowered to resist unnecessary treatments.
But pressure to cut expenditures instead thrust primary-care physicians, with little preparation, into a financial and bureaucratic morass. Having agreed to lower payments and facing mounting administrative costs, physicians simply cut the one variable still in their control—time spent with patients. The average length of an office visit, never a leisurely affair, is now just around 10 minutes. Time spent overseeing complex cases, not to mention checking that people received routine care or tests, never materialized. Studies indicate that primary and preventive care have deteriorated under managed care. Commenting in his most recent survey of the U.S. health system, John Wennberg complained that "there are no systems in place for doctors to take care of patient populations … primary care is chaotically organized."
A recent study for the health policy journal Milbank Quarterly concluded that "whether the care is preventive, acute or chronic, [the U.S. health-care system] frequently does not meet professional standards."
Even well-organized preventive care cannot overcome the reality that the medical industry basically exists to care for the sick, not the healthy. Two-thirds of all medical expenses are incurred by 10% of the population. In every age group, 1% of the population accounts for about 30% of expenditures. Cutting expenditures inevitably entails reducing what the industry euphemistically calls "medical losses"—payments to treat the sick. Insurers have thus introduced rules intended to discourage treatment. Some are explicit, limiting hospitalization after an uncomplicated birth to 24 hours, or delaying diagnostic tests except in medical emergencies. More often, though, insurers impede care by requiring doctors to get authorization for all tests and procedures or by threatening to drop doctors whose costs they deem excessive.
Most controversial are reimbursement agreements that place the economic interests of physicians in direct conflict with their responsibility to patients. Many insurers pay doctors a fixed annual fee supplemented by a year-end bonus if the physician "spends" less than the average on patient care. Others employ "capitation" agreements, under which primary-care physicians receive a fixed fee for each covered member. Because the annual payment represents a budget that must cover all the expenses associated with those plan members, including office visits, diagnostic tests, and treatment by specialists, capitation shifts the financial risk of treating very sick patients from the insurer to the physician. One or two complex cases, if fully treated, can substantially cut into a practice's income.
Many insurers bar participating physicians from disclosing the terms of their contracts, either privately to patients or in public venues. In 1995, David Himmelstein, well-known critic of for-profit HMOs, was famously dropped from US Healthcare after criticizing the HMO on the Phil Donahue Show. Subsequently, Massachusetts and a few other states passed legislation prohibiting such "gag clauses" in HMO contracts, but the practice remains legal in most states.
Last fall, newspapers reported that urban emergency rooms across the country were turning ambulances away at record rates, the fallout from several years of cutbacks in hospital staff and facilities. Non-profit hospitals, having discounted rates to secure the business of managed-care firms, are under intense pressure to cut costs and trim staff, just like the for-profit chains. In a move echoed by strapped non-profits across the country, Beth Israel Deaconess Hospital of Boston, a Harvard Medical School teaching hospital, announced plans to partner with a pharmaceutical company that would exploit discoveries by the hospital research staff. While conceding the potential for conflicts of interest, hospital officials insisted that they had to find some source of revenue to stanch annual losses in the hundreds of millions of dollars.
Defenders of for-profit medicine contend that the marketplace is working well. Managed care, they point out, did succeed in restraining medical inflation during much of the 1990s. And while critics maintain that cost reductions have come at the expense of patient care, industry defenders counter that offices established recently in several states to investigate consumer complaints about managed care are languishing for lack of business. Spokespeople for managed care point to surveys in which 80% of those polled express satisfaction with their health-care plan. They insist that the industry is getting a bum rap—that criticisms are based more on perception than reality.
But in a sensitive field like medicine, perception and reality cannot be so neatly separated. A person who is sick or injured has neither the knowledge nor the emotional distance to function as a rational consumer. Patients must rely absolutely on the disinterest and integrity of those who make decisions about their care. When the quest for profit pervades medical care, whether driven by physician-entrepreneurs or by managed-care shareholders, it so poisons public trust that even reasonable treatment restrictions (like limiting use of experimental bone-marrow transplants) invite cynicism and lawsuits.
An ideal health-care system would certainly require rationing of care. America probably devotes too many resources on heroic end-of-life interventions and not enough on long-term and rehabilitative care, too much on designer drugs and too little on childhood vaccinations, too much on the rich and too little on the uninsured. How to allocate scarce medical resources is a topic worthy of debate, but that debate needs to take place in public, not in corporate boardrooms and secret negotiations between insurers and providers.