By Ben Menzies
As the notion of a single-payer healthcare system has returned to center stage in American politics, proponents have pointedly eschewed any element of cost-sharing in their legislative proposals. Cost-sharing provisions require the insured to pay some portion of the cost of services received, often in the form of a fee or copay. Without it, healthcare would be provided at no cost to all patients, regardless of income.
Senator Bernie Sanders’ Medicare For All Act of 2017 is explicit, directing the Secretary of Health and Human Services to “ensure that no cost-sharing, including deductibles, coinsurance, copayments, or similar charges, be imposed on an individual for any benefits provided under this Act” (although the bill does allow for fees of up to $200 per year for certain prescription drugs).
Representative Pramila Jayapal’s recent Medicare For All Act of 2019 goes even further, retaining the explicit prohibition on cost-sharing from Sanders’s bill, but eliminating the exception for prescription drugs. In an interview, Rep. Jayapal indicated she views the exclusion of cost-sharing as an essential element of the bill, arguing, “health care is a human right… so you can’t say it’s a right just for the people who are not wealthy. You have to say that it’s a right for everybody. Everybody should have the right to go to the doctor without that cost sharing.”
Even California’s recent proposal for a state-level single-payer system, the Healthy California Act, explicitly prohibited any cost-sharing despite the significant fiscal cost of such a system for a state famous for its recurring budget nightmares.
Cost-sharing mainly works to reduce consumption by forcing consumers to think twice before accessing care, and evidence suggests this response works for even minor fees on care. The high underlying cost of health care makes it essentially impossible for consumers to pay a significant portion of the cost themselves. This means that cost-sharing can be effective for reducing consumption, but it makes little sense to use cost-sharing as a major source of revenue for the program.
Critics of single-payer often claim such a system could lead to “rationing” of health care, but rationing is already present in our health care system. Perhaps the greatest potential benefit of a single-payer universal national health insurance system is reducing the cost of health care, particularly for those with lower incomes or high medical costs (or both). The high cost of health care, even for those who have health insurance, is a major deterrent for accessing care, and a wealth of research has demonstrated how this deterrent effect harms the health outcomes of those unable to pay or unwilling to risk an unexpected financial disaster. The high cost of health care, therefore, is a form of price-based rationing that enables more affluent people to consume health care more easily relative to others who are less able or willing to pay. It also incentivizes health care providers and insurers to allocate more resources to those most able to pay for them. Even relatively affluent people are not exempt from this rationing, as health care’s high costs can suddenly overshoot other expenses and create significant hardship for all but the wealthiest households. By providing health care at zero cost to the consumer, a single-payer system envisioned by the Medicare For All advocates who exclude cost-sharing would eliminate this form of rationing, seemingly guaranteeing equal access for all.
By removing the deterrent effect of price-based rationing, it seems inevitable that a universally free health care system would incentivize greater consumption of health care among all people. After all, if making health care free failed to increase its consumption, that would suggest little underlying unmet demand for healthcare in the status quo (quite obviously not the case). It is far from obvious, however, that the American health care system can accommodate a significant increase in utilization of health care. Indeed, the American health care system currently suffers from a host of shortages – from qualified nursing personnel to quality hospitals in disadvantaged communities to primary care physicians. These shortages are particularly acute in lower-income communities due to the limited incentive to serve people unable to pay higher costs for services.
However, reducing price-based barriers to accessing these services is not likely to increase the supply of actual resources, making care affordable but still practically inaccessible to many. In a prime example of price-based rationing in American health care, specialist care is particularly inaccessible for people with lower incomes or who live in disadvantaged communities, despite general availability of these services to individuals able to pay for them. Because there is little market incentive to provide specialist care to people with few resources at present, the supply of specialized care remains restricted. To the extent that removal of price-based barriers increases utilization of specialist health care by people currently unwilling or unable to pay those costs, it seems likely that the existing supply of specialist care could be insufficient to meet new demand, especially if people currently able to access specialist care use it at a higher rate following the removal of any cost barriers. Insufficient existing supply combined with increased demand are the standard conditions for a shortage, which leads to long wait times or other administrative barriers to care (also known as rationing). Just like in the current system, individuals with higher incomes are likely to have greater ease in navigating the system, making it likely that shortage-induced rationing will disproportionately harm lower-income people even without an explicit price-based rationing system.
A single-payer system would have advantages in overcoming these challenges compared with the status quo, but a limited cost-sharing system targeted only at relatively affluent households consuming unnecessary care could be an effective tool in managing the transition to a more just health care system. As the primary buyer of health care, the government would have greater flexibility in managing the provision of care using tactics like paying higher rates to providers serving lower-income populations while cutting rates in affluent communities, and the federal government could increase existing efforts to expand the supply of health care by providing subsidies for medical training and education.
Inevitably these actions will require time to take effect, as training new nurses and physicians or constructing new facilities will take years to implement effectively. Unfortunately, the greatest potential for shortages is in the immediate term as people react to newly-free health care by consuming more of it, and the sheer administrative effort required to remake the American healthcare system is almost certain to create unexpected disruptions in supply as the government works out the kinks.
Aside from a pure nationalization of the health care supply (such as the United Kingdom’s National Health Service, something contemplated not even in the most ambitious current American proposals), the only way to manage the risk of this immediate-term shortfall is to control demand, and a modest price signal is the best existing option. Controlling demand requires either explicit rationing, which would be highly controversial and could lead to unappealing outcomes as people are barred from care, or some other mechanism for disincentivizing some individuals from consuming certain forms of care. Targeting the cost-sharing to ensure that only affluent households are subject and pay only modest fees would avoid recreating cost-based inequities in the existing system. Indeed, this kind of price-based rationing targeting only the affluent appears to be the most effective way to prevent existing inequities from simply reproducing themselves in the context of shortages resulting from newly-free care.
Opponents of cost-sharing in single-payer raise some important arguments. Fees, even in much smaller amounts than are currently typical, can create substantial financial hardship on lower-income people, especially those who have significant and/or ongoing health care needs. While policymakers can avoid this outcome by simply exempting households below a certain income from cost-sharing, opponents argue this would imperil the central premise of Medicare For All – its universality. Drawing on the example of Social Security, which has largely withstood attempts to cut its benefits or curtail its universality, some argue that creating a universal benefit owed to all as a “right” (e.g. Rep. Jayapal’s language) gives the benefit greater political appeal by recruiting affluent people into its coalition of beneficiaries. On the other hand, means-tested programs that exclude more-affluent people with greater political power may make easy targets for cuts.
This logic is intuitive, but there is little empirical evidence for it. During the recent debate over repealing the Affordable Care Act, opponents emphasized the proposed cuts to Medicaid in their successful effort to prevent repeal. Medicaid, a means-tested program with strict eligibility criteria that differ from state to state, seems like exactly the type of program that should have been vulnerable to cuts, yet polling has consistently shown strong public support for Medicaid. Research suggests Medicaid’s resilience is not unusual globally, as countries with more universal benefits tend to cut benefits just as much and as often as countries with more targeted benefits. Regardless, even a single-payer system with targeted cost-sharing would leave large numbers of Americans significantly better off, potentially generating a much larger constituency than existing welfare programs, which tend to target only small shares of the population.
Another argument against cost-sharing is that it causes people to reduce their consumption of healthcare regardless of value, resulting in untreated health problems and lower overall quality of life. Evidence suggests this is somewhat true, as most consumers lack the expertise to distinguish necessary from unnecessary care and thus simply cut their consumption across the board to minimize costs. This response may be unavoidable to a certain extent, as cost-sharing is intended to disincentivize consumption and consumers seem unlikely to become savvy medical experts anytime soon, but thoughtful policy design can mitigate it.
For instance, exempting emergency procedures and high value preventive care from cost-sharing altogether could encourage utilization of these services while discouraging unnecessary consumption. Fees could be set low enough to create only a trivial cost, making it unlikely that people will forego important care for long. Even without cost-sharing, people will continue to face barriers to health care (particularly time and expertise in navigating the system) that will have the same effect of forcing inefficient self-rationing. In fact, these non-price barriers to care seem likely to increase without cost-sharing, as utilization should increase waiting times and other forms of non-price rationing.
As single-payer proponents debate the details of the system they will push should a Democrat win the presidency in 2020, it is essential that advocates engage meaningfully with the inherent tradeoffs in designing a universal health care system. Rather than retreating into slogans and theoretical arguments, proponents should consider following the examples of many successful single-payer systems by designing a limited cost-sharing mechanism. If properly designed, such a mechanism can enhance, rather than inhibit, the promise of providing health care to all who need it – especially those most harmed by the current system.
Ben Menzies is a Master of Public Policy candidate at the Goldman School of Public Policy and Editor in Chief of the Berkeley Public Policy Journal.