Menu

As noted last month, prior authorizations can be an important means of identifying clinically inappropriate or duplicative care. They are also increasingly employed as gatekeeping mechanisms to control healthcare spending by limiting utilization. This month, in the second installment of our conversation about prior authorizations, we examine how the U.S. healthcare system has given rise to the use of prior authorizations, particularly in commercial health insurance.

Background

The contradiction has become a cliche in health policy: In study after study comparing healthcare across high income countries, the U.S. consistently ranks lowest in key health metrics, even though the U.S. outspends all other countries whether measured in terms of per capita expenditures or as a percentage of gross domestic product (GDP).

Less frequently discussed is the fact that, despite the nation’s outsized expenditures, physicians in the U.S. are more likely than those in other high-income countries to report that restrictions on coverage resulted in difficulties accessing medication or treatment for patients.

At first glance, such findings support patient advocates and public health officials’ concerns about access to care . When considered in the context of expenditures and outcomes, they point towards another issue—providers frustrated that efforts to control healthcare spending are being made at the expense of their autonomy.

The $4.3 trillion spent annually on healthcare in the U.S. represents over 18 percent of the GDP and it is only expected to grow. The interplay of multiple factors such as new technologies and an aging population continues to propel healthcare spending, which is predicted to top $16 trillion by 2030.

At its most basic, spending is a function of utilization and price. Controlling it requires a change in at least one of these variables. It looks easy enough as a mathematical equation, but reining in spending is a complicated proposition.

Gerard Anderson, a professor of international health at the Johns Hopkins University Bloomberg School of Public Health, has famously and repeatedly argued that prices drive healthcare spending. However, data shows that some healthcare resources are overutilized. As Applied Policy has previously observed, healthcare providers sometimes continue certain treatments and procedures even after they have been disproven by research.

Uncomfortable discussions

Discussions of managing utilization in healthcare can be difficult, occurring, at the intersection of health and money—sometimes having life or death consequences.

The conversation is complicated by the fact that we have come to expect not only access to care, but access to the best of care—or at least the best advertised care. Social media, disease awareness campaigns, and direct-to-consumer advertising of pharmaceuticals have increased public awareness of health conditions. They have also heightened expectations of treatment.

Demand for healthcare is theoretically infinite, but resources are limited.

Prioritization of needs is already a feature of some of our most critical interfaces with medicine. Patients are triaged in emergency departments and the Society of Critical Care Medicine developed guidelines for ICU admission. While we have come to accept such rankings by clinicians, healthcare consumers are less accepting of prioritization by need in non-emergent situations and resistant to prioritization on the basis of cost by non-clinicians.

The purchase and provision of care in the U.S.

The federal government can shape national healthcare spending through legislation, regulation, and the power of the purse. Through Medicare, Medicaid, and the Children’s Health Insurance Program (CHIP) alone, the Centers for Medicare & Medicaid Services (CMS) is the country’s single largest payer for healthcare.

However, purchasing power is not exclusive to government. The majority of Americans with health insurance opt for coverage through private plans, with most enrolling in their employer-sponsored plan. According to the U.S. Census Bureau, employer-based insurance covered 53.3 percent of the U.S. population for at least some of portion of 2021.

This reliance on employer-based insurance is unique to the U.S. among other high-income countries. The model became entrenched during the Second World War when, acting upon authority granted to him under the Stabilization Act of 1942, President Franklin D. Roosevelt prohibited private employers from raising workers’ wages. Competing for workers in a tight labor market, companies recognized that they could entice prospective employees by offering fringe benefits, including health insurance. In the years after the war, the National Labor Relations Board ruled that health insurance was subject to collective bargaining. By the time the Internal Revenue Service confirmed in 1954 that health insurance benefits were not taxable, an American “system” of health insurance had all but been established.

Today, Americans expect to be offered health insurance through their employer. Despite rising costs and the potential to redirect employees to healthcare exchanges under the Affordable Care Act (ACA), employers do not seem inclined to stop offering it. “They trust us to make the decision for them,” one human resources executive explained when interviewed for a recent Commonwealth Fund study on the future of employer-sponsored health insurance.

Collectively, private payers and the employers who contract with them exert a $1.2 trillion influence in national healthcare spending. However, this potential is diffused through competition and opaque pricing. As some have observed, “in mulitpayer systems the relationship between funding and choices is more fluid compared to single-payer systems.”

Limited transparency in healthcare pricing contributes to several problems in healthcare. In insulating patients from the cost of care, opaqueness in pricing raises the risk of moral hazard. It also adds to siloes in the healthcare sector. While a physician contracted with a given healthcare plan may chafe at its requirement for a prior authorization before a service or procedure, they often will not know what reimbursement rates the plan has negotiated with the local hospital or imaging facility where the service will be rendered.

In fact, the RAND Corporation found that private health plans in the U.S. paid hospitals 247 percent of the amount Medicare would have paid for the same services in 2020. These additional costs are passed on to the employers and individuals who purchase insurance, calling into question the “efficiency of the employer market.”

While in office, President Trump issued an executive order aimed at increasing transparency in healthcare pricing. The two rules which resulted from this action were specific to health plans and hospitals, with the Trump Administration touting their importance in empowering consumers and increasing competition—a hallmark of free market healthcare models.

Not everyone supports government efforts to compel price transparency. AHIP, a trade group representing health insurance plans, argues that “(P)ublicly disclosing competitively negotiated, proprietary rates will reduce competition and push prices higher – not lower – for consumers, patients, and taxpayers.” While noting that its members are committed to providing patients with information, the American Hospital Association contends the complex nature of “hospital pricing and rate negotiations does not translate easily into a single, fixed rate per service.”

Cost-effectiveness and value

Of course, it is not merely a matter of prices. Inherent in any economic exchange is the expectation of value: Are the goods and services being purchased worth the money being spent?

In healthcare, this begins with an assessment of whether drugs and treatments are efficacious. While the Food and Drug Administration (FDA) can conclude that a drug is safe and effective, it does not make pricing decisions. And the random controlled trials (RCTs) conducted in drug or device development are of comparatively short duration, whereas a longer time horizon is typically needed to assess cost-effectiveness and, ultimately, value.

The Centers for Disease Control and Prevention describes cost effectiveness analysis as the comparison of “an intervention to another intervention (or the status quo) by estimating how much it costs to gain a unit of a health outcome, like a life year gained or a death prevented.” It can be objectively calculated.

Value, on the other hand—and as understood in an economic context—is a measure of how much a consumer is willing to pay for a good or service, or, for the purposes of this discussion, a drug or treatment.

Determinations of value in healthcare spending are complicated by the unique nature of healthcare purchasing. Unlike consumers in other sectors, patients do not bear the full cost of the care they receive. The use of prior authorizations stems from the assumption that the health insurance company, as payer, is positioned to assess the value of care outside of the patient/provider relationship.

Commercial payers make these assessments within the context of regulatory compliance and their own financial interests. The ACA mandates that insurance plans must cover a comprehensive set of essential health benefits, limiting insurers’ ability to tailor coverage to specific consumer needs. It also limits what health insurance companies can pay in administrative costs or keep as profits.  Under the ACA, companies are required to spend at least 80 percent or 85 percent of premium dollars on medical care. This “80/20” rule or Medical Loss Ratio (MLR) specifies that companies failing to meet this threshold must provide a rebate to enrollees.

Investors in for-profit insurance companies expect to realize a profit. To reconcile provision of patient care with investor interests, insurers undertake a careful balancing of financial management and risk assessment.

The provision of insurance in any sector is never a blind gamble, and insurers have always used actuarial assessments in evaluating the risks and costs associated with extending coverage. In healthcare, analysis of demographic information, historical claims data, and health trends allows insurers to project future expenses. Since the 1980s, insurers have also employed utilization management tools, including prior authorizations, to shape the provision of care to align with spending projections.

A 1989 report from the Institute of Medicine noted that the rapid increase in the use of utilization management strategies, including prior authorizations, in the 1980s reflected “purchasers’ dismay over continuing rapid rises in health care costs and their perception that much care is unnecessary.”

Over thirty years later, the premise of the argument is unchanged. However, the context has become more complex.

Consider, for example, that the Congressional Budget Office found that per capita use of prescription medicines, the average cost of individual drugs, and spending on prescription drugs as a percentage of national healthcare expenditures all rose between 1980 and 2018. But, cost was not the sole focus of the agency’s report. It also emphasized that these increases came with “myriad health benefits” for patients, had helped to reduce spending on services from hospitals and physicians, and “improved the lives of those with chronic conditions and have also extended life.”

Conclusion

Emerging technologies and pharmaceutical advancements are regularly expanding the horizons of medicine. And each scientific innovation carries the potential to impact both patient lives and national healthcare spending.

The debate over the use of prior authorizations will likely intensify as physicians seek to provide patients with the highest standard of care and payers seek to limit their expenses in a competitive market model.

********

APPLIED POLICY WILL CONTINUE ITS EXAMINATION OF PRIOR AUTHORIZATIONS IN THE COMING MONTHS.

  • In June: Prior authorizations in Medicare Advantage and Medicaid programs
  • In July: Prior authorizations, patients, and health equity