May 2015, Vol 8, No 3 - Editorial
David B. Nash, MD, MBA
Editor-in-Chief
American Health & Drug Benefits
Founding Dean
Jefferson College of Population Health
Philadelphia, PA
Download PDF

Much of the focus regarding the implementation of the Affordable Care Act (ACA) has centered on the success (or failure) of healthcare.gov, but other important components of the wide-ranging bill had garnered little attention. As a result, I am forced to ask the following question—“Did the HEN [Hospital Engagement Network] lay a rotten egg?”

In April 2011, “the US Department of Health and Human Services (HHS) joined leaders representing hospitals, employers, health plans, physicians, nurses, and other health professionals, patient advocates, and State and Federal governments to launch the Partnership for Patients, a nationwide public-private initiative to keep patients from being harmed in hospitals and heal without complication.”1 This Partnership for Patients Program (PPP) was created by the Centers for Medicare & Medicaid Services (CMS) through its Innovation Center, based on the direct authority given to CMS via the ACA.

Central to this “partnership” are the 27 HENs. “The HENs work at the regional, state, national, or hospital system level to help identify solutions already working and disseminate them to other hospitals and providers.”1 Among the 27 HENs are groups such as the American Hospital Association at the national level, the Voluntary Hospital Association of America (a group that Jefferson School of Population Health works with very closely), and the Hospital & Healthsystem Association of Pennsylvania (HAP).

Sounds good, doesn’t it? The HENs represent more than 3700 hospitals, and their stated primary goals in the HHS/CMS report were to reduce preventable, hospital-­acquired conditions by 40%, and 30-day readmissions by 20% between 2010 and 2014.1 How did this work? What were the results of this so-called partnership?

How to interpret the results of this report depends on whom you believe, and hence my question at the opening of this editorial. According to the HHS/CMS report, the HEN has been a rousing success!1 For example, the report from CMS (and HHS) claims that obstetric trauma rates have decreased by 15.8% and early elective deliveries have been reduced by 64.5%.1 In addition, “the all-cause 30-day hospital readmission rate among Medicare fee-for-service beneficiaries plummeted further to approximately 17.5 percent in 2013, translating into an estimated 150,000 fewer hospital readmissions between January 2012 and December 2013. This represents an 8 percent reduction in the Medicare fee-for-service all-cause 30-day readmissions rate.”1

At his presentation at the Jefferson School of Population Health monthly Population Health Forum in December 2014, Michael Conseulos, MD, MBA, Senior Vice President for Clinical Integration at HAP, extolled the virtues of Pennsylvania’s participation in the HEN.2 Although HAP is appropriately proud to participate in the HEN, the data show that across multiple dimensions of healthcare performance, including access, affordability, prevention, and equity, the state of Pennsylvania fell from number 14 nationwide in 2009 to number 22 in 2014.2

In a 2014 perspective published in the New England Journal of Medicine, 2 nationally regarded experts, Peter Pronovost, MD, PhD, Director, Armstrong Institute for Patient Safety and Quality, Johns Hopkins Medicine, and Ashish K. Jha, MD, MPH, Director, Harvard Global Health Institute, noted that “the PPP’s weak study design and methods, combined with a lack of transparency and rigor in evaluation, make it difficult to determine whether the program improved care. Such deficiencies result in a failure to learn from improvement efforts and stifle progress toward a safer, more effective health care system.”3

Pronovost and Jha go on to severely criticize the HEN program, noting that “CMS allowed each HEN to define its own performance measures, with little focus on data quality control….CMS also required HENs and participating hospitals to submit a large number of process measures of unknown validity….Finally, CMS made—and presented publicly—inferences about its program’s benefits without having subjected its work to independent evaluation or peer review.”3

From a public policy perspective, we surely have a conundrum here. The information provided in the HHS/CMS report claims dramatic improvements in hospital performance in the 4-year period between 2010 and 2014.1 Did this nearly $1-billion investment,3 all because of the ACA, actually improve care? (It should also be noted that this $1 billion represents 3 times the annual budget of the Agency for Healthcare Research and Quality,3 the scientific home of most research in quality and safety in our country.)

Pronovost and Jha believe that there were well-­regarded research alternatives available to CMS that might have solidified the results and given greater confidence to researchers in our field. For example, Pronovost and Jha called for an interrupted time-series study design with concurrent controls, “rather than having a single pre time period and a single post time period”3 design. More important, however, Pronovost and Jha noted that “failing to attend closely to issues of design, methods, and metrics leaves us with little confidence in an intervention. For the PPP, which required thousands of hours of clinicians’ time and large sums of money, that lack of confidence is particularly unfortunate.”3

So, did the HEN lay a rotten egg? On the one hand, states and major national organizations representing thousands of hospitals working together to improve clinical outcomes sure sounds good to me. On the other hand, the critique by Pronovost and Jha is nearly damning, and calls into serious question the results of this very important, publicly funded program. Where does this leave us?

We need a national conversation about real advances in the quality and safety of the care we deliver. We need a more transparent and robust funding stream, with better peer review, to reduce mistakes and improve outcomes. If we fail, that rotten egg smell will continue to linger in the air.

As always, I remain interested in your views about the quality of research on quality. You can reach me at This email address is being protected from spambots. You need JavaScript enabled to view it..

References
1. US Department of Health & Human Services. Centers for Medicare & Medicaid Services. New HHS data shows major strides made in patient safety, leading to improved care and savings. May 7, 2014. http://innovation.cms.gov/Files/reports/patient-safety-results.pdf. Accessed April 2, 2015.
2. Conseulos MJ. Transforming healthcare in Pennsylvania: preparing for the future. Presented at the Jefferson School of Population Health Forum; December 10, 2014. http://jdc.jefferson.edu/hpforum/87/.
3. Pronovost P, Jha AK. Did hospital engagement networks actually improve care? N Engl J Med. 2014;371:691-693.

Related Items
Health Inequities in America
David B. Nash, MD, MBA
September 2017 Vol 10, No 6 published on September 20, 2017 in Editorial
Commencement 2017
David B. Nash, MD, MBA
July 2017 Vol 10, No 5 published on July 24, 2017 in Editorial
“Sunshine Is the Best Disinfectant”
David B. Nash, MD, MBA
June 2017 Vol 10, No 4 published on June 22, 2017 in Editorial
Guru Insights
David B. Nash, MD, MBA
May 2017 Vol 10, No 3 published on May 16, 2017 in Editorial
Reflections, Predictions, and Admonitions
David B. Nash, MD, MBA
April 2017 Vol 10, No 2 published on April 18, 2017 in Editorial
Last modified: June 4, 2015
  •  Association for Value-Based Cancer Care
  • Value-Based Cancer Care
  • Value-Based Care in Rheumatology
  • Oncology Practice Management
  • Rheumatology Practice Management
  • Urology Practice Management
  • Inside Patient Care: Pharmacy & Clinic
  • Lynx CME