As I mentioned in my last post, I recently completed a certification in health information technology (HIT) and was struck by how many of the reform principles being used in the broader healthcare system could be applied to clinical trials. In that post I discussed why it is important that clinical trials systems be designed to “run downhill.”
Now I want to explore the trend of a systems approach in healthcare improvement and the extremely positive impact this approach has had in healthcare. In addition, I’ll discuss the great potential this approach could have for clinical operations, specifically in terms of patient recruitment.
Read on to discover how a simple change in perspective is improving the healthcare system and how that same perspective can improve clinical operations and patient recruitment.
The Report that Rocked Healthcare
In 1999 the Institute of Medicine published a report called To Err is Human, breaking the silence surrounding medical error and prompting self-examination by the US healthcare industry. This report found that as many as 98,000 people die from medical errors that occur in US hospitals, a figure that outpaces deaths from auto accidents, breast cancer, or AIDS.
Prior to this report, the US healthcare system often operated under the assumption that doctors, nurses and other healthcare professionals do not make mistakes. So when healthcare professionals inevitably did make mistakes, these mistakes were viewed as purely personal failings deserving of blame and shame.
The Institute of Medicine’s report challenged this assumption.
The Institute of Medicine asserted that the problem was not bad people in healthcare but good people working in bad systems that needed improvement. This view of the healthcare system required that professionals tasked with improving that system shift their thinking in two important ways.
Acceptance of Fallibility
First, the Institute of Medicine’s perspective required acceptance of human fallibility. And this acceptance necessitated a new assumption about processes, namely that they would go wrong rather than right. This view advocates the active prevention of mistakes.
As an example, consider the act of baby proofing a house. You actively assess objects in your house, determine potential hazards, and remove those hazards where possible. Healthcare hazards should be proactively identified and removed in the same way.
Change in Attitude About Blame
Second, the Institute of Medicine advocated that we point the finger at systems rather than people when errors occur. By placing blame on people, we limit learning, increase likelihood of repeat errors, and drive self-reporting underground. The result is that the same errors are needlessly repeated. Conversely, a culture of “no blame” allows people to learn from mistakes and take action to prevent similar mistakes from occurring again.
But sometimes blame is necessary. The patient safety community has since recognized the importance of individual accountability and shifted from advocacy of no blame to that of a “just culture.” A just culture balances a systems-based perspective with the need for individual accountability. In a just culture, “honest mistakes” are treated differently than acts of recklessness or negligence.
A Systems-Based Approach in Action
Dr. Peter Pronovost, an intensive care physician at Johns Hopkins Hospital, applied a systems approach to hospital infections with astounding success. He challenged the notion that infections were an inevitable cost of treating sick patients and set out to prevent infections. After closely studying hospital infections, he proposed a system to reduce them.
Dr. Pronovost instituted his system in Michigan ICUs, and infection rates decreased by 66%, saving an estimated 1500 lives and $100 million in 18 months. Time named Dr. Pronovost one of the 100 most influential people in the world, and he was awarded a MacArthur Fellowship as a result of his work.
Not Just for Healthcare
A simpler example outside of healthcare is ATMs. Perhaps you remember the old ATMs that would keep your card until your transaction had completed. Being the fallible humans that we are, people often collected their money and walked away before the ATM spit out their card.
Banks could have just had the attitude that their patrons needed to try harder to remember and perhaps put up a sign reminding patrons about their card. In many instances, this kind of response is what we see in clinical research. For example, I’m sure any site staff or CRAs reading this post are very familiar with the phrase “re-educate the patient.”
Instead, banks took a systems approach. Most all ATMs (at least in the US) now require you to swipe your ATM card. Since the card never leaves your hand, it’s far more difficult to leave behind.
Another industry that uses a systems approach for error prevention is aviation, which by the way, has served as a model for many of the safety reforms we are seeing in healthcare.
A Systems Approach to Clinical Trials
If Dr. Pronovost can have such spectacular results by using a systems approach to tackle hospital infections, imagine what results that same approach could have in clinical trials, not just in reducing error but also in improving efficiency.
Just as healthcare professionals working in the broader healthcare system are not perfect, site staff are not perfect either. As humans are prone to do, they will make mistakes and not always perform at their best. The majority of site staff care about their work, but they are working in a system that needs improvement. And they should be empowered to do good work by systems that facilitate their success.
Rather than assuming that trials will go exactly as planned, we need to assume that things will go wrong (as they often do). With that assumption in place, we can focus on areas that are likely to be problematic and improve these flawed systems.
Certainly, sites should also be accountable. And I have no doubt that sites who perform poorly will be accountable, namely through a lack of repeat business. But I’d like to see systems that help those who are truly trying to do a good job.
A systems approach could be useful in a variety of areas, including patient compliance, EDC, document management, protocol design, and more. But, not surprisingly, my interest is primarily in how such an approach could improve patient recruitment.
A Systems Approach to Patient Recruitment
Though the exact figures vary, I think we can all agree that far too many research sites struggle with patient recruitment. Here’s a couple of statistics taken from the Center for Information and Study on Clinical Research Participation (CISCRP) website:
- “Fifty percent of clinical research sites enroll one or no patients in their studies.”
- “…70% of the research sites under-perform, and somewhere between 15%-20% never even enroll a single patient.”
I often see the word “underperformer” or “laggard” used to describe sites that do not meet sponsor expectations. In some cases, that’s an accurate characterization. For instance, I think it’s accurate to describe the bottom 20% of enrollers as underperformers. But if 70% of sites are not meeting sponsor expectations, those sites are not underperformers. They are the norm.
This distinction might seem semantic, but when viewed from a systems-based perspective, it’s extremely important.
If the majority of sites are not meeting expectations, that’s a big indication that it’s time to take a serious look at our patient recruitment systems. Simply telling sites to do a better job with recruitment is clearly not working. We need to put better systems in place so that more sites can be successful and meet sponsor expectations with regard to patient recruitment.
In a continuation of this post, I explore patient recruitment from a systems perspective. To learn how a simple metaphor can help you meet patient enrollment deadlines, click here.
Comments? I’d love to have sites, sponsors, and CROs chime share their thoughts. Please share your experience below.
Leave a Reply