Friday, August 22, 2008

The PDSA Cycle: Spinning in the right direction


Last week we looked at the SOAP format as an effective but under utilized care improvement tool at the level of the individual patient. Let’s now look at the PDSA cycle, a systems improvement tool that underlies most of the complex and designer quality improvement (QI) methodologies that circulate in healthcare (e.g. Six Sigma, Lean, Kaizen, TQM). Like SOAP, the PDSA cycle addresses many of the same “new” concepts that surface in QI, including evidence-based care, continuous and collaborative improvement. When used as it is intended, the results are measurable, significant and sustained. As with SOAP, however, the challenge is to stay on course with the PDSA process and we often overlook key elements, which lead to suboptimal results.

“P” is for plan – the objective of using the PDSA cycle is to identify the critical variable(s) in a system needing improvement and to develop a corrective plan. Rather than implement the plan on a large scale, the corrective plan is executed in a pilot area, a controlled environment where data collection, action and oversight is manageable. Identification of the system in need of improvement and its critical variable(s) are the key to a successful plan. This requires that the PDSA team completely understand and visualize the system being addressed, which is only possible if those using this system, namely frontline staff, are part of the team. Further, since it is the frontline staff that will be directly affected by the proposed systems changes, it is vital for them to be part of the crafting of the plan if the improvement is to be sustained. Finally, including only voluntary frontline staff to participate on a PDSA team is critical; by including only those who are interested in active participation, collaborative effort and a willingness to engage in the next step of the cycle are optimized.

“D” is for do, or action – This is where the team implements the plan in the pilot area. The key element for successful action is team member accountability: NO ACCOUNTABILITY = NO ACTION = NO CHANGE! It is the responsibility of the PDSA team leader(s) and senior leadership to hold team members accountable to the action plan. This is facilitated by making well-defined action items: WHO, WHAT, and BY WHEN? Action is also facilitated by a plan that has been created and agreed upon by the voluntary frontline staff comprising the PDSA team. Action items must be manageable: SMALL ACTIONS TAKEN ALL THE TIME are what create great results. Team members in action must be proactively visible in their efforts so that other staff members in the pilot area are aware of the improvement initiative underway. Not only does this generate overall interest but it also can generate a willingness of staff to assist team members with their action items.

“S” is for study – the results of the prior action taken are analyzed for success and for opportunities to make further improvements. WHAT is measured and HOW it is measured are key (sound familiar?). This data drives the continuous improvement process. Wrong data = wrong direction. The team makes an assessment of the collected information relative to the initial plan; any revisions that might further improve the existing plan are discussed and incorporated before proceeding to the next implementation step.

“A” is for act, or do, or action. In this step the PDSA returns to the pilot area with a revised action plan. Accountability, well-defined action items and proactive visibility again take center stage to bring successful action forward. Data is collected and results are measured so that critical revisions can again be incorporated into the plan in the next step, the “P” at the beginning of the cycle.

The PDSA cycle continues to circle in this fashion until an optimized improvement process has been created in the pilot area. If the PDSA cycle has been successful, not only is there an effective improvement plan but there is also the necessary visibility and interest among other staff members and leadership for broader implementation. This may require the creation of additional teams comprised of voluntary staff and always the continued strong support of leadership.

Food for thought:

- Think about the planning step: We often don’t spend the necessary time and effort in the planning process, nor do we include the right team members. How often do we bring in the expert consultants into our organizations to implement their one-size-fits all improvement plan? How often do our own quality departments assume complete responsibility for improvement initiatives? How often are frontline staff actively included in the process of making the system to be improved visible and understandable? If there is staff participation, is it active, sustained and voluntary? What’s the buy-in of the staff that participate?
- Think about the action step: Absence of accountability plagues healthcare in general. Most improvement initiatives are not voluntary and in an already stressed work environment, participants understandably try to avoid additional tasks. Again, how often are teams voluntary in nature? How many times do teams become gripe committees, all talk and little action? How frequently to we create well-defined action items that are small and manageable? Do staff in improvement areas really understand what is going on, what the goals are?
- Think about the study step: Many times our data collection and results measurement are poor and misdirected. Many times we take short cuts in our reassessment or skip over it completely. How many times do we “jump” to solution? How many times to we let an improvement team disassemble? How many times do the “experts” assume control of the process once the initial plan is underway?

We have seen that both SOAP and the PDSA cycle have inherent weak spots when used individually. What happens when we combine them under an umbrella of universal systems principles? It turns out that we can create a robust model that incorporates all the critical elements and that can serve as a template for any care improvement process. Next week we will look at this combination: The Universal Soap Cycle.

Saturday, August 16, 2008

SOAP: The Forgotten Systems Improvement Tool


David Dibble, my systems-thinking mentor at New Agreements Healthcare, and I were discussing the challenges of making systems improvement accessible to care providers a few weeks ago. Most times underlying system challenges manifest in the form of symptoms, and it takes some digging to uncover what is actually malfunctioning. I was struck by the similar challenges that care providers face when they are taking care of patients, and it occurred to me that we utilize a tool that organizes our care approach in a consistent manner: the SOAP note. Inherent in that four letter acronym is a very powerful, time tested template for ensuring that all the necessary bases are covered to correctly diagnose and treat a patient, irrespective of their condition. In a lot of ways, however, we have forgotten this system improvement power and SOAP has instead simply become a way of formatting a patient note. Many of the concepts that are now surfacing as new ideas in care improvement, such as patient centeredness, being data driven, and taking a team approach to an evidence-based care plan have all been long embedded in this SOAP acronym. Let’s remind ourselves about this by taking the acronym apart.

“S” is for subjective, or symptoms. The entire care process for the patient begins by the care provider listening to the patient describe their complaint. It’s a frontline driven approach (sound familiar?) – the patient guides the care provider to the health problem. An effective care provider listens with respect, without judgment and asks questions to elicit the complete picture from the patient’s point of view. If additional information is required, a family member may be asked to contribute to the history. Not only is the patient heard and able to describe what it’s like to be experiencing the problem, but he/she is integral to identifying the next step: differential diagnosis and data collection.

“O” is for objective, or data collection. As we know, the most important element to data collection is WHAT is measured and HOW it’s measured. The patient’s subjective input and the care provider’s skilled assessment of the described symptoms direct data collection. This includes physical exam, review of SYSTEMS (interesting how that word shows up!), old records, labs, and studies. We all know what happens when the patient data collected is not focused on the critical elements (remember the Critical 20?) – think about the medical student or intern who orders everything “just to cover the bases.” Not only is it expensive and inefficient, but it frequently sends us all off on wild goose chases with unrelated incidental findings – as Deming would say, “off into the Milky Way we go!” Tight data collection based on the Critical 20 makes it much more likely for us to proceed effectively to the next step: assessment and diagnosis.

“A” is for assessment, or likely diagnosis. When the patient and care provider combine the focused symptoms, signs and data further narrowing of the differential diagnosis is usually possible. We arrive at a working diagnosis, which usually results in a treatment plan. Again, the working diagnosis and treatment plan is only as good as the information collected in the preceding two steps. Think about how many times we go down the less optimal path because we overlooked or were too rushed to recognize a critical input.

“P” is for plan, which leads to action. An effective plan incorporates all of the prior steps, resulting in clear action and improvement to the patient’s condition. An effective plan requires excellent communication, interdisciplinary team work and patient input (if the patient isn’t part of this, implementing a sustainable improvement is not likely to occur! Remember, change is easy until we’re directly affected by it – nobody is closer to the change than the patient).

Once the plan is activated, effective care providers will ideally begin a SOAP "loop" to evaluate and to modify the care process as appropriate. Thus, on a regular basis we check in with the patient to see how they’re doing with the treatment (“S”), we do a physical exam, check labs and studies (“O”), we re-assess our care path (“A”) and continue or modify our plan as necessary (“P”). As the major issues are resolved, we evaluate the patient for any other treatment interventions that might present as important. If we’ve done our job, the patient improves and the patient continues with the care plan or makes lifestyle adjustments such that he/she remains well (sustained improvement).

Food for thought:
- Think about how easy it is for us to cut patients off when they describe their symptoms. Not only are they frequently frustrated with their care providers and disenfranchised from the care process, but we frequently head down the suboptimal care path by jumping to solution. Rather than taking the necessary time to be thorough, we end up course correcting repeatedly in our data collection and treatments, which is an incredible waste of time and resources. Patient and staff satisfaction plummet.
- Think about how we deliver care in silos – every specialist to themselves. Data isn’t shared, frequently it conflicts and the result is a patient who has no idea what’s going on and a care process that’s suboptimal. We can be almost certain that our care interventions will not be sustained when patients are on their own. Instead they continue to cycle through healthcare with the same issues.
- Think about how disjointed we cycle through the SOAP process to make the necessary adjustments to our care intervention. Frequently the patient is excluded from further significant input once we’ve made the initial “diagnosis” and we often transition from proactive to reactive additional data collection, many times at the expense of patient discomfort or harm.
- Think about how the SOAP process relates to improving the systems within which we all work. Who is affected by the broken processes? Who needs to be heard so that the symptoms and relevant data can be collected? Who needs to be part of the solution so that an effective plan resulting in sustainable change is made possible?

Next week we will look at a process improvement tool used in systems improvement at the hospital level and explore its connection to patient care. SOAP is applicable to process improvement and care providers have a lot more to contribute to this than we think.

Sunday, August 3, 2008

Rubber Side Down!


I am on a mountainbiking trip in Southern Utah this week (Brian Head) - no new post until the week of 8-11. Happy Trails!