Artikel bewaren

U heeft een account nodig om artikelen in uw profiel op te slaan

Login of Maak een account aan
Reacties0

Implementation research and practice: a focus on the ‘how’

Enola Proctor definieert Implementation Science als onderzoek gericht op versnelling en verbetering van invoering van evidence based care. De vraag “hoe kunnen we effectief en efficiënt evidence based interventies implementeren?” staat centraal. Proctor bespreekt de invloed van context en bevorderende en belemmerende factoren daarin. Het artikel laat zien dat implementatie uitkomsten anders zijn dan effectiviteit van de interventie. Taxonomieën van implementatie strategieën en de mate van voorkomen van die strategieën komen aan de orde. Tot slot geeft Proctor aan dat de toekomst van Implementation Science ligt in relaties vinden tussen belemmerende factoren en strategieën en in het toepassen van passende designs zoals stepped wedge en hybride methoden.

Naar de themapagina over implementatie

Door Enola Proctor. Het artikel is gebaseerd op haar plenaire lezing tijdens de internationale conferentie over Implementatie onderzoek in onze Week van de implementatie in februari 2018.

The long recognized, prolonged and incomplete uptake of scientific discoveries into routine service delivery has sparked an emphasis on implementation science in the United States and globally. In the U.S., research to accelerate and improve delivery of evidence-based care is commonly referred to as dissemination and implementation research or implementation science. Implementation research seeks to inform how to deliver evidence based interventions, programs, and policies into real-world settings so that their benefits can be realized and sustained.

The ultimate aim of implementation research is to build a base of evidence about the most effective processes and strategies for improving service delivery.  Implementation research builds upon effectiveness research, then seeks to discover how to use specific implementation strategies and move those interventions into specific settings, extending their availability, reach, and benefits to clients and communities.

This paper overviews important contextual factors that are key to successful implementation, addresses the outcomes through which implementation is evaluated, and reviews what we know about implementation strategies, the “how to” of adopting and sustaining evidence based interventions.  The paper concludes with research priorities for the field.

Service context

A host of service settings share the goal of delivering high quality services through efforts to implement evidence-based interventions. These include specialty mental health, schools, criminal justice system, hospitals and community based health clinics.  Providers, system administrators, and service payers share a desire to provide good care.  Given slow adoption of new practices, we ask, “What gets in the way?”

Research has identified a host of barriers to adopting, delivering, and sustaining new, effective interventions.  These include provider barriers such as habit, resistance to change, skepticism about new evidence, and time constraints.  System challenges include insufficient resources to pay for provider training, inertia, data and record systems that are incompatible with the assessment measures associated with many evidence-based interventions, restrictions on reimbursing system improvements. Thus, practice change needs to aligned with, or focused on improving, the practice infrastructure (Emmons, Weiner, Fernandez, & Tu, 2011) and policy ecology (Raghavan, Bright, & Shadoin, 2008).

Efforts to implement practice improvements also need to engage stakeholders, including service consumers and families, providers, agency administrators, service funders, and sometimes government officials.  They play key roles such as deciding which interventions to adopt, what to pay for, resisting or, more optimistically, championing or facilitating change.  Thus, implementation requires careful assessment of the practice context, considering not only if better interventions could be implemented but also assessing:  Is there a demand to implement?  Is there a push out?  Is there a “pull” for change among stakeholders?  Is the practice infrastructure equipped to support the interventions?

Evaluating implementation succes

Practice evaluations often yield disappointing results, showing that the expected outcomes are not attained.  This might mean that the interventions were not effective.  However, just as likely, distinct outcomes are required to capture intervention uptake—outcomes that are focused not on the effectiveness of the intervention but on the implementation process itself.  Implementation outcomes differ from clinical outcomes.  Implementation outcomes enable a direct test of whether or not a given intervention is actually adopted and delivered.  Implementation outcomes help to identify the roadblocks in intervention adoption.

Our team developed a taxonomy of eight implementation outcomes:  acceptability, adoption, appropriateness, feasibility, fidelity, implementation cost, penetration, and sustainability(Proctor et al., 2011).  This taxonomy became the framework for two national repositories of measures for implementation research:  the Seattle Implementation Research Collaborative, or SIRC (Lewis et al., 2015), and the NIH measures database called GEM (Rabin et al., 2012).  These repositories seek to harmonize and increase the rigor of measurement in implementation science (Lewis, Brownson, & Proctor, 2017).  Yet measurement remains underdeveloped in implementation science; while an increasing number of scales have been developed to capture implementation outcome constructs, few report reliability or validity.  Few measurement issues are more important for implementation science than advancing tools to capture context, process and outcomes in the field (Lewis et al., 2017).

Implementation strategies

Moving effective programs and practices into routine care settings requires the skillful use of implementation strategies, defined as systematic “methods or techniques used to enhance the adoption, implementation, and sustainability of a clinical program or practice into routine service” (Proctor, Powell, & McMillen, 2013). Implementation strategies are interventions for system change—the how of change—how organizations, communities, and providers can learn to deliver new and more effective practices (Powell et al., 2012; Powell et al., 2015; Waltz et al., 2014; Waltz et al., 2015).

Our teams have developed taxonomies of implementation strategies, conducting  a structured literature review to generate common nomenclature and a taxonomy of implementation strategies.  That review yielded 63 distinct implementation strategies, which fell into six groupings: planning, educating, financing, restructuring, managing quality, and attending to policy context (Powell et al., 2012).

Our team refined that compilation using Delphi techniques and concept mapping to develop conceptually distinct categories of implementation strategies. (Powell et al., 2015; Waltz et al. 2014).  The refined compilation of 73 discrete implementation strategies which were then further organized into 9 clusters.  These clusters include: (1) changing agency infrastructure, (2) utilizing financial strategies, (3) supporting clinicians, (4) providing interactive assistance, (5) training and educating stakeholders, (6) adapting and tailoring interventions to context, (7) developing stakeholder relationships, (8) using evaluative and iterative strategies, and (9) engaging consumers.

These taxonomies of implementation strategies position the field for more robust research on implementation processes.  Because the language used to described implementation strategies has not yet “gelled” (in fact it has been described as a “Tower of Babel),” we also developed guidelines for reporting the components of strategies (Proctor et al., 2013) so that readers would have more behaviorally specific information about what a strategy is, who does it, when, and for how long.

Evidence about strategies

Researchers have begun to identify from practice-based evidence the implementation strategies most often used.  Using activity logs to track implementation strategies, Bunger and colleagues (2017) found that such strategies as quality improvement tools, using data experts, providing supervision and sending clinical reminders were frequently used to facilitate delivery of behavioral interventions within a child welfare setting.  Among the most frequently employed strategies for implementing psychosocial interventions are: provider training and support, including coaching and provision of technical assistance; iterative quality improvement approaches of trialing, assessing, and revising change; and monitoring and providing feedback on practice change. One study, focused on the initiation of HIV treatment within the US Veterans Administration (VA), found that data warehousing techniques, (e.g., using a dashboard (85%)), and intervening with patients to promote uptake and adherence to HCV treatment (71%) were frequently used (Rogal et al., 2017).

Reflecting the complexity of quality improvement processes, we have learned that there is no “magic bullet” (Powell, Proctor, & Glass, 2013).  Most research tests combinations of strategies (Powell et al., 2013).    Our study of VA clinics working to implement evidence-based HIV treatment found that implementers used an average of 25 different implementation strategies (Rogal et al., 2017).

A fifteen agency based randomized clinical trial found that an organizational focused intervention, the ARC model, improved agency culture and climate, stimulated clinicians to enroll in EB practice training, and boosted clinical effect sizes of various practices (Glisson, Williams, Hemmelgarn, Proctor, & Green, 2016a; Glisson, Williams, Hemmelgarn, Proctor, & Green, 2016 b).  In a hospital critical care unit, the implementation strategies of developing a team, selecting and using champions, provider education sessions, and audit and feedback increased adherence to phlebotomy guidelines (Steffen et al., 2017).

Experts in implementation science and implementation practice identified as most important the strategies of “evaluate and iterative approaches” and “train and educate stakeholders.”   Reported as less helpful were such strategies as “access new funding streams” and “remind clinicians of practices to use” (Waltz et al., 2015). In the VA, initiation of new evidence-based HIV treatments was associated with use of local consensus discussions, preparing patients to be active participants in HIV care, fostering collaborative learning environments, facilitation, technical assistant, and changes in the structure and location of clinic services (Rogal et al., 2017).   Implementation strategies have been shown to boost clinical effectiveness (Glisson, Schoenwald, Hemmelgarn ,et al., 2010), reduce staff turnover (Aarons, Sommerfield, Hect, Silvosky, & Chaffin, 2009) and help reduce disparities in care (Balicer et al., 2015).

Future directions: research on implementation strategies

Funding agencies in many countries and of many international entities have prioritized research that can advance our understanding of how to improve service quality through the implementation of evidence-based care.  For some such funding agencies, such as the National Institutes of Health, developing and testing the effectiveness of implementation strategies is a top priority.  Research needs to address such questions as:  what strategies are appropriate for different interventions? What strategies are effective in which organizational and policy contexts?  Which strategies are effective in overcoming which barriers?  And for attaining which specific implementation outcomes? Are the strategies that are effective for initial adoption also effective for scale up, spread, and sustained use of interventions?

Table 1 shows hypothesized fit between strategies and barriers.

Barrier Strategy
Limited provider knowledge Training, education, coaching
Overestimates of actual quality Audit and feedback
Lack of motivation to change Champions, incentives, penalties
Beliefs and attitudes Opinion leaders
System barriers Task shifting, role and process redesign

Research on implementation strategies is complicated given the wide variation in complexity. Some are discrete, involving one process or action, such as “meetings,” “reminders,” meetings, checklists, while others are multifaceted (Powell, et al., 2012), comprising two+ strategies, such as “training + technical assistance” or “facilitation.” Most complex are “blended” strategies, which involve a set of interwoven packaged strategies, such as the “ARC” organizational intervention (Glisson et al., 2016b).

Research designs for testing strategies include effectiveness, comparative effectiveness, and cost-effective approaches, usually employed in a clustered randomized or stepped-wedge approach (Landsverk, Brown, Smith, et al, 2017).  Hybrid designs enable simultaneous tests of interventions and implementation strategies while SMART designs enable disentangling the effects of components of multifaceted strategies. (Kilbourne, Eisenberg, et al, 2014; Kirchner, Waltz, Powell, Smith & Proctor, 2017).

Tackling these challenges and deriving answers to the question, “how can we most effectively and efficiently implement evidence-based practices” is essential if we are to realize the benefits of clinical research and improve the lives of those seeking care.

Enola Proctor is Shanti K. Khinduka Distinguished Professor Brown School of Social Work, Washington University in St. Louis, Missouri, USA. This paper is based on an invited plenary presented at the Improving Implementation practice conference: VUmc Amsterdam, February 9, 2017

 

References

Aarons, G. A., Sommerfeld, D. H., Hecht, D. B., Silovsky, J. F., & Chaffin, M. J. (2009). The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: Evidence for a protective effect. Journal of Consulting and Clinical Psychology, 77(2), 270–280. doi:10.1037/a0013223

Balicer, R. D., Hoshen, M., Cohen-Stavi, C., Shohat-Spitzer, S., Kay, C., Bitterman, H., Lieberman, N., Jacobson, O. and Shadmi, E. (2015). Sustained reduction in health disparities achieved through Targeted Quality Improvement: One-year follow-up on a three-year intervention. Health Services Research, 50,1891–1909. doi:10.1111/1475-6773.12300

Bunger, A. C., Powell, B. J., Robertson, H. A., MacDowell, H., Birken, S. A., & Shea, C. (2017). Tracking implementation strategies: A description of a practical approach and early findings. Health Research Policy and Systems, 15(15), 1-12. doi:10.1186/s12961-017-0175-y

Curran, G.M., Bauer, M., Mittman, B., PYne, J.M., & Stegler, C.  2012).  Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care, 50 (3), 217-226.

Emmons, K. M., Weiner, B., Fernandez, M. E., & Tu, S.-P. (2011). Systems Antecedents for Dissemination and Implementation. Health Education & Behavior, 39(1), 87–105. doi:10.1177/1090198111409748

Glisson, C., Schoenwald, S. K., Hemmelgarn, A., Green, P., Dukes, D., Armstrong, K. S., & Chapman, J. E. (2010). Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. Journal of Consulting and Clinical Psychology, 78(4), 537-550. doi:10.1037/a0019160

Glisson, C., Williams, N. J., Hemmelgarn, A., Proctor, E. K., & Green, P. (2016a). Increasing clinicians’ EBT exploration and preparation behavior in youth mental health services by changing organizational culture with ARC. Behaviour Research and Therapy, 76, 40-46. doi:10.1016/j.brat.2015.11.008

Glisson, C.,Williams, N. J, Hemmelgarn, A., Proctor, E. K, Green, P. (2016b). Aligning organizational priorities with ARC to improve youth mental health service outcomes. Journal of Consulting and Clinical Psychology, 84(8), 713-725. doi:10.1037/ccp0000107

Kilbourne, A.M., Almirall, D. Eisenberg, D. et al. (2014).  Protocol:  Adaptive Implementation of Effective Programs Trial (ADEPT):  cluster randomized SMRT trail comparing a standard versus enhanced implementation strategy to improve outcomes of a mood disorders program.  Implementation Science, 9: 132. Dio: 10.1186/s13012-014-0132x

Kirchner, Lewis, C. C., Stanick, C. F., Martinez, R. G., Weiner, B. J., Kim, M., Barwick, M., & Comtois, K. A. (2015). The Society for Implementation Research collaboration instrument review project: A methodology to promote rigorous evaluation. Implementation Science, 10(2), 1-18. doi:10.1186/s13012-014-0193-x

Landsverk, J., Brown, C.H., Smith, J.D., Chamberlain, P, Curran, G.M., Palinkas, L., Ogihara, M., Czaja, S., Goldberg-Fiebert, J.D., Vermeer, W., Saldana, L., Rolls Reutz, J.A., & Horwitz, S.M. (2017).  Design and analysis in dissemination and implementation research.  In Bownson, R.C., Colditz, G.A., & Proctor, E.K. (2017) Dissemination and Implementation Research in Health: Translating science to practice. Second Edition.  NY: Oxford.

Lewis, C. C., Proctor, E. K., & Brownson, R. C. (2017). Measurement issues in dissemination and implementation research. In Bownson, R.C., Colditz, G.A., & Proctor, E.K. (2017) Dissemination and Implementation Research in Health: Translating science to practice. Second Edition.  NY: Oxford.

Powell, B. J., Waltz, T. J., Chinman, M. J., Damschroder, L. J., Smith, J. L., Matthieu, M. M., Proctor E. K., & Kirchner, J. E. (2015). A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, 10(21). doi:10.1186/s13012-015-0209-1

Powell, B. J., Proctor, E. K., & Glass, J. E. (2013). A systematic review of strategies for implementing empirically supported mental health interventions. Research on Social Work Practice, 24(2), 192-212. doi:10.1177/1049731513505778

Powell, B., McMillen, C., Proctor, E., Carpenter, C.R., Griffey, R.T., Bunger, A.C., Glass, J.E., & York, J.L. (2012). A Compilation of Strategies for Implementing Clinical Innovations in Health and Mental Health. Medical Care Research and Review.69(2),123-157 https://dx.doi.org/10.1177/1077558711430690

Proctor, E. K., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., . . . Hensley, M. (2011). Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research, 38(2), 65-76. doi:10.1007/s10488-010-0319-7

Proctor, E. K., Powell, B. J., & McMillen, J. C. (2013). Implementation strategies: recommendations for specifying and reporting. Implementation Science, 8(1). doi:10.1186/1748-5908-8-139

Rabin, B. A., Purcell, P., Naveed, S.,Moser, R. P., Henton,M. D., Proctor, E. K., . . .&Glasgow,R.E. (2012). Advancing the application, quality and harmonization of implementation science measures. Implementation Science, 7(119), 1–11. doi:10.1186/1748-5908-7-119

Raghavan, R., Bright, C. L., & Shadoin, A. L. (2008). Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implementation Science, 3(1). doi:10.1186/1748-5908-3-26

Rogal, S. S., Yakovchenko, V., Waltz, T. J., Powell, B. J., Kirchner, J. E., Proctor, E. K., … Chinman, M. J. (2017). The association between implementation strategy use and the uptake of hepatitis C treatment in a national sample. Implementation Science, 12(1). doi:10.1186/s13012-017-0588-6

Steffen, K., Doctor, A., Hoerr, J., Gill, J., Markham, C., Brown, S. M., … Spinella, P. C. (2017). Controlling Phlebotomy Volume Diminishes PICU Transfusion: Implementation Processes and Impact. Pediatrics, 140(2), e20162480. doi:10.1542/peds.2016-2480

Waltz, T. J., Powell, B. J., Chinman, M. J., Smith, J. L., Matthieu, M. M., Proctor, E. K., . . . Kirchner, J. E. (2014). Expert Recommendations for Implementing Change (ERIC): Protocol for a mixed methods study. Implementation Science, 9(39), 1-12. doi:10.1186/1748-5908-9-39

 

Geef uw reactie

Om te kunnen reageren moet u ingelogd zijn. Heeft u nog geen account, maak dan hieronder een account aan. Lees ook de spelregels.