Skip to main content

A Developmental Systems Guide for Child and Adolescent Behavioral Health Practitioners: 19. Getting Evidence-Based Interventions to People: Implementation Science

A Developmental Systems Guide for Child and Adolescent Behavioral Health Practitioners
19. Getting Evidence-Based Interventions to People: Implementation Science
    • Notifications
    • Privacy
  • Project HomeA Developmental Systems Guide for Child and Adolescent Behavioral Health Practitioners
  • Projects
  • Learn more about Manifold

Notes

Show the following:

  • Annotations
  • Resources
Search within:

Adjust appearance:

  • font
    Font style
  • color scheme
  • Margins
table of contents
  1. Title Page
  2. Copyright
  3. Table Of Contents
  4. Preface
  5. Theoretical and Practical Foundations
    1. 1. Our Framework within the Developmental Systems Perspective
    2. 2. A Developmental Systems Approach to Understanding Race and Ethnicity within Child Development and Psychopathology
    3. 3. Assessment, Clinical Formulation, and Diagnosis: A Biopsychosocial Framework within the Developmental Systems Lens
    4. 4. Psychosocial Intervention and Treatment: From Problem to Action
    5. 5. Psychopharmacology through a Developmental Systems Lens
  6. Therapeutic Approaches for Specific Disorders
    1. 6. Intellectual Disabilities/Intellectual Developmental Disorders (IDD)
    2. 7. Autism Spectrum Disorder in Children and Adolescents
    3. 8. Attention Deficit Hyperactivity Disorder in Children and Adolescents
    4. 9. Depressive Disorders in Children and Adolescents
    5. 10. Anxiety Disorders in Children and Adolescents
    6. 11. Trauma and Stressor Related Disorders in Children and Adolescents
    7. 12. Disruptive Behavior Disorders in Youth
    8. 13. Substance Use Disorders in Youth
    9. 14. Eating Disorders in Children and Adolescents
    10. 15. Psychosis in Children and Adolescents
    11. 16. Suicide and Self-Injurious Thoughts and Behaviors in Children and Adolescents
    12. 17. Gender, Sexuality, and Psychosocial Care
  7. Organizational Considerations
    1. 18. Clinical Supervision of Youth-Serving Clinicians
    2. 19. Getting Evidence-Based Interventions to People: Implementation Science
  8. Contributors
  9. Image Credits

Cover for chapter nineteen, Getting Evidence-Based Interventions to People: Implementation Science, by Sean E. Snyder, MSW Courtney Benjamin Wolk, PhD, and John Armstrong, PhD.

Caroline is supervising a team of clinicians who are receiving a deluge of referrals for trauma-based therapy in the wake of increased gun violence in the city. She received some training in TF-CBT. Her clinicians are hesitant to push their clients to talk about their trauma in the systematic, therapeutic way outlined in various EBP protocols for treating child trauma. The team feels they need some support but don’t think that their agency has the resources to help them or the motivation to support therapies for child trauma exposure. Caroline knows that training only solves one part of the problem for her team, and she will look to implement best practices.

In the practice vignette, we are seeing an all-too-common issue: the demand for an evidence-based practice (EBP) outweighs the available supply of providers. This is what we call a problem of implementation. Implementation of an intervention requires more than just a clinical decision; it depends on a multitude of organizational contexts and factors related to the EBP itself (e.g., its adaptability, the service environment in which a clinician operates). In previous chapters we looked at the evidence-based selection process for clinicians treating their clients. This chapter will give an overview of the field of implementation science, the mechanics of implementation, and a practice example of translating research into practice.

Overview of Implementation

Implementation practice refers to the use of strategies to change people’s behavior or organizational processes, which can include the roll out of an EBP. Implementation science is considered the scientific study of methods and strategies that facilitate the uptake of evidence-based practice in health care settings (Eccles & Mittman, 2006; Bauer et al., 2015). There is some overlap with quality improvement (QI), but there are differences between implementation science and quality improvement. QI is aimed at a developing a specific solution to a specific problem at the clinical or system level, whereas implementation science tends to start with an EBP that is seen as underutilized despite having an evidence base to support its use (Bauer et al., 2015). These differ from dissemination practice, the act of sharing information to increase awareness and knowledge of a “thing,” and dissemination science, the scientific study of best practices to disseminate information across audiences and contexts (Bauer et al., 2015). In our vignette, Caroline’s team did not necessarily have an issue related to dissemination; there was some knowledge of an EBP. The issue was getting that knowledge to practice and getting that practice to scale.

The Research Pipeline and Implementation 

Our practice vignette with Caroline highlights aspects of the research pipeline. When an EBP is tested in a lab setting, we refer to that as efficacy research. Effectiveness research occurs when we move the EBP into a community setting after efficacy is established, and implementation research is the last phase of the research pipeline. Translating knowledge to practice can be a long process, and problems with implementation can contribute to the 17-year average wait before innovations can be used in practice (Bauer & Kirchner, 2020). We run into issues of impact and of efficiency with intervention science, with data showing that approximately 80% of medical research dollars do not make an impact on public health, largely because of lack of proper implementation (Chalmers & Glasziou, 2009). Designing for implementation does not lead to more cost in the long run. Studies of global health initiatives show that those designed for proper implementation are cost efficient compared to initiatives that did not plan for implementation (World Health Organization, 2009).

Implementation Research Outcomes

Implementation outcomes are different from more traditional clinical outcomes (i.e., did the patient/client get better after they received the innovation); they are concerned with the proximal effects of the implementation effort and include constructs of acceptability, appropriateness, feasibility, fidelity, penetration, and sustainability (see Table 19-1; Proctor et al., 2011). It is important to attend to implementation outcomes because they can help clinicians and administrators understand why an EBP did or did not achieve the intended outcome when implemented in their setting. Without this information, one may falsely conclude that an EBP does not work for a particular population or in a particular context. Rather, it may be that the desired outcomes were not achieved because clinicians were not trained and supervised in implementing the EBP with fidelity to the core intervention components.

Caroline’s leadership team is at a crossroads: how can they meet the needs of their clients under time, capacity, and resource constraints. We will discuss how her team will work through those in the coming sections. Before taking a dive into the process, let’s look at the key concepts within implementation.

Table 19.1. Implementation Outcomes
Implementation OutcomeConceptual MeaningRelated Questions
AcceptabilityThe degree to which stakeholders agree with aspects of the EBPDoes the content of this EBP match our situation? Is the EBP too simple or complex? Is it meant to be delivered to our intended audience?
AppropriatenessThe degree to which stakeholders view the EBP as relevant to their needDo we believe this EBP will address our needs? Do we think it is practical to reach our intended goal?
FeasibilityThe degree to which the EBP is actually relevant to stakeholder’s needsIs this EBP truly suitable for us? Can we realistically implement this to create meaningful change?
FidelityThe degree to which the EBP is implemented as intendedAre we adhering to the methods this EBP requires? If we’re modifying the EBP in any way, what effects might that have on outcomes?
PenetrationThe degree to which a practice is used within a service setting and its subsystemsDo clinicians trained in the EBP deliver the EBP with eligible clients?
SustainabilityThe degree to which an EBP can be implemented over timeHow much maintenance does this EBP require? Do we have the necessary resources to maintain its use? How easily can implementation be incorporated into organizational and clinical routines?

The Mechanics of Implementation

It may be helpful to unpack some of the overlapping terms in the field of implementation. To keep it simple, what we are implementing can be loosely called “the thing,” and the implementation strategies are considered “the how” (Curran, 2020).

Implementing “the Thing”

Implementation science can appear convoluted, with its many terms and over 61 identified dissemination and/or implementation theories, frameworks, and models (Tabek et al., 2012), each with its own particular language or overlapping concepts. A common practice in communicating scientific knowledge to lay persons or non-specialists is to use nonscientific language. Geoffrey Curran, an often-cited implementation researcher, offers the model of simplicity to explain implementation science. He defines the intervention/practice/innovation as the thing. Effectiveness research attempts to see if the thing works, and implementation research attempts to understand how to best do the thing. Implementation strategies are referred to as the stuff we do to help people and settings do the thing. Defining the thing can be one of the most difficult aspects of implementation work. Pull quote in blue textbox. Implementation outcomes can help clinicians and administrators understand why an EBP did or did not achieve the intended outcome when implemented in their setting.Oftentimes, our things can fit into the 7-Ps defined by Brown and colleagues (2017): programs, practices, principles, procedures, products, pills, and policies. In this chapter, the thing Caroline’s team wants to roll out is the practice of trauma-focused cognitive behavioral therapy (TF-CBT). The stuff Caroline’s implementation support team will do to help her clinicians do TF-CBT will be the implementation strategies discussed later in this chapter.

It is important to note that there isn’t always one perfect EBP, or thing, to implement in a particular scenario. The available efficacy and effectiveness research may not directly map onto your exact client population or setting. However, EBPs can be delivered flexibly, while maintaining fidelity, to allow for personalization to client needs (Kendall, 2022). See, for example, Chorpita et al., (2011) for information on how to select EBPs to meet client needs.

Implementation Frameworks

Frameworks are outlines, overviews, and systems of thinking; they are descriptive and not explanatory. They do not outline the process; rather, they provide the context and architecture around any process or theory (Nilsen, 2015). As mentioned previously, there are numerous dissemination and implementation theories, frameworks, and models. This chapter will highlight those that are readily actionable for readers and have open access supports for implementers to utilize. Some frameworks are determinant frameworks, that is, they help us understand the relevant contextual aspects of implementation. There are also process models, which describe the steps for putting the research into practice; these tend to not have the same degree of explanatory qualities as theories (Nilsen, 2015).

The Consolidated Framework for Implementation Research. The consolidated framework for implementation research (CFIR). described by Damschroder and colleagues (2009), represents an effort to synthesize the health services implementation literature. It provides “an overarching typology – a list of constructs to promote theory development and verification about what works where and why across multiple contexts.” The five major domains described by CFIR include intervention characteristics, outer setting, inner setting, characteristics of individuals involved, and the implementation process. Intervention characteristics are the features of an intervention being implemented into a particular organization; they involve considerations such as whether there is the need to adapt an existing intervention to improve fit. The outer setting includes the economic, political, and social context within which an organization exists, for example what policies drive the practices selected for implementation and what are the characteristics of the community the organization serves. The inner setting includes the structures, organizational norms, and cultural processes implementation occurs within. Pull quote in blue textbox. The intervention/ practice/ innovation is referred as quote the thing unquote. Effectiveness research attempts to see if the quote thing unquote works, and implementation research attempts to understand how best to do the quote thing unquote.The fourth domain relates to the individuals involved with implementing the intervention, including the impact of individual choices and influence on implementation. The final domain is the implementation process; successful implementation requires an active change process to achieve widespread use of interventions with fidelity. The CFIR website includes more information about these constructs and helpful, free, tools and resources.

EPIS Framework. One of the most widely used implementation science frameworks is known as the exploration, preparation, implementation, and sustainment (EPIS) framework (Aarons et al., 2011). The EPIS model has been applied to juvenile justice settings (Knight et al., 2015), public mental health, and child-welfare service settings (Moullin et al., 2019). The EPIS framework covers the different phases of implementation as well as the contextual factors that can organize implementation activities. These include outer contexts, inner contexts, innovation factors, and bridging factors. The outer context spans factors external to the organization such as the service environment, funding/contracting factors, networks and interorganizational relationships, and client characteristics. (Aarons, 2011) The inner context considers determinants such as organizational characteristics that span leadership, organizational culture and climate, readiness for change, and quality and fidelity monitoring (Aarons, 2011; Moulin et al., 2019), as well as individual characteristics of the organization that span staff and leader attitudes, skills, and demographics. Bridging contextual factors reach across both the outer and inner contexts; intermediaries like purveyor organizations or community-academic partnerships often serve in these roles (Aarons et al., 2011). The innovation factors are related to the EBP that is to be implemented in the organization and the fit and adaptability of the EBP to the organization, provider, and client. We will explore how this process model is used at the end of the chapter.

Theoretical Models

Theories describe and explain how individuals, organizations, and systems change (Nilsen, 2015), and there can be abstract ways to show how specific relationships lead to specific events (Nilsen, 2015).

Theory of Planned Behavior. The theory of planned behavior (Ajzen, 1991) asserts that motivational factors and perceived behavioral control influence one’s intention to perform a behavior, and this intention will become actual behavior if given the opportunity. Motivational factors include constructs such as attitudes, subjective norms, and social norms. For example, if a person is motivated to use an EBP, believes other clinicians like them use the EBP, believes use of the EBP would be appreciated and rewarded in their organization, and believes they have the requisite knowledge and skill to use the EBP, they will use the EBP if given the opportunity. In the context of implementation, implementation strategies may look to capitalize on available motivation or address its absence.

Theoretical Domains Framework. Building on the work of Michie and colleagues (2005) and Cane and colleagues (2012), the theoretical domains framework (TDF) aims to group constructs from implementation theories into domains to inform the identification of determinants of behavior. Atkins et al. (2017) also present a guide to using the TDF. The TDF can be helpful in informing the theoretical basis for implementation studies, understanding the scope of potential reasons for slow diffusion of knowledge to practice, and offering methods to make knowledge actionable. The domains discussed the TDF include knowledge; skills; social/professional role and identity; beliefs about capabilities; optimism; beliefs about consequences; reinforcement; intentions; goals; memory, attention, and decision processes; environmental contexts and resources; social influences; emotion; and behavioral regulation. To view the constructs within these domains, consult the table from the guide offered by Atkins and colleagues (2017). These domains and constructs cue us into what can facilitate or get in the way of implementing an EBP.

Implementation Strategies

Implementation strategies are, as discussed earlier, the stuff we do to help people and settings do the thing (Curran, 2020). That is, implementation strategies are the methods used to help increase adoption of an EBP, to support the implementation of an EBP in an organization, and/or to promote sustainment of the EBP (Proctor, Powell, & McMillen, 2013). There are at least 73 recognized implementation strategies (Powell et al., 2015), including strategies to support dissemination of information about the EBP, strategies that aim to improve the implementation process (e.g., by engaging stakeholders, adapting the EBP), strategies to help integrate the EBP into the context (e.g., through modifying record keeping and supervision), strategies to build capacity for the EBP through technical assistance and support, for example; and strategies to support scale-up (e.g., training, developing toolkits; see Leeman et al., 2017). In social work organizations, commonly utilized implementation strategies include training/workshops, supervision, policy mandates to use the EBP, and financial strategies (e.g., an enhanced reimbursement rate). The complete list can be found online in Powell et al.’s (2015) supplementary material and tables. As Powell and colleagues discuss, these strategies do not constitute a checklist required for any implementation effort; they merely present the breadth of strategies that can be packaged together in an implementation plan. On the CFIR website, you can find a tool for matching implementation strategies to barriers identified at different CFIR domains (i.e., the CFIR-ERIC matching tool; see also Waltz et al., 2015).

Clinician Exercise

  • Think of a current service problem in your practice setting

  • Is there a practice gap that requires a new intervention roll-out?

  • Or do you have something that works but it’s difficult to sustain and scale?

  • What beyond training would you need to do to improve your practice setting?

Example: EPIS Process

Let’s look at the practice vignette and consider how Caroline’s team may attempt to approach their problem. We will look at the process of implementing TF-CBT in a community mental health clinic through the lens of the EPIS Model and Framework. Note that while we use TF-CBT as the example here for illustrative purposes and continuity throughout this chapter, the same process can be applied to other situations and EBPs. For example, a community clinic that is increasingly receiving referrals for preschoolers with behavioral challenges could apply this process to their implementation of parent-child interaction therapy (PCIT) (Funderburk & Eyberg, 2011). Or a school seeing an increase in aggression and conduct problems may apply the same process to the implementation of the Coping Power program (Larson & Lochman, 2002).

Exploration Phase

Exploration begins when implementers and an organization identify a need. Following the identification of the need, various EBPs are evaluated to determine those most likely to achieve the desired outcomes. While doing so, implementers and relevant stakeholders consider systemic, organizational, and individual levels of adaptation that may be required to ensure the EBP represents the best possible fit for their organization (Moullin et al., 2019). The exploration phase ends when implementers and stakeholders choose an EBP and outline the initial expected adaptations necessary to ensure effective implementation.

As a leader, Caroline knew that that TF-CBT was efficacious and had been widely implemented in different settings. So, the thing itself seemed appropriate, but Caroline had to consider exploring aspects of the outer sociopolitical context and funding availability. Would insurance and managed care organizations support the rollout of TF-CBT with an enhanced rate? How would she cover the cost of training? She knew that there was a network in her local department of behavioral health that could offer training and used direct networking to get to the right administrators in that department. She knew this partnership could take time, but it was worth the effort of reaching out and taking things slow.

What about the inner context? Caroline was a champion for this intervention and thought that she could lead the charge. The main barrier in her organization was the climate: folks were feeling burned out and potential individual adopters had little perceived need to change. They felt comfortable with their usual treatment. Caroline had to consider organizational readiness before moving into preparing for potential implementation.

Preparation Phase

Preparation involves a deeper consideration of the systemic, organizational, and systems-level needs prior to implementation. Implementers and relevant stakeholders collaborate to determine potential barriers and facilitators to implementation. The implementation team also plans for expected implementation support (e.g., training, coaching, audit, and feedback) to create an environment where the EBP is valued and supported at all levels within the organization (Moullin et al., 2019).

Caroline’s networking resulted in a meeting with a key administrator in her public mental health system. They were open to training her team but needed to apply for an internal grant. This process forced Caroline to consider the steps for preparation. She needed to determine the funding stream for the service (continued billing), set up the actual training, identifying other champions of the intervention to support group supervision in TF-CBT, build out a fidelity checklist for use in supervision of clinicians, consider appropriate staffing patterns to continue to offer non-trauma psychotherapy services, determine how youth would be referred for TF-CBT, and prepare to evaluate the outcomes of rolling out the service.

Implementation Phase

Implementation occurs when the use of the EBP begins within the organization. Implementers and relevant stakeholders engage in an iterative monitoring of the use of the EBP to assess for unforeseen challenges or needs. Where necessary, adjustments are made to implementation strategies and supports to meet expectations for effective implementation (Moullin et al., 2019).

To support the roll out of TF-CBT, Caroline consulted the go-live checklist. This ensured she was covering all aspects of implementing TF-CBT at her clinic. She made sure there were adequate monitoring and feedback systems in place and planned to pilot cases as a way to get iterative feedback.

Sustainment Phase

Sustainment consists of continuous monitoring of implementation, as well as internal and external factors that affect implementation (Moullin et al., 2019). Successful sustainment occurs when the EBP is properly adapted and ingrained within the organization to achieve the desired impact upon its intended need.

Six months into implementing, Caroline was seeing some success. The main problem was continued staffing. One trained clinician transferred within the clinic, another was promoted to supervisor, and a third left the agency. Caroline had planned for some turnover and was able to work in a second cohort of trainees later in the year. She allocated money in her grant for ongoing consultation with a TF-CBT master trainer for support and generalizable learning from the original workshop training. “If I went back and did it all over again, I wouldn’t change a thing,” Caroline said. “It was a lot of up-front work to make sure we were ready, that it was something we could continue to do; ultimately, we want youth to get access to care and for care to be there for the next family that needs it.”

Clinical Voices: Implementation Science with Dr. Courtney Benjamin Wolk, PhD

Courtney Benjamin Wolk, PhD is an Assistant Professor at the Penn Center for Mental Health at the Perelman School of Medicine. She is a licensed clinical psychologist and an implementation scientist. The long-term goal of her research is to develop and evaluate strategies to promote the uptake of evidence-based care into routine practice, with the ultimate goal of improving the effectiveness of mental health services for children and adults in non-specialty mental health settings. She completed both her MA and PhD in Clinical Psychology at Temple University, where she focused on the development and evaluation of cognitive-behavioral therapies (CBT) for child and adolescent anxiety. She completed an APA-accredited pre-doctoral internship in clinical psychology at Children’s National Medical Center in Washington, D.C.

Sean E. Snyder, LCSW: This clinical dialogue is going to focus on implementation science. Dr. Wolk, could you give a little bit of background about yourself to start.

Courtney Wolk, PhD: I’m a clinical psychologist by training, and I identify as an implementation science researcher. I’m currently an Assistant Professor at the Penn Center for Mental Health, which is in the psychiatry department at the University of Pennsylvania’s Perelman School of Medicine. Most of the work that I do right now is focused on how to implement and integrate evidence-based mental health interventions into settings where mental health is not traditionally a part of the service model. I’m really interested in non-specialty mental health settings and some of the implementation challenges that arise in those settings that are unique compared to more traditional behavioral health settings. A lot of the work that I do is situated in either schools or primary care clinics.

Snyder: Wonderful. Now the big word of the day, implementation. The formal chapter will explain the technical definitions related to implementations science, so I would love to hear how you would explain what implementation science is to a layperson who doesn’t know anything about it.

Wolk: Great question. Implementation science is really the scientific study of methods to support the implementation or the use/uptake of evidence-based interventions in real-world settings. We in the field are really interested in how we take practices that we know are effective through established research and help spread them so that people in the community can access those treatments when they need them. We focus on different strategies to help support clinicians and organizations to use evidence based or best practices.

Pull quote in blue textbox. Implementation science is a relatively new field. It builds upon many established fields, from social psychology, to organizational management, to other areas like quality improvement.Snyder: I know from your experience as a clinical psychologist that it’s all too familiar a case where a smart clinician like yourself gets trained in an EBP, then you see that there’s a problem with that EBP not getting to the kids. I want to hear more about your particular story. How did you get into implementation science? When we hear about the career dreams and aspirations of children, we hear them say they want to be a psychologist when they grow up or be an astronaut or a pop star. You don’t hear kids saying they want to be an implementation scientist. How did you get there?

Wolk: It’s actually a lot of what you started to describe about training and the gap you see in practice. As part of my graduate training and my internship, I got trained in all these best-practices and evidence-based interventions for kids, such as CBT for anxiety and PCIT. I saw how effective these treatments were through the research, but more importantly, in the kids that I was working with. I was always struck by hearing from families, time and time again, about how long it had taken them to find an effective treatment for their child. It was pretty common for families to say, “You are the third therapist that we’ve come to,” or “My child’s been on medication for this for three years and nothing has helped us so far.” Then they got to us and in 12 weeks or 16 weeks of working through an evidence-based program with them we could see really dramatic changes in their child’s symptoms, behavior, and their overall functioning. Families were thrilled and would often say, “Why hasn’t anyone done this before, why did it take us so long to find this particular program?” I saw this over and over again and it was often luck that they happened upon an EBP. Or maybe the child had a parent who had some medical training or health education, where they knew what to look for and found their way to the right place.

I was so frustrated that so many of these kids and families were struggling for so long, when we had these treatments that work and that could really help their kids. It was so hard for them to get EBPs and as I thought about what were the next steps in my training, I really wanted to get training to address the questions, “How do we actually help clinicians and systems use these best practices? How do we make sure that people are getting trained in them? How do we make sure that when a family shows up at their local community clinic there’s someone there who is trained in these best practices and that the child can get help right away, instead of having parents spend years searching for someone who can best help their kid?”

Pull quote in blue textbox. I was so frustrated that so many of these kids and families were struggling for so long, when we had these treatments that work and that could really help their kids.That’s how I stumbled into implementation science, which at the time was and still is a relatively new field. It builds upon many established fields, from social psychology to organizational management, to other areas like quality improvement. Implementation science was this new way of thinking about evidence-based practice implementation in community health settings. I got really excited about potentially having some actual tools and methods to guide bringing these practices to families who needed them.

Snyder: I heard you say that it almost came from a place of empathy and sympathy. When you are a community clinician, you can really empathize with the families about what they’re going through because they want the best care for their kids, and we know about the faults of our systems of care. There’s always this constant parity issue between medical care and behavioral health. Here it is more about disparity in the access and actionability of our knowledge. This makes me ask, does implementation science have an inherent social justice spirit?

Wolk: I think that’s absolutely true; I think every child and family and individual deserves to get the best that we have to offer as clinicians. They shouldn’t have to work so hard to find effective treatments when they exist, and we see this, as you mentioned, in other areas of healthcare as well, where there’s this real gap between the development of evidence-based practices and when they actually are diffused widely in the community. This access and actionability is not unique to mental health, but it is particularly problematic for behavioral health. Many of our psychosocial interventions are pretty complex and many of our clinicians in the community don’t have a lot of exposure to them as part of their training. Clinicians want to do what’s best for their clients, but they don’t always have the right tools in their toolkit in terms of some of these particular evidence-based practices.

And so I think we need to do a better job of preparing the workforce, and most importantly, preparing organizations to support and sustain these practices. The hope is that a sustainable model can provide better access to families. Everyone deserves and should have access to effective treatments when they exist.

Snyder: The point of this book as an open access resource is to get people access to evidence-based information and hopefully prepare our workforce a little better. But we know what happens in textbooks needs to be tailored to the clinician’s reality on the ground. And reading this book isn’t enough. There are a lot more steps that go into adopting an EBP. In your practice working with organizations, what do you see as the biggest barriers or challenges for organizations when adopting or sustaining use of an EBP?

Pull quote in blue textbox. If you really want people to change their behavior and start integrating evidence-based practices more into their work, you need to go beyond jus training and think about ongoing consultation or supervision.Wolk: These are often challenging jobs, where there are really high productivity demands on clinicians in the community, where they have very large caseloads and often have many administrative responsibilities like paperwork and billing. There’s not a lot of time for extra training or extra supervision in these practices, and there’s often not a lot of extra money in these clinics to invest in training or to protect clinician time for the extra supervision needed to learn a new practice. It’s really hard for anyone to absorb a new EBP into their workflow when there’s not a lot of extra time or money to support that.

Snyder: There are different contexts, from the outer context where it is something about funding and contracting— Are they getting reimbursed differently? Is there money available for this? — and intercontextual factors like the organizational culture and staffing patterns. Those contexts are from the EPIS model. Are there other things from that model that pop out to you in regard to key implementation factors?

Wolk: Leadership is also something that we find time and time again is critically important. With implementation, it is critical to know if there is a champion or a leader in that organization, who believes in evidence-based practice and will help their staff carve out the time to really develop their expertise in these practices and make sure they have the time and flexibility they need to implement them. Another thing that happens a lot is policy mandates about using a specific practice. And it might not always be the right fit at that moment for that particular organization or for the skill set of the clinicians who are working there. Sometimes there’s a mismatch with something that needs to happen or there’s not a lot of time to really develop the infrastructure and to do the trainings that are needed to do it well. Everyone does their best, but these are sometimes complicated interventions to learn and master. We don’t always give people enough time and support to really take that on.

Snyder: And maybe that’s where capacity issues could be addressed through community-academic partnerships or other types of arrangements. I would be interested to hear from your experience with the University of Pennsylvania as your academic home and Philadelphia as the home for your service contexts, do organizations in Philadelphia reach out to your team? Or is it your team reaching out to organizations? How do these partnerships typically develop?

Wolk: It happens in different ways. Sometimes we have a particular program that we’re really interested in helping to support or spread into the community, and we approach different potential clinics or partner sites to see if they’re interested and if they have the ability to take that on and partner with us. Then we go about figuring out a plan for how to how to bring it to their site.

Other times it’s more system driven, in that the system has identified a particular need or gap or there’s a mandate that has come down from a funder or other organization that has identified a need to do something different. They might find us and seek our support or our advice about how they could do that. Over the years, our Center has developed some pretty long-standing relationships with school districts and with behavioral health payers in the area, and because we’ve established those relationships and been able to help support some of their previous EBP efforts, they may come to us for advice or support when they have something new that they want to roll out or when they’ve identified that there’s a gap and they want someone to help them  work through what some potential solutions might be.

So sometimes we try to push into sites to see if we can help bring something to them, but oftentimes it works well when they have already identified a need and then we’re able to be there to help support them in making a change.

Snyder: Could you share about a project to give us a sense of what that looks like?

Wolk: I can tell you about our efforts in partnership with Community Behavioral Health in Philadelphia to bring BRIDGE, an evidence-based teacher consultation model, to public schools in Philadelphia. This arose a couple of years ago out of an identified need in the system and the school district to bring some new evidence-based practices to the clinical training of the community providers who are working in schools. BRIDGE was selected by key stakeholders involved with school mental health services in the city. At that time, a payer organization, Community Behavioral Health, wanted to invest in BRIDGE.

Our team at Penn has a history of doing a lot of teacher consultation work in autism support classrooms and had an existing relationship with the developer of BRIDGE, and so we were able to work with them to develop a plan to build capacity. It’s been something we’ve been working towards for a couple of years now, in partnership with them, to implement it system wide.

Pull quote in blue textbox. With implementation, it is critical to know if there is a champion or a leader in that organization, who believes in evidence-based practice and will help their staff carve out the time to really develop their expertise in these practices and make sure they have the time and flexibility they need to implement them.Once BRIDGE was identified as something helpful and acceptable for Community Behavioral Health and the schools, our team worked with the intervention developer, Lisa Capella at NYU, to make sure we were well trained in BRIDGE, and worked together with her to make some adaptations to the model in our training and implementation plan to really fit the context in Philadelphia. We think it’s really important that we work closely with the developer on that because we don’t want to make any adaptations that would compromise the fidelity of this existing model, which is effective. On the other side, our team has a lot of experience working in the system and in the school district, and so we want to make sure that we customize BRIDGE as much as possible so that it’s a good fit for the clinicians who will be working in the schools.

We’re gearing up in a couple of months to start training clinicians. We’ll be doing a couple days of workshops with them and a lot of consultation and coaching both live in the schools, and as they develop mastery, we will continue to work with them by phone over the course of this whole year. The ongoing consultation is to help them really build their confidence and their comfort with using the BRIDGE intervention strategies to support teachers in the schools that the clinicians are working with. We’ll be starting small with about 15 or 20 clinicians this fall and the hope is that we’ll learn and iterate and continue to refine things in collaboration with the developer. Then we would be able to continue to scale this up over the next few years, with all the clinicians who are in the system, working in K-8 schools in the school district.

Snyder: Everything is iterative! What was really nice to hear in your response is the idea of designing for implementation. Designing for implementation takes time, it is not something where you look up an intervention on the California Clearinghouse, then it happens in a couple of weeks. Designing for implementation requires steps like organizational preparation, considering ongoing supports in the actual implementation stage. Or for instance, with the EPIS model, there is exploration phase (to make sure the thing is appropriate), the preparation phase (to prep to do the thing), and the implementation phase (where we do the thing). All of that preplanning leads to sustainability, which is the ideal. We want to spread and scale interventions and sustain them to ensure continued access to treatments that work. It’s also attempting to ensure the cost effectiveness of implementation when you design for it.

Pull quote in blue textbox. It's really important that we work closely with the developer on that because we don't want to make any adaptations that would compromise the fidelity of this existing model, which is effective.Looking ahead, implementation science has a bright and varied future, because it’s not only related to behavioral health. In fact, a lot of the work of implementation science has been in public health or health care settings. Considering the breadth of implementation science in settings, what’s the future look like for the field?

Wolk: It does cut across many different disciplines. I think there are a lot of opportunities in implementation science to think about how we can develop generalizable knowledge that will work across clinical conditions across different types of sites. That could help streamline the implementation process; right now, the best practice for implementation is to tailor and customize a lot to the particular site, to the population. That can be quite effective, but it also is extremely time and resource intensive. There’s a lot of interest in developing and  harnessing rapid approaches for developing and testing and iterating on potential strategies to support implementation, and from there, we can better understand what things need to be customized and tailored. It’s differentiating between where you need to spend that intensive time and what you can streamline. There could be go-to strategies for particular things that you’re trying to implement or strategies for particular settings or challenges. As a field, we’re starting to develop some of that knowledge, but I think we have a long way to go. We want our work to continually be effective, but we do want to streamline things so that it’s feasible to embark on this work with more and more sites and to bring more things to scale. That is where we can really have a big impact.

Snyder: For the novice implementor or someone who now has caught the fire of implementation, where can they go to learn more?

Wolk: There are some great institutes and resources that are available both for people who want to develop expertise in implementation science and for people who want to practically understand some of the principles and practices to support implementation in the setting that they work in.

The Society for Implementation Research Collaborative website has some nice resources organized, which is a good place to start. There are some websites that are more geared towards practical implementation and some sites for more advanced learning. There are week-long institutes, or certain training institutes, for example, the Implementation Science Institute that runs every year at Penn. It’s a weeklong course that will give you a much more in depth understanding of the field, which may be more appropriate if you’re in a leadership role or if you’re overseeing your organization or system’s efforts to bring evidence-based practices to your setting.

Snyder: Great resources all around. Any last parting words or nuggets of wisdom?

Wolk: A lot of times when people think about implementing an evidence-based practice, they think about training for their staff in that practice. And we know that this is a really important foundational thing that needs to happen. People have to get trained in the practice, but training alone is not sufficient. If you really want people to change their behavior and start integrating evidence-based practices more into their work, you need to go beyond just training and think about ongoing consultation or supervision. You need to think about other ways that the organization can support the use of best practices, so that people are motivated to use them and they have the support and resources that they need to implement them. There’s not an easy fix; a workshop is not a typically enough because these efforts really require an investment to make sure that you are doing it right and that you can sustain the practice.

Snyder: Yes, that harkens back to designing for implementation, doing things the right way so that people can get the treatments they need and that more and more people can continue to have access to quality care. To wrap up, implementation science really is a social justice vehicle. We need to get things to people who need them and who ultimately deserve them.

Things Clinicians Should Know

Implementation practice: the use of strategies to change people’s behavior or organizational processes.

Implementation science: the scientific study of methods and strategies that facilitate the uptake of evidence-based practice in health care settings.

Quality improvement: the process of developing a specific solution to a specific problem at the clinical or system level.

Implementation frameworks: outlines, overviews, and systems of thinking that are descriptive and not explanatory.

Implementation theories: describe and explain how individuals, organizations, and systems change.

Implementation strategies: ways to help people and settings do an evidence-based practice or innovation.

Open Access Tools

The Consolidated Framework for Implementation Research (CFIR)

EPIS Framework (EPIS)

Orientation to the Science of Dissemination and Implementation (Intro Series)

Society for Implementation Research Collaboration (SIRC)

References

Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and policy in mental health, 38(1), 4–23. https://doi.org/10.1007/s10488-010-0327-7

Ajzen, I. (1991). The theory of planned behavior. Organizational behavior and human decision processes, 50(2), 179–211. https://doi.org/10.1016/0749-5978(91)90020-T

Atkins, L., Francis, J., Islam, R., O’Connor, D., Patey, A., Ivers, N., Foy, R., Duncan, E. M., Colquhoun, H., Grimshaw, J. M., Lawton, R., & Michie, S. (2017). A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems. Implementation science, 12(1), 77. https://doi.org/10.1186/s13012-017-0605-9

Bauer, M. S., Damschroder, L., Hagedorn, H., Smith, J., & Kilbourne, A. M. (2015). An introduction to implementation science for the non-specialist. BMC psychology, 3(1), 32. https://doi.org/10.1186/s40359-015-0089-9

Bauer, M. S., & Kirchner, J. (2020). Implementation science: What is it and why should I care? Psychiatry research, 283, 112376. https://doi.org/10.1016/j.psychres.2019.04.025

Brown, C. H., Curran, G., Palinkas, L. A., Aarons, G. A., Wells, K. B., Jones, L., Collins, L. M., Duan, N., Mittman, B. S., Wallace, A., Tabak, R. G., Ducharme, L., Chambers, D. A., Neta, G., Wiley, T., Landsverk, J., Cheung, K., & Cruden, G. (2017). An overview of research and evaluation designs for dissemination and implementation. Annual review of public health, 38, 1–22. https://doi.org/10.1146/annurev-publhealth-031816-044215

Cane, J., O’Connor, D., & Michie, S. (2012). Validation of the theoretical domains framework for use in behaviour change and implementation research. Implementation science, 7(1), 37. https://doi.org/10.1186/1748-5908-7-37

Chalmers, I., & Glasziou, P. (2009). Avoidable waste in the production and reporting of research evidence. The Lancet, 374(9683), 86–89. https://doi.org/10.1016/S0140-6736(09)60329-9

Chorpita, B. F., Bernstein, A., & Daleiden, E. L. (2011). Empirically guided coordination of multiple evidence-based treatments: An illustration of relevance mapping in children’s mental health services. Journal of consulting and clinical psychology, 79(4), 470–480. https://doi.org/10.1037/a0023982

Curran G. M. (2020). Implementation science made too simple: A teaching tool. Implementation science communications, 1, 27. https://doi.org/10.1186/s43058-020-00001-z

Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation science, 4(1), 50. https://doi.org/10.1186/1748-5908-4-50

Funderburk, B. W., & Eyberg, S. (2011). Parent–child interaction therapy. In J. C. Norcross, G. R. VandenBos, & D. K. Freedheim (Eds.), History of psychotherapy: Continuity and change (pp. 415–420). American Psychological Association. https://doi.org/10.1037/12353-021

Kendall, Philip C. (Ed). (2022). Flexibility within fidelity: Breathing life into a psychological treatment manual (pp. 42-60). New York: Oxford University Press

Knight, D., Belenko, S., Robertson, A., Wiley, T., Wasserman, G., Leukefeld, C., DiClemente, R., Brody, G., Dennis, M., & Scott, C. (2015). Designing the optimal JJ-TRIALS study: EPIS as a theoretical framework for selection and timing of implementation interventions. Addiction science & clinical practice, 10(1). https://doi.org/10.1186/1940-0640-10-S1-A29

Larson, J., & Lochman, J. E. (2002). Helping School Children Cope with Anger: A Cognitive-Behavioral Intervention. New York: Guilford.

Michie, S., Johnston, M., Abraham, C., Lawton, R., Parker, D., Walker, A., & “Psychological Theory” Group. (2005). Making psychological theory useful for implementing evidence based practice: A consensus approach. Quality & safety in health care, 14(1), 26–33. https://doi.org/10.1136/qshc.2004.011155

Moullin, J. C., Dickson, K. S., Stadnick, N. A., Rabin, B., & Aarons, G. A. (2019). Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implementation science, 14(1), 1. https://doi.org/10.1186/s13012-018-0842-6

Nilsen P. (2015). Making sense of implementation theories, models and frameworks. Implementation science, 10, 53. https://doi.org/10.1186/s13012-015-0242-0

Powell, B. J., Waltz, T. J., Chinman, M. J., Damschroder, L. J., Smith, J. L., Matthieu, M. M., Proctor, E. K., & Kirchner, J. E. (2015). A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation science, 10, 21. https://doi.org/10.1186/s13012-015-0209-1

Proctor, E. K., Powell, B. J., & McMillen, J. C. (2013). Implementation strategies: Recommendations for specifying and reporting. Implementation science, 8,1–11. https://doi.org/10.1186/1748-5908-8-139

Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., Griffey, R., & Hensley, M. (2011). Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and policy in mental health, 38(2), 65–76. https://doi.org/10.1007/s10488-010-0319-7

Tabak, R. G., Khoong, E. C., Chambers, D. A., & Brownson, R. C. (2012). Bridging research and practice: Models for dissemination and implementation research. American journal of preventive medicine, 43(3), 337–350. https://doi.org/10.1016/j.amepre.2012.05.024

Waltz, T. J., Powell, B. J., Matthieu, M. M., Damschroder, L. J., Chinman, M. J., Smith, J. L., Proctor, E. K., & Kirchner, J. E. (2015). Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: Results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation science, 10(1), 109. https://doi.org/10.1186/s13012-015-0295-0

Annotate

Next Chapter
Contributors
PreviousNext
Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org