Skip to main content
  • Study protocol
  • Open access
  • Published:

Study protocol: Novel Methods for Implementing Measurement-Based Care with youth in Low-Resource Environments (NIMBLE)

Abstract

Background

For youth receiving care in community mental health centers, comorbidities are the rule rather than the exception. Using measurement-based care (MBC), or the routine evaluation of symptoms to inform care decisions, as the foundation of treatment for youth with comorbid problems significantly improves the impact of psychotherapy by focusing care and building engagement and alliance. MBC increases the rate of symptom improvement, detects clients who would otherwise deteriorate, and alerts clinicians to non-responders. Despite its demonstrated utility, MBC is rarely implemented with fidelity; less than 15% of providers report using MBC per recommendations. Previous efforts to support MBC implementation have yielded suboptimal outcomes, in part, due to organizations’ challenges with identifying and prioritizing barriers and selecting and developing strategies to overcome them. New methods are needed for identifying and prioritizing barriers, and matching strategies to barriers to optimize MBC implementation and treatment quality to improve youth mental health outcomes in community settings.

Methods

Pragmatic implementation methods will be piloted in four diverse community mental health centers. Methods include (a) rapid evidence synthesis; (b) rapid ethnography; (c) design kits (e.g., kits with disposable cameras, journals, maps); (d) barrier prioritization, and (e) causal pathway diagramming. These activities will generate actionable barriers; subsequently, we will use facilitated group processes to prioritize barriers and develop causal pathway diagrams to match strategies to barriers to create implementation plans that optimize MBC fidelity (Aim 1). We will track strategy deployment for 6 months, then compare MBC fidelity for another 6 months post-implementation with data from 2 years of historical controls (Aim 2). Finally, we will co-design a toolkit for design kit methods with youth and the practice and scientific communities (Aim 3).

Discussion

Optimizing MBC implementation in community mental health centers could transform youth mental health care by ensuring the most pressing symptoms are targeted early in treatment. The discussion section highlights expected challenges and limits to using the five methods, including recruitment and engagement given the high pressure on community mental health settings.

Trial registration

Clinicaltrials.gov. NCT05644756. Registered on 18 November 2022. This trial was retrospectively registered.

Background

For youth receiving care in community mental health centers (CMHCs), comorbidities are the rule rather than the exception [1, 2]. Using measurement-based care (MBC) as the foundation of treatment for youth with comorbidities significantly improves the therapeutic impact, as it can help facilitate meaningful communication and define treatment focus [3, 4]. MBC is the systematic, routine evaluation of symptoms to inform care decisions [5]. MBC is an evidence-based framework that offers improvements over usual care [5]. Especially in youth, MBC increases the rate of symptom improvement [1], detects clients who would otherwise deteriorate [6], and alerts clinicians to non-responders [6, 7]. Implementing MBC with fidelity requires three elements: (1) Collect, or routine administration of measures for symptoms, outcomes, and processes before therapy sessions; (2) Share, or clinician and client score review; and (3) Act, collaborative reevaluation of the treatment plan [8]. But MBC is rarely implemented with fidelity outside the confines of controlled randomized trials: less than 15% of providers report using MBC per recommendations [9] and disparities in implementation exist [9, 10].

Previous efforts to support MBC implementation in practice have yielded suboptimal outcomes in large part because, as is common in “implementation as usual,” strategies to support MBC implementation are not matched to important contextual factors; instead, they are selected based on personal preference, organizational routine, ISLAGIATT (“it seemed like a good idea at the time”), and other criteria. Research has shown that tailoring implementation strategies to address high-priority implementation barriers can increase MBC fidelity, at least with respect to one element: routine administration of measures symptoms, outcomes, and processes before therapy sessions [11,12,13]. Our experience suggests, however, that for tailored implementation to realize its full potential, new methods are needed for identifying and prioritizing implementation barriers and matching strategies to high-priority barriers [14].

The Novel Methods for Implementing Measurement-Based Care in Low-Resource Environments (NIMBLE) study seeks to tailor implementation plans to improve MBC fidelity by empowering clinic staff at all levels to identify, prioritize, and address barriers within their clinics. NIMBLE is driven by three aims that test and refine methods to tailor implementation plans intended to improve MBC fidelity in community mental health clinics: Aim 1—to co-create tailored plans for improving MBC fidelity in community mental health centers; Aim 2—to evaluate the effect of tailored implementation relative to implementation-as-usual on fidelity to MBC practices, by comparing clinician’s fidelity to MBC post-implementation planning compared to historical controls; Aim 3—to co-design toolkits for IMPACT methods piloted in Aim 1 via user-centered design workshops.

Methods/design

Context

IMPACT Center

The IMPACT Center is a collaboration between the Kaiser Permanente Washington Health Research Institute, the University of Washington, and the University of Michigan. NIMBLE is one of three R34-like IMPACT signature projects. The IMPACT Center, funded by the National Institute of Mental Health ALACRITY mechanism [15], will deliver practical implementation science to solve public health problems. IMPACT will refine practical, relevant, and rigorous implementation strategies, methods, and toolkits to accelerate the ability of under-resourced community settings to implement evidence-based practices with fidelity to improve youth mental health outcomes. Toward this end, the IMPACT Center will overcome three challenges: Challenge I—community settings undertaking EBP implementation face dozens of barriers, and existing methods for prioritizing these do not typically draw on the literature or sufficiently engage practice partners; Challenge II—there is little to no guidance for how to match implementation strategies to prioritized barriers; Challenge III—discrete implementation strategies are rarely optimized and often result in costly and complex activities that tax under-resourced settings. The IMPACT center has two cores: a methods core that brings together interdisciplinary expertise in implementation methods, and an administrative core that evaluates the center and meaningfully incorporates stakeholders in IMPACT methods and research.

Washington State CBT+ initiative (CBT+)

The Washington State CBT+ initiative (called CBT+) provides an ideal natural laboratory for IMPACT. CBT+ is an academic-community partnership funded by the state’s Division of Behavioral Health and Recovery [16]. Over nearly 15 years, CBT+ has provided training and some organizational support for delivering EBPs for children and adolescents in public mental health, with strong practice community collaboration and leadership. CBT+’s team-based approach to training includes asynchronous and synchronous web-based training, 6 months of phone consultation, and a yearly 1-day advanced training. Training and phone consultation are led by a University of Washington EBP expert or by an experienced CMHC supervisor who co-leads training and aspects of CBT+ using a train-the-trainer approach [17]. Phone consultations focus on clinical support for applying cognitive behavioral therapy (CBT). CBT+ trains approximately 250 clinicians and supervisors yearly and provides organizational support for EBP delivery through ongoing monthly calls for all previously trained supervisors and a yearly in-person supervisor training.

Design overview

NIMBLE will partner with four clinics connected to CBT+ to optimize MBC use. We will actively seek clinics that are diverse in terms of the following: youth served (e.g., approaching clinics that see primarily Latinx youth), rural/urban status, size, and length of time implementing EBPs. Hereafter, we refer to clinic staff (e.g., clinicians, administrators, supervisors) as practice partners and will specify when referring to a specific group of staff. We will begin by collaboratively identifying and prioritizing barriers, then engage practice partners in developing and enacting tailored implementation plans to address prioritized barriers. We will then track each clinic’s strategy deployment and debrief about the use of IMPACT Center methods to evaluate and refine them. We will check in with clinic implementation teams to support their activities and track strategy deployment once per month; 5–6 months is considered sufficient for active implementation [12, 18]. Finally, we will engage teams in a debrief session to assess the acceptability, feasibility, and appropriateness of IMPACT methods. See Fig. 1 for a visual representation of the study design.

Fig. 1
figure 1

A diagram of study activities. REA, rapid ethnographic assessment

Aim 1: Co-create plans for optimizing MBC implementation for youth with comorbidities

We will use three complementary barrier identification methods to surface locally contextualized barriers: rapid evidence synthesis (RES), rapid ethnographic assessment (REA), and design kits. Following barrier identification, we will use a group-facilitated barrier prioritization process followed by causal pathway diagramming to tailor implementation plans. Methods will be deployed in partnership with clinics over the course of a year. Ideally, methods will be deployed in-person (Site Visit 1) following a 2-h MBC workshop training for clinicians. However, given that our priority is building and maintaining relationships with practice partners, all study activities could be done in a hybrid or completely virtual format.

Barrier identification

Implementation studies usually conduct key informant interviews, focus groups, or surveys to identify barriers [19,20,21,22]. Once barriers are uncovered, researchers then typically engage a limited subset of stakeholders in identifying and prioritizing barriers by rating them by feasibility and importance. This approach to barrier identification and prioritization has several limitations: (1) it does not incorporate barriers from the literature; (2) it has issues of recall, bias, and social desirability; (3) it does not sufficiently engage practice partners and youth in evidence-based practices (EBPs) prior to assessment; (4) it does not capture the lived experience of practice partners or youth and typically does not ask about issues related to workflow, decision support, or informatics.

We propose an approach to barrier identification that rapidly and meaningfully centers the voices of those who have historically been least likely to be a part of the implementation process: clinic staff that are not clinicians, supervisors, and administrators. Our approach to barrier identification is multi-method and multi-informant, incorporating scientific literature with promising user-centered design methods (see Fig. 2).

Fig. 2
figure 2

Barrier identification methods

Rapid evidence synthesis (RES) to identify barriers from the literature (pre-site visits)

Systematic reviews provide invaluable scientific insight for identifying barriers to implementation. They are, however, very resource-intensive, typically costing around $100,000 to appropriately fund and 2 years to complete [23, 24]. Nimbler methods for evidence synthesis, like RES, promise to ease the burden of these efforts by reducing the resources needed to complete the review process while maintaining scientific rigor [25]. RES can be done in a matter of weeks, rather than months or years, and emerging evidence suggests that the resulting evidence is comparable [26]. Using RES methods may be an important step toward truly being able to incorporate implementation research into practice.

To identify barriers from the literature, we will conduct a rapid review to identify known barriers to implementing MBC in CMHCs serving youth. We will begin with expert recommendations of key articles to focus on pilot searches. We will then identify key terms using PubMed MeSH terms [27] and key terms and synonyms from established reviews (e.g., measurement-based care, implementation, barrier; [5, 28]) to conduct pilot searches of PsycINFO, PubMed, and Web of Science. These pilot searches will document the yield, sensitivity, and specificity of each search and ensure that exemplar articles will be captured within the search. We will systematically search PsycINFO, PubMed, and Web of Knowledge. The systematic search will be supplemented by targeted searches of relevant journals, reference reviews of relevant conceptual papers and systematic reviews, and an informal review of Google Scholar and ResearchGate. Screening will be done in Rayyan, an online systematic review tool [29]. Studies will be included if they (1) surfaced barriers to MBC, (2) were done in mental or behavioral health settings, and (3) focused on youth mental health treatment. The lead author will screen titles and abstracts. Two team members will conduct independent full-text screening and resolve discrepancies by consensus. The lead team member will abstract study data (e.g., setting, population, barriers identified) and distill identified barriers into the levels of analysis (e.g., individual client, organizational) and categories (e.g., attitudes, concerns about breach of confidentiality) identified in the Lewis et al. review [5]. This list of barriers will serve as a conceptual foundation to each subsequent method.

Use rapid ethnographic assessment to build engagement and identify context-specific barriers (REA; Site Visit 1)

Barrier identification methods do not typically include a wide range of stakeholders, and when they do, these methods are often in a restrictive survey format or in a brief interview or focus group that is decontextualized. Rapid ethnography uses observational and interviewing methods that seek to understand the experience of those working within clinics “in a rapid timeframe to promote action” [30]. REA is particularly promising for identifying barriers and understanding an implementation context. Developing and refining REA methods for use in community mental health centers was one of NIMBLE’s primary foci.

Procedure

REA consists of fieldwork (e.g., clinic site visits) to gather data and build relationships, followed by a synthesis of those data [30]. We will use REA methods over 2-day site visits at each clinic. We will conduct two types of ethnographic observations. First, we will use unobtrusive techniques, documenting observations of activities (e.g., clinician documentation) and significant events (e.g., staff meetings) using written and audio-recorded field notes. Second, we will audio-record semi-structured, focused interviews with as many practice partners as possible. Amongst our team, we will (1) record debrief huddles twice at each visit, where we discuss impressions of the established MBC process, clinic climate, and gaps in our knowledge, and (2) document other relevant observations in writing.

Participants

For each clinic, we expect up to 30 clinician and staff interviews: 30–45 min for providers, 15 min for staff. We expect a final sample of N = 120 clinic staff to complete interviews.

Data analysis

Our team will engage in a rapid analysis approach to document the occurrence or presence of barriers at each clinic, noting duration, time, location, and affected persons, situated in organizational, social, and task contexts that capture the lived experience of practice partners. In the month following each clinic visit, we will triangulate data from (1) unobtrusive observations, (2) notes from formal interviews, and (3) audio and written notes from debrief huddles. We will analyze data using the following steps. First, data will be condensed and reduced into a spreadsheet format. Second, data will be iteratively coded into themes using thematic analysis methods. Coded data will be categorized with the explicit purpose of (1) identifying barriers (what is getting in the way of MBC use), (2) identifying facilitators (are there established strengths we can leverage to promote MBC use), and (3) identifying themes that may aid in developing implementation strategies (strengths and possible strategies). These data will be used to generate a list of clinic-specific barriers to be used in a group-facilitated barrier prioritization exercise with clinic partners. This list will also be shared with practice partners and clinic administration. No direct quotes will be used.

Use design kits to identify context-specific barriers (Site Visit 1)

We will develop and use design kits with youth and clinicians to complement REA and collect highly contextualized data that identifies barriers to MBC. Design kits, also called design or cultural probes, are user-centered design packages containing prompts and materials that seek to generate real-time data by asking participants to engage in specific activities [31, 32]. For instance, a design kit activity may ask a clinician to journal about a specific time when discussing measures with a youth client went well. To our knowledge, design kits have never been used in the youth mental health field and have rarely been used in implementation science. Not incorporating user-centered methods and factors in treatment development is a fundamental missed opportunity in psychosocial treatment development, and a problem that implementation science is poised to solve. Because design kits are completed in situ, they promise to unearth important insights about the way practice partners and youth live that are difficult to glean using traditional research methods.

Procedure

Clinicians and youth will have 7 days (1 day per activity) to engage in design kit tasks before returning the design kits in self-addressed, stamped envelopes. Within a week of receiving the data, team members will conduct follow-up phone interviews, inviting participants to interpret and comment on their responses.

Participants

We will approach all practice partners that do therapy with youth (clinicians, clinical leads, case managers). We expect between 24 and 32 clinic staff across clinics to complete design kits. We will recruit at least 6–8 youth clients per clinic to also complete design kits. We expect between 24 and 32 youths to complete design kits.

Materials

Design kit materials for clinic staff and youth are the same, but the tasks and prompts differ. Each design kit includes a pre-paid return envelope, activity prompt cards, disposable camera, journal, two pens, a small gift (two pieces of chocolate), and stress ball (see Fig. 3). For clinic staff, we focus on prompts that ask clinicians to discuss the role of MBC in their day-to-day workflow. For youth, we intend prompts to be engaging, with some focused specifically on MBC and others focused on the youth’s broader experience in treatment. Design kit prompts are in Supplementary file 1. Design kits were developed in collaboration with the Enhanced Art, Graphics, Literacy, & Editorial Strategies (EAGLES) team at KPWHRI.

Fig. 3
figure 3

Design kit materials. Materials include seven activity cards and an instruction card, a study information sheet, a journal, two pens, a disposable camera, a stress ball, and two pieces of chocolate

Analysis

Team members will transcribe the written responses into a text document. Text documents and photos will be used as a basis for creating memos documenting barriers and describing their salience, meaning, and importance to the participant. These barriers will also be incorporated into the clinic report and Barrier Prioritization exercise.

Implementation planning

Barrier prioritization that focuses primarily on feasibility may overlook key barriers most closely tied to implementation success [11]. These approaches tend to prioritize individual provider barriers over acknowledging issues for youth (e.g., preference), supervisors (e.g., leadership EBP priorities), team (e.g., workflow), clinics (e.g., decision-making support), organizations (e.g., informatics), or systems (e.g., reimbursement).

We propose to first recruit an implementation team of 6–8 practice partners per clinic. We will attempt to recruit practice partners that use MBC or have some influence on MBC. To that end, we will approach clinicians and clinical supervisors, as well as clinic leadership and clinic administrators. Our goal is to ensure that the implementation team provides a representative viewpoint and basis for decision-making. The NIMBLE team will pilot two collaborative methods with implementation teams: barrier prioritization and causal pathway diagramming. Figure 4 shows implementation planning methods.

Fig. 4
figure 4

Implementation planning methods

Group-facilitated barrier prioritization activity (Site Visit 2)

To prioritize barriers to using MBC with fidelity, IMPACT will pilot a new group-facilitated activity with practice partners. This 2-h group-facilitated activity will occur in-person or virtually using a videoconferencing platform (Site Visit 2).

Procedure

We will engage in member-checking [33] by presenting all identified barriers, discussing the findings related to each barrier, and ask participants to reflect on their experience with each barrier. Practice partners will rate each barrier on a 4-point Likert scale on the following criteria: importance, frequency, and feasibility. Importance refers to how much the barrier gets in the way of MBC. Frequency refers to how often the barrier gets in the way. Feasibility refers to how feasibly the barrier can be addressed.

Participants

We will engage implementation teams comprised of 5–8 practice partners at each clinic. In total, we expect n = 20–32 clinic staff to prioritize barriers.

Analysis

We will isolate barriers that are above the mean on all three criteria (importance, frequency, feasibility). To do this, we will (1) calculate the mean rating of each criterion for each barrier, and (2) identify barriers that were scored above the mean on all three criteria. We will present the top five most highly ranked barriers to practice partners, who will collaboratively decide on which three to address.

Tailoring implementation plans using causal pathway diagrams (Site Visit 3)

Following barrier prioritization, we will engage with practice partners to create causal pathways diagrams (CPDs) to compare and select implementation strategies best suited to address barriers to MBC use. CPDs are a novel visualization tool to improve the selection, design, and optimization of implementation strategies [34]. They support implementers, including researchers or practitioners, to clarify thinking about how implementation strategies work and under what conditions they work. They rely on theory and experience to capture the implementers’ current understanding of the process through which an implementation strategy is thought to address an identified barrier and bring about improved implementation outcomes.

Procedure

For each clinic, we will develop CPDs for the top 3 prioritized barriers. The goal is to assess how well each implementation strategy is matched to a prioritized barrier, based on its mechanism of action, and to clarify the causal chain of events that must take place to achieve MBC fidelity (i.e., how does this strategy work to address this barrier). To illustrate IMPACT’s steps for developing CPDs in Project 1, we have selected an example of the strategy Task Shifting to address a possible barrier to MBC fidelity, Inefficient Workflow; Fig. 5 depicts an empty CPD, and Fig. 6 depicts an example CPD that articulates the strategy, mechanism, and outcomes for a strategy-barrier pair.

Fig. 5
figure 5

An empty causal pathway diagram

Fig. 6
figure 6

An example completed causal pathway diagram. PHQ-9, Patient Health Questionnaire-9; MBC, measurement-based care

Our research team will develop a CPD for the top barrier in advance of meeting with the implementation team in each clinic who participated in the barrier prioritization activity (Site Visit 3). We will use a diagramming software to build CPDs in real time. We will begin by re-orienting practice partners to the purpose of NIMBLE. Next, we will present the example CPD, introduce practice partners to the construction process, discuss how the CPD aligns with their experiences, and modify it as needed. We will then guide practice partners in collaboratively constructing CPDs for the other barriers using plain-language questions to facilitate the 4-step process described below. This process will result in detailed, co-designed implementation plans.

Step 1: The research team and practice partners will collaboratively select at least one promising strategy from existing compilations [35, 36] and our experience, per prioritized barrier. Strategy selection will be guided by an assessment of the evidence, plausibility, and feasibility of each strategy. For instance, strategies that include major overhauls to a system-wide electronic health record are likely infeasible, so a strategy like that would not be selected. We will use Proctor, Powell, and McMillen’s [36] recommendations for strategy specification to help practice partners operationalize each strategy. We will work together to specify the strategies using the recommendations for articulating the actor, action, target, justification, dose, and timing [36].

Step 2: We will articulate the mechanism that describes how or why each implementation strategy works. We will use plain language questions (e.g., “How can this strategy change inefficient workflows?”) to engage stakeholders in identifying plausible mechanisms. This step aims to ensure that each strategy is aligned with the barrier.

Step 3: We will explore the presence of preconditions, or factors that are necessary for a part of the causal chain of events to take place. For instance, if the strategy to address inefficient workflow was to task-shift the administration of outcome measures to the front desk team, a necessary precondition is that this strategy would need to be approved by the necessary administrative person(s) and further require knowledge translation to brief the administering team on the measures and how to discuss them. Preconditions do not always exist, but if they do, we want to account for them to ensure the necessary conditions for an implementation strategy to work are in place.

Step 4: Practice partners will rank-order the most convincing, or plausible CPDs. These CPDs will help practice partners select the most promising strategies while serving as detailed plans for their work. After selecting promising strategies and generating CPDs for the three most highly ranked barriers at each clinic, we will discuss with our partners how to deploy these strategies and evaluate their impact, focusing on preconditions, moderators, and outcomes of implementation success. We will follow up virtually monthly via e-mail and/or phone to explore any changes in prioritized barriers, strategy deployment, and MBC fidelity.

Participants

We will engage the implementation team of 5–8 practice partners in CPD development. In total, we expect between 20 and 32 practice partners to engage in this activity.

Aim 2: Compare MBC fidelity post-IMPACT Center method deployment versus historical controls

The goal of tailoring implementation is to improve clinician’s fidelity using MBC. In this case, fidelity will be conceptualized as clinician’s use of the three main components of MBC—(1) Collect, (2) Share, and (3) Act [8]. We will assess changes in fidelity resulting from implementation tailoring activities through (1) clinician self-report and (2) tracking measure use in the CBT+ Dashboard, an anonymous online system for tracking outcome measure use for clinicians that have undergone CBT+ training. We hypothesize improvement in MBC fidelity at 6 months post-implementation.

Design

For both methods, we will employ a historical control design comparing pre- and post-IMPACT method deployment from the same clinic, given that contextual factors that directly influence EBP practices, including MBC, vary by clinic. For the clinician self-report, we will ask clinicians to self-report their use of MBC with youth clients before, during, and after the active implementation period. For the CBT+ Dashboard data, we will capture data from 2 years prior to (a) increase sample size and ability to detect significant differences, and (b) account for cohort-specific effects that may influence a particular year. We will ask clinic directors whether any clinical changes may have influenced MBC fidelity during this prior period.

Procedures

We will administer the self-report survey through REDCap [37] before active implementation begins. The active implementation period will begin after implementation plans have been generated and the necessary pre-work has been done to create processes, procedures, or resources generated by the implementation plans. The same survey will be sent to clinicians three times: before the start of active implementation, 3 months into the implementation process, and at the end of the implementation process (6 months). For the CBT+ Dashboard, we will work with the CBT+ program to extract data on nuanced elements of MBC fidelity consistent with the IMPACT Center’s overarching Quality Model [38, 39], and our emphasis on using MBC to improve treatment quality for youth with comorbid mental health problems. See Fig. 7 for the IQM, which was adapted from the seminal paper from Proctor and colleagues [38].

Fig. 7
figure 7

IMPACT quality model (IQM)

Participants

We will ask all clinicians that work with youth and have consented to study activities to complete the clinician self-report. For fidelity monitoring using the CBT+ Dashboard, we will recruit up to five clinicians from each clinic. Each clinician will enter data for 3–5 youth clients into the CBT+ Dashboard. All eligible clinicians from the clinics and their clients from 2 years prior to this study will be historical controls (~3 clinicians per year with data for 2–4 clients, for a total of ~108 control clients).

Measures

The clinician self-report measure is a simple, quick, and homegrown self-report tool based upon the Collect, Share, Act conceptualization of MBC fidelity [8]. The creation of this survey was guided by a need for something brief and pragmatic, and no self-report measures to our knowledge are (1) brief, (2) pragmatic, and (3) assess fidelity to the three main components of MBC [40]. The survey asks clinicians to report (1) their approximate youth caseload in the last 6 months, and the percentage of youth patients with whom they have (2) collected data at least every two sessions, (3) shared data over the treatment course, and (4) acted on these processes to change the direction of treatment. Clinicians will complete this survey over REDCap [37]. We plan to assess the internal reliability, concurrent validity, and convergent validity of the measure once data collection ends [41]. A copy of this measure is available from the first author.

For fidelity tracking using the CBT+ Dashboard, clinicians will enter relevant youth and/or caregiver self-report measures into the CBT+ Dashboard when administered. Measures included in the CBT+ Dashboard include the following: a trauma exposure screen and posttraumatic checklist, a general assessment of symptomology (internalizing, externalizing, and attention symptoms), depression, and anxiety, all expected to be administered at baseline and personalized through the course of treatment [16].

All clinic partners involved in rapid ethnography will also complete the Implementation Climate Scale at baseline (ICS; [42]). The ICS is an 18-item survey that assesses the expectations, supports, and rewards associated with evidence-based practice implementation in an organization. The ICS has been used extensively in implementation studies in behavioral/mental health settings and has demonstrated solid internal consistency and structural, convergent, and discriminant validity [42,43,44,45].

Analytic plan

We will use 3-level generalized mixed effects models (fidelity score nested within clinician nested within clinic) to assess whether tailored implementation plans led to significant changes in MBC fidelity. The ICS will be treated as a covariate. Only effects at the clinic level are considered fixed, as this is a nuisance cluster. Each predictor will be modeled separately, with appropriate link functions for distributional form and dichotomous or count variables. Significance of model fit and individual coefficients will be determined via deviation tests (likelihood ratio, Akaike Information Criterion, and Bayes Information Criterion).

Aim 3: Co-design toolkits for IMPACT methods piloted in Aim 1 via UCD workshops

Following the implementation process, we will co-design toolkits for IMPACT methods with practice partners, primarily for methods that clinic staff could actually use: barrier prioritization and design kits. Toolkits will consist of a brief history and rationale for a specific method and provide practical considerations, actionable steps, and resources for using that method to achieve a specific goal. Toolkits are intended to be publicly available, practical, and usable. Co-design with practice partners will consist of a focus group followed by a user-centered design group to gather feedback on toolkit structure and content for all available toolkits.

Participants

For UCD with practice partners, we will recruit n = 12 participants to participate in two design groups, one for barrier prioritization and one for design kits (total N = 24). To be eligible, participants could have been involved in any study activity.

Procedure

First, methods core team members will work with communications and visual experts to develop each toolkit based on the workshop design input and present it to practice partners via email to solicit additional asynchronous feedback for iteration before publishing the toolkits online. Once the toolkits have been created, an IMPACT Center UCD expert will lead the 2-h virtual workshops with practice partners. In the first 40 min, workshop participants will be prompted to remember their experience working with the NIMBLE team. For the next 30 min, workshop participants will receive the method toolkit, an overview of workshop objectives, and prompts to inform their engagement with the materials. The next 40 min will consist of facilitated discussion of the reviewed toolkit. Participants will consider toolkit materials using prompts like: “How effectively do you think you could follow [example toolkit] in its current form?” A co-facilitator will capture feedback that informs revisions/refinements to methods or toolkit design. In the final 10 min, we will debrief and engage in member-checking to ensure the accuracy of the feedback we obtained. If conflicting feedback arises, the facilitator will work to understand the conflict source and help participants reach a consensus by clarifying common goals for the toolkit.

Analysis

Focus group feedback will be formally analyzed and organized by a priori established themes (e.g., acceptability, feasibility) and emergent themes, both populated with direct quotes to ensure the accuracy of data capture and sharing. We will use thematic analysis on transcript data to identify actionable improvements to the barrier prioritization and design kit toolkits. We will administer the Acceptability of Intervention Measure, Intervention Appropriateness Measure, and Feasibility of Intervention Measure to supplement the interpretation of qualitative findings [4647]. 

Discussion

MBC holds promise for improving the impact of psychotherapy for youth, especially in community mental health centers where comorbidities and clinical complexity are common. To overcome suboptimal implementation of MBC, tailored implementation approaches that tackle the critical barriers are needed. The NIMBLE study pilots multiple practice partner-engaged methods that center the voices of those “on the ground” to create nuanced, locally contextualized implementation plans to improve MBC fidelity. To identify barriers, we will use rapid evidence synthesis, design kits, and rapid ethnographic assessment. To prioritize barriers, we propose a novel, group-facilitated barrier prioritization exercise. To co-create implementation plans that are guided by the voices of practice partners, we will use a group-facilitated causal pathway diagramming method. We will evaluate whether fidelity was optimized in these settings as a result of these exercises and engage practice partners to understand their perceptions of our methods and to further improve our methods.

Anticipated challenges and limitations

We anticipate several challenges. Community mental health centers are under enormous pressure and have few resources. The landscape of doing large-scale implementation studies in community mental health centers has clearly been affected by the sequelae of the COVID-19 pandemic. Turnover is common [48] and clinicians are under unprecedented amounts of stress [49]. We expect to have difficulty recruiting clinics and clinicians as well as clinician attrition. Indeed, several clinics that offered letters of support were no longer able to participate when we approached them.

We also anticipate challenges while piloting novel, untested methods. For instance, we anticipate difficulty with getting participants to return design kit materials. Additionally, design kits have inherent accessibility limitations; youth who are visually impaired or have disabilities that limit writing capabilities are excluded automatically from this method. So while design kits promise to increase equitable implementation practices, researchers should make efforts to incorporate design kit methods that allow for text-to-speech. Over the course of this study, we will make efforts to include other response options for participants, including voice memos and digital text. Finally, given that design kits have never been used in implementation research, it is possible that the data gleaned from them is not relatively beneficial in comparison to other methods like REA.

This study has several important limitations. First, we are limited in the types of implementation support we can provide to practice partners. For instance, if practice partners use a specific electronic medical system to do MBC, and many of the identified MBC barriers are related to the interface of that system, we may not be able to make significant changes to those established systems. Second, the very nature of rapid methods means that some steps of scientific rigor are eliminated. For instance, while every effort will be made to gather every relevant study in the RES, it is possible that we will miss some studies.

Conclusion

Implementation science and practice have historically minimized the voices of those outside of positions of power [50]. Practical and freely accessible methods that meaningfully engage practice partners could have a transformative impact on mental healthcare in these clinics and beyond by making it easier to engage in rigorous and effective implementation of MBC and other EBPs.

Availability of data and materials

Not applicable.

Abbreviations

AIM:

Acceptability of Intervention Measure

CBT+:

The Washington State CBT+ initiative

CMHC:

Community mental health center

COVID-19:

Coronavirus disease of 2019

CPD:

Causal pathway diagram

EAGLES:

Enhanced Art, Graphics, Literacy, & Editorial Strategies

EBP:

Evidence-based practice

FIM:

Feasibility of Intervention Measure

IAM:

Intervention Appropriateness Measure

ICS:

Implementation Climate Scale

MBC:

Measurement-based care

MeSH:

Medical Subject Headings

NIMBLE:

Novel Methods for Implementing Measurement-Based Care in Low-Resource Environments

PHQ-9:

Patient Health Questionnaire-9

REA:

Rapid ethnographic assessment

RES:

Rapid evidence synthesis

UCD:

User-centered design

References

  1. Bickman L, et al. Effects of routine feedback to clinicians on mental health outcomes of youths: results of a randomized trial. Psychiatr Serv. 2011;62(12):1423–9.

    Article  PubMed  Google Scholar 

  2. Foa EB, Stein DJ, McFarlane AC. Symptomatology and psychopathology of mental health problems after disaster. J Clin Psychiatry. 2006;67:15–25.

    PubMed  Google Scholar 

  3. Scott K, Lewis CC. Using measurement-based care to enhance any treatment. Cogn Behav Pract. 2015;22(1):49–59.

    Article  PubMed  PubMed Central  Google Scholar 

  4. McLeod BD, et al. To utility and beyond! Specifying and advancing the utility of measurement-based care for youth. J Clin Child Adolesc Psychol. 2022;51(4):375–88.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Lewis CC, et al. Implementing measurement-based care in behavioral health: a review. JAMA Psychiatry. 2019;76(3):324–35.

    Article  PubMed  Google Scholar 

  6. Shimokawa K, Lambert MJ, Smart DW. Enhancing treatment outcome of patients at risk of treatment failure: meta-analytic and mega-analytic review of a psychotherapy quality assurance system. J Consult Clin Psychol. 2010;78(3):298.

    Article  PubMed  Google Scholar 

  7. Lambert MJ, et al. Is it time for clinicians to routinely track patient outcome? a meta-analysis. Clin Psychol. 2003;10(3):288.

    Google Scholar 

  8. Barber, J. and S.G. Resnick, Collect, Share, Act: a transtheoretical clinical model for doing measurement-based care in mental health treatment. Psychol Serv. 2022. https://psycnet.apa.org/doiLanding?doi=10.1037%2Fser0000629.

  9. Jensen-Doss A, et al. Monitoring treatment progress and providing feedback is viewed favorably but rarely used in practice. Adm Policy Mental Health. 2018;45(1):48–61.

    Article  Google Scholar 

  10. Liu F.F, et al. Mind the gap: considering disparities in implementing measurement-based care. 2019.

    Google Scholar 

  11. Bickman L, et al. Implementing a measurement feedback system: a tale of two sites. Adm Policy Ment Health. 2016;43(3):410–25.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Lewis CC, et al. Implementing measurement-based care (iMBC) for depression in community mental health: a dynamic cluster randomized trial study protocol. Implement Sci. 2015;10:127.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Lewis CC, et al. Standardized versus tailored implementation of measurement-based care for depression in community mental health clinics. Psychiatr Serv. 2022;73(10):1094–101.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Lewis CC, et al. Mediators of measurement-based care implementation in community mental health settings: results from a mixed-methods evaluation. Implementation Science. 2022;17(1):1–18.

    Article  Google Scholar 

  15. Health, N.I.o.M. Advanced Laboratories for Accelerating the Reach and Impact of Treatments for Youth and Adults with Mental Illness (ALACRITY) Research Centers (P50 Clinical Trial Optional). 2023; Available from: https://grants.nih.gov/grants/guide/pa-files/PAR-20-293.html.

  16. Dorsey S, et al. A statewide common elements initiative for children’s mental health. J Behav Health Serv Res. 2016;43:246–61.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Triplett NS, et al. Evaluating a train-the-trainer approach for increasing EBP training capacity in community mental health. J Behav Health Serv Res. 2020;47:189–200.

    Article  PubMed  Google Scholar 

  18. Miller AL, et al. Partnering to translate evidence-based programs to community settings: bridging the gap between research and practice. Health Promot Pract. 2012;13(4):559–66.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Field B, et al. Using the Knowledge to Action Framework in practice: a citation analysis and systematic review. Implement Sci. 2014;9(1):1–14.

    Article  Google Scholar 

  20. Craig LE, et al. Identifying the barriers and enablers for a triage, treatment, and transfer clinical intervention to manage acute stroke patients in the emergency department: a systematic review using the theoretical domains framework (TDF). Implement Sci. 2016;11(1):1–18.

    Article  Google Scholar 

  21. Weatherson KA, Gainforth HL, Jung ME. A theoretical analysis of the barriers and facilitators to the implementation of school-based physical activity policies in Canada: a mixed methods scoping review. Implement Sci. 2017;12(1):1–15.

    Article  Google Scholar 

  22. Overbeck G, Davidsen AS, Kousgaard MB. Enablers and barriers to implementing collaborative care for anxiety and depression: a systematic qualitative review. Implement Sci. 2016;11(1):1–16.

    Article  Google Scholar 

  23. Ganann R, Ciliska D, Thomas H. Expediting systematic reviews: methods and implications of rapid reviews. Implement Sci. 2010;5:1–10.

    Article  Google Scholar 

  24. Khangura S, et al. Evidence summaries: the evolution of a rapid review approach. Syst Rev. 2012;1(1):1–9.

    Article  Google Scholar 

  25. King VJ, et al. Paper 2: Performing rapid reviews. Syst Rev. 2022;11(1):151.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Khangura S, et al. Rapid review: an emerging approach to evidence synthesis in health technology assessment. Int J Technol Assess Health Care. 2014;30(1):20–7.

    Article  PubMed  Google Scholar 

  27. Balogh SG, et al. Time evolution of the hierarchical networks between PubMed MeSH terms. PLoS One. 2019;14(8):e0220648.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  28. Powell BJ, et al. Determinants of implementing evidence-based trauma-focused interventions for children and youth: a systematic review. Adm Policy Ment Health. 2020;47(5):705–19.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Ouzzani M, et al. Rayyan-a web and mobile app for systematic reviews. Syst Rev. 2016;5(1):210.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Vindrola-Padros, C., Rapid ethnographies: a practical guide. 2021: Cambridge University Press.

  31. Iversen O.S, Nielsen C. Using digital cultural probes in design with children. In: Interaction Design And Children: Proceeding of the 2003 conference on Interaction design and children. 2003.

    Google Scholar 

  32. Wyche S. Using cultural probes in new contexts: exploring the benefits of probes in HCI4D/ICTD. In: Conference Companion Publication of the 2019 on Computer Supported Cooperative Work and Social Computing. 2019.

    Google Scholar 

  33. Henrikson NB, et al. “Not private at all:” comparative perspectives on privacy of genomic data, family history data, health-related data, and other personal data. OBM Genet. 2022;6(4):1–17.

    Article  Google Scholar 

  34. Lewis CC, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;6:136.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Connors EH, et al. Implementation strategies to promote measurement-based care in schools: evidence from mental health experts across the USA. Implement Sci Commun. 2022;3(1):67.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Powell BJ, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Harris PA, et al. The REDCap consortium: Building an international community of software platform partners. J Biomed Inform. 2019;95:103208.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Proctor E, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Mental Health Mental Health Serv Res. 2011;38:65–76.

    Article  Google Scholar 

  39. Carroll C, et al. A conceptual framework for implementation fidelity. Implement Sci. 2007;2:1–9.

    Article  Google Scholar 

  40. Powell BJ, et al. Toward criteria for pragmatic measurement in implementation research and practice: a stakeholder-driven approach using concept mapping. Implement Sci. 2017;12:1–7.

    Article  Google Scholar 

  41. Lewis CC, et al. The psychometric and pragmatic evidence rating scale (PAPERS) for measure development and evaluation. Implement Res Pract. 2021;2:26334895211037390.

    PubMed  PubMed Central  Google Scholar 

  42. Ehrhart MG, Aarons GA, Farahnak LR. Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS). Implement Sci. 2014;9(1):1–11.

    Article  Google Scholar 

  43. Beidas RS, et al. A repeated cross-sectional study of clinicians’ use of psychotherapy techniques during 5 years of a system-wide effort to implement evidence-based practices in Philadelphia. Implement Sci. 2019;14:1–13.

    Google Scholar 

  44. Beidas R, et al. The relationship between consumer, clinician, and organizational characteristics and use of evidence-based and non-evidence-based therapy strategies in a public mental health system. Behav Res Ther. 2017;99:1–10.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Williams NJ, et al. Linking molar organizational climate and strategic implementation climate to clinicians’ use of evidence-based psychotherapy techniques: cross-sectional and lagged analyses from a 2-year observational study. Implement Sci. 2018;13(1):1–13.

    Article  Google Scholar 

  46. Weiner BJ, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12:1–12.

    Article  Google Scholar 

  47. Cho E, et al. Assessing provider perceptions of training: Initial evaluation of the acceptability, feasibility, and appropriateness scale. Implement Res Pract. 2022;3:26334895221086268.

    PubMed  PubMed Central  Google Scholar 

  48. Sklar M, Ehrhart MG, Aarons GA. COVID-related work changes, burnout, and turnover intentions in mental health providers: a moderated mediation analysis. Psychiatr Rehabil J. 2021;44(3):219.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Last BS, et al. Economic precarity, financial strain, and job-related stress among Philadelphia’s public mental health clinicians. Psychiatr Serv. 2022;73(7):774–86.

    Article  PubMed  Google Scholar 

  50. Sanchez, A.L., et al., Power redistribution and upending white supremacy in implementation research and practice in community mental health. Psychiatric services, 2023: p. appi. ps. 20220209.

Download references

Acknowledgements

We would like to acknowledge Dr. Nora Henrikson and Paula Blasi, MPH, who critically helped form the parameters of the rapid evidence synthesis. We would also like to acknowledge methods core faculty Drs. Pedja Klasnja and Aaron Lyon for their feedback on NIMBLE methods. Finally, we would like to acknowledge the work of Michelle Chan, Rene Hawkes, and Virginia Lopez, who lead the administrative management of the broader IMPACT center.

Funding

This manuscript was funded by the National Institute of Mental Health P50 MH126219. The National Institute of Mental Health had no role in the design, collection, analysis, or interpretation of data.

Author information

Authors and Affiliations

Authors

Contributions

CCL, SD, MDP, and BJW conceived of the study idea and/or study analyses. RGM has been the scientific lead of NIMBLE since 2022, has implemented all study methods, wrote most of this manuscript, and is critically involved with data acquisition, analyses, and interpretation. RDM, LGP, and CB provided critical substantive reviews. AM, KDM, and CB are involved critically in data acquisition, analysis, and interpretation. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Ruben G. Martinez.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the Kaiser Permanente Washington Health Research Institute Interregional Institutional Board Review (IRB# 1943932) on October 18, 2022.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Design kit prompts.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Martinez, R.G., Weiner, B.J., Meza, R.D. et al. Study protocol: Novel Methods for Implementing Measurement-Based Care with youth in Low-Resource Environments (NIMBLE). Implement Sci Commun 4, 152 (2023). https://0-doi-org.brum.beds.ac.uk/10.1186/s43058-023-00526-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s43058-023-00526-z

Keywords