Orientation to the Science of Dissemination and Implementation - - PowerPoint PPT Presentation

orientation to the science of dissemination and
SMART_READER_LITE
LIVE PREVIEW

Orientation to the Science of Dissemination and Implementation - - PowerPoint PPT Presentation

Orientation to the Science of Dissemination and Implementation Rinad S. Beidas, PhD, University of Pennsylvania Cara C. Lewis, PhD, Kaiser Permanente Byron J. Powell, PhD, University of North Carolina at Chapel Hill November 30, 2018 11 th


slide-1
SLIDE 1

Orientation to the Science of Dissemination and Implementation

Rinad S. Beidas, PhD, University of Pennsylvania Cara C. Lewis, PhD, Kaiser Permanente Byron J. Powell, PhD, University of North Carolina at Chapel Hill

November 30, 2018 11th Annual Conference on the Science of Dissemination and Implementation

slide-2
SLIDE 2

Objective

To provide a broad overview of the field of implementation science to whet your appetite for all that is to come.

2

#DIScience2018 @rsbeidas, @caraclewis, @byron_powell

slide-3
SLIDE 3

Brief Introductions

3

slide-4
SLIDE 4

Overview

  • 1. Introduction and Rationale for Implementation Science
  • 2. Introduction to Two Case Studies
  • 3. Implementation Process Models
  • 4. Identifying and Prioritizing Barriers and Facilitators
  • 5. Identifying and Applying Implementation Strategies
  • 6. Evaluating Implementation Efforts
  • 7. Discussion

4

slide-5
SLIDE 5

Orientation to the Science of Dissemination and Implementation

Introduction and Rationale for Implementation Science 1 2 3 4 5 6 7

slide-6
SLIDE 6

6

Dickersin, 1987 Koren, 1989 Balas, 1995 Poynard, 1985

variable 0.3 year 6 - 13 years 0.6 year 0.5 year 9.3 years

Kumar, 1992 Kumar, 1992 Poyer, 1982 Antman, 1992

Negative results

Submission Acceptance Implementation Reviews, guidelines, textbook Publication Original Research

Inconsistent indexing Lack of numbers

Bibliographic databases

Expert

  • pinion

50% 46% 18% 35%

Balas & Boren (2000)

slide-7
SLIDE 7

7

Dickersin, 1987 Koren, 1989 Balas, 1995 Poynard, 1985

variable 0.3 year 6 - 13 years 0.6 year 0.5 year 9.3 years

Kumar, 1992 Kumar, 1992 Poyer, 1982 Antman, 1992

Negative results

Submission Acceptance Implementation Reviews, guidelines, textbook Publication Original Research

Inconsistent indexing Lack of numbers

Bibliographic databases

Expert

  • pinion

50% 46% 18% 35%

It takes 17 years to turn 14 percent of original research to the benefit of patient care

Balas & Boren (2000)

slide-8
SLIDE 8

From Bench to Bedside?

8

Credit Cynthia Vinson

slide-9
SLIDE 9

Quality Gaps Persist

9

slide-10
SLIDE 10

Prioritization of D&I Science

10

slide-11
SLIDE 11

Dissemination Research – The scientific study of targeted distribution of information and intervention materials to a specific public health or clinical practice

  • audience. The intent is to understand how to best

spread and sustain knowledge and the associated evidence-based interventions Implementation Research – The scientific study of the use of strategies to adopt and integrate evidence- based health interventions into clinical and community settings to improve patient outcomes and benefit population health.

11

NIH PAR-18-007

slide-12
SLIDE 12

12

“Let it happen” “Help it happen” “Make it happen”

Implementation Dissemination Diffusion

Greenhalgh et al. (2004); Lomas (1993)

slide-13
SLIDE 13

13

Intervention Effectiveness/Process Research Healthcare/Behavioral Economics Medical Anthropology Organization & Management, Marketing Social Psychology Adult Education/ Learning Improvement Science

Multidisciplinary Influences

slide-14
SLIDE 14

14

Could an intervention work? Does an intervention work? Making an intervention work Efficacy studies Effectiveness studies

Real-world relevance Time

Exploration Preparation Implementation Sustainment Local knowledge Generalizable knowledge Implementation Research

Traditional Translational Pipeline

Implementation Practice Preintervention 4 Phases: Aarons et al., 2011 Brown et al., ARPH 2017
slide-15
SLIDE 15

Credit H. Brown and J.D. Smith

System to Support Adoption and Delivery w Fidelity Intervention Intervention System to Support Adoption and Delivery with Fidelity

Evaluate Health Outcomes Evaluate Quality, Quantity, Speed of Delivery

Effectiveness vs. Implementation

slide-16
SLIDE 16

16

Proctor et al. (2009 & 2011)

What?

Evidence-based Interventions

Health Outcomes Satisfaction Function Health status/ symptoms

*IOM Standards of Care

THE USUAL

slide-17
SLIDE 17

17

Proctor et al. (2009 & 2011)

What?

Evidence-based Interventions

How?

Implementation Strategies

Implementation Outcomes Acceptability Appropriateness Feasibility Adoption Fidelity Penetration Costs Sustainment Service Outcomes* Efficiency Safety Effectiveness Equity Patient- centeredness Timeliness Health Outcomes Satisfaction Function Health status/ symptoms

*IOM Standards of Care
slide-18
SLIDE 18

18

Proctor et al. (2009 & 2011)

What?

Evidence-based Interventions

How?

Implementation Strategies

Implementation Outcomes Acceptability Appropriateness Feasibility Adoption Fidelity Penetration Costs Sustainment Service Outcomes* Efficiency Safety Effectiveness Equity Patient- centeredness Timeliness Health Outcomes Satisfaction Function Health status/ symptoms

*IOM Standards of Care

THE IMPLEMENTATION PATHWAY

slide-19
SLIDE 19

Plain Language

  • The intervention/practice/innovation is THE THING
  • Implementation strategies are the stuff we do to try to help

people/places DO THE THING

  • Main implementation outcomes are HOW WELL they DO

THE THING

19

Courtesy of Geoff Curran

slide-20
SLIDE 20

20

Proliferation of Frameworks

“Frameworks are like

  • toothbrushes. Everyone has
  • ne and no one wants to

use anyone else’s.”

  • Christian Schunn (2001)

Schunn is cited as saying this in Gorman et al. (2003), “Spherical Horses and Shared Toothbrushes”

slide-21
SLIDE 21

21

Common Element: Multiphase

Exploration Phase Preparation

Phase

Implementation

Phase

Sustainment

Phase Adoption Decision Training/ Coaching Begins EBP Delivered with Fidelity

Aarons et al. (2011)

slide-22
SLIDE 22

22

Common Element: Multilevel

Larger&System/&Environment & Organiza4on & Group&/&Team & Individual & Reimbursement,&legal,&and& regulatory&policies&are&key & Structure&and&strategy&are&key & Coopera4on,&coordina4on,&&&shared& knowledge&are&key & Knowledge,&skill,&and&exper4se&are& key & Shortell,&S.&M.&(2004).&Increasing&value:&a&research&agenda&for&addressing&the&managerial&and&organiza4onal&challenges&facing&health&care& delivery&in&the&United&States.&Medical(Care(Research(and(Review,&61(3&suppl),&12SS30S.& ! Ferlie,&E.&B.,&&&Shortell,&S.&M.&(2001).&Improving&the&quality&of&health&care&in&the&United&Kingdom&and&the&United&States:&a&framework&for& change.&Milbank(Quarterly,&79(2),&281S315.& ! Four!Levels!of!Change!for!Assessing! Performance!Improvement ! Assump9ons!about!Change !
slide-23
SLIDE 23

23

Types of Theories/Frameworks

Nilsen (2015)

slide-24
SLIDE 24

24

slide-25
SLIDE 25

Orientation to the Science of Dissemination and Implementation

Introduction to Two Case Studies 1 2 3 4 5 6 7

slide-26
SLIDE 26

Case Study 1

26

Naturalistic observational study of use of evidence-based practice over time in large public mental health system

slide-27
SLIDE 27

2007 2011 2012 2016

slide-28
SLIDE 28
slide-29
SLIDE 29

29

ORIGINAL PAPER

Applying the Policy Ecology Framework to Philadelphia’s Behavioral Health Transformation Efforts

Byron J. Powell1,2 • Rinad S. Beidas2 • Ronnie M. Rubin3 • Rebecca E. Stewart2 • Courtney Benjamin Wolk2 • Samantha L. Matlin4 • Shawna Weaver3 • Matthew O. Hurford5 • Arthur C. Evans3 • Trevor R. Hadley2 • David S. Mandell2 Adm Policy Ment Health (2016) 43:909–926 DOI 10.1007/s10488-016-0733-6 Training & Consultation Systematically contracting for EBP delivery Hosting events highlighting EBP champions Designating
  • rganizations
as “EBP agencies” Enhanced rates for EBP delivery
slide-30
SLIDE 30

30

Timeline of Data Collection

Pre- EPIC

19 agencies 23 sites 130 therapists 22 agencies 28 sites 247 therapists 21 agencies 26 sites 249 therapists

slide-31
SLIDE 31

31

EPIS Framework

Aarons et al. (2011); Beidas et al. (2013)

slide-32
SLIDE 32

Research Questions

32

?

Does use of CBT increase over the 5 year period? What drives clinician behavior in a large system implementing EBP? What are stakeholder perspectives on barriers and facilitators to implementation of EBP? Establishing temporal relationship between constructs of interest to move towards mechanisms.

slide-33
SLIDE 33

Case Study 2

33

Randomized trial comparing to approaches to implementing measurement-based care in community mental health

STUDY PROTOCOL Open Access

Implementing measurement-based care (iMBC) for depression in community mental health: a dynamic cluster randomized trial study protocol

Cara C. Lewis1,2*, Kelli Scott1, C. Nathan Marti3, Brigid R. Marriott1, Kurt Kroenke4, John W. Putz5, Peter Mendel6 and David Rutkowski7 Implementation Science Lewis et al. Implementation Science (2015) 10:127 DOI 10.1186/s13012-015-0313-2
slide-34
SLIDE 34

Measurement Based Care (MBC)

34

  • MBC is the systematic evaluation of patient symptoms

prior to or during a clinical encounter to inform treatment

  • 3 components of MBC fidelity
  • 1. Administer

Measure

  • 2. Review Data
  • 3. Discuss
slide-35
SLIDE 35

iMBC: Implementing Measurement Based Care

35

  • Aim 1: To compare the effect of standardized versus

tailored MBC implementation on

1a: Clinician- level outcomes 1b: Client-level

  • utcomes
slide-36
SLIDE 36

Study Design

Mid Oct. 2016 March 2017 Baseline Assessment Training Assessment 5 month follow up Assessment* 15 month follow up Assessment *Note: 5 to 7 implementation team meetings occur between Training Assessment and 5 month follow up assessment Rural, Small, Tailored Urban, Medium, Standard Rural, Small, Standard Urban, Small, Standard Urban, Large, Tailored Urban, Small, Tailored Rural, Small, Standard Urban, Medium, Standard Urban, Large, Tailored Rural, Medium, Tailored
  • Aug. 2017
  • Jan. 2018
Urban, Medium, Standard Urban, Large, Tailored
slide-37
SLIDE 37

iMBC Conditions

37

Contextual Factor Implementation Strategies Standardized Focus Tailored Focus Resources Paper-based PHQ-9 with score entered in EHR for clinician review. Client completion of paper PHQ-9 and score entered in EHR for review by the clinician. Client Completion of paper PHQ-9 and score entered in EHR for review by the clinician. Networks & Linkages Form Implementation Teams Promoting MBC fidelity per the guideline Identifying and addressing emergent barriers. Policies & Incentives Guideline for PHQ-9 administration frequency. Each session w/ Client. Determined by Site. Norms & Attitudes Initial MBC training. Penetration data provided. Standardized training material. Tailored training material targeting identified barriers from the needs assessment. Penetration data to inform tailored implementation. Structure & Process Progress note modifications. Office professional involvement in administration. For Clinician Score Review. Office professionals provide client PHQ-9 in waiting room. For Clinician Score Review. Site determines PHQ-9 administration and role of
  • ffice professionals.
Media & Change Agents Triweekly consultation with experts. Consultation focuses on MBC fidelity. Consultation focuses on targeting identified barriers and tailoring strategies to address contextual barriers identified throughout the course of implementation.

Blended Implementation Protocol: Strategies to Address Determinants Standardized: Best Practices Tailored: Content is Contextualized

slide-38
SLIDE 38

38

slide-39
SLIDE 39

Orientation to the Science of Dissemination and Implementation

Implementation Process Models 1 2 3 4 5 6 7

slide-40
SLIDE 40

40

Graham et al. (2006)

Process Model Example 1: The KTA Cycle

slide-41
SLIDE 41

41

Process Model Example 2: QIF

Phase One: Initial considerations regarding the host setting Assessment strategies
  • 1. Conducting a needs and resources assessment
  • 2. Conducting a fit assessment
  • 3. Conducting a capacity/readiness assessment
Decisions about adaptation
  • 4. Possibility for adaptation
Capacity-building strategies
  • 5. Obtaining explicit buy-in from critical stakeholders and
fostering a supportive community/organizational climate
  • 6. Building general/organizational capacity
  • 7. Staff recruitment/maintenance
  • 8. Effective pre-innovation staff training
Phase Two: Creating a structure for implementation Structural features for implementation
  • 9. Creating implementation teams
  • 10. Developing an implementation plan
Phase Three: Ongoing structure once implementation begins Ongoing implementation support strategies
  • 11. Technical assistance/coaching/supervision
  • 12. Process evaluation
  • 13. Supportive feedback mechanism
Phase Four: Improving future applications
  • 14. Learning from experience
ORIGINAL PAPER

The Quality Implementation Framework: A Steps in the Implementation Process

Duncan C. Meyers • Joseph A. Durlak • Abraham Wandersman Am J Community Psychol (2012) 50:462–480 DOI 10.1007/s10464-012-9522-x
slide-42
SLIDE 42

42

Process Model Example 3: DIHSRF

Mendel et al. (2008)

slide-43
SLIDE 43

Panelist Question

43

?

What implementation process models have guided your work?

slide-44
SLIDE 44

44

slide-45
SLIDE 45

Orientation to the Science of Dissemination and Implementation

Identifying and Prioritizing Barriers and Facilitators 1 2 3 4 5 6 7

slide-46
SLIDE 46

Assessing Barriers/Facilitators

Determinants “Factors that might prevent or enable improvements in practice (barriers, enablers, facilitators, problems & needs,

  • r disincentives or incentives)”

(Flottorp et al., 2013)

Methods

  • Literature search
  • Informal consultation
  • Surveys
  • Interviews, focus groups,

ethnographic methods

  • Mixed methods approaches

46

slide-47
SLIDE 47

Determinant Frameworks: Example 1 - CFIR

47

 Short Description                                                   
  • Consolidates theories &

frameworks from multiple disciplines

  • 39 constructs within 5

domains

  • No specific relationships

defined

slide-48
SLIDE 48

Determinant Frameworks: Example 2 - TDF

48

TDF-1 (2005) TDF-2 (2012) DOMAINS
  • 1. Knowledge
  • 1. Knowledge
  • 2. Skills
  • 2. Skills
  • 3. Social, professional role and identity
  • 3. Social, professional role and identity
  • 4. Beliefs about capabilities
  • 4. Beliefs about capabilities
  • 5. Optimism
  • 5. Beliefs about consequences
  • 6. Beliefs about consequences
  • 7. Reinforcement
  • 6. Motivation and goals
  • 8. Intention
  • 9. Goals
  • 7. Memory, attention and decision processes
  • 10. Memory, attention and decision processes
  • 8. Environmental context and resources
  • 11. Environmental context and resources
  • 9. Social Influences
  • 12. Social Influences
  • 10. Emotion
  • 13. Emotion
  • 11. Behavioural Regulation
  • 14. Behavioural Regulation
  • 12. Nature of Behaviour
  • Michie et al. (2005);

Cane et al. (2012)

slide-49
SLIDE 49

49

EXPLORATION OUTER CONTEXT Sociopolitical Context Legislation Policies Monitoring and review Funding Service grants Research grants Foundation grants Continuity of funding Client Advocacy Consumer organizations Interorganizational networks Direct networking Indirect networking Professional organizations Clearinghouses Technical assistance centers INNER CONTEXT Organizational characteristics Absorptive capacity Knowledge/skills Readiness for change Receptive context Culture Climate Leadership Individual adopter characteristics Values Goals Social Networks Perceived need for change PREPARATION OUTER CONTEXT Sociopolitical Federal legislation Local enactment Definitions of “evidence” Funding Support tied to federal and state policies Client advocacy National advocacy Class action lawsuits Interorganizational networks Organizational linkages Leadership ties Information transmission Formal Informal INNER CONTEXT Organizational characteristics Size Role specialization Knowledge/skills/expertise Values Leadership Culture embedding Championing adoption IMPLEMENTATION OUTER CONTEXT Sociopolitical Legislative priorities Administrative costs Funding Training Sustained fiscal support Contracting arrangements Community based organizations. Interorganizational networks Professional associations Cross-sector Contractor associations Information sharing Cross discipline translation Intervention developers Engagement in implementation Leadership Cross level congruence Effective leadership practices INNER CONTEXT Organizational Characteristics Structure Priorities/goals Readiness for change Receptive context Culture/climate Innovation-values fit EBP structural fit EBP ideological fit Individual adopter characteristics Demographics Adaptability Attitudes toward EBP SUSTAINMENT OUTER CONTEXT Sociopolitical Leadership Policies Federal initiatives State initiatives Local service system Consent decrees Funding Fit with existing service funds Cost absorptive capacity Workforce stability impacts Public-academic collaboration Ongoing positive relationships Valuing multiple perspectives INNER CONTEXT Organizational characteristics Leadership Embedded EBP culture Critical mass of EBP provision Social network support Fidelity monitoring/support EBP Role clarity Fidelity support system Supportive coaching Staffing Staff selection criteria Validated selection procedures

Determinant & Process Framework Example: EPIS

ORIGINAL PAPER Advancing a Conceptual Model of Evidence-Based Practice Implementation in Public Service Sectors Gregory A. Aarons • Michael Hurlburt • Sarah McCue Horwitz Adm Policy Ment Health (2011) 38:4–23 DOI 10.1007/s10488-010-0327-7
slide-50
SLIDE 50

50

Implementation Theory Example: Implementation Climate

DEBATE Open Access

The meaning and measurement of implementation climate

Bryan J Weiner1*†, Charles M Belden1†, Dawn M Bergmire2† and Matthew Johnston2† Weiner et al. Implementation Science 2011, 6:78 http://www.implementationscience.com/content/6/1/78 Implementation Science

Are interventions ex expec ected ed, su suppo pport rted, and re reward rded?

RESEARCH Open Access

Linking molar organizational climate and strategic implementation climate to clinicians’ use of evidence-based psychotherapy techniques: cross-sectional and lagged analyses from a 2-year

  • bservational study
Nathaniel J. Williams1*, Mark G. Ehrhart2, Gregory A. Aarons3, Steven C. Marcus4 and Rinad S. Beidas5 Williams et al. Implementation Science (2018) 13:85 https://doi.org/10.1186/s13012-018-0781-2
slide-51
SLIDE 51

Panelist Questions

51

?

How did you assess and prioritize barriers? What specific barriers are you addressing?

slide-52
SLIDE 52

52

slide-53
SLIDE 53

Orientation to the Science of Dissemination and Implementation

Identifying and Applying Implementation Strategies 1 2 3 4 5 6 7

slide-54
SLIDE 54

Implementation Strategies - Methods or techniques used to enhance the adoption, implementation, sustainment, and scale-up of a program or practice.

54

Proctor, Powell, & McMillen (2013); Powell, Garcia, & Fernandez (2018)

slide-55
SLIDE 55

Types of Strategies

  • Discrete – Single action or process (e.g., reminders,

audit and feedback, supervision)

  • Multifaceted – Combination of multiple discrete

strategies (e.g., training + consultation), some of which have been protocolized and branded

55

Powell et al. (2012, 2015)

slide-56
SLIDE 56

56

Powell et al. (2012)

slide-57
SLIDE 57

73 Discrete Strategies

57

RESEARCH Open Access

A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project

Byron J Powell1*, Thomas J Waltz2, Matthew J Chinman3,4, Laura J Damschroder5, Jeffrey L Smith6, Monica M Matthieu6,7, Enola K Proctor8 and JoAnn E Kirchner6,9 Implementation Science Powell et al. Implementation Science (2015) 10:21 DOI 10.1186/s13012-015-0209-1

*See Additional File 6 of Powell et al. (2015) for most comprehensive version of the compilation

SHORT REPORT Open Access

Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study

Thomas J. Waltz1,2*, Byron J. Powell3, Monica M. Matthieu4,5,10, Laura J. Damschroder2, Matthew J. Chinman6,7, Jeffrey L. Smith5,10, Enola K. Proctor8 and JoAnn E. Kirchner5,9,10 Implementation Science Waltz et al. Implementation Science (2015) 10:109 DOI 10.1186/s13012-015-0295-0
slide-58
SLIDE 58

58

Strategy Review Number of Trials Effect Sizes

Printed Educational Materials 14 Randomized Trials 31 ITS Median absolute improvement 2.0% (range 0% to 11%) Educational Meetings 81 Randomized Trials Median absolute improvement 6% (IQR 1.8% to 15.3%) Educational Outreach 69 Randomized Trials Median absolute improvement in prescribing behaviors 4.8% (IQR 3% to 6.6%), other behaviors 6% (IQR 3.6% to 16%) Local Opinion Leaders 18 Randomized Trials Median absolute improvement 12% (6% to 14.5%) Audit and Feedback 140 Randomized Trials Median absolute improvement 4.3% (IQR .5 to 16%) Computerized Reminders 28 Randomized Trials Median absolute improvement 4.2% (IQR .8 to 18.8%) Tailored Interventions 26 Randomized Trials Meta-Regression using 15 trials. Pooled odds ratio of 1.56 (95% CI, 1.27 to 1.93, p < .001) Examples of Cochrane EPOC reviews updated from Grimshaw et al. (2012)

slide-59
SLIDE 59

Now what?

59

How do we design and tailor strategies?

slide-60
SLIDE 60

60

Identified Barriers Relevant Implementation Strategies Lack of knowledge Interactive education sessions Perception/reality mismatch Audit and feedback Lack of motivation Incentives/sanctions Beliefs/attitudes Peer influence/opinion leaders

Discrete Strategy Examples

slide-61
SLIDE 61

61

Multifaceted Strategy Example

Hea Health care e co collaboratives (O (Organization
  • nal)
Pr Provider co communica cation (I (Interperson
  • nal)
Ed Education and co counseling fo for women (I (Intraperson
  • nal)
Ph Physici cian's mo motivation Ce Cervical Ca Cancer Scr Screening Wo Woman’s kno nowle ledg dge Pr Provider- pa patie ient in interac actio ion Weiner et al. (2012)
slide-62
SLIDE 62

Unfortunately, we far too often…

62

16 28 46 63 56 N = Absolute effect size Number of interventions in treatment group >4 4 3 2 1 80% 60% 40% 20% 0%
  • 20%
  • 40%
  • 60%
  • 80%

Grimshaw et al. (2004); Henggeler et al. (2002); Squires et al. (2014)

“Kitchen Sink” Approach “It seemed like a good idea at the time” (Eccles) “ISLAGIATT” Approach “Train and Pray” Approach “One Size Fits All” Approach

slide-63
SLIDE 63

Opportunities to Advance Implementation Science

  • Diversify the strategies tested
  • Study discrete, multifaceted, and tailored strategies
  • Utilize a wider range of designs and methods

63

Brown et al. (2017); Institute of Medicine (2009); Lau et al. (2015); Mazucca et al. (2018); Powell et al. (2014)

slide-64
SLIDE 64

Need to Enhance Methods for Designing and Tailoring

64

Baker et al. (2015); Bosch et al. (2007); Colquhoun et al. (2017); Grol et al. (2013); Powell et al. (2017)

1

slide-65
SLIDE 65

Need to Enhance Methods for Designing and Tailoring

65

Baker et al. (2015); Bosch et al. (2007); Colquhoun et al. (2017); Grol et al. (2013); Powell et al. (2017)

  • Group Model Building
  • Conjoint Analysis
  • Concept Mapping
  • Intervention Mapping
slide-66
SLIDE 66

Specify & Test Mechanisms

66

Lewis et al. (2018); National Institutes of Health (2016); Weiner et al. (2012); Williams et al. (2016)

2

slide-67
SLIDE 67

Specify & Test Mechanisms

67

R13 HS025632 (Lewis, PI)

  • s
ir ce nd S y
  • s
  • ;
y

Workgroup Co-Leads & Key Issues Strategy à Mechanism à Outcome Brian Mittman & Byron Powell Causal Theory & Context Rinad Beidas & Nate Williams Measurement Bryan Weiner & Cara Lewis Design & Analysis Greg Aarons & Aaron Lyon

slide-68
SLIDE 68

Improve Description, Tracking, and Reporting

68

  • Poor description, tracking, and reporting:
  • Limits replication in science and practice
  • Precludes answers to how and why

strategies work

  • Numerous reporting guidelines exist

Albrecht et al. (2013); Boyd et al. (2018); Bunger et al. (2017); Hoffman et al. (2014); Proctor et al. (2013)

3

slide-69
SLIDE 69

Name it, Define it, Specify it!

69

Proctor, Powell, & McMillen (2013); https://impsciuw.org/implementation-strategies/

slide-70
SLIDE 70

Increase Economic Evaluations

70

  • In a review of 235 implementation studies, only 10%

provided information about implementation costs

  • Severely inhibits decision making regarding strategies
  • Practical tools have been developed (e.g., COINS)
  • Common framework facilitating comparability is needed

Hoomans & Severens (2014); Raghavan et al. (2018); Saldana et al. (2014); Vale et al. (2007)

4

slide-71
SLIDE 71

71

Bunger et al. (2017)

Increase Economic Evaluations

slide-72
SLIDE 72

Panelist Questions

72

?

How have you identified, developed, selected, and tailored implementation strategies? What specific implementation strategies are you applying?

slide-73
SLIDE 73

73

slide-74
SLIDE 74

Orientation to the Science of Dissemination and Implementation

Evaluating Implementation Efforts 1 2 3 4 5 6 7

slide-75
SLIDE 75

75

Glasgow et al. (1999)

Evaluation Framework: Example 1 – RE-AIM

slide-76
SLIDE 76

76

Proctor et al. (2009)

Evaluation Framework: Example 2 – IR in MH

Implementation Outcomes Acceptability Adoption Appropriateness Feasibility Fidelity Costs Penetration Sustainability
  • *IOM Standards of Care
What? QIs ESTs How? Implementation Strategies Service Outcomes* Efficiency Safety Effectiveness Equity Patient- centeredness Timeliness Patient Outcomes Satisfaction Function Health status/ symptoms
slide-77
SLIDE 77

77

Implementation Outcomes

ORIGINAL PAPER Outcomes for Implementation Research: Conceptual Distinctions, Measurement Challenges, and Research Agenda Enola Proctor • Hiie Silmere • Ramesh Raghavan • Peter Hovmand • Greg Aarons • Alicia Bunger • Richard Griffey • Melissa Hensley Adm Policy Ment Health (2011) 38:65–76 DOI 10.1007/s10488-010-0319-7 RESEARCH Open Access

Psychometric assessment of three newly developed implementation outcome measures

Bryan J. Weiner1* , Cara C. Lewis2,3,4, Cameo Stanick5, Byron J. Powell6, Caitlin N. Dorsey2, Alecia S. Clary6, Marcella H. Boynton7 and Heather Halko8 Weiner et al. Implementation Science (2017) 12:108 DOI 10.1186/s13012-017-0635-3 SYSTEMATIC REVIEW Open Access

Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria

Cara C. Lewis1,2*, Sarah Fischer1, Bryan J. Weiner3, Cameo Stanick4, Mimi Kim5,6 and Ruben G. Martinez7 Implementation Science Lewis et al. Implementation Science (2015) 10:155 DOI 10.1186/s13012-015-0342-x
slide-78
SLIDE 78

78

Qualitative and Mixed Methods

Best Practices for Mixed Methods Research in the Health Sciences

Commissioned by the Office of Behavioral and Social Sciences Research (OBSSR) Helen I. Meissner, Ph.D., Office of Behavioral and Social Sciences Research By John W. Creswell, Ph.D., University of Nebraska-Lincoln Ann Carroll Klassen, Ph.D., Drexel University Vicki L. Plano Clark, Ph.D., University of Nebraska-Lincoln Katherine Clegg Smith, Ph.D., Johns Hopkins University With the Assistance of a Specially Appointed Working Group ORIGINAL PAPER

Mixed Method Designs in Implementation Research

Lawrence A. Palinkas • Gregory A. Aarons • Sarah Horwitz • Patricia Chamberlain • Michael Hurlburt • John Landsverk Adm Policy Ment Health (2011) 38:44–53 DOI 10.1007/s10488-010-0314-z

Integration is key!

slide-79
SLIDE 79

79

Qualitative and Mixed Methods

Best Practices for Mixed Methods Research in the Health Sciences

Commissioned by the Office of Behavioral and Social Sciences Research (OBSSR) Helen I. Meissner, Ph.D., Office of Behavioral and Social Sciences Research By John W. Creswell, Ph.D., University of Nebraska-Lincoln Ann Carroll Klassen, Ph.D., Drexel University Vicki L. Plano Clark, Ph.D., University of Nebraska-Lincoln Katherine Clegg Smith, Ph.D., Johns Hopkins University With the Assistance of a Specially Appointed Working Group ORIGINAL PAPER

Mixed Method Designs in Implementation Research

Lawrence A. Palinkas • Gregory A. Aarons • Sarah Horwitz • Patricia Chamberlain • Michael Hurlburt • John Landsverk Adm Policy Ment Health (2011) 38:44–53 DOI 10.1007/s10488-010-0314-z
slide-80
SLIDE 80

80

Study Design Resources

QUALITY IMPROVEMENT RESEARCH

Research designs for studies evaluating the effectiveness

  • f change and improvement strategies
M Eccles, J Grimshaw, M Campbell, C Ramsay . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Qual Saf Health Care 2003;12:47–52

Study Designs for Effectiveness and Translation Research

Identifying Trade-offs

Shawna L. Mercer, MSc, PhD, Barbara J. DeVinney, PhD, Lawrence J. Fine, MD, DrPH, Lawrence W. Green, DrPH, Denise Dougherty, PhD
slide-81
SLIDE 81

81

Hybrid Effectiveness- Implementation Designs

Effectiveness-implementation Hybrid Designs

Combining Elements of Clinical Effectiveness and Implementation Research to Enhance Public Health Impact

Geoffrey M. Curran, PhD,* Mark Bauer, MD,w Brian Mittman, PhD,z Jeffrey M. Pyne, MD,* and Cheryl Stetler, PhDz

slide-82
SLIDE 82

82

Efficacy Studies Effectiveness Studies D&I Studies

Improved Clinical outcomes Quality outcomes Processes of care

The “Gap”

It can take years to decades to translate research findings to practice to benefit patients and providers

Curran et al. (2012)

slide-83
SLIDE 83

83

Efficacy Studies Effectiveness Studies* D&I Studies Improved Clinical outcomes Quality outcomes Processes of care

Hybrid Designs Go Here

Curran et al. (2012)

slide-84
SLIDE 84

84

Clinical Effectiveness Research Implementation Research Hybrid Type 1

Primary Aim: Determine effectiveness of a clinical intervention Secondary Aim: Better understand the context for implementation Primary Aim: Determine effectiveness of a clinical intervention Co-Primary Aim: Determine feasibility and/or (potential) impact of an implementation strategy

Hybrid Type 2 Hybrid Type 3

Primary Aim: Determine impact of an implementation strategy Secondary Aim: Assess clinical outcomes associated with implementation trial

The Continuum

Curran et al. (2012)

slide-85
SLIDE 85

Panelist Questions

85

?

What implementation outcomes are you assessing? How have you utilized mixed methods? What designs have you used?

slide-86
SLIDE 86

86

P2i Evaluation

Aim 1: Does use of CBT increase over the 5 year period in response to the creation of EPIC? Outcome Measure/Method Use of CBT strategies Self-reported use of therapy strategies with a representative client using Therapist Procedures Checklist- Family Revised

slide-87
SLIDE 87

87

P2i Evaluation

Aim 2: What drives clinician behavior in a large system implementing EBPs? Construct/Outcome Measure/Method Organizational Culture & Climate Organizational Social Context Measure Implementation Climate Implementation Climate Scale Leadership Implementation Leadership Scale; Multifactor Leadership Questionnaire Attitudes Evidence Based Practice Attitudes Scale Knowledge Knowledge of Evidence-Based Services Questionnaire Use of CBT strategies (outcome) See above

slide-88
SLIDE 88

88

P2i Evaluation

Aim 3: What are stakeholder perspectives on barriers and facilitators to implementation of EBPs? EPIS constructs Semi-structured interviews

slide-89
SLIDE 89

89

iMBC Evaluation

Aim 1: To compare the effect of standardized versus tailored MBC

implementation on clinician-level (1a) and client-level (1b) outcomes.

Construct/Outcome Measure/Method

Tailored Implementation Strategies Coding of implementation team meetings using reporting recommendation categories Fidelity Electronic health record data capture (self- report & objective data capture)

slide-90
SLIDE 90

90

iMBC Evaluation

Aim 2: To identify contextual mediators of MBC fidelity.

Construct/Outcome Measure/Method

Norms Self-report from Theory of Planned Behavior Attitudes Monitoring and Feedback Attitudes Scale Climate Implementation Climate Scale Policies Focus Groups Networks & Linkages Sociometric Questionnaire

slide-91
SLIDE 91

91

slide-92
SLIDE 92

Orientation to the Science of Dissemination and Implementation

Discussion 1 2 3 4 5 6 7

slide-93
SLIDE 93

93

Discussion & Questions

Advice for someone starting off in the field? How best to network and connect in the field? What do you attribute to your success in the field? How to build capacity at your institution?

slide-94
SLIDE 94

Acknowledgments

  • NIMH K23MH099179 (Beidas, PI)
  • NIMH P50MH113840 (Beidas, Mandell, Volpp, MPI)
  • NIMH T32MH109433 (Beidas, Mandell, MPI)
  • NIMH R01MH103310 (Lewis, PI)
  • NIMH R01MH106510 (Lewis, PI)
  • AHRQ R13HS025632 (Lewis, PI)
  • NIMH K01MH113806 (Powell, PI)

94

slide-95
SLIDE 95

Contact Information

Rinad S. Beidas rbeidas@upenn.edu Cara C. Lewis cara.c.lewis@kp.org Byron J. Powell bjpowell@unc.edu

95