Open Access Highly Accessed Systematic Review

A critical synthesis of literature on the promoting action on research implementation in health services (PARIHS) framework

Christian D Helfrich12*, Laura J Damschroder3, Hildi J Hagedorn45, Ginger S Daggett6, Anju Sahay7, Mona Ritchie8, Teresa Damush69, Marylou Guihan10, Philip M Ullrich11 and Cheryl B Stetler1213

Author Affiliations

1 Northwest HSR&D Center of Excellence, VA Puget Sound Healthcare System, Seattle, Washington, USA

2 Department of Health Services, University of Washington School of Public Health, Seattle, Washington, USA

3 HSR&D Center for Clinical Management Research and Diabetes QUERI, VA Ann Arbor Healthcare System, Ann Arbor, Michigan, USA

4 VA Substance Use Disorders Quality Enhancement Research Initiative, Minneapolis VA Medical Center, Minneapolis, Minnesota, USA

5 Department of Psychiatry, School of Medicine, University of Minnesota, Minneapolis, Minnesota, USA

6 VA Stroke QUERI, HSR&D Center of Excellence, Richard L. Roudebush VA Medical Center, Indianapolis, Indiana, USA

7 Chronic Heart Failure QUERI Center, VA Palo Alto Health Care System, Palo Alto, California, USA

8 Mental Health Quality Enhancement Research Initiative, Central Arkansas Veterans Healthcare System, North Little Rock, Arkansas, USA

9 Indiana University Center for Aging Research, Regenstrief Inc., Indianapolis, Indiana, USA

10 Spinal Cord Injury QUERI Research Coordinating Center, Center for Management of Complex Chronic Care (CMC3), Edward Hines, Jr. VA Hospital, Hines, Illinois, USA

11 Spinal Cord Injury QUERI, VA Puget Sound Health Care System, Seattle, Washington, USA

12 Independent Consultant, Amherst, Massachusetts, USA

13 Health Services Department, Boston University School of Public Health, Boston, Massachusetts, USA

For all author emails, please log on.

Implementation Science 2010, 5:82  doi:10.1186/1748-5908-5-82


The electronic version of this article is the complete one and can be found online at: http://www.implementationscience.com/content/5/1/82


Received:20 May 2010
Accepted:25 October 2010
Published:25 October 2010

© 2010 Helfrich et al; licensee BioMed Central Ltd.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Background

The Promoting Action on Research Implementation in Health Services framework, or PARIHS, is a conceptual framework that posits key, interacting elements that influence successful implementation of evidence-based practices. It has been widely cited and used as the basis for empirical work; however, there has not yet been a literature review to examine how the framework has been used in implementation projects and research. The purpose of the present article was to critically review and synthesize the literature on PARIHS to understand how it has been used and operationalized, and to highlight its strengths and limitations.

Methods

We conducted a qualitative, critical synthesis of peer-reviewed PARIHS literature published through March 2009. We synthesized findings through a three-step process using semi-structured data abstraction tools and group consensus.

Results

Twenty-four articles met our inclusion criteria: six core concept articles from original PARIHS authors, and eighteen empirical articles ranging from case reports to quantitative studies. Empirical articles generally used PARIHS as an organizing framework for analyses. No studies used PARIHS prospectively to design implementation strategies, and there was generally a lack of detail about how variables were measured or mapped, or how conclusions were derived. Several studies used findings to comment on the framework in ways that could help refine or validate it. The primary issue identified with the framework was a need for greater conceptual clarity regarding the definition of sub-elements and the nature of dynamic relationships. Strengths identified included its flexibility, intuitive appeal, explicit acknowledgement of the outcome of 'successful implementation,' and a more expansive view of what can and should constitute 'evidence.'

Conclusions

While we found studies reporting empirical support for PARIHS, the single greatest need for this and other implementation models is rigorous, prospective use of the framework to guide implementation projects. There is also need to better explain derived findings and how interventions or measures are mapped to specific PARIHS elements; greater conceptual discrimination among sub-elements may be necessary first. In general, it may be time for the implementation science community to develop consensus guidelines for reporting the use and usefulness of theoretical frameworks within implementation studies.

Background

Only a small proportion of research findings are widely translated into clinical settings [1], often due to barriers in the local setting [2]. The Promoting Action on Research Implementation in Health Services framework, or PARIHS, is a conceptual framework that posits key, interacting elements that influence successful implementation of evidence-based practices (EBPs) [3-7]. Implementation researchers have widely cited PARIHS or used it as the basis for empirical work [8-11]. This body of research has occurred against the backdrop of broad calls to incorporate theoretical frameworks in quality improvement implementation activities and research [12-14].

It has been over a decade since Kitson and colleagues first described the PARIHS framework, and while several papers have been published that update and propose refinements [4-7,14,15], there has not yet been a literature review to examine how the framework has been used in implementation projects and research. Our interest in PARIHS grew out of its use by numerous researchers involved in the Veterans Health Administration (VA) Quality Enhancement Research Initiative and their expressed need for guidance in how to use it in implementation projects. The purpose of the present article is to critically review and synthesize the conceptual and empirical literatures on PARIHS to: understand how PARIHS has been used; understand how its elements and sub-elements have been operationalized; and highlight strengths and limitations of PARIHS relative to use of the framework to guide an implementation study. We close with a set of recommendations to increase the value of the PARIHS framework for guiding implementation activities and research.

PARIHS framework

PARIHS outlines the determinants of successful implementation of evidence into practice. It was initially published in 1998 as an unnamed framework inductively developed based on the experience of the authors with practice improvement and guideline implementation efforts [3]. They presented three case examples to illustrate its usefulness with accompanying descriptive analyses. Subsequently, two concept analyses were published exploring the maturity, meaning, and characteristics of facilitation [4] and context [5] as they relate to implementation. These concept analyses were based on non-systematic reviews of the literature. The original authors published a refined version of the framework in 2002 based on theoretical insights from these concept analyses [15]. This article contained the first published use of the PARIHS label. A conceptual exploration of evidence was published in 2004, which rounded out the PARIHS team's review of their framework's three core elements [6]. Kitson and colleagues published a further clarification of PARIHS in 2008. This latest paper proposed that PARIHS is best used in a two-step process: as a framework to diagnose and guide preliminary assessment of evidence and context, and to guide development, selection, and assessment of facilitation strategies based on the existing evidence base and local context [7].

The framework comprises three, interacting core elements: evidence (E) - 'codified and non-codified sources of knowledge' [7] as perceived by multiple stakeholders; context (C) - the quality of the environment or setting in which the research is implemented; and facilitation (F) - a 'technique by which one person makes things easier for others,' achieved through 'support to help people change their attitudes, habits, skills, ways of thinking, and working' [3]. The core assertion is that successful implementation is a function of E, C, and F and their interrelationships. The status of each of these elements can be assessed for whether it will have a weak ('low' rating) or strong ('high' rating) effect on implementation (Figure 1).

thumbnailFigure 1. Key elements for implementing evidence into practice, from Rycroft-Malone et al. [29].

In the PARIHS framework, evidence consists of four sub-elements, corresponding to four main sources of evidence: research evidence from studies and clinical practice guidelines including, but not limited to, formal experiments; clinical experience or related professional knowledge; patient preferences and experiences; and locally derived information or data, such as project evaluations or quality improvement initiatives [6,7]. A fundamental premise of PARIHS is that while research evidence is often treated as the most heavily weighted source, all four sources have meaning and constitute evidence from the perspective of end users.

Context comprises four sub-elements: receptive context, organizational culture, leadership, and evaluation [5,7]. All four of these sub-elements are defined in PARIHS core papers [5,7], and, for culture, leadership. and evaluation, definitions from the broader literature are cited in a related concept analysis [5]. For example, culture is alternatively described as a 'paradigm,' as '`the way things are done around here' and as a metaphor for the organization-something the organization is rather than something it possesses; leadership is described as an indicator or reflection of the 'nature of human relationships' in the organization, pertaining to the types of leadership roles enacted and who enacts them [3,5]; and evaluation is described largely in terms of feedback [5] and how performance data are collected and reported [7]. Descriptions of the sub-elements for each are provided in earlier papers that reflect 'high' and 'low' ratings that indicate a more or less favorable context for successful implementation, respectively. Indications for high ratings of context include, for example: clearly defined and acknowledged physical, social, cultural, structural and/or system boundaries; valuing individual staff and clients; promoting organizational learning; existence of transformational leadership as well as democratic or inclusive decision making; and existence of feedback on individual, team, and/or system performance [15,16].

Facilitation includes three sub-elements and an array of mechanisms to influence implementation of evidence into clinical practice. The first sub-element of facilitation focuses on its purpose; e.g., whether facilitation is to support attainment of a specific goal (task-oriented) or enable individuals or teams to reflect on and change their attitudes and ways of working (holistic-oriented) [15]. In the PARIHS framework, these two purposes are arrayed as endpoints on a continuum. The second and third sub-elements of facilitation are the role of the facilitator(s) and their associated skills and attributes, which are described for each of the two purposes. On the task-oriented end of the continuum, the facilitator might engage in episodic contacts and provide practical focused help, which requires strong project management/technical skills but a relatively low level of intensity. On the holistic-oriented end of facilitation, the facilitator might focus on building sustained partnerships with teams to assist them in developing their own practice change skills. This requires a relatively high level of intensity.

Methods

We used qualitative, critical synthesis methods for this review because our objectives were descriptive (e.g., describing how PARIHS has been used) and critical (e.g., appraising relative strengths and weaknesses of the framework), rather than meta-analytic (e.g., calculating an average effect size) [17]. We describe our review process below.

Search strategy and selection of publications

Our literature search included three sources. First, we conducted key word searches of the PubMed and CINAHL databases using the terms 'PARIHS' and 'promoting action on research implementation in health services.' We selected PubMed because it represents the preeminent database of peer-reviewed literature in the health fields, and CINAHL because it focuses specifically on nursing literature, where some of the original PARIHS concept papers were published. We used limited key words because this review was focused on the PARIHS model, rather than implementation models generally. Second, we reviewed the reference lists of included articles. Third, we solicited citations from a PARIHS author and other colleagues familiar with this body of research.

We selected articles based on four a priori criteria: published peer-reviewed literature, English language, published prior to March 2009, and explicit reference to the PARIHS framework either by name or citation of core conceptual articles. We did not specify a priori exclusion criteria.

Appraisal and abstraction of articles

We appraised and abstracted include articles in a three-step process. First, each article was read by a primary reviewer who wrote a narrative synopsis using a template (see Additional File 1, Synopsis template). The purpose of the initial synopsis was to provide an overall summary and critique of the article. Second, the completed synopsis was distributed and reviewed by all co-authors, and discussed and refined on a conference call. Third, one of the co-authors condensed each synopsis using a structured summary table, with a separate table for each article. The purpose of the summary tables was to create a concise, structured appraisal and critique for each article. Some papers were empirical and others were conceptual. Summary tables for empirical articles included the overall method/design, an appraisal of study quality, study outcomes, how PARIHS was proposed to be used and actually used, and assessment of congruency between PARIHS and study methods (see Additional File 2, Empirical article summary table). These tables also listed how PARIHS elements and sub-elements were defined and measured or operationalized in the study, along with findings, barriers, and enablers to implementation. The summary tables for core concept articles focused on the framework's elements, sub-elements, limitations, recommendations, and other observations (Additional File 3, Core-concept article summary table). These summary tables were reviewed by the primary reviewer for that paper and again by all co-authors, discussed as a group, and affirmed or revised as needed. This collection of empirical and core summary tables constituted the analytic foundation for our meta-summary and synthesis.

Additional file 1. Synopsis template. The synopsis template is a semi-structured form for initial narrative abstraction and critique of the included articles. It included the article abstract and six sections to be filled out by the reviewer, such as aspects of the PARIHS framework said to influence the study.

Format: DOC Size: 37KB Download file

This file can be viewed with: Microsoft Word ViewerOpen Data

Additional file 2. Summary table template for empirical articles. The summary table template is a semi-structured tool for article abstraction and critique that was in tabular format and included more discrete data elements than the synopsis template, e.g., broken down by PARIHS element and sub-element. The summary table differed between the core-concept and empirical articles because of the types of publication (e.g., differences in the purposes and methods of the papers). This is the summary table for the empirical articles.

Format: DOC Size: 81KB Download file

This file can be viewed with: Microsoft Word ViewerOpen Data

Additional file 3. Summary table template for core concept articles. The summary table template is a semi-structured tool for article abstraction and critique that was in tabular format and included more discrete data elements than the synopsis template, e.g., broken down by PARIHS element and sub-element synthesis. The summary table differed between the core-concept and empirical articles because of the types of publication and related content (e.g., differences in the purposes and methods of the papers). This is the summary table for the core concept articles.

Format: DOC Size: 79KB Download file

This file can be viewed with: Microsoft Word ViewerOpen Data

Meta-summary and synthesis

Four co-authors reviewed the final set of summary tables and independently highlighted key points per article to create a meta-summary. Key points represented concepts, specific findings related to PARIHS generally and/or to specific elements or sub-elements, observations about the use of the framework, and conclusions. Information highlighted as a key point by at least three of the four co-authors was discussed further at a two-day, in-person working conference. The purpose of the discussion of key points was to explore and summarize similarities and differences across the papers (both empirical and core conceptual) and to develop qualitative themes. Some of the themes were descriptive, e.g., regarding the actual versus articulated use of PARIHS. Other themes were interpretive, e.g., our consensus judgments regarding overall limitations, related issues, and strengths of the framework relative to the ability of researchers to effectively use it to guide an implementation study. We developed implications for using the framework as well as related recommendations based on these synthesized findings. As with the article appraisal, the synthesis and recommendations were discussed with all co-authors and refined until consensus was reached.

Results

Search results

We initially identified 33 unique articles (Figure 2). We excluded an unpublished doctoral dissertation [18], and eight commentaries [19-26]. Commentaries did not reflect planned or actual application or refinement of PARIHS (See Additional File 4, Table of commentaries excluded from the synthesis). We included the remaining 24 articles in our review.

Additional file 4. Commentaries excluded from the synthesis. This is a table of eight papers that were reviewed as part of our literature review and ultimately excluded because we defined them as commentaries that neither presented empirical research related to PARIHS nor conceptual critique or elaboration of the framework. The table includes abstracted data on the purpose of paper; the rationale for using PARIHS; and how PARIHS was to be used.

Format: DOC Size: 40KB Download file

This file can be viewed with: Microsoft Word ViewerOpen Data

thumbnailFigure 2. Flow diagram of literature review .

We characterized six articles as core concept articles (Table 1 Overview of core concept articles for the PARIHS framework). These were written by members of a PARIHS coordinating group (http://www.PARIHS.org/pages/contact_us.html webcite) for the stated purpose of introducing [3] or elaborating on the framework, either as a whole [7,15], or on one of its three core elements [4-6]. The remaining 18 articles (Table 2 Overview of empirical articles included in the synthesis) were a mix of case reports and qualitative or mixed-methods studies [27-33], quantitative studies [9-11,34-36], literature reviews [37-39] and study protocols [40] or frameworks [41]. We refer to these collectively as empirical articles to distinguish them from the core concept articles.

Table 1. Overview of core concept articles for the PARIHS framework

Table 2. Overview of empirical articles included in the synthesis

Two of the empirical articles reported on the same study in which the Context Assessment Instrument (CAI) was developed based on PARIHS [35,36]. We also obtained an unpublished final report for the project [42], which included all of the material in the two articles plus more methodological detail. We combined these sources into a single entry in Tables 3 and 4, yielding 17 study entries.

Table 3. Core concept articles

Table 4. Empirical articles

How and why PARIHS was used in studies

Empirical studies generally used PARIHS as an organizing framework for analyses, such as examining predictors of nurses' research utilization (RU) [9,10,34], or reporting findings, such as highlighting differences between a series of efficacy studies and a planned translational study [40] (Table 2 Overview of empirical articles included in the synthesis).

Stated reasons for using PARIHS included that it acknowledges the complexity of implementation (or knowledge translation) [39]; it includes contextual factors [38]; and that it explicitly includes and describes context and facilitation [30]. Generally, users referred to the intuitive appeal of the three main elements (evidence, context, and facilitation) and PARIHS's explicit acknowledgement of the complex interrelationships among elements and their effects on implementation. Five empirical articles provided no explicit rationale for selecting PARIHS.

How PARIHS elements were operationalized

Three empirical papers described development of survey instruments based on PARIHS, two exclusively on the same survey assessing the element of context [35,36] and the other on evidence and context [11]. A series of three studies mapped survey items from secondary datasets to PARIHS elements, and tested their association with nurses' RU: one focused on context [34], and two on context and facilitation [9,10]. Except for a study by the PARIHS team [8], the empirical articles were not designed to validate or refine PARIHS.

Among non-quantitative empirical articles, two provided details of how PARIHS was operationalized: one specified questions used in a program evaluation [33], and another proposed a PARIHS-based framework to enhance reflective professional practice [41]. The nine remaining empirical articles did not specify how elements and sub-elements were measured or assessed, such as coding definitions or logic models for drawing conclusions about observed relationships.

A critical appraisal of reviewed studies

A key strength of the existing PARIHS literature (Table 3 Core concept articles, and Table 4 Empirical articles) was that several studies used findings to comment on the framework in ways that could help refine or validate it. One example was a suggestion to address underlying motivation for change, such as relative advantage and tension for change [27,28]. Another was a qualitative exploration by the PARIHS team of how the framework fit with empirical findings [29]. A series of three articles attempted to quantify measures of context and facilitation and test quantitative multi-level models using facilitation and context as predictors of RU by nurses [9,10,34].

We identified two major issues with the PARIHS literature through our review. First, none of the studies used PARIHS prospectively to design implementation strategies. With the exception of articles reporting on survey development [11,35,36], all of the empirical studies were retrospective or cross-sectional. The six core concept papers described analyses that were conducted at a high level addressing broad concepts, and relied on non-systematic review of the literature.

Second, there was significant lack of detail about how variables were measured [39], mapped to PARIHS elements [38], or how results or conclusions were derived [33]. For example, Sharp and colleagues concluded that good implementation outcomes could be achieved in settings with poor context, but not both poor context and poor facilitation. However, the authors did not indicate which cases supported those conclusions and what characterized context and facilitation at those sites [30].

A critical appraisal of the PARIHS framework

Several overarching strengths of PARIHS emerged (Table 3 Core concept articles, and Table 4 Empirical articles). First, though studies have not done so to date, the developers describe an explicit method for using PARIHS to guide diagnostic analysis of evidence and context [7], findings from which should be used to plan facilitation strategies to accomplish implementation.

Second are its flexibility and applicability to a range of settings, as well as perceptions by users that it captures key elements of the implementation experience. This includes PARIHS' expansive acknowledgement of what can and should constitute 'evidence,' and its recognition that implementation is a complex and multi-faceted process that is dynamic and often unpredictable. In additional, several articles reported findings that support specific PARIHS elements or sub-elements, such as Estabrooks and colleagues' finding that measures of facilitation and context are significantly associated with nurses' RU [10].

The primary issue related to the framework was a need for greater conceptual clarity about the definitions of sub-elements and the nature of dynamic relationships among elements and sub-elements. In many cases, sub-elements appear to have significant conceptual overlap. For example, criteria for evaluating receptive context include 'power and authority processes' and whether or not cultural boundaries are clearly defined and acknowledged. These two criteria appear to overlap with the culture and leadership sub-elements, which include being 'able to define culture(s) in terms of prevailing values/beliefs' and 'democratic inclusive decision making processes.' It is not clear what distinguishes receptive context, as a construct, from culture and leadership. Another example is that facilitation is defined solely as a role, and in terms of the individual who fills the role and the relationship they have with those implementing the change. As presently described, this element does not address implementation interventions such as reminders, web-based education, toolkits, social marketing, and audit and feedback that may be undertaken to facilitate implementation, and which could conceivably be untaken by a number of actors. Although PARIHS acknowledges the dynamic relationships among elements, the elements and sub-elements are described in linear terms, from 'low' to 'high,' with little explicit account of how or in what form dynamics among and across the sub-elements might emerge.

Both a strength and issue for PARIHS was the specification of the outcome 'successful implementation.' It was a strength in that the framework stipulates an outcome where many implementation models do not. However, there was little information in the six core articles about how to conceptualize or define successful implementation, and the empirical articles adopted a range of outcomes. Some articles used a broad outcome of RU [10,39], i.e., the degree to which clinicians apply research knowledge in their practices generally. Others used the degree of implementation or uptake of specific practice changes [30,31].

Discussion

Our objectives in the present synthesis were to understand how PARIHS has been used in implementation studies, how it has been operationalized, and the strengths and limitations of PARIHS and its supporting literature. We found a reasonably large published literature (33 published papers, 18 of which were empirical), but this is a body of findings that reflects many of the current limitations of the broader implementation science literature. These limitations provide great opportunities for improvement, notably three.

First, PARIHS was largely used and operationalized as an organizing device or heuristic, usually post hoc. However, PARIHS developers intended the framework to be used to assess evidence and context prior to implementation and then using these findings to guide facilitation of implementation. To move the framework forward, we need empirical studies that use PARIHS to prospectively design or comprehensively evaluate implementation activities. Researchers should explain the degree to which intervention design decisions and change strategies are based on PARIHS. The lack of prospective implementation studies is not unique to PARIHS; all but a fraction of published implementation studies fail to explicitly use any theory at all [43,44], so researchers do not appear to be conducting prospective implementation studies based on any conceptual frameworks; a similar lack of theoretical foundation is reported among studies of organizational factors linked to patient safety [45]. Our findings echo those of Kajermo and colleagues in a recent literature synthesis on use of the BARRIER scale, which is intended to prospectively identify barriers to research use by nurses [46]. Based on a paucity of prospective studies, they concluded that no further descriptive studies should be done, and that only prospective studies would move the science forward. We extend the same call for studies using PARIHS.

Second, though a strength of the empirical literature was that some studies showed empirical support for PARIHS, this finding needs to be interpreted in light of the overall study designs, which were retrospective case reports or cross-sectional analyses, and often lacked key methodological details. Furthermore, authors rarely contrasted findings to previous studies; the citation of prior work using PARIHS occurred almost exclusively in the introduction to set the stage for the study or conceptual rationale of the study. This too, may in part, be a function of the current development of the implementation science literature, and the natural evolution of standards and expectations about what details researchers most need to report. It may be time for something akin to CONSORT [47] or MOOSE [48] guidelines for reporting results of implementation intervention studies or implementation project evaluations. While implementation science may not be amenable to the same manner of checklists that have been applied to randomized trials and meta-analyses, there are key elements that could be described in sufficient specificity to provide guidance to both journal editors and researchers. These might include an explanation or rationale for mapping study findings to the constructs of the conceptual framework being used; a rationale for excluding certain elements; details about operationalization of constructs, including coding definitions for qualitative analyses; and discussion of the criteria authors use to draw conclusions about relationships between determinants and implementation outcomes. This might help address a key criticism of efforts to promote more theory-based implementation research, namely that translation of theory into intervention design is too subjective and opaque [49].

Finally, there are opportunities to improve the conceptual clarity of the framework itself, including refining conceptual definitions to more clearly draw distinctions among related sub-elements, such as receptive context, leadership, and culture. This will help provide for more rigorous studies by making it easier for users to map measures back to PARIHS consistently, derive testable hypotheses using the framework, and design more effective implementation strategies. We have drafted an implementation guide, being published separately, which discusses in more detail recommendations for those using PARIHS in task-oriented implementation projects and research, or seeking to refine the framework. Below, we briefly discuss three specific opportunities to refine the PARIHS framework.

First, PARIHS acknowledges the dynamic relationships among elements and sub-elements in the framework and the often unpredictable nature of implementation. However, dynamic implies that elements/sub-elements interact or act as modifiers or contingencies, such that the effects of one is dependent on others [50]. As a result, the same implementation intervention may have wildly different effects in different settings [51]. PARIHS would be strengthened even more by beginning to describe how those dynamics might emerge and provide examples that could eventually help identify more generalizable patterns. Identifying and describing all potential interactions is clearly impossible, but currently, PARIHS elements are described on a continuum, low to high, that strongly implies linear relationships, which are inconsistent both with the broader concept of PARIHS as a dynamic model and with available evidence. For example, we have prospective studies that find senior leadership support changes dramatically over time, with senior leaders shifting among roles ranging from institutional mentors for the change to critics of it [52]; and that senior leadership support is not always a strong driver and certainly not always a necessary condition for implementation [53,54]. It may be possible to identify generalizable contextual interactions, such as senior leadership support being necessary for EBPs that involve coordination across departments or services, require large capital investments or lack strong professional endorsement.

In part, the lack of specifics about interactions among elements may arise from PARIHS straddling the line between a higher order planned action (or prescriptive) theory (PAT) for use by change agents to guide their implementation strategy, and a classical (or descriptive/explanatory) model meant to describe or explain how change occurs. The core concept articles explicitly propose that PARIHS be used to guide implementation by assessing evidence and context in order to inform facilitation, strongly positioning PARIHS as a prescriptive model, albeit not with the detail of a PAT as described by Graham and Tetroe [53].

Second, we also noted that a more explicit definition for 'successful implementation' is needed. This again is both a key strength of the framework and an opportunity to strengthen it. A clear definition of successful implementation is critical for moving implementation science literature forward, and we may do well to draw on the literatures of other disciplines. For example, researchers in education [55] and health promotion [56] have written specifically about criteria for determining when new programs are fully implemented. Likewise, scholars in management have written about conceptual considerations for defining effective implementation of new practices such as IT systems [57] and banking practices [58], including distinguishing implementation from 'compliant' use that is either incomplete or likely to degrade.

Conceptually, successful implementation might comprise three distinct aspects, identified as part of our aforementioned implementation Guide. All represent seemingly necessary conditions for concluding that a project has achieved successful implementation: realization of the implementation plan or strategy; achievement and maintenance of the targeted EBP; and achievement and maintenance of end-point patient or organizational outcomes. These three components reflect a logic model linking an implementation strategy to ultimate outcomes. This definition of successful implementation affords an understanding of when and how an implementation program has delivered the benefits as hypothesized. To accomplish that, we need to assess whether the implementation strategy occurred as planned, whether the EBP was established as needed, and whether desired outcomes followed.

Third, other conceptual models should be drawn on and compared to better elaborate the core PARIHS elements or to better position work using PARIHS in the broader literature. The PARIHS core concept papers make it clear that the developers envision PARIHS being used in combination with other conceptual frameworks. Findings in some of the studies suggest the value of making additional attributes of the evidence-based change more explicit such as those identified in Rogers' Diffusion of Innovation framework [34]. For example, Rogers' innovation attribute of the observability of a new practice (i.e., the extent to which its use by an individual is readily perceived by others in their social network) [2,59] does not appear to have an analogue in PARIHS. These types of comparisons and extensions would help build cumulative knowledge and inform refinements to the framework.

The PARIHS authors continue to revisit and refine the framework, recognize its limitations, and call for further research [7]. We consider a critical strength of any framework. Researchers [60] and practitioners [61] continue to use PARIHS and we expect more rigorous studies will be published. Already in the period since we completed our literature search, we are aware of at least five new publications citing PARIHS including two articles presenting results of validations of survey instruments based on the framework [62,63]. Also, several prospective research studies based on the framework are in progress by both the PARIHS team (http://www.parihs.org webcite) and other research teams, including one conducting research in Vietnam and several conducting research in the Veterans Health Administration QUERI program within the US.

Limitations

Our review had two limitations. First, we did not assess the 'gray' or unpublished literature or publications in languages other than English. In doing so, we may have missed important work relating to PARIHS.

Second, we focused exclusively on the PARIHS framework, and not on literature regarding other frameworks that may include similar or related constructs. Doing so was beyond the scope of our synthesis, though we do comment on the need for greater comparison and linkages between PARIHS and other frameworks.

Some may also view our methods as limited because we did not conduct a quantitative meta-analysis. However, we used methods appropriate to our research questions and to the literature being reviewed, which included few quantitative studies. We also took several steps to increase the transparency and reliability of our results.

Summary

The single greatest need for researchers using PARIHS, and other implementation models, is to use the framework prospectively and comprehensively, and evaluate that use relative to its perceived strengths and issues for enhancing successful implementation. Ultimately, the proof of any implementation framework is its demonstrated usefulness in practical terms to design implementation interventions and make implementation more effective under various conditions. Studies using the framework in this way will move the whole field forward.

Researchers using PARIHS in studies or to guide action research should clearly explain how PARIHS is used and how interventions or measures map to specific PARIHS elements. For example, studies of facilitation activities should explain how facilitation purpose, role and skills and attributes were defined or taken into account. Other reviews have similarly called for more explicit and detailed explanation of how theory is used in implementation studies [43,44]. It may be time for the implementation science community to develop consensus guidelines for what should be reported.

Competing interests

The authors declare that they have no competing interests.

Authors' contributions

CBS conceived the study. All authors abstracted, reviewed data and provided critical input on findings. CDH wrote first draft of paper and CBS, LJD and HH provided major input and revisions. All authors read, critiqued and approved the final manuscript.

Acknowledgements

This material is based upon work supported by the U.S. Department of Veterans Affairs, Office of Research and Development Health Services R&D Program. We wish to acknowledge the important contributions of Jeffrey Smith to the paper, and the important administrative assistance of Jared LeClerc and Rachel Smith. Also, our thanks to Corrine Voils for providing invaluable feedback on a draft of the paper, and to Lars Wallin and Jacqueline Tetroe for their excellent reviews and suggestions. The views expressed in this article are the authors' and do not necessarily reflect the position or policy of the Department of Veterans Affairs.

References

  1. McGlynn EA, Asch SM, Adams J, Keesey J, Hicks J, DeCristofaro A, Kerr EA: The quality of health care delivered to adults in the United States.

    N Engl J Med 2003, 348(26):2635-2645. PubMed Abstract | Publisher Full Text OpenURL

  2. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC: Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science.

    Implement Sci 2009, 4:50. PubMed Abstract | BioMed Central Full Text | PubMed Central Full Text OpenURL

  3. Kitson A, Harvey G, McCormack B: Enabling the implementation of evidence based practice: a conceptual framework.

    Quality in Health Care 1998, 7(3):149-158. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL

  4. Harvey G, Loftus-Hills A, Rycroft-Malone J, Titchen A, Kitson A, McCormack B, Seers K: Getting evidence into practice: the role and function of facilitation.

    Journal of Advanced Nursing 2002, 37(6):577-588. PubMed Abstract | Publisher Full Text OpenURL

  5. McCormack B, Kitson A, Harvey G, Rycroft-Malone J, Titchen A, Seers K: Getting evidence into practice: the meaning of 'context'.

    J Adv Nurs 2002, 38(1):94-104. PubMed Abstract | Publisher Full Text OpenURL

  6. Rycroft-Malone J, Seers K, Titchen A, Harvey G, Kitson A, McCormack B: What counts as evidence in evidence-based practice?

    J Adv Nurs 2004, 47(1):81-90. PubMed Abstract | Publisher Full Text OpenURL

  7. Kitson A, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A: Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges.

    Implementation Science 2008, 3(1):1. PubMed Abstract | BioMed Central Full Text OpenURL

  8. Stetler CB, Legro MW, Rycroft-Malone J, Bowman C, Curran G, Guihan M, Hagedorn H, Pineros S, Wallace CM: Role of "external facilitation" in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration.

    Implement Sci 2006, 1:23. PubMed Abstract | BioMed Central Full Text OpenURL

  9. Cummings GG, Estabrooks CA, Midodzi WK, Wallin L, Hayduk L: Influence of organizational characteristics and context on research utilization.

    Nursing research 2007, 56(4 Suppl):S24-39. PubMed Abstract | Publisher Full Text OpenURL

  10. Estabrooks CA, Midodzi WK, Cummings GG, Wallin L: Predicting research use in nursing organizations: a multilevel analysis.

    Nursing research 2007, 56(4 Suppl):S7-23. PubMed Abstract | Publisher Full Text OpenURL

  11. Bahtsevani C, Willman A, Khalaf A, Östman M: Developing an instrument for evaluating implementation of clinical practice guidelines: a test-retest study.

    Journal of Evaluation in Clinical Practice 2008, 14(5):839-846. PubMed Abstract | Publisher Full Text OpenURL

  12. Eccles M, Grimshaw J, Walker A, Johnston M, Pitts N: Changing the behavior of healthcare professionals: the use of theory in promoting the uptake of research findings.

    Journal of Clinical Epidemiology 2005, 58:107-112. PubMed Abstract | Publisher Full Text OpenURL

  13. Grol RPTM, Bosch MC, Hulscher MEJL, Eccles MP, Wensing M: Planning and Studying Improvement in Patient Care: The Use of Theoretical Perspectives.

    The Milbank Quarterly 2007, 85(1):93-138. PubMed Abstract | Publisher Full Text OpenURL

  14. ICEBeRG TICEtBR Group: Designing theoretically-informed implementation interventions.

    Implementation Science 2006, 1(1):4. PubMed Abstract | BioMed Central Full Text | PubMed Central Full Text OpenURL

  15. Rycroft-Malone J, Kitson A, Harvey G, McCormack B, Seers K, Titchen A, Estabrooks C: Ingredients for change: revisiting a conceptual framework.

    Qual Saf Health Care 2002, 11(2):174-180. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL

  16. Kitson AL, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A: Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges.

    Implementation Science 2008, 3:1-1. PubMed Abstract | BioMed Central Full Text | PubMed Central Full Text OpenURL

  17. Sandelowski M, Barroso J: Handbook for synthesizing qualitative research. New York: Springer Pub. Co; 2007. OpenURL

  18. Larkin RM: Challenges to prison-based mental health research: a case study. In D.N.Sc. Columbia University; 2008. OpenURL

  19. Donaldson NE, Rutledge DN, Ashley J: Outcomes of adoption: measuring evidence uptake by individuals and organizations.

    Worldviews on evidence-based nursing/Sigma Theta Tau International, Honor Society of Nursing 2004, 1(Suppl 1):S41-51. PubMed Abstract | Publisher Full Text OpenURL

  20. Kavanagh T, Stevens B, Seers K, Sidani S, Watt-Watson J: Examining Appreciative Inquiry as a knowledge translation intervention in pain management.

    Canadian Journal of Nursing Research 2008, 40(2):40-56. PubMed Abstract OpenURL

  21. Kavanagh T, Watt-Watson J, Stevens B: An examination of the factors enabling the successful implementation of evidence-based acute pain practices into pediatric nursing.

    Children's Health Care 2007, 36(3):303-321. OpenURL

  22. Larkin ME, Griffith CA, Capasso VA, Cierpial C, Gettings E, Walsh K, O'Malley C: Promoting research utilization using a conceptual framework.

    J Nurs Adm 2007, 37(11):510-516. PubMed Abstract | Publisher Full Text OpenURL

  23. O'Halloran P, Martin G, Connolly D: A model for developing, implementing, and evaluating a strategy to improve nursing and midwifery care.

    Practice Development in Health Care 2005, 4(4):180-191. Publisher Full Text OpenURL

  24. Rycroft-Malone J: The PARIHS framework--a framework for guiding the implementation of evidence-based practice.

    Journal of nursing care quality 2004, 19(4):297-304. PubMed Abstract | Publisher Full Text OpenURL

  25. Wallin L, Profetto-McGrath J, Levers MJ: Implementing nursing practice guidelines: a complex undertaking.

    J Wound Ostomy Continence Nurs 2005, 32(5):294-300.

    discussion 300-291

    PubMed Abstract | Publisher Full Text OpenURL

  26. Walsh K, Lawless J, Moss C, Allbon C: The development of an engagement tool for practice development.

    Practice Development in Health Care 2005, 4:124-130. Publisher Full Text OpenURL

  27. Ellis I, Howard P, Larson A, Robertson J: From workshop to work practice: An exploration of context and facilitation in the development of evidence-based practice.

    Worldviews on evidence-based nursing/Sigma Theta Tau International, Honor Society of Nursing 2005, 2(2):84-93. PubMed Abstract | Publisher Full Text OpenURL

  28. Owen S, Milburn C: Implementing research findings into practice: improving and developing services for women with serious and enduring mental health problems.

    J Psychiatr Ment Health Nurs 2001, 8(3):221-231. PubMed Abstract | Publisher Full Text OpenURL

  29. Rycroft-Malone J, Harvey G, Seers K, Kitson A, McCormack B, Titchen A: An exploration of the factors that influence the implementation of evidence into practice.

    J Clin Nurs 2004, 13(8):913-924. PubMed Abstract | Publisher Full Text OpenURL

  30. Sharp ND, Pineros SL, Hsu C, Starks H, Sales AE: A Qualitative Study to Identify Barriers and Facilitators to Implementation of Pilot Interventions in the Veterans Health Administration (VHA) Northwest Network.

    Worldviews on evidence-based nursing/Sigma Theta Tau International, Honor Society of Nursing 2004, 1(2):129-139. PubMed Abstract | Publisher Full Text OpenURL

  31. Stetler C, Legro M, Rycroft-Malone J, Bowman C, Curran G, Guihan M, Hagedorn H, Pineros S, Wallace C: Role of "external facilitation" in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration.

    Implementation Science 2006, 1(1):23. PubMed Abstract | BioMed Central Full Text | PubMed Central Full Text OpenURL

  32. Wallin L, Rudberg A, Gunningberg L: Staff experiences in implementing guidelines for Kangaroo Mother Care--a qualitative study.

    Int J Nurs Stud 2005, 42(1):61-73. PubMed Abstract | Publisher Full Text OpenURL

  33. Conklin J, Stolee P: A model for evaluating knowledge exchange in a network context.

    The Canadian journal of nursing research = Revue canadienne de recherche en sciences infirmieres 2008, 40(2):116-124. PubMed Abstract OpenURL

  34. Wallin L, Estabrooks CA, Midodzi WK, Cummings GG: Development and validation of a derived measure of research utilization by nurses.

    Nursing research 2006, 55(3):149-160. PubMed Abstract | Publisher Full Text OpenURL

  35. Wright J: Developing a tool to assess person-centred continence care.

    Nurs Older People 2006, 18(6):23-28. PubMed Abstract OpenURL

  36. Wright J, McCormack B, Coffey A, McCarthy G: Evaluating the context within which continence care is provided in rehabilitation units for older people.

    INTERNATIONAL JOURNAL OF OLDER PEOPLE NURSING 2007, 2(1):9-19. PubMed Abstract | Publisher Full Text OpenURL

  37. Brown D, McCormack B: Developing Postoperative Pain Management: Utilising the Promoting Action on Research Implementation in Health Services (PARIHS) Framework.

    Worldviews on Evidence-Based Nursing 2005, 2(3):131-141. PubMed Abstract | Publisher Full Text OpenURL

  38. Meijers JM, Janssen MA, Cummings GG, Wallin L, Estabrooks CA, R YGH: Assessing the relationships between contextual factors and research utilization in nursing: systematic literature review.

    Journal of advanced nursing 2006, 55(5):622-635. PubMed Abstract | Publisher Full Text OpenURL

  39. Milner M, Estabrooks CA, Myrick F: Research utilization and clinical nurse educators: A systematic review.

    Journal of evaluation in clinical practice 2006, 12(6):639-655. PubMed Abstract | Publisher Full Text OpenURL

  40. Alkema GE, Frey D: Implications of translating research into practice: a medication management intervention.

    Home health care services quarterly 2006, 25(1-2):33-54. PubMed Abstract | Publisher Full Text OpenURL

  41. Doran DM, Sidani S: Outcomes-focused knowledge translation: a framework for knowledge translation and patient outcomes improvement.

    Worldviews on evidence-based nursing/Sigma Theta Tau International, Honor Society of Nursing 2007, 4(1):3-13. PubMed Abstract | Publisher Full Text OpenURL

  42. McCormack B, McCarthy G: Development of the Context Assessment Index (CAI). Republic of Ireland Health Research Board and the Northern Ireland Department of Health, Social Services and Public Safety; 2008.

  43. Davies P, Walker A, Grimshaw J: A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations.

    Implementation Science 2010, 5(1):14. PubMed Abstract | BioMed Central Full Text | PubMed Central Full Text OpenURL

  44. Davies P, Walker A, Grimshaw J: Theories of behavior change in studies of guideline implementation.

    Proc Br Psychol Soc 2003., 11 OpenURL

  45. Hoff T, Jameson L, Hannan E, Flink E: A review of the literature examining linkages between organizational factors, medical errors, and patient safety.

    Med Care Res Rev 2004, 61(1):3-37. PubMed Abstract | Publisher Full Text OpenURL

  46. Nilsson Kajermo K, Bostrom AM, Thompson D, Hutchinson A, Estabrooks C, Wallin L: The BARRIERS scale -- the barriers to research utilization scale: A systematic review.

    Implementation Science 2010, 5(1):32. PubMed Abstract | BioMed Central Full Text | PubMed Central Full Text OpenURL

  47. Schulz KF, Altman DG, Moher D: CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials.

    BMC Med 2010, 8:18. PubMed Abstract | BioMed Central Full Text | PubMed Central Full Text OpenURL

  48. Stroup DF, Berlin JA, Morton SC, Olkin I, Williamson GD, Rennie D, Moher D, Becker BJ, Sipe TA, Thacker SB: Meta-analysis of observational studies in epidemiology: a proposal for reporting. Meta-analysis Of Observational Studies in Epidemiology (MOOSE) group.

    Jama 2000, 283(15):2008-2012. PubMed Abstract | Publisher Full Text OpenURL

  49. Bhattacharyya O, Reeves S, Garfinkel S, Zwarenstein M: Designing theoretically-informed implementation interventions: Fine in theory, but evidence of effectiveness in practice is needed.

    Implementation Science 2006, 1(1):5. PubMed Abstract | BioMed Central Full Text | PubMed Central Full Text OpenURL

  50. Johns G: The essential impact of context on organizational behavior.

    Academy of Management Review 2006, 31(2):386-408. OpenURL

  51. Helfrich CD, Weiner BJ, McKinney MM, Minasian L: Determinants of Implementation Effectiveness: Adapting a Framework for Complex Innovations.

    Med Care Res Rev 2007, 64(3):279-303. PubMed Abstract | Publisher Full Text OpenURL

  52. Van De Ven AH, Polley DE: The Innovation Journey. New York: Oxford University Press; 1999. OpenURL

  53. Edmondson A: Psychological Safety and Learning Behavior in Work Teams.

    Administrative Science Quarterly 1999, 44(2):350-383. Publisher Full Text OpenURL

  54. Edmondson A: Speaking up in the operating room: How team leaders promote learning in interdisciplinary action teams.

    Journal of Management Studies 2003, 40:1419-1452. Publisher Full Text OpenURL

  55. Yin RK: Changing urban bureaucracies: How new practices become routinized. Lexington, MA: Lexington Books; 1979. OpenURL

  56. Goodman RM, Steckler A: A model for the institutionalization of health promotion programs.

    Family & Community Health 1989, 11(4):63-78. OpenURL

  57. Klein KJ, Sorra JS: The challenge of innovation implementation.

    Academy of Management Review 1996, 21(4):1055-1080. Publisher Full Text OpenURL

  58. Nord WR, Tucker S: Implementing Routine and Radical Innovations. Lexington, Massachusetts: Lexington Books; 1987. OpenURL

  59. Rogers EM: Diffusion of Innovations. Fifth edition. New York, NY: The Free Press; 2003. OpenURL

  60. Rycroft-Malone J, Fontenla M, Seers K, Bick D: Protocol-based care: the standardisation of decision-making?

    Journal of Clinical Nursing 2009, 18(10):1490-1500. PubMed Abstract | Publisher Full Text OpenURL

  61. Capasso V, Collins J, Griffith C, Lasala CA, Kilroy S, Martin AT, Pedro J, Wood SL: Outcomes of a clinical nurse specialist-initiated wound care education program: using the promoting action on research implementation in health services framework.

    Clinical Nurse Specialist: The Journal for Advanced Nursing Practice 2009, 23(5):252-257. Publisher Full Text OpenURL

  62. McCormack B, McCarthy G, Wright J, Coffey A: Development and Testing of the Context Assessment Index (CAI).

    Worldviews on Evidence-Based Nursing 2009, 6(1):27-35. PubMed Abstract | Publisher Full Text OpenURL

  63. Helfrich CD, Li YF, Sharp ND, Sales AE: Organizational readiness to change assessment (ORCA): Development of an instrument based on the Promoting Action on Research in Health Services (PARiHS) framework.

    Implementation Science 2009, 4(1):38. PubMed Abstract | BioMed Central Full Text | PubMed Central Full Text OpenURL