Skip to main content

Measuring determinants of implementation behavior: psychometric properties of a questionnaire based on the theoretical domains framework

Abstract

Background

To be able to design effective strategies to improve healthcare professionals’ implementation behaviors, a valid and reliable questionnaire is needed to assess potential implementation determinants. The present study describes the development of the Determinants of Implementation Behavior Questionnaire (DIBQ) and investigates the reliability and validity of this Theoretical Domains Framework (TDF)-based questionnaire.

Methods

The DIBQ was developed to measure the potential behavioral determinants of the 12-domain version of the TDF (Michie et al., 2005). We identified existing questionnaires including items assessing constructs within TDF domains and developed new items where needed. Confirmatory factor analysis was used to examine whether the predefined structure of the TDF-based questionnaire was supported by the data. Cronbach’s alpha was calculated to assess internal consistency reliability of the questionnaire, and domains’ discriminant validity was investigated.

Results

We developed an initial questionnaire containing 100 items assessing 12 domains. Results obtained from confirmatory factor analysis and Cronbach’s alpha resulted in the final questionnaire consisting of 93 items assessing 18 domains, explaining 63.3% of the variance, and internal consistency reliability values ranging from .68 to .93. Domains demonstrated good discriminant validity, although the domains ‘Knowledge’ and ‘Skills’ and the domains ‘Skills’ and ‘Social/professional role and identity’ were highly correlated.

Conclusions

We have developed a valid and reliable questionnaire that can be used to assess potential determinants of healthcare professional implementation behavior following the theoretical domains of the TDF. The DIBQ can be used by researchers and practitioners who are interested in identifying determinants of implementation behaviors in order to be able to develop effective strategies to improve healthcare professionals’ implementation behaviors. Furthermore, the findings provide a novel validation of the TDF and indicate that the domain ‘Environmental context and resources’ might be divided into several environment-related domains.

Peer Review reports

Background

Much research and funding is invested into developing, piloting, and evaluating evidence-based innovations to promote health. However, the transfer of effective innovations, such as pharmacological and behavior change interventions, into routine healthcare practice often does not happen as desired [15]. With the public health impact of these innovations depending on their implementation in practice, it is important to understand healthcare professionals’ (HCP) implementation behaviors and factors associated with suboptimal use of research evidence [6, 7].

Many factors can potentially influence HCPs’ implementation behaviors. These factors may be related to characteristics of the innovation (e.g., compatibility, complexity), social setting (e.g., norms, support), organizational context (e.g., capacity, resources), innovation strategies (e.g., training, reimbursement), patient (e.g., attitudes, compliance), and the individual HCP (e.g., skills, attitudes) [6, 813]. Identifying the key factors associated with HCP implementation behavior can inform the development of strategies to promote evidence-based behavior [6, 1419].

Research has shown that active implementation strategies, such as educational outreach and reminders, can be effective in enhancing implementation behaviors [20, 21]. However, due to the scarce use of theory to inform the choice and design of implementation strategies [22], there is a lack of understanding of why strategies are effective or not [23]. To enhance the effective development of implementation strategies, therefore, many advocate using a theoretical approach to guide the investigation of implementation determinants [14, 17, 2325].

Behavior change theories provide testable hypotheses about when and why specific factors will lead to a certain implementation behavior. However, a limitation in the use of these theories to asses and identify factors underlying HCP implementation behavior is the large number of theories that might be used and their overlapping constructs [12, 2527]. The Theoretical Domains Framework (TDF) [28, 29] is an integrative framework that can be used to overcome this constraint. Within the original TDF [28], constructs from 33 behavior change theories were grouped into 12 domains of behavioral determinants covering the full range of current scientific explanations for human behavior (i.e., ‘Knowledge’, ‘Skills’, ‘Social/professional role and identity’, ‘Beliefs about capabilities’, ‘Beliefs about consequences’, ‘Memory, attention and decision processes’, ‘Environmental context and resources’, ‘Social influences’, ‘Emotion’, ‘Behavioral regulation’, and ‘Nature of the behaviors’). As a consequence, researchers can use this integrative framework instead of having to choose between different theories.

The TDF has instigated a new line of investigation and has been applied in many implementation studies. Specifically, qualitative studies concluded that the TDF was useful for the comprehensive exploration of possible explanations for suboptimal implementation behavior (e.g., [3035]) and for the identification of suitable theories to further investigate these behaviors [27, 36]. Furthermore, the framework was used for the development of questionnaires to assess potential implementation behavior determinants [3739]. So far, however, questionnaires’ internal consistency reliability was insufficient [3739], and only one out of three questionnaires was able to measure the theoretical domains independently [39]. Consequently, there is need for a valid and reliable method to identify theory-based factors influencing HCPs’ implementation behaviors to be able to design effective implementation strategies [12].

Recently, the TDF [28] has been validated, leading to the revised TDF including 14 domains [29]. Main differences between the original and the revised framework include the separation of the domain ‘Optimism’ from the domain ‘Beliefs about capabilities’ and the domain ‘Reinforcement’ from the domain ‘Beliefs about consequences’. Moreover, the domain ‘Motivation and goals’ was divided into two separate domains, i.e., ‘Intentions’ and ‘Goals’, and the domain ‘Nature of the behaviors’ was omitted in the revised framework. As a first step in the development of a TDF-based questionnaire for the valid and reliable assessment of factors influencing HCP implementation behavior, we developed a generic questionnaire assessing the 14 domains of behavioral determinants of the revised TDF [29]. Investigation of questionnaire items’ discriminant content validity based on judgments of a sample of experts on behavior change theory resulted in a questionnaire able to assess all domains discriminately, except for the domains ‘Reinforcement’, ‘Goals’, and ‘Behavioral regulation’. Accordingly, the findings suggested that the 12-domain original version of the TDF [28] might be more applicable in developing a TDF-based questionnaire [40].

The main aim of the current study was to develop a questionnaire based on the 12-domain version of the TDF [28] and to test the psychometric properties of this questionnaire on a sample of HCPs. To validate the Determinants of Implementation Behavior Questionnaire (DIBQ) the following research questions were addressed: 1) does confirmatory factor analysis support the predefined structure of the TDF-based questionnaire (i.e., construct validity); 2) is the questionnaire able to measure TDF domains in a reliable way (i.e., internal consistency reliability); and 3) are the domains of the questionnaire independently measurable (i.e., discriminant validity)? Our specific interest is in HCPs’ implementation of physical activity (PA) interventions, which we used in this study as a field of application for the DIBQ.

Methods

Development of the determinants of implementation behavior questionnaire

We developed a questionnaire that initially included 100 items assessing each of the domains through their related key constructs (see Additional file 1). First, constructs within domains were selected based on:

  1. 1.

    Their conceptual relatedness to the content of the domain (i.e., Knowledge, Skills, Professional role, and Memory);

  2. 2.

    Their inclusion in relevant theories frequently used in the field of behavior change (and thus ready access to existing items): the Theory of Planned Behavior [41] (i.e., Perceived behavioral control, Attitude, Subjective norm, and Intention) and Social Cognitive Theory [42] (i.e., Self-efficacy, Outcome expectancies, and Social support);

  3. 3.

    The existence of validated scales to measure constructs (i.e., Role clarity, Optimism, Emotions, Action planning, Coping planning, Automaticity); and/or

  4. 4.

    Constructs’ relevance to the implementation of PA intervention in routine healthcare by mapping factors resulting from previous research [13, 43] onto the TDF domains (i.e., Reinforcement, Priority, Characteristics of the innovation, Characteristics of the socio-political context, Characteristics of the organization, Characteristics of the participants, Characteristics of the innovation strategy, Descriptive norm).

Second, for each domain a minimum of two and a maximum of 24 items were developed, with an average of 4 items for each construct. Items were related to the target behavior ‘delivering PA interventions following the guidelines’. Items measuring the constructs within the domains ‘Knowledge’, ‘Beliefs about capabilities’, ‘Social influences’, ‘Emotion’, ‘Behavioral regulation’, and ‘Nature of the behaviors’ [37, 41, 42, 4449] were adapted from previously published questionnaires. The content of these items was based on previous research on factors influencing the implementation of PA intervention in routine healthcare [13, 43]. For instance, items measuring the constructs Self-efficacy [41] and Coping planning [47] were developed so that they included HCPs’ barriers of lack of time and patient motivation. Items measuring constructs within the domains ‘Skills’, ‘Social/professional role and identity’, ‘Memory, attention, and decision processes’ were based on results of the discriminant content validity study [40]. With regard to the domain ‘Beliefs about consequences’, items measuring the constructs Attitude [41] and Outcome expectancies [42] were adapted from previously published questionnaires, whereas items measuring the construct Reinforcement were newly developed (as none could be located in the literature). Regarding the domain ‘Motivation and goals’, items measuring the construct Intention were adapted from a previously published questionnaire [41], while items were newly developed for the construct Priority. Furthermore, new items were created for the domain ‘Environmental context and resources’. New items were developed based on discussions between WAG, MRC, and JMH. These discussions were informed by the academic literature on the concept and definition of specific domains and constructs, questions to identify behavior change processes as formulated by Michie et al. [28], and themes emerging from interviews on the implementation of PA interventions [43]. Finally, the questionnaire was piloted among five colleague researchers and a sample of eight physical therapists. Piloting indicated that the questionnaire was easily understood and well received by the respondents.

Respondents and procedure

We recruited physical therapists delivering PA interventions to a variety of target groups (i.e., people with chronic obstructive pulmonary disease, diabetes, arthritis or obesity). They were recruited through physical therapist associations and contacted opportunistically via their practice websites. Physical therapists were sent an email including the link to the online questionnaire and were assured that their responses would be confidential and anonymous. They reported on their gender, age, practice experience, sort of practice/workplace, and the socioeconomic status (SES) of the majority of their intervention participants. Full questionnaire completion was rewarded with a 25 euro voucher. Non-respondents were sent an email with a questionnaire on their demographic characteristics.

Data management

Questionnaires were exported from Qualtrics software, version 45433 [50] to IBM SPSS Statistics version 19.0 [51] for analyses. Responses were scored from 1 (strongly disagree) to 7 (strongly agree). Items worded negatively, such as ‘Delivering [PA intervention] following the guidelines is something I often forget’, were reverse-coded. For the six social support items, it was possible to fill in ‘Not applicable’, because not all physical therapists work together with others in delivering PA interventions, and some are part of the management of their organization and therefore do not receive management support. Scores on this category were recoded as missing.

Data analyses

Confirmatory factor analyses

Confirmatory factor analysis was used to examine whether the a priori assignment of items to Michie et al.’s [28] TDF domains was supported by the data (i.e., research question 1). To perform the confirmatory factor analysis, we used the oblique multiple group (OMG) method [52, 53], which has been previously shown to perform better or to be highly comparable to the more well-known confirmatory common factor analysis [5456]. The OMG method involves calculating correlations between items and domains, from which the following conclusions are drawn: if an item correlates highest with the domain the item was assigned to, the item is correctly assigned to the domain (and the predefined structure is confirmed); if an item correlates highest with a domain the item was not assigned to, the item is incorrectly assigned to the domain (and the predefined structure is not confirmed). In the OMG method, correlations between items and domains are corrected for self-correlation and test length [52].

When an item is assigned incorrectly, adjustments should be made. We used the iterative OMG procedure to make adjustments to the structure of our questionnaire. This step-wise procedure involves testing the adjusted assignment obtained from an OMG analysis in a subsequent OMG analysis on the same data set, which will either support the assignment or provide suggestions for new adjustments. When, based on these suggestions, a new adjustment is made, this assignment can be tested again on the same data set. The iterative procedure continues until the adjusted assignment is supported by the data (i.e., items correlate highest with the domain they are assigned to; the adjustment leads to a higher total explained variance) or when none of the adjusted assignments are supported by the data and a newly obtained adjusted assignment is equal to one of the previously assignments. Preferably, changes in item assignment can be justified by a theoretical or conceptual link between the incorrect assigned item and the domain to which it has been assigned [54].

In this study, the iterative procedure of adjustment consisted of two iterations. In the first iteration, adjustments were made based on suggestions from the OMG analyses and theoretical or conceptual links between items and domains. In the second iteration, adjustments were also based on suggestions from the OMG analyses and theoretical or conceptual assumptions. In addition, we compared poor fitting domains from the OMG solution to the solution based on exploratory factor analysis (i.e., principal component analysis; PCA [57]) to guide adjustments of the assignment of items to domains. Following the iterative OMG procedure, adjustments were only retained when they were supported by the new results from the OMG analysis. Finally, the variance-accounted-for by the adjusted predefined components was compared to the variance-accounted-for by the components resulting from the PCA. Preferably this difference is small, which indicates that the adjusted predefined structure fits the data well.

Internal consistency reliability and discriminant validity

Cronbach’s alpha [58] was computed to assess the internal consistency reliability of the items assessing each domain (i.e., research question 2). Two tests of discriminant validity [59] were undertaken to assess if the DIBQ was able to measure the TDF domains discriminately (i.e., research question 3). First, discriminant validity was assessed by determining whether the bootstrapped 95% confidence interval around Pearson’s correlations between domains included 1.00 [60]. Second, we calculated attenuation-corrected correlations to discover the ‘true correlation’ between the domains [61].

Computational note

The analyses were performed using IBM SPSS statistics version 19.0 [51]. For the OMG analyses, we used a SPSS-macro file obtained from Timmerman and Stuive [62]. Attenuation-corrected correlations were calculated using the R software environment [63] using the R-package Psy [64].

Ethics

The Medical Ethics Committee of the Leiden University Medical Centre granted ethical approval of this study (reference number NV/CME 09/081).

Results

Characteristics of the respondents

Of the 496 physical therapists who were invited for the study, 274 (55.2%) delivering 15 different PA interventions completed the questionnaire. The number of questionnaires analyzed was 270, following removal of physical therapists reporting no experience with PA intervention delivery. Table 1 shows characteristics of respondents and non-respondents. Of the respondents, 58.1% (n = 157) were female, they were on average 39.7 (SD = 12.3) years old, and had on average 14.9 (SD = 11.3) years of practice experience. Most of them worked in a group practice (68.5%, n = 185), and most delivered PA interventions to an equal percentage of participants with a low and high SES (53%, n = 143) or to people with a low SES (44.8%, n = 121). A total of 68 out of 222 non-respondents (30.6%) filled in the non-respondents questionnaire. Comparisons between respondents and non-respondents indicated that the latter were significantly older and had more practice experience.

Table 1 Demographic characteristics of respondents and non-respondents

Psychometric properties of the questionnaire

Confirmatory factor analysis

OMG analyses showed that the total variance explained by the initial questionnaire was 48.0%. In other words, the initial assignment of the items to the 12 domains of the TDF explained about half of the total variance in item scores. In the first iteration of adjustments, results of the OMG analysis indicated that model fit could be improved by adjusting the domains ‘Environmental context and resources’ and ‘Beliefs about capabilities’. Based on Fleuren et al.’s [8] categorization of innovation determinants into factors related to the innovation, socio-political context, organization, innovation strategy, and Chaudoir et al.’s [12] additional category of factors related to the patient, the first adjustment of the questionnaire included dividing the domain ‘Environmental context and resources’ into the domains ‘Innovation’, ‘Socio-political context’, ‘Organization’, ‘Patient’, and ‘Innovation strategy’. This process was done in five subsequent steps (in each step, one new domain was entered), with every step leading to a higher total explained variance, validating the adjustment. With regard to the domain ‘Beliefs about capabilities’, the constructs Self efficacy and Perceived behavioral control did not fit well with the conceptually different ‘Optimism’ items, and therefore ‘Optimism’ items were assigned to a standalone domain. Subsequently, this adjustment was supported by the results of the re-run of the OMG analysis.

In the second iteration, further improvement of model fit was informed by comparing the poor fitting domains from the OMG solution with the solution from the PCA. This led to the assignment of items measuring social support from the management to the domain ‘Organization’, and ‘Priority’ items to a separate domain. Furthermore, the domain ‘Emotion’ was divided into two domains (i.e., ‘Negative emotions’ and ‘Positive emotions’) and items measuring the domain ‘Memory, attention, and decision processes’ and the construct Automaticity were combined in the ‘Nature of the behaviors’ domain. Again, these adjustments were validated by re-running the OMG analyses.

For each of the resulting 18 domains, a Cronbach’s alpha was computed. Investigation of ‘alpha, if item deleted’ values revealed that seven items could be deleted. These were one item measuring the domain ‘Priority’, one item measuring the domain ‘Innovation’, three items measuring the domain ‘Organization’, one item measuring the domain ‘Socio-political context’, and one item measuring the domain ‘Patient’. After these adjustments, the final questionnaire included 93 items assessing 18 domains (see Table 2). Definitions of these domains are shown in Table 3. In addition, OMG results showed that the total variance explained by the domains was increased with more than 15% to 63.3%. The variance-accounted-for by the structure of the questionnaire as we built it differed 4.7% with the variance-accounted-for by the components resulting from the PCA. This can be considered a small difference [65], indicating that the predefined (and adjusted) structure fits the data well. A comparison between the initial and the final questionnaire is shown in Table 4.

Table 2 Final questionnaire
Table 3 Domain definitions
Table 4 Comparison between initial and final questionnaire

Internal consistency reliability and discriminant validity

Internal consistency reliability values for the 18 domains of the final questionnaire ranged from .68 for the domain ‘Innovation’ (i.e., the only domain with an alpha < .70) to .93 for the domain ‘Knowledge’. None of the bootstrapped 95% confidence intervals around Pearson’s correlations included 1.00, indicating sufficient discriminant validity (for an overview of all correlations between domains, see Additional file 2). In addition, we found high attenuation-corrected correlations between the domains ‘Knowledge’ and ‘Skills’ (r = .80) and the domains ‘Skills’ and ‘Social/professional role and identity’ (r = .86), which suggests overlap between these domains (see Additional file 3).

Discussion

We developed and tested a questionnaire assessing factors influencing HCPs’ implementation behaviors that was based on a theoretical framework of behavioral determinants [28]. The DIBQ was one of the first TDF-based questionnaires that was developed in a rigorous manner, and showing very good psychometric properties. That is, it had good construct validity, and the majority of domains showed high internal consistency reliability and discriminant validity. While our focus was on the measurement of factors influencing the implementation of PA interventions in PHC, we suggest that the DIBQ can be applied more broadly, as the questionnaire can easily be adapted to other contexts in which implementation research takes place. Consequently, the DIBQ can solve previously reported problems with the measurement of theory-based factors underlying HCP behavior [12, 2527]. This can contribute to the development of effective implementation strategies and subsequently the impact of evidence-based interventions.

With regard to the questionnaire’s construct validity, our findings supported the majority of the predefined structure of the questionnaire that was based on the 12 domains of the TDF [28]. They correspond with Taylor et al. [39, 68], who found good discriminant validity of TDF domains in a questionnaire measuring influences on patient safety behaviors [39] and in the Determinants of Physical Activity Questionnaire [68]. These results provide an additional level of validation for the content of the TDF, and they confirm the viability of using the framework for construction of a theory-based questionnaire. Nevertheless, the questionnaire’s construct validity could be enhanced by some adjustments in content of the domains and the structure of the questionnaire to 18 domains.

The main adjustment we made to the structure of the questionnaire was dividing the domain ‘Environmental context and resources’ into five different environment-related domains: ‘Innovation’, ‘Socio-political context’, ‘Organization’, ‘Patient’, and ‘Innovation strategy’. This adjustment is consistent with leading theoretical models on the introduction of innovations in healthcare [6, 812]. Replication of this domain-structure in future research may suggest including five different environment-related domains in the TDF. Next, ‘Optimism’ items were separated from the domain ‘Beliefs about capabilities’. This separation makes sense because ‘Optimism’ items were measured as a general disposition (e.g., ‘In my work as a physical therapist, in uncertain times, I usually expect the best’), whereas ‘Beliefs about capabilities’ items concerned capabilities that are required to achieve a specific outcome (e.g., ‘I am confident that I can deliver [PA intervention] following the guidelines’). Furthermore, the adjustment corresponds with the results of the recent validation of the TDF [29]. Items measuring social support from the management were assigned to the domain ‘Organization’ and ‘Priority’ items were separated from ‘Intention’ items. The first adjustment could also be justified by conceptual links between items and domains, and the latter adjustment corresponded with results of the validated TDF [29]. In addition, dividing the domain ‘Emotion’ into the domains ‘Positive emotions’ and ‘Negative emotions’ could be explained by previous research that indicated that positive and negative affect are two relatively independent constructs that can be measured discriminately [69, 70]. Based on similarities in their content, items measuring the domain ‘Memory, attention, and decision processes’ and Automaticity items were merged into the domain ‘Nature of the Behaviors’. Moreover, the link between automatic behaviors and memory was highlighted by Wood and Neal [71]. When developing a TDF-based questionnaire, it is possible that adding questions on attention and decision making to the memory items might decrease the overlap between the domains ‘Memory, attention, and decision processes’ and ‘Nature of the Behaviors’. Finally, some items measuring the domains ‘Priority’, ‘Innovation’, ‘Organization’, ‘Socio-political context’, and ‘Patient’ were deleted based on the domains’ Cronbach’s alpha values. An explanation based on the content of these items could not be found; however, lack of internal consistency reliability of the domains ‘Priority’, ‘Innovation’, ‘Organization’, ‘Socio-political context’, and ‘Patient’ might be related to the fact that the items measuring these domains were all newly developed. This suggests that items measuring the domain ‘Environmental context and resources’ can be improved (see Chaudoir et al. [12] for an overview of measures assessing these domains related to the environment).

No adjustments were needed for five out of the 12 domains of the initial questionnaire: ‘Knowledge’, ‘Skills’, ‘Social/professional role and identity’, ‘Beliefs about consequences’, and ‘Behavioral regulation’. This might be explained by the use of previously published questionnaires for the development of ‘Knowledge’ and ‘Behavioral regulation’ items, and most of the ‘Beliefs about consequences’ items. Furthermore, items measuring the domains ‘Skills’ and ‘Social/professional role and identity’ were validated by the discriminant content validity study [40]. Noticeably, the ‘Knowledge’ item ‘I know how to…,’ ‘Reinforcement’ items, and items measuring the construct Action Planning performed well, while they could not be validated by the discriminant content validity study [40]. This might be explained by the divergence in the main aims of the two studies; the increased focus on differences between individual items when investigating items’ discriminant content validity, and the emphasis on similarities between groups of items when examining a questionnaire’s construct validity. Indeed, in the present study, items that were not validated in the discriminant content validity study were surrounded by other previously validated items.

Compared to three other studies using a TDF-based questionnaire to identify implementation behavior determinants [3739], our questionnaire demonstrated high internal consistency reliability for the majority of domains. Explanations for this might be the lower number of items that the previous studies used to measure each domain [3739] and the development of items for domains instead of constructs within domains [38, 39]. Furthermore, it is not clear to what extent Beenstock et al. [38] and Taylor et al. [39] used items from previously published questionnaires.

Although OMG analyses revealed sufficient discriminant validity on item level, attenuation-corrected correlations revealed overlap between the domains ‘Knowledge’ and ‘Skills’ and ‘Skills’ and ‘Social/professional role and identity’. On the other hand, bootstrapped 95% confidence intervals around correlations suggested that the questionnaire was able to measure TDF domains discriminately. Based on these results and the different content of the domains, we did not merge them into one single domain. However, high correlations between domains might be problematic when analyzing associations between domains and outcome variables taking a multivariate approach.

While our focus was on the measurement of factors influencing HCPs’ implementation of PA interventions, the questionnaire was designed to be easily adaptable so it can be used in studies investigating implementation behaviors performed by other HCPs in other settings. However, depending on the behavior, the implementing HCP, and the context, it may be necessary to include items for specific barriers and facilitators. For example, time, patient motivation, and financial support may play a role in the delivery of PA interventions by physical therapists, while these factors might not relate to other behaviors, HCPs and settings. Moreover, validity and reliability of use of the questionnaire for other behaviors, HCPs and settings needs further investigation.

Some limitations of this study need to be taken into consideration when interpreting the results. First, respondents were physical therapists delivering PA interventions to a variety of target groups. In this study, we did not distinguish between the different PA interventions. Our results suggest sufficient internal validity of the DIBQ. However, a question remains as to whether the structure of the DIBQ holds for every specific PA intervention. In this study, small sample sizes within each PA intervention (sample sizes varied from 4 to 101) hindered the performance of confirmatory factor analysis for each PA intervention separately. A recommendation for future applications of the DIBQ is to replicate the reliability analysis for the target group at hand. Second, the questionnaire assessed TDF domains through their related constructs. However, to develop a questionnaire that is of an acceptable length to fill in, only a selection of constructs could be measured. Although the selection of key-constructs was based on previous research on factors influencing the implementation of PA interventions in primary healthcare [13, 43], it could be that some of the domains’ key-constructs are not part of the questionnaire leading to decreased validity of the measurement of domains. For example, the construct Intrinsic motivation [72] was not included to measure the domain ‘Motivation and goals’ and the construct Burnout [73] was not included to measure the domain ‘Emotion’, although we know from previous research that these are important determinants for HCPs’ evidence-based practice [74, 75]. Nevertheless, a questionnaire including 93 items might still be too long to fill in. This could also be an explanation for the 55.2% response rate, which was comparable to previous reported response rates of 54% [76] and 57% [77] in surveys among physical therapists, but can be considered low in comparison to Barrett et al. [78], who reached a response rate of 88%. A next step in the development process could be to develop a shorter version of the DIBQ and assess its psychometric properties. One strategy to decrease the amount of items would be to select items measuring the domains directly, instead of through their related key construct. Taking into account the criterion for a reliable component (i.e., at least three items with a loading above .80 [79]), this could decrease the average of 4 items for each construct to 4 items for each domain. The results of the discriminant content validity study [40] may guide the selection of items in order to obtain a shortened version of the questionnaire. Comparisons between respondents and non-respondents indicated that the latter were significantly older and had more practice experience, which limits the generalizability of our results. Finally, the methods used to validate our questionnaire were limited to factor analyses and the examination of discriminant validity of the domains, and only internal consistency reliability was assessed. Future research should also investigate items’ predictive validity and test-retest reliability of the questionnaire.

Conclusions

This study describes the development and initial validation of the DIBQ. The questionnaire showed good construct validity (i.e., research question 1) and the majority of domains showed high internal consistency reliability (i.e., research question 2) and discriminant validity (i.e., research question 3). Therefore, the questionnaire is viable to measure potential determinants of implementation behavior in a theory-based and comprehensive way. The identification of factors influencing implementation behaviors provides important information on what factors should be targeted when designing strategies to promote the effective implementation of interventions [6, 1419]. This is highly likely to increase the impact of health behavior change interventions. Future studies on the psychometric properties of the questionnaire are warranted and should go beyond construct validity, internal consistency reliability, and discriminant validity. In addition, more research is needed to understand the strengths and limitations of the questionnaire when it is used for other behaviors among other HCPs and in other settings.

Consent

In our study, completion of the questionnaire indicated participants’ consent for their participation in the study.

Abbreviations

HCP:

Healthcare professional

TDF:

Theoretical domains framework

PA:

Physical activity

SES:

Socioeconomic status.

References

  1. Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE: Knowledge translation of research findings. Implement Sci. 2012, 7: 50-10.1186/1748-5908-7-50.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Glasgow RE, Lichtenstein E, Marcus AC: Why don’t we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. Am J Public Health. 2003, 93: 1261-1267. 10.2105/AJPH.93.8.1261.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Haines A, Kuruvilla S, Borchert M: Policy and practice bridging the implementation gap between knowledge and action for health. Bull World Health Organ. 2004, 82: 724-732.

    PubMed  PubMed Central  Google Scholar 

  4. Glasgow R, Klesges L, Dzewaltowski DA, Bull SS, Estabrooks PA: The future of physical activity behavior change research: what is needed to improve translation of research into health promotion practice?. Ann Behav Med. 2004, 27: 3-12. 10.1207/s15324796abm2701_2.

    Article  PubMed  Google Scholar 

  5. Eccles MP, Armstrong D, Baker R, Cleary K, Davies H, Davies S, Glasziou P, Ilott I, Kinmonth A-L, Leng G, Logan S, Marteau T, Michie S, Rogers H, Rycroft-Malone J, Sibbald B: An implementation research agenda. Implement Sci. 2009, 4: 18-10.1186/1748-5908-4-18.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Grol R, Grimshaw J: From best evidence to best practice: effective implementation of change in patients’ care. Lancet. 2003, 362: 1225-1230. 10.1016/S0140-6736(03)14546-1.

    Article  PubMed  Google Scholar 

  7. Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S: A conceptual framework for implementation fidelity. Implement Sci. 2007, 2: 40-10.1186/1748-5908-2-40.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Fleuren M, Wiefferink K, Paulussen T: Determinants of innovation within health care organizations: literature review and Delphi study. Int J Qual Health Care. 2004, 16: 107-123. 10.1093/intqhc/mzh030.

    Article  PubMed  Google Scholar 

  9. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O: Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004, 82: 581-629. 10.1111/j.0887-378X.2004.00325.x.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Grol R, Wensing M, Eccles M: Improving Patient Care: The Implementation of Change in Clinical Practice. 2005, Oxford: Elsevier

    Google Scholar 

  11. Rogers EM: Diffusion of Innovations. 1983, New York: The Free Press

    Google Scholar 

  12. Chaudoir SR, Dugan AG, Barr CHI: Measuring factors affecting implementation of health innovations: a systematic review of structural, organizational, provider, patient, and innovation level. Implement Sci. 2013, 8: 22-10.1186/1748-5908-8-22.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Huijg JM, Crone MR, Verheijden MW, van der Zouwe N, Middelkoop BJC, Gebhardt WA: Factors influencing the adoption, implementation, and continuation of physical activity interventions in primary health care: a Delphi study. BMC Fam Pract. 2013, 14: 142-10.1186/1471-2296-14-142.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Eccles M, Grimshaw J, Walker A, Johnston M, Pitts N: Changing the behavior of healthcare professionals: the use of theory in promoting the uptake of research findings. J Clin Epidemiol. 2005, 58: 107-112. 10.1016/j.jclinepi.2004.09.002.

    Article  PubMed  Google Scholar 

  15. Foy R, Eccles M, Grimshaw J: Why does primary care need more implementation 5esearch?. Fam Pract. 2001, 18: 353-355. 10.1093/fampra/18.4.353.

    Article  CAS  PubMed  Google Scholar 

  16. French SD, Green SE, O’Connor DA, McKenzie JE, Francis JJ, Michie S, Buchbinder R, Schattner P, Spike N, Grimshaw JM: Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the theoretical domains framework. Implement Sci. 2012, 7: 38-10.1186/1748-5908-7-38.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Michie S, Johnston M, Francis J, Hardeman W, Eccles M: From theory to intervention: mapping theoretically derived behavioural determinants to behaviour change techniques. Appl Psychol. 2008, 57: 660-680. 10.1111/j.1464-0597.2008.00341.x.

    Article  Google Scholar 

  18. Presseau J, Johnston M, Francis J, Hrisos S, Stamp E, Steen N, Hawthorne G, Grimshaw J, Elovainio M, Hunter M, Eccles M: Theory-based predictors of multiple clinician behaviors in the management of diabetes. J Behav Med. 2013, epub ahead of print

    Google Scholar 

  19. Flottorp SA, Oxman AD, Krause J, Musila NR, Wensing M, Godycki-Cwirko M, Baker R, Eccles MP: A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sci. 2013, 8: 35-10.1186/1748-5908-8-35.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Grimshaw JM, Shirran L, Thomas R, Mowatt G, Fraser C, Bero L, Grilli R, Harvey E, Oxman A, O’Brien MA: Changing provider behavior: an overview of systematic reviews of interventions. Med Care. 2001, 39: II2-II45.

    Article  CAS  PubMed  Google Scholar 

  21. Grimshaw JM, Thomas RE, MacLennan G, Fraser C, Ramsay CR, Vale L, Whitty P, Eccles MP, Matowe L, Shirran L, Wensing M, Dijkstra R, Donaldson C: Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess. 2004, 8: 6-

    Article  Google Scholar 

  22. Davies P, Walker AE, Grimshaw JM: A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implement Sci. 2010, 5: 14-10.1186/1748-5908-5-14.

    Article  PubMed  PubMed Central  Google Scholar 

  23. The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG): Designing theoretically-informed implementation interventions. Implement Sci. 2006, 1: 4-

    Article  Google Scholar 

  24. Francis JJ, Tinmouth A, Stanworth SJ, Grimshaw JM, Johnston M, Hyde C, Stockton C, Brehaut JC, Fergusson D, Eccles MP: Using theories of behaviour to understand transfusion prescribing in three clinical contexts in two countries: development work for an implementation trial. Implement Sci. 2009, 4: 70-10.1186/1748-5908-4-70.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Grol RPTM, Bosch MC, Hulscher MEJL, Eccles MP, Wensing M: Planning and studying improvement in patient care: the use of theoretical perspectives. Milbank Q. 2007, 85: 93-138. 10.1111/j.1468-0009.2007.00478.x.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Rothman AJ: “Is there nothing more practical than a good theory?”: why innovations and advances in health behavior change will arise if interventions are used to test and refine theory. Int J Behav Nutr Phys. 2004, 1: 11-10.1186/1479-5868-1-11.

    Article  Google Scholar 

  27. Francis JJ, Stockton C, Eccles MP, Johnston M, Cuthbertson BH, Grimshaw JM, Hyde C, Tinmouth A, Stanworth SJ: Evidence-based selection of theories for designing behaviour change interventions: using methods based on theoretical construct domains to understand clinicians’ blood transfusion behaviour. Br J Health Psychol. 2009, 14: 625-646. 10.1348/135910708X397025.

    Article  PubMed  Google Scholar 

  28. Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A: Making psychological theory useful for implementing evidence based practice: a consensus approach. Qual Saf Health Care. 2005, 14: 26-33. 10.1136/qshc.2004.011155.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  29. Cane J, O’Connor D, Michie S: Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012, 7: 37-10.1186/1748-5908-7-37.

    Article  PubMed  PubMed Central  Google Scholar 

  30. McSherry LA, Dombrowski SU, Francis JJ, Murphy J, Martin CM, O’Leary JJ, Sharp L, ATHENS Group: ‘It’s a can of worms’: understanding primary care practitioners' behaviours in relation to HPV using the theoretical domains framework. Implement Sci. 2012, 7: 73-10.1186/1748-5908-7-73.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Patey AM, Islam R, Francis JJ, Bryson GL, Grimshaw JM: Anesthesiologists’ and surgeons’ perceptions about routine pre-operative testing in low-risk patients: application of the Theoretical Domains Framework (TDF) to identify factors that influence physicians’ decisions to order pre-operative tests. Implement Sci. 2012, 7: 52-10.1186/1748-5908-7-52.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Islam R, Tinmouth AT, Francis JJ, Brehaut JC, Born J, Stockton C, Stanworth SJ, Eccles MP, Cuthbertson BH, Hyde C, Grimshaw JM: A cross-country comparison of intensive care physicians’ beliefs about their transfusion behaviour: a qualitative study using the theoretical domains framework. Implement Sci. 2012, 7: 93-10.1186/1748-5908-7-93.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Curran JA, Brehaut J, Patey AM, Osmond M, Stiell I, Grimshaw JM: Understanding the Canadian adult CT head rule trial: use of the theoretical domains framework for process evaluation. Implement Sci. 2013, 8: 25-10.1186/1748-5908-8-25.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Dyson J, Lawton R, Jackson C, Cheater F: Does the use of a theoretical approach tell us more about hand hygiene behaviour? The barriers and levers to hand hygiene. J Infect Prev. 2011, 12: 17-24. 10.1177/1757177410384300.

    Article  Google Scholar 

  35. Mazza D, Chapman A, Michie S: Barriers to the implementation of preconception care guidelines as perceived by general practitioners: a qualitative study. BMC Health Serv Res. 2013, 13: 36-10.1186/1472-6963-13-36.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Michie S, Pilling S, Garety P, Whitty P, Eccles M, Johnston M, Simmons J: Difficulties implementing a mental health guideline: an exploratory investigation using psychological theory. Implement Sci. 2007, 2: 8-10.1186/1748-5908-2-8.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Amemori M, Michie S, Korhonen T, Murtomaa H, Kinnunen TH: Assessing implementation difficulties in tobacco use prevention and cessation counselling among dental providers. Implement Sci. 2011, 6: 50-10.1186/1748-5908-6-50.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Beenstock J, Sniehotta FF, White M, Bell R, Milne EM, Araujo-Soares V: What helps and hinders midwives in engaging with pregnant women about stopping smoking? A cross-sectional survey of perceived implementation difficulties among midwives in the northeast of England. Implement Sci. 2012, 7: 36-10.1186/1748-5908-7-36.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Taylor N, Parveen S, Robins V, Slater B, Lawton R: Development and initial validation of the influences on patient safety behaviours questionnaire. Implement Sci. 2013, 8: 81-10.1186/1748-5908-8-81.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Huijg JM, Gebhardt WA, Crone MR, Dusseldorp E, Presseau J: Discriminant content validity of a Theoretical Domains Framework questionnaire for use in implementation research. Implement Sci. 2014, 9: 11-10.1186/1748-5908-9-11.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Ajzen I: The theory of planned behavior. Organ Behav Hum Decis. 1991, 50: 179-211. 10.1016/0749-5978(91)90020-T.

    Article  Google Scholar 

  42. Bandura A: Health promotion from the perspective of social cognitive theory. Psychol Health. 1998, 13: 623-649. 10.1080/08870449808407422.

    Article  Google Scholar 

  43. Huijg JM, van der Zouwe N, Gebhardt WA, Crone MR, Verheijden MW, Middelkoop BJC: Introducing physical activity interventions in primary health care: a qualitative study of perceived facilitators and barriers [abstract]. Int J Behav Med. 2012, 19: S1-S341.

    Google Scholar 

  44. van Veldhoven M, Meijman T, Broersen J, Fortuin R: Handleiding VBBA. 2002, Amsterdam: SKB Vragenlijst Services

    Google Scholar 

  45. Frese M: Social support as a moderator of the relationship between work stressors and psychological dysfunctioning: a longitudinal study with objective measures. J Occup Health Psychol. 1999, 4: 179-192.

    Article  CAS  PubMed  Google Scholar 

  46. Gardner B, Abraham C, Lally P, de Bruijn G-J: Towards parsimony in habit measurement: testing the convergent and predictive validity of an automaticity subscale of the self-report habit index. Int J Behav Nutr Phys. 2012, 9: 102-10.1186/1479-5868-9-102.

    Article  Google Scholar 

  47. Sniehotta FF, Schwarzer R, Scholz U, Schüz B: Action planning and coping planning for long-term lifestyle change: theory and assessment. Eur J Soc Psychol. 2005, 35: 565-576. 10.1002/ejsp.258.

    Article  Google Scholar 

  48. Scheier MF, Carver CS, Bridges MW: Distinguishing optimism from neuroticism (and trait anxiety, self-mastery, and self-esteem): a reevaluation of the life orientation test. J Pers Soc Psychol. 1994, 67: 1063-1078.

    Article  CAS  PubMed  Google Scholar 

  49. Cialdini R, Kallgren C, Reno R: A focus theory of normative conduct: a theoretical refinement and reevaluation of the role of norms in human behavior. Adv Exp Soc Psychol. 1991, 24: 202-232.

    Google Scholar 

  50. Qualtrics: Qualtrics software, version 45433. 2013, Provo, Utah, USA: Qualtrics

    Google Scholar 

  51. Corp IBM: IBM SPSS Statistics for Windows, Version 19.0. 2010, IBM Corp.: Armonk, NY

    Google Scholar 

  52. Stuive I, Kiers HAL, Timmerman ME, ten Berge JMF: The empirical verification of an assignment of items to subtests: the oblique multiple group method versus the confirmatory common factor method. Educ Psychol Meas. 2008, 68: 923-939. 10.1177/0013164408315264.

    Article  Google Scholar 

  53. Holzinger K: A simple method of factor analysis. Psychometrika. 1944, 9: 257-262. 10.1007/BF02288737.

    Article  Google Scholar 

  54. Stuive I, Kiers HAL, Timmerman ME: Comparison of methods for adjusting incorrect assignments of items to subtests: oblique multiple group method versus confirmatory common factor method. Educ Psychol Meas. 2009, 69: 948-965. 10.1177/0013164409332226.

    Article  Google Scholar 

  55. Jöreskog K: A general approach to confirmatory maximum likelihood factor analysis. Psychometrika. 1969, 34: 183-202. 10.1007/BF02289343.

    Article  Google Scholar 

  56. Jöreskog K: Testing a simple structure hypothesis in factor analysis. Psychometrika. 1966, 31: 165-178. 10.1007/BF02289505.

    Article  PubMed  Google Scholar 

  57. Hotelling H: Analysis of a complex of statitical variables into principal components. J Educ Psychol. 1933, 24: 417-441.

    Article  Google Scholar 

  58. Cronbach L: Coefficient alpha and the internal structure of tests. Psychometrika. 1951, 16: 297-333. 10.1007/BF02310555.

    Article  Google Scholar 

  59. Campbell D, Fiske D: Convergent and discriminant validation by the multitrait-multimethod matrix. Psychol Bull. 1962, 59: 257-272.

    Article  Google Scholar 

  60. Anderson JC, Gerbing DW: Structural equation modeling in practice: a review and recommended two-step approach. Psychological Bulletin. 1988, 103: 411-423.

    Article  Google Scholar 

  61. Spearman C: The proof and measurement of association between two things. Am J Psychol. 1904, 15: 72-101. 10.2307/1412159.

    Article  Google Scholar 

  62. Timmerman M, Stuive I: Multiple Group Method with Corrections for Selfcorrelation and Test Length. 2007

    Google Scholar 

  63. R Development Core Team R: R: a Language and Environment for Statistical Computing. 2012, Vienna, Austria: R Foundation for Statistical Computing

    Google Scholar 

  64. Falissard B: Psy: Various Procedures Used in Psychometry. 2012, Vienna, Austria: R Foundation for Statistical Computing, R Package Version 1.1

    Google Scholar 

  65. Cohen J: Statistical Powe Analysis for the Behavioral Sciences. 1988, Hillsdale, New Jersey: Lawrence Erlbaum

    Google Scholar 

  66. Wännström I, Peterson U, Asberg M, Nygren A, Gustavsson JP: Psychometric properties of scales in the General Nordic Questionnaire for Psychological and Social Factors at Work (QPS): confirmatory factor analysis and prediction of certified long-term sickness absence. Scand J Psychol. 2009, 50: 231-244. 10.1111/j.1467-9450.2008.00697.x.

    Article  PubMed  Google Scholar 

  67. American Psychological Association (APA): APA Dictionary of Psychology. 2007, Washington, DC: American Psychological Association

    Google Scholar 

  68. Taylor N, Lawton R, Conner M: Development and initial validation of the determinants of physical activity questionnaire. Int J Behav Nutr Phys Act. 2013, 10: 74-10.1186/1479-5868-10-74.

    Article  PubMed  PubMed Central  Google Scholar 

  69. Watson D, Clark LA, Tellegen A: Development and validation of brief measures of positive and negative affect: the PANAS scales. J Pers Soc Psychol. 1988, 54: 1063-1070.

    Article  CAS  PubMed  Google Scholar 

  70. Crawford JR, Henry JD: The positive and negative affect schedule (PANAS): construct validity, measurement properties and normative data in a large non-clinical sample. Br J Clin Psychol. 2004, 43: 245-265. 10.1348/0144665031752934.

    Article  PubMed  Google Scholar 

  71. Wood W, Neal DT: The habitual consumer. J Consum Psychol. 2009, 19: 579-592. 10.1016/j.jcps.2009.08.003.

    Article  Google Scholar 

  72. Deci E, Ryan R: Intrinsic Motivation and Self-Determination in Human Behavior. 1985, New York: Plenum Publishing Co.

    Book  Google Scholar 

  73. Maslach C, Jackson SE: The measurement of experienced burnout. J Organ Behav. 1981, 2: 99-113. 10.1002/job.4030020205.

    Article  Google Scholar 

  74. Dannapfel P, Peolsson A, Nilsen P: What supports physiotherapists’ use of research in clinical practice? A qualitative study in Sweden. Implement Sci. 2013, 8: 31-10.1186/1748-5908-8-31.

    Article  PubMed  PubMed Central  Google Scholar 

  75. Forsman H, Rudman A, Gustavsson P, Ehrenberg A, Wallin L: Nurses’ research utilization two years after graduation: a national survey of associated individual, organizational, and educational factors. Implement Sci. 2012, 7: 46-10.1186/1748-5908-7-46.

    Article  PubMed  PubMed Central  Google Scholar 

  76. Shirley D, Van der Ploeg HP, Bauman AE: Physical activity promotion in the physical therapy setting: perspectives from practitioners and students. Phys Ther. 2010, 90: 1311-1322. 10.2522/ptj.20090383.

    Article  PubMed  Google Scholar 

  77. van der Wees PJ, Zagers CA, de Die SE, Hendriks EJ, Nijhuis-van der Sanden MW, de Bie RA: Developing a questionnaire to identify perceived barriers for implementing the Dutch physical therapy COPD clinical practice guideline. BMC Health Serv Res. 2013, 13: 159-10.1186/1472-6963-13-159.

    Article  PubMed  PubMed Central  Google Scholar 

  78. Barrett EM, Darker CD, Hussey J: Promotion of physical activity in primary care: knowledge and practice of general practitioners and physiotherapists. J Public Health. 2013, 21: 63-69. 10.1007/s10389-012-0512-0.

    Article  Google Scholar 

  79. Stevens J: Applied Multivariate Statistics for the Social Sciences. 1992, Hillsdale, New Jersey: Lawrence Erlbaum

    Google Scholar 

Download references

Acknowledgements

This research was funded by ZonMw, The Netherlands Organisation of Health Research and Development.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Johanna M Huijg.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

JMH was involved in the design of the study, recruited the respondents, collected, analyzed and interpreted the data, and wrote the initial and subsequent drafts of the manuscript. ED was involved in data analysis and interpretation, and commented on the manuscript. WAG and MRC were involved in the conception and the design of the study, assisted with interpretation of the data, and critically revised the manuscript. MWV, NZ and BJCM were involved in the conception and the design of the study, assisted with interpretation of the data, and commented on the manuscript. All authors read and approved the final manuscript.

Electronic supplementary material

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article

Huijg, J.M., Gebhardt, W.A., Dusseldorp, E. et al. Measuring determinants of implementation behavior: psychometric properties of a questionnaire based on the theoretical domains framework. Implementation Sci 9, 33 (2014). https://doi.org/10.1186/1748-5908-9-33

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1748-5908-9-33

Keywords