Skip to main content
Back to Top


Spotlight on Qualitative Research

Content on this page is provided for reference purposes only. It is no longer maintained and may now be outdated.

It is no secret that implementation science (IS) is a challenging field, given the complexity of interventions, implementations, and implementation sites, which involve interpersonal processes and multiple levels of context that influence the effort.  Adding to the challenge is the real-world, dynamic nature of implementation research marked by changes in interventions, implementation strategies, and settings over time.  Qualitative methods are well suited to help us sort out this complexity and, in tandem with quantitative methods, render mixed methods a powerful tool for understanding implementation efforts and outcomes. 

The use of qualitative methods in IS, however, has not been without its problems. Traditionally, qualitative research involves long-term, labor-intensive engagement of the researcher in a research setting.  Implementation science, however is action-oriented, requiring flexible, adaptive methods, frequently in compressed timeframes.  Adapting qualitative methods to the contingencies of implementation science while maintaining methodological rigor and integrity can be difficult.  A comprehensive discussion specifically focused on qualitative methods in IS is necessary, but has not yet formally or systematically occurred.  To remedy this, NCI’s Implementation Science Team has organized a workgroup to catalyze this discussion, the QUALRIS (QUALitative Research in Implementation Science) group.  The group is included on the Research to Reality website as one of its learning communities.

QUALRIS includes thought leaders in IS science, qualitative methods, or both. Its members are  Deborah Cohen (OHSU), Benjamin F. Crabtree (RWJMS), Laura Damschroder (Ann Arbor VA), Alison B. Hamilton (Los Angeles VA), Suzanne Heurtin-Roberts (NCI), Jennifer Leeman (UNC Chapel Hill), Deborah K. Padgett (NYU), Lawrence Palinkas (USC), Borsika Rabin (UC San Diego), and Heather Schacht Reisinger (Iowa VA). 

The group has been writing a white paper examining some of the challenges involved in using qualitative methods in implementation science and offering guidance in meeting those challenges.  The paper is intended for implementation researchers with limited to no experience in qualitative methodology, orienting readers to qualitative methods, and relating how the methods have been used thus far and for what purposes.  Using qualitative methods in IS can be challenging for the most experienced researcher, and we anticipate that these researchers will also find the paper helpful.   

QUALRIS is planning to meet in December of this year, just after the 10th annual D&I conference, to put finishing touches on a near-final draft of the paper.  We plan to release the paper on NCI’s IS Team website early in 2018. In the meantime, we can share some thoughts on using qualitative methods rigorously and appropriately in IS.  They have drafted guidance in the following domains:

 1) Employ qualitative methods relevant to research questions rather than methods commonly used as “defaults” such as “focus groups” or “semi-structured interviews;”

 2) Give increased attention to procedures designed to achieve qualitative analogs to validity and reliability, such as “trustworthiness,” “transferability,” and “auditability.” Providing documentation of adherence to those procedures;

3) Provide rationales for form and content of interview and focus group guides;

4) Document and explain data analysis logic and procedures;

5) Improve presentation of qualitative findings in IS publications.    

The group also recommends increased qualitative expertise on research teams and increased training in qualitative methods for IS researchers. 

Raising the bar for the use of qualitative methods in IS serves to strengthen the field.  It can also expand our understanding of the implementation process, as qualitative inquiry frequently leads to discovery of new phenomena and to new questions we hadn’t previously thought to ask.   Strong methods, both qualitative and quantitative, will lead us to a stronger implementation science.

We would love to hear about your experiences in using qualitative methods in implementation research.  What has worked well for you?  What problems have you faced?  How did you address those problems?  What are your thoughts about maintaining rigor in qualitative methods while engaged in frequently fast-paced and complex implementation studies?  What questions do you have? Please join in our QUALRIS conversation on R2R!

Relevant Readings:

Cohen, D.J. and Crabtree, B.F. (2008) Evaluative criteria for qualitative research in health care: controversies and recommendations. Annals of Family Medicine 6:331-339.

Palinkas, L. A. (2014) Qualitative and mixed methods in mental health services and implementation research. Journal of Clinical Child and Adolescent Psychology 43(6): 851-861.

Sandelowski, M., & Leeman, J. (2012) Writing usable qualitative health research findings. Qualitative Health Research 22(10), 1404-1413.

Southam-Gerow, M. A. and S. Dorsey, S. (2014). Qualitative and mixed methods research in dissemination and implementation science: introduction to the special issue. Journal of Clinical Child & Adolescent Psychology 43(6): 845-850.


Thanks for posting Suzanne! I

Thanks for posting Suzanne! I was delighted to see the recent publication of "Factors Related to Implementation and Reach of a Pragmatic Multisite Trial: The My Own Health Report (MOHR) Study" this week in the Journal of the American Board of Family Medicine.  

This study does a nice job in articulating a prospective method for assessing and describing contextual factors related to implementation and patient reach of a pragmatic trial in primary care.

I am looking forward to the continuing conversations around what else do we need to move this important area of research forward!

Thanks for your comments,

Thanks for your comments, Margaret!   Thr paper you mentioned did show how qualitative methods  can document and help us undertand change over time in implementation contexts.  We could see early attitudes of enthusiam fading to frustration at some practice sites when difficulties with implementing the MOHR intervention  arose.  In some sites frustration evolved into loss of support and inadequate resources made available  to implement the intervention satsfactorily.  In some sites frustration was channeled into creative solutions so that implementation strategies changed and the intervention was delivered successfully.  It gives insight into interralated contextual prcesses and elements  influencing  mutual change over time ,a  special strength of qualiative methods. 

Actually the qualitative contextual analysis of the MOHR study reported in this paper was particularly challlenging for a number of reasons..  The MOHR  study was, in part, a motivating factor leading  to the QUALRIS project.



Nice piece, Suzanne. It made me think of an observation I just made  yesterday. As a training exercise, I was allowed to watch cognitive testing being done on an NCI survey. While I was there I asked the project lead how often she thought people didn’t pretest  their data collection tools (especially surveys and interviews) and she said she didn’t know, but she hoped not often. As a medical anthropologist with a background in qualitative research, I feel the same way, but my question still hasn’t been answered. Do people recognize the intricacies that make for good interviews and focus groups?

The cognitive testing was similar to interview pretests I’ve done in the past, but it was interesting to see a more formal version. While I was watching, the participant would read each survey question aloud to the interviewer. Then they would answer the question. At the end of each section the interviewer went back and asked what the participant thought the question meant, and used other probes to elicit more information on issues previous participants identified. I was reminded how questions that seem to make sense to researchers in the thick of a discipline may not be understandable or relevant to laypeople. There were several questions that didn’t make sense or were hard for the interviewee to understand on the survey being tested. There were also questions that didn’t seem to have a relevant answer for that participant, like when a participant was asked to calculate a number that wouldn’t have made sense as a response. It reminded me of when I was in Mozambique asking people about malaria and the environment – there was little association of malaria with the environment when I asked directly, but when I rewrote my questions to be more specific participants said that standing water and high grasses increase mosquito populations and make someone more likely to get malaria.

Suzanne, I love your first point about being more thoughtful about matching methods with research questions. I wonder if you have any qualitative methods in mind other than the defaults? I agree that we need to branch out, but I also think we need to make sure that people using qualitative methods really understand methodological requirements necessary to make  qualitative data accurate, relevant, and useful.

Hi Jordan,

Hi Jordan,

No, I don't have any specific methods in mind.  I suppose I'm frustrated by the overuse of methods such as focus groups because it's something that many people have heard of.  Frankly,focus groups are frequently used inappropriately and without understanding of what the method can and can't do.  For example, it is not a good approach for eliciting a wide range of beliefs or thoughts, due to social pressure to conform in a group.  Rather, it's a way to understand what people can generally agree upon, not their differences.  That's why they're called "focus" groups, to focus on a few ideas.  Basically the point is that one's method needs to be relevant to one's research question.  This is what we're all taught in "research methods 101," but apparently many people new to qualitative methods don't think about this.

Thanks for your comments!





Thank you Suzanne and others for opening a space like this to share doubts and reflections. Really needed! The main problem I have when conducting qualitative and collaborative research (what I do as a researcher social worker) within the framework of Implementation Science is the prevalence of models for clinical and biomedical interventions. So I have my doubts if it´s really –as depicted- we are interdisciplinarian in equall terms, when positivist researchers are always some steps forward, since the complexity, density and lack of fixed designs structure that in qualitative research it´s found it´s something that keep qualitative reserachers in need to be making decisions in vivo all the time .

I also have problems with the proposal to implement, many times I feel the pressure to define and select one intervention in advance, as if it were something absolutatly clear, what it is not, considering the epistemological pluralism in social sciences and, therefore, that not only eveidence-based practice should be count on as evidence but also other with their own epistemological stances, as valid as the post-positivist ones.

I note a demand to have the selection to implement completely “close and fixed” in advanced, what is almost impossible if you are working collaboratively where agreements and consensus can not be taken for granted.  I wonder if you can develop and decide it through the process according to emergents you find during fieldwork. 

Thank you, it´s good at least to share problems!


Dear Natalia,

Dear Natalia,

Yes, it's true that there are still difficulties reconciling certain aspects of qualitative methods with the quantitative methods that predominate in biomedical and implementation research.  They are based upon  differing epistemologies and world views as you note. There still is some tension between the two approaches and reluctance on the part of some researchers to accept the validity of qualitative research.  More research specifically  devoted to methodological innovation is definitely needed.  Such research, greater familiarity with qualitative methods and time will likely take  care of this.

As for the need to specify one's methods while writing a research proposal, this can be challenging for the qualitative researcher, but the requirement is understandable.  It is risky to commit scarce resources to a research study if no methods are specified.   How would the funder evaluate the proposal? It is always possible in proposal-writing to describe some methods that can be anticipated to be useful for likely contingencies.  The proposal writer can describe expected scenarios and methods while acknowledging some of the exploratory nature of a study.

Not all qualitative research is exploratory, however, and clear, specific designs and methods can be  proposed for many research projects.  This is where the art of proposal writing  comes in.  It can help to consult with other researchers who have considerable experience in a particular area or specific method.   These are some of the difficulties of engaging in mixed-methods and team research, but they are problems that can be resolved in collaboration.

Thanks for sharing your thoughts and concerns.