Some reporting guidelines are also available in languages other than English. Find out more in our Translations section .
For information about Library scope and content, identification of reporting guidelines and inclusion/exclusion criteria please visit About the Library .
Visit our Help page for information about searching for reporting guidelines and for general information about using our website.
Venkatesh, V., Brown, S.A., and Sullivan, Y.W. 'Guidelines for Conducting Mixed-methods Research: An Extension and Illustration,' Journal of the AIS (17:7), 2016, 435-495. https://doi.org/10.17705/1jais.00433
77 Pages Posted: 28 Jan 2022
Virginia Polytechnic Institute and State University - Pamplin College of Business
University of Arizona - Department of Management Information Systems
Binghamton University
Date Written: 2016
The objective of this paper is to extend the guidelines of Venkatesh et al. (2013) for mixed methods research by identifying and integrating variations in mixed methods research. By taking into account 14 properties of mixed methods research (e.g., purposes, research questions, epistemological assumptions), our guidelines demonstrate how researchers can flexibly identify the existing variations in mixed methods research and proceed accordingly with a study design that suits their needs. To make the guidelines actionable for various situations and issues that researchers could encounter, we develop a decision tree to map the flow and relationship among the design strategies. We also provide an in-depth illustration of one possible type of mixed methods research in information systems and discuss how to develop and validate meta-inferences as the outcomes of such a study.
Keywords: mixed methods research, meta-inferences, research design, qualitative, quantitative
Suggested Citation: Suggested Citation
Virginia polytechnic institute and state university - pamplin college of business ( email ).
VA United States
HOME PAGE: http://vvenkatesh.com
AZ United States
PO Box 6001 Binghamton, NY 13902-6000 United States
Paper statistics, related ejournals, psychology research methods ejournal.
Subscribe to this fee journal for more curated articles on this topic
Background Evidence for Mobile Stroke Units (MSUs) demonstrates that onset to treatment times for intravenous thrombolysis can be reduced and access to mechanical thrombectomy might be improved. Despite growing use of MSUs internationally, to date there have been no studies in NHS England and NHS Wales exploring the acceptability of MSUs to clinicians, patient and public representatives and other key stakeholders, which are important when considering potential feasibility and implementation.
Methods This study used a mixed methods design with a cross-sectional survey and qualitative workshops and interviews between October 2023 to May 2024. Survey data were collected from clinicians involved in emergency stroke care. Qualitative data involved clinical and non-clinical professionals involved in stroke care alongside patient and public representatives with experience of stroke. Survey data were descriptively analysed while content analysis was used on open-ended questions. Qualitative data were thematically analysed, prior to triangulation using a convergent coding matrix.
Results The study results, drawn from 25 respondents to the survey and 21 participants in qualitative workshops, found that almost all participants had positive affective attitudes to the concept of MSUs. However, several key areas of concern were identified that need to be addressed prior to implementing MSUs. These concerns included how MSUs would be staffed; whether and how telemedicine could contribute; the types of economic impacts; extent to which triage systems could accurately identify stroke patients for MSUs to attend; where the base location and geographic coverage of MSUs should be, the impact of MSUs on equitable access to stroke care, and how to improve public awareness of MSUs.
Conclusion Whilst MSUs are mostly acceptable to key stakeholders, numerous areas of concern need to be addressed prior to MSU implementation. We recommend further research to address these issues prior to implementation in the NHS.
The authors have declared that no competing interests exist.
Author declarations.
I confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained.
The details of the IRB/oversight body that provided approval or exemption for the research described are given below:
Ethical approval was provided via Northumbria University ethics online system (reference: 4117). The study was deemed by the Health Research Authority (HRA) to not require HRA approval. All participants gave written consent prior to any data collection.
I confirm that all necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived, and that any patient/participant/sample identifiers included were not known to anyone (e.g., hospital staff, patients or participants themselves) outside the research group so cannot be used to identify individuals.
I understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance).
I have followed all appropriate research reporting guidelines, such as any relevant EQUATOR Network research reporting checklist(s) and other pertinent material, if applicable.
Data cannot be shared publicly because participants did not give consent for data sharing. Data are available from Northumbria University, contact via corresponding author, for researchers who meet the criteria for access to confidential data.
View the discussion thread.
Thank you for your interest in spreading the word about medRxiv.
NOTE: Your email address is requested solely to identify you as the sender of this article.
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
Email citation, add to collections.
Your saved search, create a file for external citation management software, your rss feed.
Affiliations.
Objective: Mixed methods research (MMR) integrates quantitative and qualitative methods throughout the research process to answer complex research questions. MMR designs align with the guiding frameworks of patient-centered care and social determinants of health by effectively examining the role of contextual factors and human experiences in influencing health and rehabilitation outcomes. Reporting standards and critical appraisal tools ensure the quality and transparency of the research process. MMR standards exist; yet, there is a need for reporting guidelines and an appraisal tool that meets field standards, is applicable across rehabilitation fields of study, and can accommodate the range of possibilities for combining research approaches and methods.
Methods: Mixed Methods Reporting in Rehabilitation & Health Sciences (MMR-RHS) was developed using a systematic consensus-building process in accordance with published guidance and was preregistered with the Enhancing the Quality and Transparency of Health Research Network. MMR-RHS evolved through a sequence of steps, including extensive literature review, expert consultation, stakeholder feedback, pilot testing, and tool refinement.
Results: MMR-RHS consists of 20 criteria that align with field standards for rigor and transparency, with an emphasis on integration throughout the research process, a key component of MMR.
Conclusions: A systematic process was utilized to develop the reporting standards and an appraisal tool for MMR in rehabilitation and health science. The tool is comprehensive, includes a set of criteria grounded in MMR literature, and is flexible for application to a range of MMR designs commonly seen in rehabilitation research.
Impact: The MMR-RHS may improve the quality and transparency of MMR by supporting investigators, authors, reviewers, and editors during project development, manuscript preparation, and critical review. The tool may assist readers in critical appraisal, knowledge translation, and application of published MMR findings. Ultimately, the MMR-RHS may help legitimize mixed methods in rehabilitation and health research, an important step toward understanding the complexities of health care, patient outcomes, and evolving societal health needs.
Keywords: Critical Appraisal; Mixed Methods Research; Reporting Standards; Research Reporting; Transparency.
© The Author(s) 2023. Published by Oxford University Press on behalf of the American Physical Therapy Association. All rights reserved. For permissions, please e-mail: [email protected].
PubMed Disclaimer
Linkout - more resources, full text sources.
NCBI Literature Resources
MeSH PMC Bookshelf Disclaimer
The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.
Qualitative Research
This content is disabled due to your privacy settings. To re-enable, please adjust your cookie preferences.
This webinar reviews four different strategies for integrating qualitative and quantitative data or results that invite a more instrumental role for a qualitative inquiry in contributing analytical insight.
This program does not offer CE credit.
Elizabeth G. Creamer, EdD
Professor emerita, Virginia Tech.
Presents a handful of key strategies to maintain our mental well-being when trying to help others do the same.
October 2022 On Demand Webinar
Emphasizes the basics of classic grounded theory and shows how the original tenets of the method guide the procedures.
Describes how qualitative evaluation can make a vital contribution to every stage of developing and optimizing an intervention.
Provides practical guidance to help researchers carry out IPA studies which take advantage of the strengths and potential for flexibility within the approach.
September 2022 On Demand Webinar
APA Style Journal Article Reporting Standards offer guidance on what information should be included in all manuscript sections for quantitative, qualitative, and mixed methods research and include how to best discuss race, ethnicity, and culture.
Introducing Journal Article Reporting Standards for Race, Ethnicity, and Culture (JARS–REC)
JARS–REC were created to develop best practices related to the manner in which race, ethnicity, and culture are discussed within scientific manuscripts in psychological science.
Quantitative research
Use JARS–Quant when you collect your study data in numerical form or report them through statistical analyses.
Qualitative research
Use JARS–Qual when you collect your study data in the form of natural language and expression.
Mixed methods research
Use JARS–Mixed when your study combines both quantitative and qualitative methods.
Race, ethnicity, culture
Use JARS–REC for all studies for guidance on how to discuss race, ethnicity, and culture.
APA Style Journal Article Reporting Standards (APA Style Jars ) are a set of standards designed for journal authors, reviewers, and editors to enhance scientific rigor in peer-reviewed journal articles. Educators and students can use APA Style JARS as teaching and learning tools for conducting high quality research and determining what information to report in scholarly papers.
The standards include information on what should be included in all manuscript sections for:
Additionally, the APA Style Journal Article Reporting Standards for Race, Ethnicity, and Culture ( Jars – Rec ) provide guidance on how to discuss race, ethnicity, and culture in scientific manuscripts. Jars – Rec should be applied to all research, whether it is quantitative, qualitative, or mixed methods.
Using these standards will make your research clearer and more accurate as well as more transparent for readers. For quantitative research, using the standards will increase the reproducibility of science. For qualitative research, using the standards will increase the methodological integrity of research.
Jars –Quant should be used in research where findings are reported numerically (quantitative research). Jars –Qual should be used in research where findings are reported using nonnumerical descriptive data (qualitative research). Jars –Mixed should be applied to research that includes both quantitative and qualitative research (mixed methods research). JARS–REC should be applied to all research, whether it is quantitative, qualitative, or mixed methods.
For more information on APA Style JARS:
Many aspects of research methodology warrant a close look, and journal editors can promote better methods if we encourage authors to take responsibility to report their work in clear, understandable ways. —Nelson Cowan, Editor, Journal of Experimental Psychology: General
Read more testimonials
This content is disabled due to your privacy settings. To re-enable, please adjust your cookie preferences.
This video describes and discusses the updated APA Style Journal Article Reporting Standards.
Reporting Qualitative Research in Psychology
Journal article reporting standards for qualitative research
Reporting Quantitative Research in Psychology
Journal article reporting standards for quantitative research
Publication Manual, 7th Edition
The official source for writing papers and creating references in seventh edition APA Style
Email an APA Style Expert if you have questions, feedback, or suggestions for modules to be included in future JARS updates.
Introducing APA Style Journal Article Reporting Standards for Race, Ethnicity, and Culture
These standards are for all authors, reviewers, and editors seeking to improve manuscript quality by encouraging more racially and ethnically conscious and culturally responsive journal reporting standards for empirical studies in psychological science.
APA Style JARS for high school students
In this post, we provide an overview of APA Style JARS and resources that can be shared with high school students who want to learn more about effective communication in scholarly research.
Happy 2022, APA Stylers!
This blog post is dedicated to our awesome APA Style users. You can use the many resources on our website to help you master APA Style and improve your scholarly writing.
APA Style JARS on the EQUATOR Network
The APA Style Journal Article Reporting Standards (APA Style JARS) have been added to the EQUATOR Network. The network aims to promote accuracy and quality in reporting of research.
APA Style JARS: Resources for instructors and students
APA Style Journal Article Reporting Standards (APA Style JARS) are a set of guidelines for papers reporting quantitative, qualitative, and mixed methods research that can be used by instructors, students, and all others reading and writing research papers.
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .
Jennifer p wisdom.
Psychiatry Department, Columbia University New York State Psychiatric Institute, 1051 Riverside Drive Box 100, New York, NY 10032
Psychiatry Department, Columbia University New York State Psychiatric Institute, New York, NY
Department of Educational Leadership Counseling at Sam Houston State University, Huntsville, TX
Kaiser Permanente Northwest Center for Health Research, Portland, OR
Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles.
All empirical articles ( n = 1,651) published between 2003 and 2007 from four top-ranked health services journals.
All mixed methods articles ( n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p -values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component.
Mixed methods articles comprised 2.85 percent ( n = 47) of empirical articles, quantitative articles 90.98 percent ( n = 1,502), and qualitative articles 6.18 percent ( n = 102). There was a statistically significant difference (χ 2 (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ 2 (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively).
Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports.
As the health services research field continues to evolve, so too does its methods. Mixed methods research capitalizes on the strengths of both qualitative and quantitative methodologies by combining approaches in a single research study to increase the breadth and depth of understanding ( Johnson, Onwuegbuzie, and Turner 2007 ). Mixed methods can be a better approach to research than either quantitative-only or qualitative-only methods when a single data source is not sufficient to understand the topic, when results need additional explanation, exploratory findings need to be generalized, or when the complexity of research objectives are best addressed with multiple phases or types of data ( Brannen 1992 ; Creswell and Plano Clark 2011 ). Rigorous mixed methods approaches require that individual components (qualitative or quantitative) adhere to their respective established standards ( Curry, Nembhard, and Bradley 2009 ; Creswell and Plano Clark 2011 ). Despite recent guidelines on frameworks for conducting mixed methods research (e.g., Curry, Nembhard, and Bradley 2009 ; Creswell and Plano Clark 2011 ), a critical challenge has been ensuring that reports from mixed methods studies transparently discuss the methodological components integral to the conduct of the studies. Health services researchers and reviewers need clear guidelines regarding research methodology, including methodological components that should be expected in mixed methods papers to indicate that they are sufficiently rigorous.
Health services research is the study of how social factors, financing systems, organizational structures and processes, health technologies, and personal behaviors affect access to health care, the quality and cost of health care, and ultimately, health and well-being ( Lohr and Steinwachs 2002 ). As a result of the breadth of topics addressed, health services research draws upon methods and concepts from many fields, including medicine, epidemiological and economic studies, and the evaluation of services and interventions ( Field, Tranquada, and Feasley 1995 ). Health services researchers increasingly work in interdisciplinary partnerships (e.g., Aboelela et al. 2007 ) and use innovative methods, including mixed methods, to more fully understand health services phenomena. Mixed methods approaches are also consistent with suggestions to extend scientific and contextual health knowledge beyond randomized trials ( Berwick 2005 ).
Mixed methods research capitalizes on the strengths of both qualitative and quantitative methodology by combining both components in a single research study to increase breadth and depth of understanding ( Johnson, Onwuegbuzie, and Turner 2007 ). Qualitative and quantitative methods can be integrated for different purposes to provide a more comprehensive picture of health services than either method can alone. Mixed methods are appropriate in the following situations: (1) when researchers would like to converge different methods or use one method to corroborate the findings from another about a single phenomenon (triangulation); (2) when researchers would like to use one method to elaborate, illustrate, enhance, or clarify the results from another method (complementarity); (3) when researchers would like to use results from one method to inform another method, such as in creating a measure (development); (4) when researchers would like to use one method to discover paradoxes and contradictions in findings from another method that can suggest reframing research questions (initiation); and (5) when researchers seek to expand the breadth and depth of the study by using different methods for different research components (expansion) ( Greene, Caracelli, and Graham 1989 ). Bryman (2006 ) modified and expanded this list to add that mixed methods can also be useful in obtaining diversity of views, illustrating concepts, and developing instruments.
Quantitative and qualitative research can be distinguished by the philosophical assumptions brought to the study (e.g., deductive versus inductive), the types of research strategies (e.g., experiments versus case studies), and the specific research methods used in the study (e.g., structured survey versus observation) ( Creswell 2008 ). Qualitative health services research, for example, is a method in which the researcher collects textual material derived from speech or observation and attempts to understand the phenomenon of interest in terms of the meanings people bring to them ( Denzin and Lincoln 1994 ; Shortell 1999 ; Giacomini and Cook for the Evidence-Based Medicine Working Group 2000 ; Malterud 2001 ; Bradley, Curry, and Devers 2007 ). Certain characteristics are typical of qualitative research, including a naturalistic setting (as opposed to a laboratory), a focus on participants’ perspectives and their meaning, the outcome as a process rather than a product, and data collected as words or images ( Padgett 2008 ).
The National Institutes of Health noted the need for rigor in combining qualitative and quantitative methods to study complex health issues in their recent publication, Best Practices for Mixed Methods in Health Sciences ( Creswell, Klassen, Plano Clark, and Smith for the Office of Behavioral and Social Sciences Research 2011 ). There are several frameworks to guide the rigorous conduct and evaluation of mixed methods research ( Collins, Onwuegbuzie, and Sutton 2006 ; Curry, Nembhard, and Bradley 2009 ; Tashakkori and Teddlie 2010 ; Creswell and Plano Clark 2011 ). Collectively, these frameworks recommend that the conduct of mixed method studies—and reports of mixed method research, including peer-reviewed publication—demonstrates explicit rationales for all decisions regarding study design, including the purpose of including both qualitative and quantitative methods. They specifically advise that each component (qualitative or quantitative) should be conducted with a level of rigor in accordance with established principles in its field, and that researchers be transparent in methodological reporting. For example, sampling design should be specified as identical, parallel, nested, or mixed ( Onwuegbuzie and Collins 2007 ); the level of mixing methods (fully versus partially) should be described, as should time orientation (sequential or concurrent components of research) and emphasis (equal importance of methodological approaches or one more dominant) ( Leech and Onwuegbuzie 2009 ).
Conducting and evaluating mixed methods research have unique methodological challenges, particularly related to rigor. Quantitative studies typically rely on quality criteria such as internal validity, generalizability, and reliability ( Campbell 1957 ; Campbell and Stanley 1963 ; Messick 1989 , 1995 ; Onwuegbuzie and Daniel 2002 , 2004 ; Onwuegbuzie 2003 ), whereas qualitative studies have roughly comparable quality criteria of credibility, transferability, and dependability ( Lincoln and Guba 1985 ; Guba and Lincoln 1989 ; Miles and Huberman 1994 ; Maxwell 2005 ; Pope and Mays 2006 ). For example, questions asked when evaluating a qualitative study might include the following: “Were participants relevant to the research question and was their selection well reasoned?” and “Was the data collection comprehensive enough to support rich and robust descriptions of the observed events?” ( Giacomini and Cook for the Evidence-Based Medicine Working Group 2000 ). In addition to determining whether methodological approaches unique to qualitative or quantitative research were employed, an evaluation of a mixed methods study should assess aspects unique to mixed methods, such as how multiple components are integrated and how consistency and discrepancy between findings from each method are managed ( Sale and Brazil 2004 ; O'Cathain, Murphy, and Nicholl 2007 ). Qualitative, quantitative, and mixed methodologists agree that study procedures should be reported transparently, including sufficient detail to allow the reader to make inferences about study quality ( Lincoln and Guba 1985 ; Giacomini and Cook for the Evidence-based Medicine Working Group 2000 ; O'Cathain, Murphy, and Nicholl 2007 ; Armstrong et al. 2008 ; Creswell 2008 ; Curry, Nembhard, and Bradley 2009 ; Leech et al. 2009 ; Teddlie and Tashakkori 2009 ).
Several researchers have proposed specific techniques to assess the overall methodology of mixed methods research and assess the methodological components of the qualitative, quantitative, and mixed portions of the studies (e.g., Pluye et al. 2009 ; O'Cathain 2010 ; Tashakkori and Teddlie 2010 ; Creswell and Plano Clark 2011 ; Leech, Onwuegbuzie, and Combs 2011 ). For example, O'Cathain (2010 ) assessed quality of mixed methods research by evaluating transparency and clarity in reporting planning, design, data, interpretive rigor, inference transferability, reporting quality, synthesizability, and utility. Others have suggested alternative methods for assessing quality, but criteria often are not elucidated or are vague. Further, those frameworks typically address quality of the study design as opposed to the characteristics provided in the published article . By contrast, Sale and Brazil (2004 ) proposed a structured framework for the evaluation of mixed methods publications by identifying key methodological components that should be included for both qualitative and quantitative portions of studies. Despite these advances, we found few published accounts of the rigor of published mixed methods research. Our article has three specific research questions: (1) How has the frequency of mixed methods studies published in health services journals changed over time? (2) How are mixed methods articles being used to elucidate health services? and (3) To what extent do mixed methods reports differ in methodological content compared to qualitative-only or quantitative-only articles?
This systematic review assessed the frequency of mixed methods publications in top health services research journals and compared the frequency of key methodological components in qualitative, quantitative, and mixed method studies. We first reviewed articles in health services research journals to determine the prevalence of mixed methods designs and the presence of key methodological components. Then, we conducted statistical analyses of trends over time in the frequency of mixed methods articles and in the presence of key methodological components of those articles. Because this was an analysis of published data, no ethical oversight was required.
We examined four journals: Health Affairs, Health Services Research , Medical Care , and Milbank Quarterly , which had 5-year impact factors of 2.94–4.71. Journals were selected by reviewing the Institute for Scientific Information (2007 ) rankings for the top 10 journals in health care sciences and services. Of these 10, we included all journals that focused generally on health services research and excluded journals with narrower foci ( Value in Health , Journal of Health Economics , Journal of Pain and Symptom Management , Statistical Methods in Medical Research , Quality and Safety in Health Care , and Quality of Life Research ). Although 2001 marked a turning point in the proliferation of mixed methods studies published in major electronic bibliographic databases such as PubMED ( Collins, Onwuegbuzie, and Jiao 2007 ), we chose to examine articles from 2003 to 2007 because 2003 marks publication of the first edition of Tashakkori and Teddlie's landmark Handbook of Mixed Methods in Social and Behavioral Research , which provided the first comprehensive collection of mixed method theory, methodology, and application. Five years represents a sufficient period of time to examine trends of published articles following the publication of a landmark methodological work.
We reviewed empirical articles to determine whether each represented a quantitative, qualitative, or mixed methods study. This entailed using all the information presented in the abstract and the body of the article to identify the research design either as stated or implied by the author(s). We excluded nonempirical articles (book reviews, literature reviews, commentaries and opinion articles, letters to the editor, policy statements) and articles from a special issue of Milbank Quarterly (Volume 83, Number 4) that included only articles published between 1932 and 1998.
We classified articles as quantitative if they included (1) a primary goal of testing theories or hypotheses about relationships between/among variables, or (2) quantitative data and methodology, such as hierarchical linear modeling, multiple regression, or Markov modeling. We classified articles as qualitative if they included either (1) a primary goal of exploring or understanding the meaning ascribed to a specific phenomenon or experience, or (2) qualitative data such as observations, unstructured or semi-structured interviews, or focus group interviews or methodologies such as thematic analysis. Although more complex definitions of mixed method studies exist (e.g., Johnson, Onwuegbuzie, and Turner 2007 ; Creswell and Plano Clark 2011 ), we classified articles as mixed methods if they integrated or combined both quantitative and qualitative methods in a single study ( Sale and Brazil 2004 ). This definition reflects the general definitions of mixed methods and the lack of consensus on a specific definition across all multidisciplinary mixed methods researchers.
We used spreadsheets to track classifications, with cells containing articles’ abstracts and our field notes. Two authors read and classified articles in batches of 50 according to type, conferring as needed until agreement was achieved ( n = 300 articles); the remaining articles ( n = 1,351) were each coded by one author. For the few articles for which methodology was ambiguous ( n = 58, 3.5 percent of all empirical articles), classification was resolved in consultation with a third author. Similar methods have been used in other evaluations of mixed methods articles ( Powell et al. 2008 ).
We identified all mixed methods articles ( n = 47) and equal random samples ( n = 47) of quantitative articles (from 1,502 articles) and qualitative articles (from 102 articles) (total n = 141) in the four journals. Random samples of qualitative and quantitative articles were selected using a random number generator and did not adjust for journal or year. We assessed the frequency of key methodological components reported across articles, then compared rates by article type. The methodological components we focused on were drawn from two conceptual frameworks. The first included Sale and Brazil's (2004 ) criteria: (1) internal validity for quantitative findings and credibility for qualitative findings, (2) external validity for quantitative findings and transferability or fittingness for qualitative findings, (3) reliability for quantitative findings and dependability for qualitative findings, and (4) objectivity for quantitative findings and confirmability for qualitative findings (specific criteria are listed in Table 3 ). The second was O'Cathain's transparency criteria for mixed methods studies ( O'Cathain, Murphy, and Nicholl 2007 ; O'Cathain 2010 ), which specify that mixed methods studies should state the (1) priority of methods (primarily quantitative, primarily qualitative, or equal priority), (2) purpose of mixing methods (e.g., triangulation, complementarity, initiation, development, or expansion), (3) sequence of methods (qualitative first, quantitative first, or simultaneous), and (4) stage of integration of both types of data (e.g., data collection, analysis, interpretation). We assessed four additional components of mixed methods studies: (1) whether qualitative and quantitative components were integrated, (2) whether limitations of design were detailed, (3) whether areas of consistency between qualitative and quantitative components were elucidated, and (4) whether areas of inconsistency between components were described.
Key Methodological Components in Mixed Methods, Quantitative, and Qualitative Health Services Research Articles
Mixed Method Studies (n =47) | Quantitative Studies (n =47) | Qualitative Studies (n =47) | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Yes | No | N/A | % with Component | Yes | No | N/A | % with Component | Yes | No | N/A | % with Component | |
Truth value (internal validity) | ||||||||||||
Ethical review undertaken | 9 | 37 | 1 | 19.57 | 9 | 37 | 1 | 19.57 | ||||
Informed consent stated | 5 | 21 | 21 | 19.23 | 5 | 38 | 4 | 11.63 | ||||
Identifying or controlling for extraneous/confounding variables | 7 | 40 | 0 | 14.89 | 33 | 14 | 0 | 70.21 | ||||
Confidentiality protected | 3 | 42 | 2 | 6.67 | 2 | 42 | 3 | 4.55 | ||||
Comparability of control to intervention groups at baseline | 0 | 0 | 47 | 0.00 | 8 | 36 | 3 | 18.18 | ||||
Control/comparison groups treated similarly | 0 | 0 | 47 | 0.00 | 3 | 40 | 4 | 6.98 | ||||
Applicability (external validity/generalizability) | ||||||||||||
Outcome measures defined | 7 | 0 | 40 | 100.00 | 43 | 3 | 1 | 93.48 | ||||
Control/comparison group described | 2 | 0 | 45 | 100.00 | 11 | 33 | 3 | 25.00 | ||||
Data collection instruments/source of data described | 29 | 18 | 0 | 61.70 | 46 | 1 | 0 | 97.87 | ||||
Statement of purpose/objective | 28 | 19 | 0 | 59.57 | 40 | 7 | 0 | 85.11 | ||||
Source of subjects stated (sampling frame) | 27 | 19 | 1 | 58.70 | 41 | 6 | 0 | 87.23 | ||||
Study population defined or described | 24 | 23 | 0 | 51.06 | 43 | 4 | 0 | 91.49 | ||||
Source of control/comparison group stated | 1 | 1 | 45 | 50.00 | 8 | 36 | 3 | 18.18 | ||||
Selection of control/comparison group described | 1 | 1 | 45 | 50.00 | 8 | 36 | 3 | 18.18 | ||||
Data gathering procedures described | 23 | 24 | 0 | 48.94 | 33 | 14 | 0 | 70.21 | ||||
Description of setting/conditions under which data collected | 22 | 24 | 1 | 47.83 | 32 | 15 | 0 | 68.09 | ||||
Statistical procedures referenced or described | 19 | 28 | 0 | 40.43 | 45 | 2 | 0 | 95.74 | ||||
Subject recruitment or sampling selection described | 17 | 30 | 0 | 36.17 | 35 | 12 | 0 | 74.47 | ||||
Statement about nonrespondents, dropouts, deaths | 16 | 31 | 0 | 34.04 | 21 | 25 | 1 | 45.65 | ||||
-Values stated | 16 | 31 | 0 | 34.04 | 41 | 6 | 0 | 87.23 | ||||
Both statistical and clinical significance acknowledged | 13 | 34 | 0 | 27.66 | 41 | 6 | 0 | 87.23 | ||||
Study design stated explicitly | 11 | 36 | 0 | 23.40 | 26 | 21 | 0 | 55.32 | ||||
Inclusion/exclusion criteria stated explicitly | 10 | 36 | 1 | 21.74 | 28 | 19 | 0 | 59.57 | ||||
Missing data addressed | 10 | 37 | 0 | 21.28 | 18 | 29 | 0 | 38.30 | ||||
At least one hypothesis stated | 10 | 37 | 0 | 21.28 | 23 | 24 | 0 | 48.94 | ||||
Sample randomly selected | 6 | 39 | 2 | 13.33 | 12 | 35 | 0 | 25.53 | ||||
Confidence intervals given for main results | 5 | 42 | 0 | 10.64 | 26 | 21 | 0 | 55.32 | ||||
Power calculation provided | 1 | 46 | 0 | 2.13 | 7 | 40 | 0 | 14.89 | ||||
Description of intervention | 0 | 2 | 45 | 0.00 | 7 | 36 | 4 | 16.28 | ||||
Assessment of outcome blinded | 0 | 0 | 47 | 0.00 | 2 | 41 | 4 | 4.65 | ||||
Consistency (reliability) | ||||||||||||
Standardization of observers described | 3 | 44 | 0 | 6.38 | 7 | 40 | 0 | 14.89 | ||||
Neutrality (objectivity) | ||||||||||||
Statement of researcher's assumptions/perspective | 5 | 42 | 0 | 10.64 | 4 | 43 | 0 | 8.51 | ||||
Truth value (credibility) | ||||||||||||
Triangulation of qualitative sources | 25 | 22 | 0 | 53.19 | 27 | 20 | 0 | 57.45 | ||||
Triangulation of qualitative methods | 16 | 31 | 0 | 34.04 | 13 | 34 | 0 | 27.66 | ||||
Use of exemplars | 13 | 34 | 0 | 27.66 | 14 | 33 | 0 | 29.79 | ||||
Ethical review undertaken | 10 | 37 | 0 | 21.28 | 8 | 30 | 9 | 21.05 | ||||
Triangulation of investigators | 7 | 40 | 0 | 14.89 | 3 | 44 | 0 | 6.38 | ||||
Informed consent stated | 6 | 41 | 0 | 12.77 | 3 | 35 | 9 | 7.89 | ||||
Member checks | 4 | 43 | 0 | 8.51 | 2 | 45 | 0 | 4.26 | ||||
Confidentiality protected | 4 | 43 | 0 | 8.51 | 3 | 35 | 9 | 7.89 | ||||
Consent procedures described | 3 | 44 | 0 | 6.38 | 2 | 36 | 9 | 5.26 | ||||
Peer debriefing | 2 | 45 | 0 | 4.26 | 0 | 47 | 0 | 0.00 | ||||
Negative case analysis (searching for disconfirming evidence) | 1 | 46 | 0 | 2.13 | 0 | 47 | 0 | 0.00 | ||||
Triangulation of theory/perspective | 0 | 47 | 0 | 0.00 | 4 | 43 | 0 | 8.51 | ||||
Applicability (transferability/fittingness) | ||||||||||||
Statement of purpose/objective | 34 | 13 | 0 | 72.34 | 36 | 11 | 0 | 76.60 | ||||
Data gathering procedures described | 25 | 22 | 0 | 53.19 | 36 | 11 | 0 | 76.60 | ||||
Description of study context or setting | 20 | 27 | 0 | 42.55 | 38 | 9 | 0 | 80.43 | ||||
Phenomenon of study stated | 18 | 29 | 0 | 38.30 | 24 | 23 | 0 | 51.06 | ||||
Sampling procedure described | 18 | 29 | 0 | 38.30 | 22 | 23 | 2 | 48.89 | ||||
Rationale for qualitative methods | 17 | 30 | 0 | 36.17 | 12 | 35 | 0 | 25.53 | ||||
Description of participants/informants | 16 | 31 | 0 | 34.04 | 25 | 20 | 2 | 55.56 | ||||
Statement of research questions | 15 | 32 | 0 | 31.91 | 21 | 26 | 0 | 44.68 | ||||
Statement of how setting was selected | 15 | 32 | 0 | 31.91 | 30 | 17 | 0 | 63.04 | ||||
Data analysis described | 15 | 32 | 0 | 31.91 | 20 | 27 | 0 | 42.55 | ||||
Transcription procedures described | 11 | 36 | 0 | 23.40 | 13 | 28 | 6 | 31.71 | ||||
Coding techniques described | 9 | 38 | 0 | 19.15 | 17 | 30 | 0 | 36.17 | ||||
Justification or rationale for sampling strategy | 8 | 39 | 0 | 17.02 | 18 | 27 | 2 | 40.00 | ||||
Audiotaping procedures described | 8 | 39 | 0 | 17.02 | 12 | 29 | 6 | 29.27 | ||||
Statement about nonrespondents, dropouts, deaths | 6 | 41 | 0 | 12.77 | 4 | 34 | 9 | 10.53 | ||||
Description of raw data | 3 | 44 | 0 | 6.38 | 4 | 43 | 0 | 8.51 | ||||
Rationale for tradition within qualitative methods | 2 | 45 | 0 | 4.26 | 2 | 45 | 0 | 4.26 | ||||
Data collection to saturation specfiied | 2 | 45 | 0 | 4.26 | 1 | 44 | 2 | 2.22 | ||||
Statement that reflexive journals, logbooks, notes were kept | 2 | 45 | 0 | 4.26 | 3 | 44 | 0 | 6.38 | ||||
Consistency (dependability | ||||||||||||
External audit of process | 0 | 47 | 0 | 0.00 | 0 | 47 | 0 | 0.00 | ||||
Neutrality (comfirmability) | ||||||||||||
External audit of data | 2 | 45 | 0 | 4.26 | 0 | 47 | 0 | 0.00 | ||||
Bracketing or | 0 | 47 | 0 | 0.00 | 0 | 47 | 0 | 0.00 | ||||
Statement of researcher's assumptions or perspective | 0 | 47 | 0 | 0.00 | 2 | 45 | 0 | 4.26 | ||||
Integration of qualitative and quantitative components | 40 | 7 | — | 85.11 | ||||||||
Sequence of methods specified | 10 | 37 | — | 27.03 | ||||||||
Areas of consistency between methods stated | 12 | 35 | — | 25.53 | ||||||||
Areas of inconsistency between methods stated | 6 | 41 | — | 12.77 | ||||||||
Stage of integration specified | 5 | 42 | — | 11.90 | ||||||||
Priority of methods specified | 2 | 45 | — | 4.44 | ||||||||
Purpose of mixing methods specified | 2 | 45 | — | 4.44 | ||||||||
Limitations of mixed methods stated | 2 | 45 | — | 4.26 |
We assessed components using categories of 0 (not described), 1 (described), or not applicable (e.g., for criteria referencing control groups in a study that had none, or ethical review for a study with no human subjects data) ( O'Cathain, Murphy, and Nicholl 2007 ). We identified only whether the study contained or did not contain each methodological component and did not attempt to assess quality or appropriateness of each component within the context of the study. For example, we assessed whether the publication stated that missing data were addressed but not whether the methods to address missing data were the best methods for that particular research design. Similar to initial article classification, two authors read and coded articles to assess presence/absence of each criterion, with any ambiguity resolved in consultation with a third author.
Once all articles were coded, we conducted a statistical analysis to determine whether there were trends over time in the prevalence of mixed methods articles. To assess this, we used linear regression to test the hypothesis that there would be an increase in the prevalence of the number of mixed methods articles over time. We also conducted chi-square tests to assess differences between mixed methods, qualitative, and quantitative articles on both quantitative and qualitative criteria. We tested whether each criterion was present in the same proportion of quantitative studies as in the quantitative portion of the mixed methods studies and in the same proportion of qualitative studies as in the qualitative portion of the mixed methods studies.
In general, coders could easily categorize the type of study. Challenges arose when transparency about methods was inadequate ( N = 58, 3.5 percent of all empirical articles). For example, some papers indicated that data from interviews were included but did not provide details about who was interviewed, what was asked in the interviews, how the interview data were analyzed, or how the interview data were integrated into the overall study.
Research Question 1: How has the frequency of mixed methods studies published in health services journals changed over time?
Table 1 presents a summary of the types of articles published in four major health services research journals from 2003 through 2007. Only 2.85 percent ( n = 47) of empirical articles were mixed methods studies; 6.18 percent ( n = 102) of empirical studies represented qualitative research. Quantitative research represented 90.98 percent ( n = 1,502) of empirical articles. The journal containing the highest proportion of empirical studies employing a mixed methods design was Milbank Quarterly (8.33 percent), followed by Health Affairs (6.91 percent), Health Services Research (4.03 percent), and Medical Care (0.78 percent). Chi-square test showed a significant difference in these proportions (χ 2 = 34.67, df = 3, p < .0001).
Type and Design of Empirical Articles Published in Health Services Research Journals from 2003 to 2007, Data Presented by Journal
Journal | Quant | Qual | Mixed | Total |
---|---|---|---|---|
305 | 49 | 21 | 375 | |
81.33% | 13.07% | 5.60% | ||
428 | 26 | 17 | 471 | |
90.87% | 5.52% | 3.61% | ||
751 | 12 | 6 | 769 | |
97.66% | 1.56% | 0.78% | ||
18 | 15 | 3 | 36 | |
50.00% | 41.67% | 8.33% | ||
Total | 1,502 | 102 | 47 | 1,651 |
90.98% | 6.18% | 2.85% |
Note . Mixed, mixed method articles; Qual, qualitative articles; Quant, quantitative articles.
To detect temporal trends in the frequency of mixed methods research in the health services literature, articles were collapsed across journal and examined by publication year. Table 2 presents the frequency of article type for each of the 5 years. All journals combined published an average of 10.8 mixed method articles per year, or 3.27 percent of empirical articles annually. A quadratic trend was seen across the 5 years ( R 2 = 0.65), indicating a slight increase in mixed method articles in the first 2 years and then a decrease for the remaining years.
Type and Design of Empirical Articles Published in Four Health Services Research Journals from 2003 to 2007, Data Presented by Year
Year | Quant | Qual | Mixed | Total |
---|---|---|---|---|
2003 | 260 | 21 | 7 | 288 |
90.28% | 7.29% | 2.43% | ||
2004 | 295 | 18 | 13 | 326 |
90.49% | 5.52% | 3.99% | ||
2005 | 282 | 17 | 8 | 307 |
91.86% | 5.54% | 2.61% | ||
2006 | 321 | 25 | 10 | 356 |
90.17% | 7.02% | 2.81% | ||
2007 | 344 | 21 | 9 | 374 |
91.98% | 5.61% | 2.41% | ||
Total | 1,502 | 102 | 47 | 1,651 |
90.98% | 6.18% | 2.85% |
Research Question 2: How are mixed methods articles being used to elucidate health services research?
Mixed methods articles were categorized into four overlapping categories: Articles on organizational and individual decision making processes ( n = 18 studies) combined qualitative interviews with quantitative administrative data analyses to assess decision making about processes or impediments to processes. Examples include a study of formulary adoption decisions ( Dandrove, Hughes, and Shanley 2003 ) and states’ decisions to reduce Medicaid and other public program funding ( Hoadley, Cunningham, and McHugh 2004 ).
Sixteen articles described outcomes or effects of policies or initiatives by combining administrative health record or performance data with interviews of health administrators, providers, or executives. Examples include papers describing outcomes of pay-for-performance changes to Medicaid ( Felt-Lisk, Gimm, and Peterson 2007 ; Rosenthal et al. 2007 ) and hospital patient safety initiatives ( Devers, Pham, and Liu 2004 ).
Thirteen measurement development articles employed mixed methods to create measurement tools to assess, for example, caregiver burden ( Cousineau et al. 2003 ), patient activation ( Hibbard et al. 2004 ), and the development of a Healthcare Effectiveness Data and Information Set (HEDIS) smoking measure ( Pbert et al. 2003 ). These studies typically examined qualitative data from individual or focus group interviews first to inform creation and testing of a survey.
Articles on experiences and perceptions were the least common category ( n = 8), typically combining surveys and interviews. These included family physicians’ perceptions of the effect of medication samples on their prescribing practices ( Hall, Tett, and Nissen 2006 ); caregivers’ experiences of the termination of home health care for stroke patients ( Levine et al. 2006 ); and consumer enrollment experiences in the Cash and Counseling program ( Schore, Foster, and Phillips 2007 ).
Only five mixed methods articles (10.64 percent) of the total mixed methods sample used the terms “mixed method” or “multimethod” in the abstract or text, although four articles (8.51 percent) referred to “qualitative and quantitative” data.
Research Question 3: Do mixed methods articles report qualitative and quantitative methodology differently than methodology is reported in qualitative-only or quantitative-only articles?
Table 3 presents a summary of the frequency of key methodological components present in quantitative articles, qualitative articles, and mixed methods articles (each n = 47). For quantitative methodological components (32 items), mixed methods articles (M = 7.02 [21.94 percent], SD = 6.24) averaged statistically significantly fewer ( t (92) = −4.50, p < .00001, Cohen's d effect size = 0.93) components than did quantitative articles (M = 15.06 [47.07 percent], SD = 10.53). For qualitative methodological components (35 items), mixed methods articles (M = 7.17 [21.34 percent], SD = 6.36) did not average a statistically significantly different proportion of components ( t (92) = −1.10, p = .14, d = 0.23) than did qualitative articles (M = 8.91 [25.47 percent], SD = 8.83). No article met all criteria, and no criterion was met by all articles. For comparative analyses at a statistical significance level of α = 0.05, power to detect a medium difference (Cohen's h = 0.50) and a large difference (Cohen's h = 0.80) was 78 and 99 percent, respectively.
Of quantitative components, mixed methods studies were most likely to describe sources of data and data collection instruments (61.70 percent of studies), state the purpose/objective of the paper (59.57 percent), state the source of subjects (58.70 percent), and define/describe the study population (51.06 percent). Most mixed methods studies did not include control and intervention groups, which excluded related criteria. Quantitative studies tended to contain more key methodological components, with more than 90 percent of studies defining outcome measures (93.48 percent), defining/describing study population (91.49 percent), describing statistical procedures (95.74 percent), and stating hypotheses (97.87 percent). Quantitative studies were more likely than the quantitative portion of mixed methods studies to describe study characteristics (e.g., study design, subject recruitment), identify or control for confounding variables, provide probability values or confidence intervals, state hypotheses, or acknowledge both statistical and clinical significance (see Table 3 ).
For qualitative methodological components, mixed methods studies were most likely to state the purpose/objective of the paper (72.34 percent), triangulate qualitative sources (e.g., use both individual and focus group interviews; 53.19 percent), and describe data-gathering procedures (53.19 percent). More than 50 percent of qualitative studies triangulated qualitative sources (57.45 percent), stated the purpose/objective of the paper (57.45 percent), and described the study setting (80.43 percent), how the setting was selected (63.04 percent), the participants (55.56 percent), and data-gathering procedures (76.60 percent). Qualitative studies were more likely than the qualitative portions of the mixed methods studies to describe the study setting, justify the sampling strategy, participants, and data-gathering procedures.
For criteria regarding method integration, few authors justified the use of mixed methods or clearly described the priority, purpose, and sequence of methods, and the stage of integration. Most articles, however, integrated qualitative and quantitative components (85.11 percent); examination of articles indicated components were most frequently integrated in the interpretation phase. Across all studies, few articles stated that informed consent was obtained, ethical review was undertaken, or that subjects’ confidentiality was protected.
Previous reports indicate mixed methods articles comprised <1 percent of empirical health articles examined in 2000 ( McKibbon and Gadd 2004 ). Since then, however, the National Institutes of Health has increased funding for mixed methods research, with the proportion of funded research projects up to 5 percent of studies in some institutes ( Plano Clark 2010 ). In the United Kingdom, the proportion of funded research that uses mixed methods is at 17 percent and continuing to increase ( O'Cathain, Murphy, and Nicholl 2007 ). We found that the use of mixed methods in articles published in top health services research journals was generally consistent between 2003 and 2007 at approximately 3 percent of all empirical articles, lower than would be expected given the complexity and depth of health services research questions for which mixed methods would be appropriate. The presence of key methodological components was variable across type of article, but the quantitative portion of mixed methods articles included consistently fewer methodological components than quantitative-only studies and the qualitative portion of mixed methods articles included about the same proportion of methodological components as qualitative-only articles. Mixed methods articles also generally did not address the priority, purpose, and sequence of methods or the integration of methods as suggested by experts in mixed methods (e.g., Creswell and Tashakkori 2008 ; O'Cathain 2010 ; Creswell and Plano Clark 2011 ).
Key methodological components that cut across qualitative and quantitative methodologies were often missing from mixed methods publications. Descriptions of sample selection and sampling procedures, the study context, and data-gathering procedures are essential aspects of interpreting study findings, and mixed methods studies should not be exempt from these basic research requirements. Many mixed methods studies did not include the level of detail that would likely be required for a qualitative or quantitative paper to be accepted in these high-ranking journals. Further, the studies appeared not to follow available guidance on the structure and components of mixed methods studies that discuss basic quality criteria, data collection strategies, methods of data analysis, procedures for integration of methods, processes of making inferences from text, and recommendations for adequate reporting of results (e.g., Giacomini and Cook for the Evidence-based Medicine Working Group 2000 ; Curry, Nembhard, and Bradley 2009 ; O'Cathain 2010 ; Tashakkori and Teddlie 2010 ; Creswell and Plano Clark 2011 ). In some ways this finding is not surprising because guidance on mixed methods standards is still emerging. We expect that the National Institutes of Health publication, Best Practices for Mixed Methods in Health Sciences (Creswell, Klassen, Plano Clark, and Smith for the Office of Behavioral and Social Science Research) will lead to increased standardization of mixed methods approaches.
Although they reported more key methodological components on average than the mixed methods articles, quantitative articles in this analysis had some surprising gaps as well, including low reporting of power analyses, how missing data were addressed, and descriptions of control/comparison groups. It should be noted, however, that quantitative articles with large sample sizes do not necessarily need power analyses. With regard to single-method qualitative articles, low proportions described the study context, coding techniques, or data analysis. Few articles with human subjects involvement included statements that the research was conducted with ethical oversight, promised confidentiality, or obtained consent. These findings suggest that the issue of poor transparency in reporting methodology is not limited to mixed methods studies.
The methodological components reported here are not optimal indicators of the quality of mixed methods publications; an article could conceivably have all of these components and yet still be a poor research study. These components are, however, a useful starting point for a systematic evaluation of the rigor of qualitative and quantitative portions of mixed methods studies. Some journals require inclusion of other criteria (e.g., Consolidated Standards of Reporting Trials 2010 ) to guide reporting of highly structured methodologies (e.g., randomized clinical trials); it would be useful to examine researchers’ and editors’ perspectives on the validity of the methodological components in this study for mixed method publications. It is difficult, however, to identify measurable criteria that capture the breadth of study designs in health services. Further, determination of what indicators of rigor would be appropriate might reasonably vary by study design, topic, scope, and even journal, and qualified judgment is required to determine which criteria are appropriate for each study. These findings suggest mixed methods researchers should provide enough detail on methodology and methodological decisions to allow reviewers to judge quality.
Researchers face challenges writing and publishing mixed methods articles, including communicating with diverse audiences who are familiar with only one methodological approach (i.e., quantitative research or qualitative research), determining the most appropriate language and terminology to use, complying with journal word counts, and finding appropriate publishing outlets with reviewers who have expertise in mixed methods research techniques and who are not biased against mixed methods studies ( Leech and Onwuegbuzie 2010 ; Leech, Onwuegbuzie, and Combs 2011 ). Our findings suggest that Sale and Brazil's (2004 ) criteria and existing guidance on conducting mixed methods research (e.g., Collins, Onwuegbuzie, and Sutton 2006 ; Tashakkori and Teddlie 2010 ; Creswell and Plano Clark 2011 ) might be useful frameworks for health services researchers as they work to improve methodological rigor. Journal editors might also encourage the publication of mixed methods projects by (1) publishing guidelines for rigor in mixed methods articles (e.g., Sale and Brazil 2004 ), (2) identifying experienced reviewers who can provide competent and ethical reviews of mixed methods studies, and (3) requiring transparency of methods for all studies so that (4) rigor and quality can be can be assessed to the same extent they are in quantitative studies. These modifications might require (5) some flexibility in word count or allowance of online appendices to allow mixed methods researchers to describe fully and concisely both qualitative and quantitative components, methods for integrating findings, and appropriate details.
In this study, assessment was limited to only published articles. We did not contact authors to determine specific study activities, and studies may have included methodological components (e.g., consenting) not reported in publications. We assessed only whether publications reported the methodological component, but we did not evaluate whether each component was fully and appropriately implemented in the research.
Mixed methods studies have utility in providing a more comprehensive picture of health services than either method can alone. Researchers who use mixed methods techniques should use rigorous methodologies in their mixed methods research designs and explicitly report key methodological components of those designs and methods in published articles. Similarly, journal editors who publish mixed methods research should provide guidance to reviewers of mixed methods articles to assess the quality of manuscripts, and they must be prepared to provide adequate space for authors to report the necessary methodological information. Frameworks are now available to guide both the design and evaluation of mixed methods research studies and published works. Whatever frameworks are used, it is essential that authors who engage in mixed methods research studies meet two primary goals (developed by the American Educational Research Association 2006 ): Mixed methods researchers should (1) conduct and report research that is warranted or defensible in terms of documenting evidence, substantiating results, and validating conclusions; and (2) ensure that the conduct of research is transparent in terms of clarifying the logic underpinning the inquiry.
Joint Acknowledgment/Disclosure Statement : The authors appreciate funding from the National Institute on Drug Abuse (K23 DA020487) and comments and feedback on an earlier draft from the anonymous reviewers, John Creswell, PhD, Alicia O'Cathain, PhD, Hilary Vidair, PhD, Susan Essock, PhD, and Sa Shen, PhD. Portions of this manuscript were presented at the International Mixed Methods Conference in July 2010 in Baltimore, Maryland.
Disclosures : None.
Disclaimers : None.
Additional supporting information may be found in the online version of this article:
Appendix SA1: Author Matrix.
Please note: Wiley-Blackwell is not responsible for the content or functionality of any supporting materials supplied by the authors. Any queries (other than missing material) should be directed to the corresponding author for the article.
COMMENTS
Novel Research Methods Researchers are encouraged to consult with colleagues in their college or department to see if others have received IRB approval for similar procedures, as their insights and experience could be helpful. ... This presentation will clarify the key aspects of ClinicalTrials.gov registration and results reporting, with an ...
Background: Smart speakers, such as Amazon's Echo and Google's Nest Home, combine natural language processing with a conversational interface to carry out everyday tasks, like playing music and finding information. Easy to use, they are embraced by older adults, including those with limited physical function, vision, or computer literacy.
According to Elizabeth (2018: 5), cited by Creswell and Plano (2007: 147), the mixed method is one of the research designs with philosophical assumptions as well as methods of inquiry. As a method, it focuses on collecting, analyzing, and mixing both quantitative and qualitative data in a single study.
Part III: Chapter 7: Reporting the Results of Mixed Method Evaluations. Chapter 7. Reporting the Results of Mixed Method Evaluations. The final task the evaluator is required to perform is to summarize what the team has done, what has been learned, and how others might benefit from this project s experience. As a rule, NSF grantees are expected ...
Describe any limitation of one method associated with the present of the other method Discussion pg. 15-17 Describe any insights gained from mixing or integrating methods Discussion: pg. 15-17 O'Cathain A, Murphy E, Nicholl J. The quality of mixed methods studies in health services research. J Health Serv Res Policy. 2008;13: 92-98.
3. Systematic Development of Standards for Mixed Methods Reporting in Rehabilitation Health Sciences Research. 4. Initial Standardized Framework for Reporting Social Media Analytics in Emergency Care Research. 5. CONFERD-HP : recommendations for reporting COmpeteNcy FramEwoRk Development in health professions. 6.
If you are reporting a mixed methods research, you have to show how you mixed the research during various phases of the research, such as the design, data collection and reporting of the results ...
The seventh edition of the Publication Manual also includes content on mixed methods studies such as standards for journal article reporting, considerations for presenting the sequence of quantitative and qualitative studies, and recommendations for describing the integration of quantitative and qualitative aspects of the research throughout a ...
Good Reporting of A Mixed Methods Study (GRAMMS) Checklist. Guideline. Section: Page. Justification to use a mixed methods approach to the research question. Materials and Methods: p. 6. Articulation of the design in terms of purpose, priority, and sequence of methods. Materials and Methods: pp.6-. 10.
Abstract. Context: Mixed methods research involves the collection, analysis and integration of both qualitative and quantitative data in a single study. The benefits of a mixed methods approach are particularly evident when studying new questions or complex initiatives and interactions, which is often the case in medical education research.
Guidelines for conducting and reporting mixed research in the field of counseling and beyond; 586; Reporting experiments in homeopathic basic research (REHBaR) - a detailed guideline for authors; 587; Guidelines for the design, conduct and reporting of human intervention studies to evaluate the health benefits of foods; 588
Tovin MM, Wormley ME. Systematic Development of Standards for Mixed Methods Reporting in Rehabilitation Health Sciences Research. Phys Ther. 2023. Language: English: PubMed ID: 37672215: Reporting guideline acronym: MMR-RHS: Study design: Mixed methods studies: Clinical area: Occupational therapy, Physiotherapy, Rehabilitation medicine
By taking into account 14 properties of mixed methods research (e.g., purposes, research questions, epistemological assumptions), our guidelines demonstrate how researchers can flexibly identify the existing variations in mixed methods research and proceed accordingly with a study design that suits their needs.
Methods: A total of 655 individuals were enrolled in this sequential, explanatory mixed-methods study using systematic random sampling between October 2020 and May 2021. Quantitative survey data from 533 female patients presenting for care at KCMC ED or RHC were analyzed to compare sociodemographics and alcohol use practices among pregnant ...
This article provides guidelines for conducting, reporting, and evaluating mixed research studies in 3 sections: research formulation, research planning, and research implementation. To date, no such guidelines are available. Detailed descriptions of each subsection are included.
1. Introduction. Concept mapping is a type of mixed-methods study design where participants generate and prioritize ideas to develop an understanding of a complex phenomenon [1,2].Concept mapping research has six phases—preparation, brainstorming (idea generation), structuring of the statements (prioritization and ranking), representation of statements, interpretation of maps, and ...
Despite the growth in the number of bibliometric analyses published in the peer-reviewed literature, few articles provide guidance on methods and reporting to ensure reliability, robustness, and reproducibility. Consequently, the quality of reporting in existing bibliometric studies varies greatly. In response, we are developing a preliminary Guidance List for the repOrting of Bibliometric ...
1. Describe the justification for using a mixed methods approach to the research question 5 2. Describe the design in terms of the purpose, priority and sequence of methods 5-8 3. Describe each method in terms of sampling, data collection and analysis 6 4. Describe where integration has occurred, how it has occurred and who has participated in ...
Expertise in qualitative and/or mixed methods research is necessary for qualitative and mixed methods design, data analysis, data integration, and reporting of results. Optimally, the individual(s) with qualitative and/or mixed methods expertise should have a track record in successfully conducting and reporting studies using these approaches.
Introduction: A patient registry database is an important tool to address a wide range of research questions. Several countries have established nationwide melanoma registry databases. However, there is no report on summarising and comparing these databases. This scoping review aims to answer a broad question on how contemporary nationwide melanoma registry databases were conducted across ...
Methods This study used a mixed methods design with a cross-sectional survey and qualitative workshops and interviews between October 2023 to May 2024. Survey data were collected from clinicians involved in emergency stroke care. ... Yes I have followed all appropriate research reporting guidelines, such as any relevant EQUATOR Network research ...
MMR standards exist; yet, there is a need for reporting guidelines and an appraisal tool that meets field standards, is applicable across rehabilitation fields of study, and can accommodate the range of possibilities for combining research approaches and methods. Methods: Mixed Methods Reporting in Rehabilitation & Health Sciences (MMR-RHS) was ...
In addition to the general JARS-Qual guidelines, the Working Group has developed standards for both qualitative meta-analysis and mixed methods research. The reporting standards were developed ...
Explore how scientific research by psychologists can inform our professional lives, family and community relationships, emotional wellness, and more. ... Report lists eight recommendations for scientists, policymakers, and others to meet the ongoing risk to health, well-being, and civic life ... QUAL Friendly Mixed Methods Research. Conducting ...
Mixed methods research ( Jars -Mixed) Additionally, the APA Style Journal Article Reporting Standards for Race, Ethnicity, and Culture ( Jars - Rec) provide guidance on how to discuss race, ethnicity, and culture in scientific manuscripts. Jars - Rec should be applied to all research, whether it is quantitative, qualitative, or mixed methods.
Study Design. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed ...