• AI Templates
  • Get a demo Sign up for free Log in Log in

Buttoning up research: How to present and visualize qualitative data

presentation qualitative data collection

15 Minute Read

presentation qualitative data collection

There is no doubt that data visualization is an important part of the qualitative research process. Whether you're preparing a presentation or writing up a report, effective visualizations can help make your findings clear and understandable for your audience. 

In this blog post, we'll discuss some tips for creating effective visualizations of qualitative data. 

First, let's take a closer look at what exactly qualitative data is.

What is qualitative data?

Qualitative data is information gathered through observation, questionnaires, and interviews. It's often subjective, meaning that the researcher has to interpret it to draw meaningful conclusions from it. 

The difference between qualitative data and quantitative data

When researchers use the terms qualitative and quantitative, they're referring to two different types of data. Qualitative data is subjective and descriptive, while quantitative data is objective and numerical.

Qualitative data is often used in research involving psychology or sociology. This is usually where a researcher may be trying to identify patterns or concepts related to people's behavior or attitudes. It may also be used in research involving economics or finance, where the focus is on numerical values such as price points or profit margins. 

Before we delve into how best to present and visualize qualitative data, it's important that we highlight how to be gathering this data in the first place. ‍

presentation qualitative data collection

How best to gather qualitative data

In order to create an effective visualization of qualitative data, ensure that the right kind of information has been gathered. 

Here are six ways to gather the most accurate qualitative data:

  • Define your research question: What data is being set out to collect? A qualitative research question is a definite or clear statement about a condition to be improved, a project’s area of concern, a troubling question that exists, or a difficulty to be eliminated. It not only defines who the participants will be but guides the data collection methods needed to achieve the most detailed responses.
  • ‍ Determine the best data collection method(s): The data collected should be appropriate to answer the research question. Some common qualitative data collection methods include interviews, focus groups, observations, or document analysis. Consider the strengths and weaknesses of each option before deciding which one is best suited to answer the research question.  ‍
  • Develop a cohesive interview guide: Creating an interview guide allows researchers to ask more specific questions and encourages thoughtful responses from participants. It’s important to design questions in such a way that they are centered around the topic of discussion and elicit meaningful insight into the issue at hand. Avoid leading or biased questions that could influence participants’ answers, and be aware of cultural nuances that may affect their answers.
  • ‍ Stay neutral – let participants share their stories: The goal is to obtain useful information, not to influence the participant’s answer. Allowing participants to express themselves freely will help to gather more honest and detailed responses. It’s important to maintain a neutral tone throughout interviews and avoid judgment or opinions while they are sharing their story. 
  • ‍ Work with at least one additional team member when conducting qualitative research: Participants should always feel comfortable while providing feedback on a topic, so it can be helpful to have an extra team member present during the interview process – particularly if this person is familiar with the topic being discussed. This will ensure that the atmosphere of the interview remains respectful and encourages participants to speak openly and honestly.
  • ‍ Analyze your findings: Once all of the data has been collected, it’s important to analyze it in order to draw meaningful conclusions. Use tools such as qualitative coding or content analysis to identify patterns or themes in the data, then compare them with prior research or other data sources. This will help to draw more accurate and useful insights from the results. 

By following these steps, you will be well-prepared to collect and analyze qualitative data for your research project. Next, let's focus on how best to present the qualitative data that you have gathered and analyzed.

presentation qualitative data collection

Create your own AI-powered templates for better, faster research synthesis. Discover new customer insights from data instantly.

presentation qualitative data collection

The top 10 things Notably shipped in 2023 and themes for 2024.

How to visually present qualitative data.

When it comes to how to present qualitative data visually, the goal is to make research findings clear and easy to understand. To do this, use visuals that are both attractive and informative. 

Presenting qualitative data visually helps to bring the user’s attention to specific items and draw them into a more in-depth analysis. Visuals provide an efficient way to communicate complex information, making it easier for the audience to comprehend. 

Additionally, visuals can help engage an audience by making a presentation more interesting and interactive.

Here are some tips for creating effective visuals from qualitative data:

  • ‍ Choose the right type of visualization: Consider which type of visual would best convey the story that is being told through the research. For example, bar charts or line graphs might be appropriate for tracking changes over time, while pie charts or word clouds could help show patterns in categorical data. 
  • ‍ Include contextual information: In addition to showing the actual numbers, it's helpful to include any relevant contextual information in order to provide context for the audience. This can include details such as the sample size, any anomalies that occurred during data collection, or other environmental factors.
  • ‍ Make it easy to understand: Always keep visuals simple and avoid adding too much detail or complexity. This will help ensure that viewers can quickly grasp the main points without getting overwhelmed by all of the information. 
  • ‍ Use color strategically: Color can be used to draw attention to certain elements in your visual and make it easier for viewers to find the most important parts of it. Just be sure not to use too many different colors, as this could create confusion instead of clarity. 
  • ‍ Use charts or whiteboards: Using charts or whiteboards can help to explain the data in more detail and get viewers engaged in a discussion. This type of visual tool can also be used to create storyboards that illustrate the data over time, helping to bring your research to life. 

presentation qualitative data collection

Visualizing qualitative data in Notably

Notably helps researchers visualize their data on a flexible canvas, charts, and evidence based insights. As an all-in-one research platform, Notably enables researchers to collect, analyze and present qualitative data effectively.

Notably provides an intuitive interface for analyzing data from a variety of sources, including interviews, surveys, desk research, and more. Its powerful analytics engine then helps you to quickly identify insights and trends in your data . Finally, the platform makes it easy to create beautiful visuals that will help to communicate research findings with confidence. 

Research Frameworks in Analysis

The canvas in Analysis is a multi-dimensional workspace to play with your data spatially to find likeness and tension. Here, you may use a grounded theory approach to drag and drop notes into themes or patterns that emerge in your research. Utilizing the canvas tools such as shapes, lines, and images, allows researchers to build out frameworks such as journey maps, empathy maps, 2x2's, etc. to help synthesize their data.

Going one step further, you may begin to apply various lenses to this data driven canvas. For example, recoloring by sentiment shows where pain points may distributed across your customer journey. Or, recoloring by participant may reveal if one of your participants may be creating a bias towards a particular theme.

presentation qualitative data collection

Exploring Qualitative Data through a Quantitative Lens

Once you have begun your analysis, you may visualize your qualitative data in a quantitative way through charts. You may choose between a pie chart and or a stacked bar chart to visualize your data. From here, you can segment your data to break down the ‘bar’ in your bar chart and slices in your pie chart one step further.

To segment your data, you can choose between ‘Tag group’, ‘Tag’, ‘Theme’, and ‘Participant'. Each group shows up as its own bar in the bar chart or slice in the pie chart. For example, try grouping data as ‘Participant’ to see the volume of notes assigned to each person. Or, group by ‘Tag group’ to see which of your tag groups have the most notes.

Depending on how you’ve grouped or segmented your charts will affect the options available to color your chart. Charts use colors that are a mix of sentiment, tag, theme, and default colors. Consider color as a way of assigning another layer of meaning to your data. For example, choose a red color for tags or themes that are areas of friction or pain points. Use blue for tags that represent opportunities.

presentation qualitative data collection

AI Powered Insights and Cover Images

One of the most powerful features in Analysis is the ability to generate insights with AI. Insights combine information, inspiration, and intuition to help bridge the gap between knowledge and wisdom. Even before you have any tags or themes, you may generate an AI Insight from your entire data set. You'll be able to choose one of our AI Insight templates that are inspired by trusted design thinking frameworks to stimulate generative, and divergent thinking. With just the click of a button, you'll get an insight that captures the essence and story of your research. You may experiment with a combination of tags, themes, and different templates or, create your own custom AI template. These insights are all evidence-based, and are centered on the needs of real people. You may package these insights up to present your research by embedding videos, quotes and using AI to generate unique cover image.

presentation qualitative data collection

You can sign up to run an end to end research project for free and receive tips on how to make the most out of your data. Want to chat about how Notably can help your team do better, faster research? Book some time here for a 1:1 demo with your whole team.

presentation qualitative data collection

Meet Posty: Your AI Research Assistant for Automatic Analysis

presentation qualitative data collection

Introducing Notably + Miro Integration: 3 Tips to Analyze Miro Boards with AI in Notably

Give your research synthesis superpowers..

Try Teams for 7 days

Free for 1 project

presentation qualitative data collection

presentation qualitative data collection

The Ultimate Guide to Qualitative Research - Part 3: Presenting Qualitative Data

presentation qualitative data collection

  • Introduction

How do you present qualitative data?

Data visualization.

  • Research paper writing
  • Transparency and rigor in research
  • How to publish a research paper

Table of contents

  • Transparency and rigor

Navigate to other guide parts:

Part 1: The Basics or Part 2: Handling Qualitative Data

  • Presenting qualitative data

In the end, presenting qualitative research findings is just as important a skill as mastery of qualitative research methods for the data collection and data analysis process . Simply uncovering insights is insufficient to the research process; presenting a qualitative analysis holds the challenge of persuading your audience of the value of your research. As a result, it's worth spending some time considering how best to report your research to facilitate its contribution to scientific knowledge.

presentation qualitative data collection

When it comes to research, presenting data in a meaningful and accessible way is as important as gathering it. This is particularly true for qualitative research , where the richness and complexity of the data demand careful and thoughtful presentation. Poorly written research is taken less seriously and left undiscussed by the greater scholarly community; quality research reporting that persuades its audience stands a greater chance of being incorporated in discussions of scientific knowledge.

Qualitative data presentation differs fundamentally from that found in quantitative research. While quantitative data tend to be numerical and easily lend themselves to statistical analysis and graphical representation, qualitative data are often textual and unstructured, requiring an interpretive approach to bring out their inherent meanings. Regardless of the methodological approach , the ultimate goal of data presentation is to communicate research findings effectively to an audience so they can incorporate the generated knowledge into their research inquiry.

As the section on research rigor will suggest, an effective presentation of your research depends on a thorough scientific process that organizes raw data into a structure that allows for a thorough analysis for scientific understanding.

Preparing the data

The first step in presenting qualitative data is preparing the data. This preparation process often begins with cleaning and organizing the data. Cleaning involves checking the data for accuracy and completeness, removing any irrelevant information, and making corrections as needed. Organizing the data often entails arranging the data into categories or groups that make sense for your research framework.

presentation qualitative data collection

Coding the data

Once the data are cleaned and organized, the next step is coding , a crucial part of qualitative data analysis. Coding involves assigning labels to segments of the data to summarize or categorize them. This process helps to identify patterns and themes in the data, laying the groundwork for subsequent data interpretation and presentation. Qualitative research often involves multiple iterations of coding, creating new and meaningful codes while discarding unnecessary ones , to generate a rich structure through which data analysis can occur.

Uncovering insights

As you navigate through these initial steps, keep in mind the broader aim of qualitative research, which is to provide rich, detailed, and nuanced understandings of people's experiences, behaviors, and social realities. These guiding principles will help to ensure that your data presentation is not only accurate and comprehensive but also meaningful and impactful.

presentation qualitative data collection

While this process might seem intimidating at first, it's an essential part of any qualitative research project. It's also a skill that can be learned and refined over time, so don't be discouraged if you find it challenging at first. Remember, the goal of presenting qualitative data is to make your research findings accessible and understandable to others. This requires careful preparation, a clear understanding of your data, and a commitment to presenting your findings in a way that respects and honors the complexity of the phenomena you're studying.

In the following sections, we'll delve deeper into how to create a comprehensive narrative from your data, the visualization of qualitative data , and the writing and publication processes . Let's briefly excerpt some of the content in the articles in this part of the guide.

presentation qualitative data collection

ATLAS.ti helps you make sense of your data

Find out how with a free trial of our powerful data analysis interface.

How often do you read a research article and skip straight to the tables and figures? That's because data visualizations representing qualitative and quantitative data have the power to make large and complex research projects with thousands of data points comprehensible when authors present data to research audiences. Researchers create visual representations to help summarize the data generated from their study and make clear the pathways for actionable insights.

In everyday situations, a picture is always worth a thousand words. Illustrations, figures, and charts convey messages that words alone cannot. In research, data visualization can help explain scientific knowledge, evidence for data insights, and key performance indicators in an orderly manner based on data that is otherwise unstructured.

presentation qualitative data collection

For all of the various data formats available to researchers, a significant portion of qualitative and social science research is still text-based. Essays, reports, and research articles still rely on writing practices aimed at repackaging research in prose form. This can create the impression that simply writing more will persuade research audiences. Instead, framing research in terms that are easy for your target readers to understand makes it easier for your research to become published in peer-reviewed scholarly journals or find engagement at scholarly conferences. Even in market or professional settings, data visualization is an essential concept when you need to convince others about the insights of your research and the recommendations you make based on the data.

Importance of data visualization

Data visualization is important because it makes it easy for your research audience to understand your data sets and your findings. Also, data visualization helps you organize your data more efficiently. As the explanation of ATLAS.ti's tools will illustrate in this section, data visualization might point you to research inquiries that you might not even be aware of, helping you get the most out of your data. Strictly speaking, the primary role of data visualization is to make the analysis of your data , if not the data itself, clear. Especially in social science research, data visualization makes it easy to see how data scientists collect and analyze data.

Prerequisites for generating data visualizations

Data visualization is effective in explaining research to others only if the researcher or data scientist can make sense of the data in front of them. Traditional research with unstructured data usually calls for coding the data with short, descriptive codes that can be analyzed later, whether statistically or thematically. These codes form the basic data points of a meaningful qualitative analysis. They represent the structure of qualitative data sets, without which a scientific visualization with research rigor would be extremely difficult to achieve. In most respects, data visualization of a qualitative research project requires coding the entire data set so that the codes adequately represent the collected data.

A successfully crafted research study culminates in the writing of the research paper . While a pilot study or preliminary research might guide the research design , a full research study leads to discussion that highlights avenues for further research. As such, the importance of the research paper cannot be overestimated in the overall generation of scientific knowledge.

presentation qualitative data collection

The physical and natural sciences tend to have a clinical structure for a research paper that mirrors the scientific method: outline the background research, explain the materials and methods of the study, outline the research findings generated from data analysis, and discuss the implications. Qualitative research tends to preserve much of this structure, but there are notable and numerous variations from a traditional research paper that it's worth emphasizing the flexibility in the social sciences with respect to the writing process.

Requirements for research writing

While there aren't any hard and fast rules regarding what belongs in a qualitative research paper , readers expect to find a number of pieces of relevant information in a rigorously-written report. The best way to know what belongs in a full research paper is to look at articles in your target journal or articles that share a particular topic similar to yours and examine how successfully published papers are written.

It's important to emphasize the more mundane but equally important concerns of proofreading and formatting guidelines commonly found when you write a research paper. Research publication shouldn't strictly be a test of one's writing skills, but acknowledging the importance of convincing peer reviewers of the credibility of your research means accepting the responsibility of preparing your research manuscript to commonly accepted standards in research.

As a result, seemingly insignificant things such as spelling mistakes, page numbers, and proper grammar can make a difference with a particularly strict reviewer. Even when you expect to develop a paper through reviewer comments and peer feedback, your manuscript should be as close to a polished final draft as you can make it prior to submission.

Qualitative researchers face particular challenges in convincing their target audience of the value and credibility of their subsequent analysis. Numbers and quantifiable concepts in quantitative studies are relatively easier to understand than their counterparts associated with qualitative methods . Think about how easy it is to make conclusions about the value of items at a store based on their prices, then imagine trying to compare those items based on their design, function, and effectiveness.

Qualitative research involves and requires these sorts of discussions. The goal of qualitative data analysis is to allow a qualitative researcher and their audience to make such determinations, but before the audience can accept these determinations, the process of conducting research that produces the qualitative analysis must first be seen as trustworthy. As a result, it is on the researcher to persuade their audience that their data collection process and subsequent analysis is rigorous.

Qualitative rigor refers to the meticulousness, consistency, and transparency of the research. It is the application of systematic, disciplined, and stringent methods to ensure the credibility, dependability, confirmability, and transferability of research findings. In qualitative inquiry, these attributes ensure the research accurately reflects the phenomenon it is intended to represent, that its findings can be understood or used by others, and that its processes and results are open to scrutiny and validation.

Transparency

It is easier to believe the information presented to you if there is a rigorous analysis process behind that information, and if that process is explicitly detailed. The same is true for qualitative research results, making transparency a key element in qualitative research methodologies. Transparency is a fundamental aspect of rigor in qualitative research. It involves the clear, detailed, and explicit documentation of all stages of the research process. This allows other researchers to understand, evaluate, replicate, and build upon the study. Transparency in qualitative research is essential for maintaining rigor, trustworthiness, and ethical integrity. By being transparent, researchers allow their work to be scrutinized, critiqued, and improved upon, contributing to the ongoing development and refinement of knowledge in their field.

Research papers are only as useful as their audience in the scientific community is wide. To reach that audience, a paper needs to pass the peer review process of an academic journal. However, the idea of having research published in peer-reviewed journals may seem daunting to newer researchers, so it's important to provide a guide on how an academic journal looks at your research paper as well as how to determine what is the right journal for your research.

presentation qualitative data collection

In simple terms, a research article is good if it is accepted as credible and rigorous by the scientific community. A study that isn't seen as a valid contribution to scientific knowledge shouldn't be published; ultimately, it is up to peers within the field in which the study is being considered to determine the study's value. In established academic research, this determination is manifest in the peer review process. Journal editors at a peer-reviewed journal assign papers to reviewers who will determine the credibility of the research. A peer-reviewed article that completed this process and is published in a reputable journal can be seen as credible with novel research that can make a profound contribution to scientific knowledge.

The process of research publication

The process has been codified and standardized within the scholarly community to include three main stages. These stages include the initial submission stage where the editor reviews the relevance of the paper, the review stage where experts in your field offer feedback, and, if reviewers approve your paper, the copyediting stage where you work with the journal to prepare the paper for inclusion in their journal.

Publishing a research paper may seem like an opaque process where those involved with academic journals make arbitrary decisions about the worthiness of research manuscripts. In reality, reputable publications assign a rubric or a set of guidelines that reviewers need to keep in mind when they review a submission. These guidelines will most likely differ depending on the journal, but they fall into a number of typical categories that are applicable regardless of the research area or the type of methods employed in a research study, including the strength of the literature review , rigor in research methodology , and novelty of findings.

Choosing the right journal isn't simply a matter of which journal is the most famous or has the broadest reach. Many universities keep lists of prominent journals where graduate students and faculty members should publish a research paper , but oftentimes this list is determined by a journal's impact factor and their inclusion in major academic databases.

presentation qualitative data collection

Guide your research to publication with ATLAS.ti

Turn insights into visualizations with our easy-to-use interface. Download a free trial today.

This section is part of an entire guide. Use this table of contents to jump to any page in the guide.

Part 1: The Basics

  • What is qualitative data?
  • 10 examples of qualitative data
  • Qualitative vs. quantitative research
  • What is mixed methods research?
  • Theoretical perspective
  • Theoretical framework
  • Literature reviews
  • Research questions
  • Conceptual framework
  • Conceptual vs. theoretical framework
  • Focus groups
  • Observational research
  • Case studies
  • Survey research
  • What is ethnographic research?
  • Confidentiality and privacy in research
  • Bias in research
  • Power dynamics in research
  • Reflexivity

Part 2: Handling Qualitative Data

  • Research transcripts
  • Field notes in research
  • Research memos
  • Survey data
  • Images, audio, and video in qualitative research
  • Coding qualitative data
  • Coding frame
  • Auto-coding and smart coding
  • Organizing codes
  • Content analysis
  • Thematic analysis
  • Thematic analysis vs. content analysis
  • Narrative research
  • Phenomenological research
  • Discourse analysis
  • Grounded theory
  • Deductive reasoning
  • What is inductive reasoning?
  • Inductive vs. deductive reasoning
  • What is data interpretation?
  • Qualitative analysis software

Part 3: Presenting Qualitative Data

  • Data visualization - What is it and why is it important?

Logo for Open Educational Resources

Chapter 20. Presentations

Introduction.

If a tree falls in a forest, and no one is around to hear it, does it make a sound? If a qualitative study is conducted, but it is not presented (in words or text), did it really happen? Perhaps not. Findings from qualitative research are inextricably tied up with the way those findings are presented. These presentations do not always need to be in writing, but they need to happen. Think of ethnographies, for example, and their thick descriptions of a particular culture. Witnessing a culture, taking fieldnotes, talking to people—none of those things in and of themselves convey the culture. Or think about an interview-based phenomenological study. Boxes of interview transcripts might be interesting to read through, but they are not a completed study without the intervention of hours of analysis and careful selection of exemplary quotes to illustrate key themes and final arguments and theories. And unlike much quantitative research in the social sciences, where the final write-up neatly reports the results of analyses, the way the “write-up” happens is an integral part of the analysis in qualitative research. Once again, we come back to the messiness and stubborn unlinearity of qualitative research. From the very beginning, when designing the study, imagining the form of its ultimate presentation is helpful.

Because qualitative researchers are motivated by understanding and conveying meaning, effective communication is not only an essential skill but a fundamental facet of the entire research project. Ethnographers must be able to convey a certain sense of verisimilitude, the appearance of true reality. Those employing interviews must faithfully depict the key meanings of the people they interviewed in a way that rings true to those people, even if the end result surprises them. And all researchers must strive for clarity in their publications so that various audiences can understand what was found and why it is important. This chapter will address how to organize various kinds of presentations for different audiences so that your results can be appreciated and understood.

In the world of academic science, social or otherwise, the primary audience for a study’s results is usually the academic community, and the primary venue for communicating to this audience is the academic journal. Journal articles are typically fifteen to thirty pages in length (8,000 to 12,000 words). Although qualitative researchers often write and publish journal articles—indeed, there are several journals dedicated entirely to qualitative research [1] —the best writing by qualitative researchers often shows up in books. This is because books, running from 80,000 to 150,000 words in length, allow the researcher to develop the material fully. You have probably read some of these in various courses you have taken, not realizing what they are. I have used examples of such books throughout this text, beginning with the three profiles in the introductory chapter. In some instances, the chapters in these books began as articles in academic journals (another indication that the journal article format somewhat limits what can be said about the study overall).

While the article and the book are “final” products of qualitative research, there are actually a few other presentation formats that are used along the way. At the very beginning of a research study, it is often important to have a written research proposal not just to clarify to yourself what you will be doing and when but also to justify your research to an outside agency, such as an institutional review board (IRB; see chapter 12), or to a potential funder, which might be your home institution, a government funder (such as the National Science Foundation, or NSF), or a private foundation (such as the Gates Foundation). As you get your research underway, opportunities will arise to present preliminary findings to audiences, usually through presentations at academic conferences. These presentations can provide important feedback as you complete your analyses. Finally, if you are completing a degree and looking to find an academic job, you will be asked to provide a “job talk,” usually about your research. These job talks are similar to conference presentations but can run significantly longer.

All the presentations mentioned so far are (mostly) for academic audiences. But qualitative research is also unique in that many of its practitioners don’t want to confine their presentation only to other academics. Qualitative researchers who study particular contexts or cultures might want to report back to the people and places they observed. Those working in the critical tradition might want to raise awareness of a particular issue to as large an audience as possible. Many others simply want everyday, nonacademic people to read their work, because they think it is interesting and important. To reach a wide audience, the final product can look like almost anything—it can be a poem, a blog, a podcast, even a science fiction short story. And if you are very lucky, it can even be a national or international bestseller.

In this chapter, we are going to stick with the more basic quotidian presentations—the academic paper / research proposal, the conference slideshow presentation / job talk, and the conference poster. We’ll also spend a bit of time on incorporating universal design into your presentations and how to create some especially attractive and impactful visual displays.

Researcher Note

What is the best piece of advice you’ve ever been given about conducting qualitative research?

The best advice I’ve received came from my adviser, Alford Young Jr. He told me to find the “Jessi Streib” answer to my research question, not the “Pierre Bourdieu” answer to my research question. In other words, don’t just say how a famous theorist would answer your question; say something original, something coming from you.

—Jessi Streib, author of The Power of the Past and Privilege Lost 

Writing about Your Research

The journal article and the research proposal.

Although the research proposal is written before you have actually done your research and the article is written after all data collection and analysis is complete, there are actually many similarities between the two in terms of organization and purpose. The final article will (probably—depends on how much the research question and focus have shifted during the research itself) incorporate a great deal of what was included in a preliminary research proposal. The average lengths of both a proposal and an article are quite similar, with the “front sections” of the article abbreviated to make space for the findings, discussion of findings, and conclusion.

Proposal Article
Introduction 20% 10%
Formal abstract with keywords 300
Overview 300 300
Topic and purpose 200 200
Significance 200 200
Framework and general questions research questions 100 200
Limitations 100
Literature Review 30% 10%
Theory grounding/framing the research question or issue 500 350
Review of relevant literature and prior empirical research in areas 1000 650
Design and Methodology 50% 20%
Overall approach and fit to research question 250 200
Case, site, or population selection and sampling strategies 500 400
Access, role, reciprocity, trust, rapport issues 200 150
Reflective biography/situation of self 200 200
Ethical and political considerations 200 200
Data collection methods 500 400
Data management plan 200
Timeline 100
Data analysis procedures 250 250
Steps taken to ensure reliability, trustworthiness, and credibility 100 200
Findings/Discussion 0% 45%
Themes and patterns; examples 3,000
Discussion of findings (tying to theory and lit review) 1,500
Final sections 0% 15%
Limitations 500
Conclusion 1000
TOTAL WORDS 5,000 10,000

Figure 20.1 shows one model for what to include in an article or research proposal, comparing the elements of each with a default word count for each section. Please note that you will want to follow whatever specific guidelines you have been provided by the venue you are submitting the article/proposal to: the IRB, the NSF, the Journal of Qualitative Research . In fact, I encourage you to adapt the default model as needed by swapping out expected word counts for each section and adding or varying the sections to match expectations for your particular publication venue. [2]

You will notice a few things about the default model guidelines. First, while half of the proposal is spent discussing the research design, this section is shortened (but still included) for the article. There are a few elements that only show up in the proposal (e.g., the limitations section is in the introductory section here—it will be more fully developed in the conclusory section in the article). Obviously, you don’t have findings in the proposal, so this is an entirely new section for the article. Note that the article does not include a data management plan or a timeline—two aspects that most proposals require.

It might be helpful to find and maintain examples of successfully written sections that you can use as models for your own writing. I have included a few of these throughout the textbook and have included a few more at the end of this chapter.

Make an Argument

Some qualitative researchers, particularly those engaged in deep ethnographic research, focus their attention primarily if not exclusively on describing the data. They might even eschew the notion that they should make an “argument” about the data, preferring instead to use thick descriptions to convey interpretations. Bracketing the contrast between interpretation and argument for the moment, most readers will expect you to provide an argument about your data, and this argument will be in answer to whatever research question you eventually articulate (remember, research questions are allowed to shift as you get further into data collection and analysis). It can be frustrating to read a well-developed study with clear and elegant descriptions and no argument. The argument is the point of the research, and if you do not have one, 99 percent of the time, you are not finished with your analysis. Calarco ( 2020 ) suggests you imagine a pyramid, with all of your data forming the basis and all of your findings forming the middle section; the top/point of the pyramid is your argument, “what the patterns in your data tell us about how the world works or ought to work” ( 181 ).

The academic community to which you belong will be looking for an argument that relates to or develops theory. This is the theoretical generalizability promise of qualitative research. An academic audience will want to know how your findings relate to previous findings, theories, and concepts (the literature review; see chapter 9). It is thus vitally important that you go back to your literature review (or develop a new one) and draw those connections in your discussion and/or conclusion. When writing to other audiences, you will still want an argument, although it may not be written as a theoretical one. What do I mean by that? Even if you are not referring to previous literature or developing new theories or adapting older ones, a simple description of your findings is like dumping a lot of leaves in the lap of your audience. They still deserve to know about the shape of the forest. Maybe provide them a road map through it. Do this by telling a clear and cogent story about the data. What is the primary theme, and why is it important? What is the point of your research? [3]

A beautifully written piece of research based on participant observation [and/or] interviews brings people to life, and helps the reader understand the challenges people face. You are trying to use vivid, detailed and compelling words to help the reader really understand the lives of the people you studied. And you are trying to connect the lived experiences of these people to a broader conceptual point—so that the reader can understand why it matters. ( Lareau 2021:259 )

Do not hide your argument. Make it the focal point of your introductory section, and repeat it as often as needed to ensure the reader remembers it. I am always impressed when I see researchers do this well (see, e.g., Zelizer 1996 ).

Here are a few other suggestions for writing your article: Be brief. Do not overwhelm the reader with too many words; make every word count. Academics are particularly prone to “overwriting” as a way of demonstrating proficiency. Don’t. When writing your methods section, think about it as a “recipe for your work” that allows other researchers to replicate if they so wish ( Calarco 2020:186 ). Convey all the necessary information clearly, succinctly, and accurately. No more, no less. [4] Do not try to write from “beginning to end” in that order. Certain sections, like the introductory section, may be the last ones you write. I find the methods section the easiest, so I often begin there. Calarco ( 2020 ) begins with an outline of the analysis and results section and then works backward from there to outline the contribution she is making, then the full introduction that serves as a road map for the writing of all sections. She leaves the abstract for the very end. Find what order best works for you.

Presenting at Conferences and Job Talks

Students and faculty are primarily called upon to publicly present their research in two distinct contexts—the academic conference and the “job talk.” By convention, conference presentations usually run about fifteen minutes and, at least in sociology and other social sciences, rely primarily on the use of a slideshow (PowerPoint Presentation or PPT) presentation. You are usually one of three or four presenters scheduled on the same “panel,” so it is an important point of etiquette to ensure that your presentation falls within the allotted time and does not crowd into that of the other presenters. Job talks, on the other hand, conventionally require a forty- to forty-five-minute presentation with a fifteen- to twenty-minute question and answer (Q&A) session following it. You are the only person presenting, so if you run over your allotted time, it means less time for the Q&A, which can disturb some audience members who have been waiting for a chance to ask you something. It is sometimes possible to incorporate questions during your presentation, which allows you to take the entire hour, but you might end up shorting your presentation this way if the questions are numerous. It’s best for beginners to stick to the “ask me at the end” format (unless there is a simple clarifying question that can easily be addressed and makes the presentation run more smoothly, as in the case where you simply forgot to include information on the number of interviews you conducted).

For slideshows, you should allot two or even three minutes for each slide, never less than one minute. And those slides should be clear, concise, and limited. Most of what you say should not be on those slides at all. The slides are simply the main points or a clear image of what you are speaking about. Include bulleted points (words, short phrases), not full sentences. The exception is illustrative quotations from transcripts or fieldnotes. In those cases, keep to one illustrative quote per slide, and if it is long, bold or otherwise, highlight the words or passages that are most important for the audience to notice. [5]

Figure 20.2 provides a possible model for sections to include in either a conference presentation or a job talk, with approximate times and approximate numbers of slides. Note the importance (in amount of time spent) of both the research design and the findings/results sections, both of which have been helpfully starred for you. Although you don’t want to short any of the sections, these two sections are the heart of your presentation.

 
Introduction 5 min 1 1 min 1
Lit Review (background/justification) 1-2 min 1 3-5 min 2
Research goals/questions 1 min 1 1-2 min 1
Research design/data/methods** 2 min** 1 5 min** 2
Overview 1 min 1 3 min 1
Findings/results** 4-8 min** 4-8 20 min** 4-6
Discussion/implications 1 min 1 5 min 1
Thanks/References 1 min 1 1 min 1

Fig 20.2. Suggested Slideshow Times and Number of Slides

Should you write out your script to read along with your presentation? I have seen this work well, as it prevents presenters from straying off topic and keeps them to the time allotted. On the other hand, these presentations can seem stiff and wooden. Personally, although I have a general script in advance, I like to speak a little more informally and engagingly with each slide, sometimes making connections with previous panelists if I am at a conference. This means I have to pay attention to the time, and I sometimes end up breezing through one section more quickly than I would like. Whatever approach you take, practice in advance. Many times. With an audience. Ask for feedback, and pay attention to any presentation issues that arise (e.g., Do you speak too fast? Are you hard to hear? Do you stumble over a particular word or name?).

Even though there are rules and guidelines for what to include, you will still want to make your presentation as engaging as possible in the little amount of time you have. Calarco ( 2020:274 ) recommends trying one of three story structures to frame your presentation: (1) the uncertain explanation , where you introduce a phenomenon that has not yet been fully explained and then describe how your research is tackling this; (2) the uncertain outcome , where you introduce a phenomenon where the consequences have been unclear and then you reveal those consequences with your research; and (3) the evocative example , where you start with some interesting example from your research (a quote from the interview transcripts, for example) or the real world and then explain how that example illustrates the larger patterns you found in your research. Notice that each of these is a framing story. Framing stories are essential regardless of format!

A Word on Universal Design

Please consider accessibility issues during your presentation, and incorporate elements of universal design into your slideshow. The basic idea behind universal design in presentations is that to the greatest extent possible, all people should be able to view, hear, or otherwise take in your presentation without needing special individual adaptations. If you can make your presentation accessible to people with visual impairment or hearing loss, why not do so? For example, one in twelve men is color-blind, unable to differentiate between certain colors, red/green being the most common problem. So if you design a graphic that relies on red and green bars, some of your audience members may not be able to properly identify which bar means what. Simple contrasts of black and white are much more likely to be visible to all members of your audience. There are many other elements of good universal design, but the basic foundation of all of them is that you consider how to make your presentation as accessible as possible at the outset. For example, include captions whenever possible, both as descriptions on slides and as images on slides and for any audio or video clips you are including; keep font sizes large enough to read from the back of the room; and face the audience when you are.

Poster Design

Undergraduate students who present at conferences are often encouraged to present at “poster sessions.” This usually means setting up a poster version of your research in a large hall or convention space at a set period of time—ninety minutes is common. Your poster will be one of dozens, and conference-goers will wander through the space, stopping intermittently at posters that attract them. Those who stop by might ask you questions about your research, and you are expected to be able to talk intelligently for two or three minutes. It’s a fairly easy way to practice presenting at conferences, which is why so many organizations hold these special poster sessions.

Null

A good poster design will be immediately attractive to passersby and clearly and succinctly describe your research methods, findings, and conclusions. Some students have simply shrunk down their research papers to manageable sizes and then pasted them on a poster, all twelve to fifteen pages of them. Don’t do that! Here are some better suggestions: State the main conclusion of your research in large bold print at the top of your poster, on brightly colored (contrasting) paper, and paste in a QR code that links to your full paper online ( Calarco 2020:280 ). Use the rest of the poster board to provide a couple of highlights and details of the study. For an interview-based study, for example, you will want to put in some details about your sample (including number of interviews) and setting and then perhaps one or two key quotes, also distinguished by contrasting color background.

Incorporating Visual Design in Your Presentations

In addition to ensuring that your presentation is accessible to as large an audience as possible, you also want to think about how to display your data in general, particularly how to use charts and graphs and figures. [6] The first piece of advice is, use them! As the saying goes, a picture is worth a thousand words. If you can cut to the chase with a visually stunning display, do so. But there are visual displays that are stunning, and then there are the tired, hard-to-see visual displays that predominate at conferences. You can do better than most presenters by simply paying attention here and committing yourself to a good design. As with model section passages, keep a file of visual displays that work as models for your own presentations. Find a good guidebook to presenting data effectively (Evergreen 2018 , 2019 ; Schwabisch 2021) , and refer to it often.

Let me make a few suggestions here to get you started. First, test every visual display on a friend or colleague to find out how quickly they can understand the point you are trying to convey. As with reading passages aloud to ensure that your writing works, showing someone your display is the quickest way to find out if it works. Second, put the point in the title of the display! When writing for an academic journal, there will be specific conventions of what to include in the title (full description including methods of analysis, sample, dates), but in a public presentation, there are no limiting rules. So you are free to write as your title “Working-Class College Students Are Three Times as Likely as Their Peers to Drop Out of College,” if that is the point of the graphic display. It certainly helps the communicative aspect. Third, use the themes available to you in Excel for creating graphic displays, but alter them to better fit your needs . Consider adding dark borders to bars and columns, for example, so that they appear crisper for your audience. Include data callouts and labels, and enlarge them so they are clearly visible. When duplicative or otherwise unnecessary, drop distracting gridlines and labels on the y-axis (the vertical one). Don’t go crazy adding different fonts, however—keep things simple and clear. Sans serif fonts (those without the little hooks on the ends of letters) read better from a distance. Try to use the same color scheme throughout, even if this means manually changing the colors of bars and columns. For example, when reporting on working-class college students, I use blue bars, while I reserve green bars for wealthy students and yellow bars for students in the middle. I repeat these colors throughout my presentations and incorporate different colors when talking about other items or factors. You can also try using simple grayscale throughout, with pops of color to indicate a bar or column or line that is of the most interest. These are just some suggestions. The point is to take presentation seriously and to pay attention to visual displays you are using to ensure they effectively communicate what you want them to communicate. I’ve included a data visualization checklist from Evergreen ( 2018 ) here.

Ethics of Presentation and Reliability

Until now, all the data you have collected have been yours alone. Once you present the data, however, you are sharing sometimes very intimate information about people with a broader public. You will find yourself balancing between protecting the privacy of those you’ve interviewed and observed and needing to demonstrate the reliability of the study. The more information you provide to your audience, the more they can understand and appreciate what you have found, but this also may pose risks to your participants. There is no one correct way to go about finding the right balance. As always, you have a duty to consider what you are doing and must make some hard decisions.

Null

The most obvious place we see this paradox emerge is when you mask your data to protect the privacy of your participants. It is standard practice to provide pseudonyms, for example. It is such standard practice that you should always assume you are being given a pseudonym when reading a book or article based on qualitative research. When I was a graduate student, I tried to find information on how best to construct pseudonyms but found little guidance. There are some ethical issues here, I think. [7] Do you create a name that has the same kind of resonance as the original name? If the person goes by a nickname, should you use a nickname as a pseudonym? What about names that are ethnically marked (as in, almost all of them)? Is there something unethical about reracializing a person? (Yes!) In her study of adolescent subcultures, Wilkins ( 2008 ) noted, “Because many of the goths used creative, alternative names rather than their given names, I did my best to reproduce the spirit of their chosen names” ( 24 ).

Your reader or audience will want to know all the details about your participants so that they can gauge both your credibility and the reliability of your findings. But how many details are too many? What if you change the name but otherwise retain all the personal pieces of information about where they grew up, and how old they were when they got married, and how many children they have, and whether they made a splash in the news cycle that time they were stalked by their ex-boyfriend? At some point, those details are going to tip over into the zone of potential unmasking. When you are doing research at one particular field site that may be easily ascertained (as when you interview college students, probably at the institution at which you are a student yourself), it is even more important to be wary of providing too many details. You also need to think that your participants might read what you have written, know things about the site or the population from which you drew your interviews, and figure out whom you are talking about. This can all get very messy if you don’t do more than simply pseudonymize the people you interviewed or observed.

There are some ways to do this. One, you can design a study with all of these risks in mind. That might mean choosing to conduct interviews or observations at multiple sites so that no one person can be easily identified. Another is to alter some basic details about your participants to protect their identity or to refuse to provide all the information when selecting quotes . Let’s say you have an interviewee named “Anna” (a pseudonym), and she is a twenty-four-year-old Latina studying to be an engineer. You want to use a quote from Anna about racial discrimination in her graduate program. Instead of attributing the quote to Anna (whom your reader knows, because you’ve already told them, is a twenty-four-year-old Latina studying engineering), you might simply attribute the quote to “Latina student in STEM.” Taking this a step further, you might leave the quote unattributed, providing a list of quotes about racial discrimination by “various students.”

The problem with masking all the identifiers, of course, is that you lose some of the analytical heft of those attributes. If it mattered that Anna was twenty-four (not thirty-four) and that she was a Latina and that she was studying engineering, taking out any of those aspects of her identity might weaken your analysis. This is one of those “hard choices” you will be called on to make! A rather radical and controversial solution to this dilemma is to create composite characters , characters based on the reality of the interviews but fully masked because they are not identifiable with any one person. My students are often very queasy about this when I explain it to them. The more positivistic your approach and the more you see individuals rather than social relationships/structure as the “object” of your study, the more employing composites will seem like a really bad idea. But composites “allow researchers to present complex, situated accounts from individuals” without disclosing personal identities ( Willis 2019 ), and they can be effective ways of presenting theory narratively ( Hurst 2019 ). Ironically, composites permit you more latitude when including “dirty laundry” or stories that could harm individuals if their identities became known. Rather than squeezing out details that could identify a participant, the identities are permanently removed from the details. Great difficulty remains, however, in clearly explaining the theoretical use of composites to your audience and providing sufficient information on the reliability of the underlying data.

There are a host of other ethical issues that emerge as you write and present your data. This is where being reflective throughout the process will help. How and what you share of what you have learned will depend on the social relationships you have built, the audiences you are writing or speaking to, and the underlying animating goals of your study. Be conscious about all of your decisions, and then be able to explain them fully, both to yourself and to those who ask.

Our research is often close to us. As a Black woman who is a first-generation college student and a professional with a poverty/working-class origin, each of these pieces of my identity creates nuances in how I engage in my research, including how I share it out. Because of this, it’s important for us to have people in our lives who we trust who can help us, particularly, when we are trying to share our findings. As researchers, we have been steeped in our work, so we know all the details and nuances. Sometimes we take this for granted, and we might not have shared those nuances in conversation or writing or taken some of this information for granted. As I share my research with trusted friends and colleagues, I pay attention to the questions they ask me or the feedback they give when we talk or when they read drafts.

—Kim McAloney, PhD, College Student Services Administration Ecampus coordinator and instructor

Final Comments: Preparing for Being Challenged

Once you put your work out there, you must be ready to be challenged. Science is a collective enterprise and depends on a healthy give and take among researchers. This can be both novel and difficult as you get started, but the more you understand the importance of these challenges, the easier it will be to develop the kind of thick skin necessary for success in academia. Scientists’ authority rests on both the inherent strength of their findings and their ability to convince other scientists of the reliability and validity and value of those findings. So be prepared to be challenged, and recognize this as simply another important aspect of conducting research!

Considering what challenges might be made as you design and conduct your study will help you when you get to the writing and presentation stage. Address probable challenges in your final article, and have a planned response to probable questions in a conference presentation or job talk. The following is a list of common challenges of qualitative research and how you might best address them:

  • Questions about generalizability . Although qualitative research is not statistically generalizable (and be prepared to explain why), qualitative research is theoretically generalizable. Discuss why your findings here might tell us something about related phenomena or contexts.
  • Questions about reliability . You probably took steps to ensure the reliability of your findings. Discuss them! This includes explaining the use and value of multiple data sources and defending your sampling and case selections. It also means being transparent about your own position as researcher and explaining steps you took to ensure that what you were seeing was really there.
  • Questions about replicability. Although qualitative research cannot strictly be replicated because the circumstances and contexts will necessarily be different (if only because the point in time is different), you should be able to provide as much detail as possible about how the study was conducted so that another researcher could attempt to confirm or disconfirm your findings. Also, be very clear about the limitations of your study, as this allows other researchers insight into what future research might be warranted.

None of this is easy, of course. Writing beautifully and presenting clearly and cogently require skill and practice. If you take anything from this chapter, it is to remember that presentation is an important and essential part of the research process and to allocate time for this as you plan your research.

Data Visualization Checklist for Slideshow (PPT) Presentations

Adapted from Evergreen ( 2018 )

Text checklist

  • Short catchy, descriptive titles (e.g., “Working-class students are three times as likely to drop out of college”) summarize the point of the visual display
  • Subtitled and annotations provide additional information (e.g., “note: male students also more likely to drop out”)
  • Text size is hierarchical and readable (titles are largest; axes labels smallest, which should be at least 20points)
  • Text is horizontal. Audience members cannot read vertical text!
  • All data labeled directly and clearly: get rid of those “legends” and embed the data in your graphic display
  • Labels are used sparingly; avoid redundancy (e.g., do not include both a number axis and a number label)

Arrangement checklist

  • Proportions are accurate; bar charts should always start at zero; don’t mislead the audience!
  • Data are intentionally ordered (e.g., by frequency counts). Do not leave ragged alphabetized bar graphs!
  • Axis intervals are equidistant: spaces between axis intervals should be the same unit
  • Graph is two-dimensional. Three-dimensional and “bevelled” displays are confusing
  • There is no unwanted decoration (especially the kind that comes automatically through the PPT “theme”). This wastes your space and confuses.

Color checklist

  • There is an intentional color scheme (do not use default theme)
  • Color is used to identify key patterns (e.g., highlight one bar in red against six others in greyscale if this is the bar you want the audience to notice)
  • Color is still legible when printed in black and white
  • Color is legible for people with color blindness (do not use red/green or yellow/blue combinations)
  • There is sufficient contrast between text and background (black text on white background works best; be careful of white on dark!)

Lines checklist

  • Be wary of using gridlines; if you do, mute them (grey, not black)
  • Allow graph to bleed into surroundings (don’t use border lines)
  • Remove axis lines unless absolutely necessary (better to label directly)

Overall design checklist

  • The display highlights a significant finding or conclusion that your audience can ‘”see” relatively quickly
  • The type of graph (e.g., bar chart, pie chart, line graph) is appropriate for the data. Avoid pie charts with more than three slices!
  • Graph has appropriate level of precision; if you don’t need decimal places
  • All the chart elements work together to reinforce the main message

Universal Design Checklist for Slideshow (PPT) Presentations

  • Include both verbal and written descriptions (e.g., captions on slides); consider providing a hand-out to accompany the presentation
  • Microphone available (ask audience in back if they can clearly hear)
  • Face audience; allow people to read your lips
  • Turn on captions when presenting audio or video clips
  • Adjust light settings for visibility
  • Speak slowly and clearly; practice articulation; don’t mutter or speak under your breath (even if you have something humorous to say – say it loud!)
  • Use Black/White contrasts for easy visibility; or use color contrasts that are real contrasts (do not rely on people being able to differentiate red from green, for example)
  • Use easy to read font styles and avoid too small font sizes: think about what an audience member in the back row will be able to see and read.
  • Keep your slides simple: do not overclutter them; if you are including quotes from your interviews, take short evocative snippets only, and bold key words and passages. You should also read aloud each passage, preferably with feeling!

Supplement: Models of Written Sections for Future Reference

Data collection section example.

Interviews were semi structured, lasted between one and three hours, and took place at a location chosen by the interviewee. Discussions centered on four general topics: (1) knowledge of their parent’s immigration experiences; (2) relationship with their parents; (3) understanding of family labor, including language-brokering experiences; and (4) experiences with school and peers, including any future life plans. While conducting interviews, I paid close attention to respondents’ nonverbal cues, as well as their use of metaphors and jokes. I conducted interviews until I reached a point of saturation, as indicated by encountering repeated themes in new interviews (Glaser and Strauss 1967). Interviews were audio recorded, transcribed with each interviewee’s permission, and conducted in accordance with IRB protocols. Minors received permission from their parents before participation in the interview. ( Kwon 2022:1832 )

Justification of Case Selection / Sample Description Section Example

Looking at one profession within one organization and in one geographic area does impose limitations on the generalizability of our findings. However, it also has advantages. We eliminate the problem of interorganizational heterogeneity. If multiple organizations are studied simultaneously, it can make it difficult to discern the mechanisms that contribute to racial inequalities. Even with a single occupation there is considerable heterogeneity, which may make understanding how organizational structure impacts worker outcomes difficult. By using the case of one group of professionals in one religious denomination in one geographic region of the United States, we clarify how individuals’ perceptions and experiences of occupational inequality unfold in relation to a variety of observed and unobserved occupational and contextual factors that might be obscured in a larger-scale study. Focusing on a specific group of professionals allows us to explore and identify ways that formal organizational rules combine with informal processes to contribute to the persistence of racial inequality. ( Eagle and Mueller 2022:1510–1511 )

Ethics Section Example

I asked everyone who was willing to sit for a formal interview to speak only for themselves and offered each of them a prepaid Visa Card worth $25–40. I also offered everyone the opportunity to keep the card and erase the tape completely at any time they were dissatisfied with the interview in any way. No one asked for the tape to be erased; rather, people remarked on the interview being a really good experience because they felt heard. Each interview was professionally transcribed and for the most part the excerpts are literal transcriptions. In a few places, the excerpts have been edited to reduce colloquial features of speech (e.g., you know, like, um) and some recursive elements common to spoken language. A few excerpts were placed into standard English for clarity. I made this choice for the benefit of readers who might otherwise find the insights and ideas harder to parse in the original. However, I have to acknowledge this as an act of class-based violence. I tried to keep the original phrasing whenever possible. ( Pascale 2021:235 )

Further Readings

Calarco, Jessica McCrory. 2020. A Field Guide to Grad School: Uncovering the Hidden Curriculum . Princeton, NJ: Princeton University Press. Don’t let the unassuming title mislead you—there is a wealth of helpful information on writing and presenting data included here in a highly accessible manner. Every graduate student should have a copy of this book.

Edwards, Mark. 2012. Writing in Sociology . Thousand Oaks, CA: SAGE. An excellent guide to writing and presenting sociological research by an Oregon State University professor. Geared toward undergraduates and useful for writing about either quantitative or qualitative research or both.

Evergreen, Stephanie D. H. 2018. Presenting Data Effectively: Communicating Your Findings for Maximum Impact . Thousand Oaks, CA: SAGE. This is one of my very favorite books, and I recommend it highly for everyone who wants their presentations and publications to communicate more effectively than the boring black-and-white, ragged-edge tables and figures academics are used to seeing.

Evergreen, Stephanie D. H. 2019. Effective Data Visualization 2 . Thousand Oaks, CA: SAGE. This is an advanced primer for presenting clean and clear data using graphs, tables, color, font, and so on. Start with Evergreen (2018), and if you graduate from that text, move on to this one.

Schwabisch, Jonathan. 2021. Better Data Visualizations: A Guide for Scholars, Researchers, and Wonks . New York: Columbia University Press. Where Evergreen’s (2018, 2019) focus is on how to make the best visual displays possible for effective communication, this book is specifically geared toward visual displays of academic data, both quantitative and qualitative. If you want to know when it is appropriate to use a pie chart instead of a stacked bar chart, this is the reference to use.

  • Some examples: Qualitative Inquiry , Qualitative Research , American Journal of Qualitative Research , Ethnography , Journal of Ethnographic and Qualitative Research , Qualitative Report , Qualitative Sociology , and Qualitative Studies . ↵
  • This is something I do with every article I write: using Excel, I write each element of the expected article in a separate row, with one column for “expected word count” and another column for “actual word count.” I fill in the actual word count as I write. I add a third column for “comments to myself”—how things are progressing, what I still need to do, and so on. I then use the “sum” function below each of the first two columns to keep a running count of my progress relative to the final word count. ↵
  • And this is true, I would argue, even when your primary goal is to leave space for the voices of those who don’t usually get a chance to be part of the conversation. You will still want to put those voices in some kind of choir, with a clear direction (song) to be sung. The worst thing you can do is overwhelm your audience with random quotes or long passages with no key to understanding them. Yes, a lot of metaphors—qualitative researchers love metaphors! ↵
  • To take Calarco’s recipe analogy further, do not write like those food bloggers who spend more time discussing the color of their kitchen or the experiences they had at the market than they do the actual cooking; similarly, do not write recipes that omit crucial details like the amount of flour or the size of the baking pan used or the temperature of the oven. ↵
  • The exception is the “compare and contrast” of two or more quotes, but use caution here. None of the quotes should be very long at all (a sentence or two each). ↵
  • Although this section is geared toward presentations, many of the suggestions could also be useful when writing about your data. Don’t be afraid to use charts and graphs and figures when writing your proposal, article, thesis, or dissertation. At the very least, you should incorporate a tabular display of the participants, sites, or documents used. ↵
  • I was so puzzled by these kinds of questions that I wrote one of my very first articles on it ( Hurst 2008 ). ↵

The visual presentation of data or information through graphics such as charts, graphs, plots, infographics, maps, and animation.  Recall the best documentary you ever viewed, and there were probably excellent examples of good data visualization there (for me, this was An Inconvenient Truth , Al Gore’s film about climate change).  Good data visualization allows more effective communication of findings of research, particularly in public presentations (e.g., slideshows).

Introduction to Qualitative Research Methods Copyright © 2023 by Allison Hurst is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License , except where otherwise noted.

Art of Presentations

[Guide] How to Present Qualitative Research Findings in PowerPoint?

By: Author Shrot Katewa

[Guide] How to Present Qualitative Research Findings in PowerPoint?

As a researcher, it is quite pointless to do the research if we are unable to share the findings with our audience appropriately! Using PowerPoint is one of the best ways to present research outcomes. But, how does one present qualitative research findings using PowerPoint?

In order to present the qualitative research findings using PowerPoint, you need to create a robust structure for your presentation, make it engaging and visually appealing, present the patterns with explanations for it and highlight the conclusion of your research findings.

In this article, we will help you understand the structure of your presentation. Plus, we’ll share some handy tips that will make your qualitative research presentation really effective!

How to Create a Structure for your Qualitative Research Presentation?

Creating the right structure for your presentation is key to ensuring that it is correctly understood by your audience.

The structure of your Research Presentation not only makes it easier for you to create the document, it also makes it simple for the audience to understand what all will be covered in the presentation at the time of presenting it to your audience.

Furthermore, having a robust structure is a great way to ensure that you don’t miss out on any of the points while working on creating the presentation.

But, what structure should one follow?

Creating a good structure can be tricky for some. Thus, I’m sharing what has worked well for me during my previous research projects.

NOTE – It is important to note that although the following structure is highly effective for most research findings presentation, it has been generalized in order to serve a wide range of research projects. You may want to take a look at points that are very specific to the nature of your research project and include them at your discretion.

Here’s my recommended structure to create your Research Findings presentation –

1. Objective of the Research

A great way to start your presentation is to highlight the objective of your research project.

It is important to remember that merely sharing the objective may sometimes not be enough. A short backstory along with the purpose of your research project can pack a powerful punch ! It not only validates the reasoning for your project but also subtly establishes trust with your audience.

However, do make sure that you’re not reading the backstory from the slide. Let it flow naturally when you are delivering the presentation. Keep the presentation as minimalistic as possible.

2. Key Parameters Considered for Measurement

Once you’ve established the objective, the next thing that you may want to do is perhaps share the key parameters considered for the success of your project.

Every research project, including qualitative research, needs to have a few key parameters to measure against the objective of the research.

For example – If the goal of your project is to gather the sentiments of a certain group of people for a particular product, you may need to measure their feelings. Are they happy or unhappy using the product? How do they perceive the branding of the product? Is it affordable?

Make sure that you list down all such key parameters that were considered while conducting the qualitative research.

In general, laying these out before sharing the outcome can help your audience think from your perspective and look at the findings from the correct lens.

3. Research Methodology Adopted

The next thing that you may want to include in your presentation is the methodology that you adopted for conducting the research.

By knowing your approach, the audience can be better prepared for the outcome of your project. Ensure that you provide sound reasoning for the chosen methodology.

This section of your presentation can also showcase some pictures of the research being conducted. If you have captured a video, include that. Doing this provides further validation of your project.

4. Research Outcomes (Presenting Descriptive Analysis)

presentation qualitative data collection

This is the section that will constitute the bulk of the your presentation.

Use the slides in this section to describe the observations, and the resulting outcomes on each of the key parameters that were considered for the research project.

It is usually a good idea to dedicate at least 1 or more slides for each parameter . Make sure that you present data wherever possible. However, ensure that the data presented can be easily comprehended.

Provide key learnings from the data, highlight any outliers, and possible reasoning for it. Try not to go too in-depth with the stats as this can overwhelm the audience. Remember, a presentation is most helpful when it is used to provide key highlights of the research !

Apart from using the data, make sure that you also include a few quotes from the participants.

5. Summary and Learnings from the Research

Once you’ve taken the audience through the core part of your research findings, it is a good practice to summarize the key learnings from each of the section of your project.

Make sure your touch upon some of the key learnings covered in the research outcome of your presentation.

Furthermore, include any additional observations and key points that you may have had which were previously not covered.

The summary slide also often acts as “Key Takeaways” from the research for your audience. Thus, make sure that you maintain brevity and highlight only the points that you want your audience to remember even after the presentation.

6. Inclusions and Exclusions (if any)

While this can be an optional section for some of the researchers.

However, dedicating a section on inclusions and exclusions in your presentation can be a great value add! This section helps your audience understand the key factors that were excluded (or included) on purpose!

Moreover, it creates a sense of thoroughness in the minds of your audience.

7. Conclusion of the Research

The purpose of the conclusion slide of your research findings presentation is to revisit the objective, and present a conclusion.

A conclusion may simply validate or nullify the objective. It may sometimes do neither. Nevertheless, having a conclusion slide makes your presentation come a full circle. It creates this sense of completion in the minds of your audience.

8. Questions

Finally, since your audience did not spend as much time as you did on the research project, people are bound to have a few questions.

Thus, the last part of your presentation structure should be dedicated to allowing your audience to ask questions.

Tips for Effectively Presenting Qualitative Research Findings using PowerPoint

For a presentation to be effective, it is important that the presentation is not only well structured but also that it is well created and nicely delivered!

While we have already covered the structure, let me share with you some tips that you can help you create and deliver the presentation effectively.

Tip 1 – Use Visuals

presentation qualitative data collection

Using visuals in your presentation is a great way to keep the presentations engaging!

Visual aids not only help make the presentation less boring, but it also helps your audience in retaining the information better!

So, use images and videos of the actual research wherever possible. If these do not suffice or do not give a professional feel, there are a number of resources online from where you can source royalty-free images.

My recommendation for high-quality royalty-free images would be either Unsplash or Pexels . Both are really good. The only downside is that they often do not provide the perfect image that can be used. That said, it can get the job done for at least half the time.

If you are unable to find the perfect free image, I recommend checking out Dreamstime . They have a huge library of images and are much cheaper than most of the other image banks. I personally use Dreamstime for my presentation projects!

Tip 2 – Tell a Story (Don’t Show Just Data!)

I cannot stress enough on how important it is to give your presentation a human touch. Delivering a presentation in the form of a story does just that! Furthermore, storytelling is also a great tool for visualization .

Data can be hard-hitting, whereas a touching story can tickle the emotions of your audience on various levels!

One of the best ways to present a story with your research project is to start with the backstory of the objective. We’ve already talked about this in the earlier part of this article.

Start with why is this research project is so important. Follow a story arc that provides an exciting experience of the beginning, the middle, and a progression towards a climax; much like a plot of a soap opera.

Tip 3 – Include Quotes of the Participants

Including quotes of the participants in your research findings presentation not only provides evidence but also demonstrates authenticity!

Quotes function as a platform to include the voice of the target group and provide a peek into the mindset of the target audience.

When using quotes, keep these things in mind –

1. Use Quotes in their Unedited Form

When using quotes in your presentation, make sure that you use them in their raw unedited form.

The need to edit quotes should be only restricted to aid comprehension and sometimes coherence.

Furthermore, when editing the quotes, make sure that you use brackets to insert clarifying words. The standard format for using the brackets is to use square brackets for clarifying words and normal brackets for adding a missing explanation.

2. How to Decide which Quotes to Consider?

It is important to know which quotes to include in your presentation. I use the following 3 criteria when selecting the quote –

  • Relevance – Consider the quotes that are relevant, and trying to convey the point that you want to establish.
  • Length – an ideal quote should be not more than 1-2 sentences long.
  • Choose quotes that are well-expressed and striking in nature.

3. Preserve Identity of the Participant

It is important to preserve and protect the identity of the participant. This can be done by maintaining confidentiality and anonymity.

Thus, refrain from using the name of the participant. An alternative could be using codes, using pseudonyms (made up names) or simply using other general non-identifiable parameters.

Do note, when using pseudonyms, remember to highlight it in the presentation.

If, however, you do need to use the name of the respondent, make sure that the participant is okay with it and you have adequate permissions to use their name.

Tip 4 – Make your Presentation Visually Appealing and Engaging

It is quite obvious for most of us that we need to create a visually appealing presentation. But, making it pleasing to the eye can be a bit challenging.

Fortunately, we wrote a detailed blog post with tips on how to make your presentation attractive. It provides you with easy and effective tips that you can use even as a beginner! Make sure you check that article.

7 EASY tips that ALWAYS make your PPT presentation attractive (even for beginners)

In addition to the tips mentioned in the article, let me share a few things that you can do which are specific to research outcome presentations.

4.1 Use a Simple Color Scheme

Using the right colors are key to make a presentation look good.

One of the most common mistakes that people make is use too many colors in their presentation!

My recommendation would be to go with a monochromatic color scheme in PowerPoint .

4.2 Make the Data Tables Simple and Visually Appealing

When making a presentation on research outcomes, you are bound to present some data.

But, when data is not presented in a proper manner, it can easily and quickly make your presentation look displeasing! The video below can be a good starting point.

Using neat looking tables can simply transform the way your presentation looks. So don’t just dump the data from excel on your PowerPoint presentation. Spend a few minutes on fixing it!

4.3 Use Graphs and Charts (wherever necessary)

When presenting data, my recommendation would be that graphs and charts should be your first preference.

Using graphs or charts make it easier to read the data, takes less time for the audience to comprehend, and it also helps to identify a trend.

However, make sure that the correct chart type is used when representing the data. The last thing that you want is to poorly represent a key piece of information.

4.4 Use Icons instead of Bullet Points

Consider the following example –

presentation qualitative data collection

This slide could have been created just as easily using bullet points. However, using icons and representing the information in a different format makes the slide pleasing on the eye.

Thus, always try to use icons wherever possible instead of bullet points.

Tip 5 – Include the Outliers

Many times, as a research project manager, we tend to focus on the trends extracted from a data set.

While it is important to identify patterns in the data and provide an adequate explanation for the pattern, it is equally important sometimes to highlight the outliers prominently.

It is easy to forget that there may be hidden learnings even in the outliers. At times, the data trend may be re-iterating the common wisdom. However, upon analyzing the outlier data points, you may get insight into how a few participants are doing things successfully despite not following the common knowledge.

That said, not every outlier will reveal hidden information. So, do verify what to include and what to exclude.

Tip 6 – Take Inspiration from other Presentations

I admit, making any presentation can be a tough ask let alone making a presentation for showcasing qualitative research findings. This is especially hard when we don’t have the necessary skills for creating a presentation.

One quick way to overcome this challenge could be take inspiration from other similar presentations that we may have liked.

There is no shame in being inspired from others. If you don’t have any handy references, you can surely Google it to find a few examples.

One trick that almost always works for me is using Pinterest .

But, don’t just directly search for a research presentation. You will have little to no success with it. The key is to look for specific examples for inspiration. For eg. search for Title Slide examples, or Image Layout Examples in Presentation.

Tip 7 – Ask Others to Critic your Presentation

The last tip that I would want to provide is to make sure that you share the presentation with supportive colleagues or mentors to attain feedback.

This step can be critical to iron out the chinks in the armor. As research project manager, it is common for you to get a bit too involved with the project. This can lead to possibilities wherein you miss out on things.

A good way to overcome this challenge is to get a fresh perspective on your project and the presentation once it has been prepared.

Taking critical feedback before your final presentation can also prepare you to handle tough questions in an adept manner.

Final Thoughts

It is quite important to ensure that we get it right when working on a presentation that showcases the findings of our research project. After all, we don’t want to be in a situation wherein we put in all the hard-work in the project, but we fail to deliver the outcome appropriately.

I hope you will find the aforementioned tips and structure useful, and if you do, make sure that you bookmark this page and spread the word. Wishing you all the very best for your project!

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • Data Collection | Definition, Methods & Examples

Data Collection | Definition, Methods & Examples

Published on June 5, 2020 by Pritha Bhandari . Revised on June 21, 2023.

Data collection is a systematic process of gathering observations or measurements. Whether you are performing research for business, governmental or academic purposes, data collection allows you to gain first-hand knowledge and original insights into your research problem .

While methods and aims may differ between fields, the overall process of data collection remains largely the same. Before you begin collecting data, you need to consider:

  • The  aim of the research
  • The type of data that you will collect
  • The methods and procedures you will use to collect, store, and process the data

To collect high-quality data that is relevant to your purposes, follow these four steps.

Table of contents

Step 1: define the aim of your research, step 2: choose your data collection method, step 3: plan your data collection procedures, step 4: collect the data, other interesting articles, frequently asked questions about data collection.

Before you start the process of data collection, you need to identify exactly what you want to achieve. You can start by writing a problem statement : what is the practical or scientific issue that you want to address and why does it matter?

Next, formulate one or more research questions that precisely define what you want to find out. Depending on your research questions, you might need to collect quantitative or qualitative data :

  • Quantitative data is expressed in numbers and graphs and is analyzed through statistical methods .
  • Qualitative data is expressed in words and analyzed through interpretations and categorizations.

If your aim is to test a hypothesis , measure something precisely, or gain large-scale statistical insights, collect quantitative data. If your aim is to explore ideas, understand experiences, or gain detailed insights into a specific context, collect qualitative data. If you have several aims, you can use a mixed methods approach that collects both types of data.

  • Your first aim is to assess whether there are significant differences in perceptions of managers across different departments and office locations.
  • Your second aim is to gather meaningful feedback from employees to explore new ideas for how managers can improve.

Prevent plagiarism. Run a free check.

Based on the data you want to collect, decide which method is best suited for your research.

  • Experimental research is primarily a quantitative method.
  • Interviews , focus groups , and ethnographies are qualitative methods.
  • Surveys , observations, archival research and secondary data collection can be quantitative or qualitative methods.

Carefully consider what method you will use to gather data that helps you directly answer your research questions.

Data collection methods
Method When to use How to collect data
Experiment To test a causal relationship. Manipulate variables and measure their effects on others.
Survey To understand the general characteristics or opinions of a group of people. Distribute a list of questions to a sample online, in person or over-the-phone.
Interview/focus group To gain an in-depth understanding of perceptions or opinions on a topic. Verbally ask participants open-ended questions in individual interviews or focus group discussions.
Observation To understand something in its natural setting. Measure or survey a sample without trying to affect them.
Ethnography To study the culture of a community or organization first-hand. Join and participate in a community and record your observations and reflections.
Archival research To understand current or historical events, conditions or practices. Access manuscripts, documents or records from libraries, depositories or the internet.
Secondary data collection To analyze data from populations that you can’t access first-hand. Find existing datasets that have already been collected, from sources such as government agencies or research organizations.

When you know which method(s) you are using, you need to plan exactly how you will implement them. What procedures will you follow to make accurate observations or measurements of the variables you are interested in?

For instance, if you’re conducting surveys or interviews, decide what form the questions will take; if you’re conducting an experiment, make decisions about your experimental design (e.g., determine inclusion and exclusion criteria ).

Operationalization

Sometimes your variables can be measured directly: for example, you can collect data on the average age of employees simply by asking for dates of birth. However, often you’ll be interested in collecting data on more abstract concepts or variables that can’t be directly observed.

Operationalization means turning abstract conceptual ideas into measurable observations. When planning how you will collect data, you need to translate the conceptual definition of what you want to study into the operational definition of what you will actually measure.

  • You ask managers to rate their own leadership skills on 5-point scales assessing the ability to delegate, decisiveness and dependability.
  • You ask their direct employees to provide anonymous feedback on the managers regarding the same topics.

You may need to develop a sampling plan to obtain data systematically. This involves defining a population , the group you want to draw conclusions about, and a sample, the group you will actually collect data from.

Your sampling method will determine how you recruit participants or obtain measurements for your study. To decide on a sampling method you will need to consider factors like the required sample size, accessibility of the sample, and timeframe of the data collection.

Standardizing procedures

If multiple researchers are involved, write a detailed manual to standardize data collection procedures in your study.

This means laying out specific step-by-step instructions so that everyone in your research team collects data in a consistent way – for example, by conducting experiments under the same conditions and using objective criteria to record and categorize observations. This helps you avoid common research biases like omitted variable bias or information bias .

This helps ensure the reliability of your data, and you can also use it to replicate the study in the future.

Creating a data management plan

Before beginning data collection, you should also decide how you will organize and store your data.

  • If you are collecting data from people, you will likely need to anonymize and safeguard the data to prevent leaks of sensitive information (e.g. names or identity numbers).
  • If you are collecting data via interviews or pencil-and-paper formats, you will need to perform transcriptions or data entry in systematic ways to minimize distortion.
  • You can prevent loss of data by having an organization system that is routinely backed up.

Finally, you can implement your chosen methods to measure or observe the variables you are interested in.

The closed-ended questions ask participants to rate their manager’s leadership skills on scales from 1–5. The data produced is numerical and can be statistically analyzed for averages and patterns.

To ensure that high quality data is recorded in a systematic way, here are some best practices:

  • Record all relevant information as and when you obtain data. For example, note down whether or how lab equipment is recalibrated during an experimental study.
  • Double-check manual data entry for errors.
  • If you collect quantitative data, you can assess the reliability and validity to get an indication of your data quality.

Receive feedback on language, structure, and formatting

Professional editors proofread and edit your paper by focusing on:

  • Academic style
  • Vague sentences
  • Style consistency

See an example

presentation qualitative data collection

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Student’s  t -distribution
  • Normal distribution
  • Null and Alternative Hypotheses
  • Chi square tests
  • Confidence interval
  • Cluster sampling
  • Stratified sampling
  • Data cleansing
  • Reproducibility vs Replicability
  • Peer review
  • Likert scale

Research bias

  • Implicit bias
  • Framing effect
  • Cognitive bias
  • Placebo effect
  • Hawthorne effect
  • Hindsight bias
  • Affect heuristic

Data collection is the systematic process by which observations or measurements are gathered in research. It is used in many different contexts by academics, governments, businesses, and other organizations.

When conducting research, collecting original data has significant advantages:

  • You can tailor data collection to your specific research aims (e.g. understanding the needs of your consumers or user testing your website)
  • You can control and standardize the process for high reliability and validity (e.g. choosing appropriate measurements and sampling methods )

However, there are also some drawbacks: data collection can be time-consuming, labor-intensive and expensive. In some cases, it’s more efficient to use secondary data that has already been collected by someone else, but the data might be less reliable.

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to systematically measure variables and test hypotheses . Qualitative methods allow you to explore concepts and experiences in more detail.

Reliability and validity are both about how well a method measures something:

  • Reliability refers to the  consistency of a measure (whether the results can be reproduced under the same conditions).
  • Validity   refers to the  accuracy of a measure (whether the results really do represent what they are supposed to measure).

If you are doing experimental research, you also have to consider the internal and external validity of your experiment.

Operationalization means turning abstract conceptual ideas into measurable observations.

For example, the concept of social anxiety isn’t directly observable, but it can be operationally defined in terms of self-rating scores, behavioral avoidance of crowded places, or physical anxiety symptoms in social situations.

Before collecting data , it’s important to consider how you will operationalize the variables that you want to measure.

In mixed methods research , you use both qualitative and quantitative data collection and analysis methods to answer your research question .

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2023, June 21). Data Collection | Definition, Methods & Examples. Scribbr. Retrieved June 24, 2024, from https://www.scribbr.com/methodology/data-collection/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, qualitative vs. quantitative research | differences, examples & methods, sampling methods | types, techniques & examples, get unlimited documents corrected.

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Neurol Res Pract

Logo of neurrp

How to use and assess qualitative research methods

Loraine busetto.

1 Department of Neurology, Heidelberg University Hospital, Im Neuenheimer Feld 400, 69120 Heidelberg, Germany

Wolfgang Wick

2 Clinical Cooperation Unit Neuro-Oncology, German Cancer Research Center, Heidelberg, Germany

Christoph Gumbinger

Associated data.

Not applicable.

This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. Qualitative research can be defined as the study of the nature of phenomena and is especially appropriate for answering questions of why something is (not) observed, assessing complex multi-component interventions, and focussing on intervention improvement. The most common methods of data collection are document study, (non-) participant observations, semi-structured interviews and focus groups. For data analysis, field-notes and audio-recordings are transcribed into protocols and transcripts, and coded using qualitative data management software. Criteria such as checklists, reflexivity, sampling strategies, piloting, co-coding, member-checking and stakeholder involvement can be used to enhance and assess the quality of the research conducted. Using qualitative in addition to quantitative designs will equip us with better tools to address a greater range of research problems, and to fill in blind spots in current neurological research and practice.

The aim of this paper is to provide an overview of qualitative research methods, including hands-on information on how they can be used, reported and assessed. This article is intended for beginning qualitative researchers in the health sciences as well as experienced quantitative researchers who wish to broaden their understanding of qualitative research.

What is qualitative research?

Qualitative research is defined as “the study of the nature of phenomena”, including “their quality, different manifestations, the context in which they appear or the perspectives from which they can be perceived” , but excluding “their range, frequency and place in an objectively determined chain of cause and effect” [ 1 ]. This formal definition can be complemented with a more pragmatic rule of thumb: qualitative research generally includes data in form of words rather than numbers [ 2 ].

Why conduct qualitative research?

Because some research questions cannot be answered using (only) quantitative methods. For example, one Australian study addressed the issue of why patients from Aboriginal communities often present late or not at all to specialist services offered by tertiary care hospitals. Using qualitative interviews with patients and staff, it found one of the most significant access barriers to be transportation problems, including some towns and communities simply not having a bus service to the hospital [ 3 ]. A quantitative study could have measured the number of patients over time or even looked at possible explanatory factors – but only those previously known or suspected to be of relevance. To discover reasons for observed patterns, especially the invisible or surprising ones, qualitative designs are needed.

While qualitative research is common in other fields, it is still relatively underrepresented in health services research. The latter field is more traditionally rooted in the evidence-based-medicine paradigm, as seen in " research that involves testing the effectiveness of various strategies to achieve changes in clinical practice, preferably applying randomised controlled trial study designs (...) " [ 4 ]. This focus on quantitative research and specifically randomised controlled trials (RCT) is visible in the idea of a hierarchy of research evidence which assumes that some research designs are objectively better than others, and that choosing a "lesser" design is only acceptable when the better ones are not practically or ethically feasible [ 5 , 6 ]. Others, however, argue that an objective hierarchy does not exist, and that, instead, the research design and methods should be chosen to fit the specific research question at hand – "questions before methods" [ 2 , 7 – 9 ]. This means that even when an RCT is possible, some research problems require a different design that is better suited to addressing them. Arguing in JAMA, Berwick uses the example of rapid response teams in hospitals, which he describes as " a complex, multicomponent intervention – essentially a process of social change" susceptible to a range of different context factors including leadership or organisation history. According to him, "[in] such complex terrain, the RCT is an impoverished way to learn. Critics who use it as a truth standard in this context are incorrect" [ 8 ] . Instead of limiting oneself to RCTs, Berwick recommends embracing a wider range of methods , including qualitative ones, which for "these specific applications, (...) are not compromises in learning how to improve; they are superior" [ 8 ].

Research problems that can be approached particularly well using qualitative methods include assessing complex multi-component interventions or systems (of change), addressing questions beyond “what works”, towards “what works for whom when, how and why”, and focussing on intervention improvement rather than accreditation [ 7 , 9 – 12 ]. Using qualitative methods can also help shed light on the “softer” side of medical treatment. For example, while quantitative trials can measure the costs and benefits of neuro-oncological treatment in terms of survival rates or adverse effects, qualitative research can help provide a better understanding of patient or caregiver stress, visibility of illness or out-of-pocket expenses.

How to conduct qualitative research?

Given that qualitative research is characterised by flexibility, openness and responsivity to context, the steps of data collection and analysis are not as separate and consecutive as they tend to be in quantitative research [ 13 , 14 ]. As Fossey puts it : “sampling, data collection, analysis and interpretation are related to each other in a cyclical (iterative) manner, rather than following one after another in a stepwise approach” [ 15 ]. The researcher can make educated decisions with regard to the choice of method, how they are implemented, and to which and how many units they are applied [ 13 ]. As shown in Fig.  1 , this can involve several back-and-forth steps between data collection and analysis where new insights and experiences can lead to adaption and expansion of the original plan. Some insights may also necessitate a revision of the research question and/or the research design as a whole. The process ends when saturation is achieved, i.e. when no relevant new information can be found (see also below: sampling and saturation). For reasons of transparency, it is essential for all decisions as well as the underlying reasoning to be well-documented.

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig1_HTML.jpg

Iterative research process

While it is not always explicitly addressed, qualitative methods reflect a different underlying research paradigm than quantitative research (e.g. constructivism or interpretivism as opposed to positivism). The choice of methods can be based on the respective underlying substantive theory or theoretical framework used by the researcher [ 2 ].

Data collection

The methods of qualitative data collection most commonly used in health research are document study, observations, semi-structured interviews and focus groups [ 1 , 14 , 16 , 17 ].

Document study

Document study (also called document analysis) refers to the review by the researcher of written materials [ 14 ]. These can include personal and non-personal documents such as archives, annual reports, guidelines, policy documents, diaries or letters.

Observations

Observations are particularly useful to gain insights into a certain setting and actual behaviour – as opposed to reported behaviour or opinions [ 13 ]. Qualitative observations can be either participant or non-participant in nature. In participant observations, the observer is part of the observed setting, for example a nurse working in an intensive care unit [ 18 ]. In non-participant observations, the observer is “on the outside looking in”, i.e. present in but not part of the situation, trying not to influence the setting by their presence. Observations can be planned (e.g. for 3 h during the day or night shift) or ad hoc (e.g. as soon as a stroke patient arrives at the emergency room). During the observation, the observer takes notes on everything or certain pre-determined parts of what is happening around them, for example focusing on physician-patient interactions or communication between different professional groups. Written notes can be taken during or after the observations, depending on feasibility (which is usually lower during participant observations) and acceptability (e.g. when the observer is perceived to be judging the observed). Afterwards, these field notes are transcribed into observation protocols. If more than one observer was involved, field notes are taken independently, but notes can be consolidated into one protocol after discussions. Advantages of conducting observations include minimising the distance between the researcher and the researched, the potential discovery of topics that the researcher did not realise were relevant and gaining deeper insights into the real-world dimensions of the research problem at hand [ 18 ].

Semi-structured interviews

Hijmans & Kuyper describe qualitative interviews as “an exchange with an informal character, a conversation with a goal” [ 19 ]. Interviews are used to gain insights into a person’s subjective experiences, opinions and motivations – as opposed to facts or behaviours [ 13 ]. Interviews can be distinguished by the degree to which they are structured (i.e. a questionnaire), open (e.g. free conversation or autobiographical interviews) or semi-structured [ 2 , 13 ]. Semi-structured interviews are characterized by open-ended questions and the use of an interview guide (or topic guide/list) in which the broad areas of interest, sometimes including sub-questions, are defined [ 19 ]. The pre-defined topics in the interview guide can be derived from the literature, previous research or a preliminary method of data collection, e.g. document study or observations. The topic list is usually adapted and improved at the start of the data collection process as the interviewer learns more about the field [ 20 ]. Across interviews the focus on the different (blocks of) questions may differ and some questions may be skipped altogether (e.g. if the interviewee is not able or willing to answer the questions or for concerns about the total length of the interview) [ 20 ]. Qualitative interviews are usually not conducted in written format as it impedes on the interactive component of the method [ 20 ]. In comparison to written surveys, qualitative interviews have the advantage of being interactive and allowing for unexpected topics to emerge and to be taken up by the researcher. This can also help overcome a provider or researcher-centred bias often found in written surveys, which by nature, can only measure what is already known or expected to be of relevance to the researcher. Interviews can be audio- or video-taped; but sometimes it is only feasible or acceptable for the interviewer to take written notes [ 14 , 16 , 20 ].

Focus groups

Focus groups are group interviews to explore participants’ expertise and experiences, including explorations of how and why people behave in certain ways [ 1 ]. Focus groups usually consist of 6–8 people and are led by an experienced moderator following a topic guide or “script” [ 21 ]. They can involve an observer who takes note of the non-verbal aspects of the situation, possibly using an observation guide [ 21 ]. Depending on researchers’ and participants’ preferences, the discussions can be audio- or video-taped and transcribed afterwards [ 21 ]. Focus groups are useful for bringing together homogeneous (to a lesser extent heterogeneous) groups of participants with relevant expertise and experience on a given topic on which they can share detailed information [ 21 ]. Focus groups are a relatively easy, fast and inexpensive method to gain access to information on interactions in a given group, i.e. “the sharing and comparing” among participants [ 21 ]. Disadvantages include less control over the process and a lesser extent to which each individual may participate. Moreover, focus group moderators need experience, as do those tasked with the analysis of the resulting data. Focus groups can be less appropriate for discussing sensitive topics that participants might be reluctant to disclose in a group setting [ 13 ]. Moreover, attention must be paid to the emergence of “groupthink” as well as possible power dynamics within the group, e.g. when patients are awed or intimidated by health professionals.

Choosing the “right” method

As explained above, the school of thought underlying qualitative research assumes no objective hierarchy of evidence and methods. This means that each choice of single or combined methods has to be based on the research question that needs to be answered and a critical assessment with regard to whether or to what extent the chosen method can accomplish this – i.e. the “fit” between question and method [ 14 ]. It is necessary for these decisions to be documented when they are being made, and to be critically discussed when reporting methods and results.

Let us assume that our research aim is to examine the (clinical) processes around acute endovascular treatment (EVT), from the patient’s arrival at the emergency room to recanalization, with the aim to identify possible causes for delay and/or other causes for sub-optimal treatment outcome. As a first step, we could conduct a document study of the relevant standard operating procedures (SOPs) for this phase of care – are they up-to-date and in line with current guidelines? Do they contain any mistakes, irregularities or uncertainties that could cause delays or other problems? Regardless of the answers to these questions, the results have to be interpreted based on what they are: a written outline of what care processes in this hospital should look like. If we want to know what they actually look like in practice, we can conduct observations of the processes described in the SOPs. These results can (and should) be analysed in themselves, but also in comparison to the results of the document analysis, especially as regards relevant discrepancies. Do the SOPs outline specific tests for which no equipment can be observed or tasks to be performed by specialized nurses who are not present during the observation? It might also be possible that the written SOP is outdated, but the actual care provided is in line with current best practice. In order to find out why these discrepancies exist, it can be useful to conduct interviews. Are the physicians simply not aware of the SOPs (because their existence is limited to the hospital’s intranet) or do they actively disagree with them or does the infrastructure make it impossible to provide the care as described? Another rationale for adding interviews is that some situations (or all of their possible variations for different patient groups or the day, night or weekend shift) cannot practically or ethically be observed. In this case, it is possible to ask those involved to report on their actions – being aware that this is not the same as the actual observation. A senior physician’s or hospital manager’s description of certain situations might differ from a nurse’s or junior physician’s one, maybe because they intentionally misrepresent facts or maybe because different aspects of the process are visible or important to them. In some cases, it can also be relevant to consider to whom the interviewee is disclosing this information – someone they trust, someone they are otherwise not connected to, or someone they suspect or are aware of being in a potentially “dangerous” power relationship to them. Lastly, a focus group could be conducted with representatives of the relevant professional groups to explore how and why exactly they provide care around EVT. The discussion might reveal discrepancies (between SOPs and actual care or between different physicians) and motivations to the researchers as well as to the focus group members that they might not have been aware of themselves. For the focus group to deliver relevant information, attention has to be paid to its composition and conduct, for example, to make sure that all participants feel safe to disclose sensitive or potentially problematic information or that the discussion is not dominated by (senior) physicians only. The resulting combination of data collection methods is shown in Fig.  2 .

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig2_HTML.jpg

Possible combination of data collection methods

Attributions for icons: “Book” by Serhii Smirnov, “Interview” by Adrien Coquet, FR, “Magnifying Glass” by anggun, ID, “Business communication” by Vectors Market; all from the Noun Project

The combination of multiple data source as described for this example can be referred to as “triangulation”, in which multiple measurements are carried out from different angles to achieve a more comprehensive understanding of the phenomenon under study [ 22 , 23 ].

Data analysis

To analyse the data collected through observations, interviews and focus groups these need to be transcribed into protocols and transcripts (see Fig.  3 ). Interviews and focus groups can be transcribed verbatim , with or without annotations for behaviour (e.g. laughing, crying, pausing) and with or without phonetic transcription of dialects and filler words, depending on what is expected or known to be relevant for the analysis. In the next step, the protocols and transcripts are coded , that is, marked (or tagged, labelled) with one or more short descriptors of the content of a sentence or paragraph [ 2 , 15 , 23 ]. Jansen describes coding as “connecting the raw data with “theoretical” terms” [ 20 ]. In a more practical sense, coding makes raw data sortable. This makes it possible to extract and examine all segments describing, say, a tele-neurology consultation from multiple data sources (e.g. SOPs, emergency room observations, staff and patient interview). In a process of synthesis and abstraction, the codes are then grouped, summarised and/or categorised [ 15 , 20 ]. The end product of the coding or analysis process is a descriptive theory of the behavioural pattern under investigation [ 20 ]. The coding process is performed using qualitative data management software, the most common ones being InVivo, MaxQDA and Atlas.ti. It should be noted that these are data management tools which support the analysis performed by the researcher(s) [ 14 ].

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig3_HTML.jpg

From data collection to data analysis

Attributions for icons: see Fig. ​ Fig.2, 2 , also “Speech to text” by Trevor Dsouza, “Field Notes” by Mike O’Brien, US, “Voice Record” by ProSymbols, US, “Inspection” by Made, AU, and “Cloud” by Graphic Tigers; all from the Noun Project

How to report qualitative research?

Protocols of qualitative research can be published separately and in advance of the study results. However, the aim is not the same as in RCT protocols, i.e. to pre-define and set in stone the research questions and primary or secondary endpoints. Rather, it is a way to describe the research methods in detail, which might not be possible in the results paper given journals’ word limits. Qualitative research papers are usually longer than their quantitative counterparts to allow for deep understanding and so-called “thick description”. In the methods section, the focus is on transparency of the methods used, including why, how and by whom they were implemented in the specific study setting, so as to enable a discussion of whether and how this may have influenced data collection, analysis and interpretation. The results section usually starts with a paragraph outlining the main findings, followed by more detailed descriptions of, for example, the commonalities, discrepancies or exceptions per category [ 20 ]. Here it is important to support main findings by relevant quotations, which may add information, context, emphasis or real-life examples [ 20 , 23 ]. It is subject to debate in the field whether it is relevant to state the exact number or percentage of respondents supporting a certain statement (e.g. “Five interviewees expressed negative feelings towards XYZ”) [ 21 ].

How to combine qualitative with quantitative research?

Qualitative methods can be combined with other methods in multi- or mixed methods designs, which “[employ] two or more different methods [ …] within the same study or research program rather than confining the research to one single method” [ 24 ]. Reasons for combining methods can be diverse, including triangulation for corroboration of findings, complementarity for illustration and clarification of results, expansion to extend the breadth and range of the study, explanation of (unexpected) results generated with one method with the help of another, or offsetting the weakness of one method with the strength of another [ 1 , 17 , 24 – 26 ]. The resulting designs can be classified according to when, why and how the different quantitative and/or qualitative data strands are combined. The three most common types of mixed method designs are the convergent parallel design , the explanatory sequential design and the exploratory sequential design. The designs with examples are shown in Fig.  4 .

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig4_HTML.jpg

Three common mixed methods designs

In the convergent parallel design, a qualitative study is conducted in parallel to and independently of a quantitative study, and the results of both studies are compared and combined at the stage of interpretation of results. Using the above example of EVT provision, this could entail setting up a quantitative EVT registry to measure process times and patient outcomes in parallel to conducting the qualitative research outlined above, and then comparing results. Amongst other things, this would make it possible to assess whether interview respondents’ subjective impressions of patients receiving good care match modified Rankin Scores at follow-up, or whether observed delays in care provision are exceptions or the rule when compared to door-to-needle times as documented in the registry. In the explanatory sequential design, a quantitative study is carried out first, followed by a qualitative study to help explain the results from the quantitative study. This would be an appropriate design if the registry alone had revealed relevant delays in door-to-needle times and the qualitative study would be used to understand where and why these occurred, and how they could be improved. In the exploratory design, the qualitative study is carried out first and its results help informing and building the quantitative study in the next step [ 26 ]. If the qualitative study around EVT provision had shown a high level of dissatisfaction among the staff members involved, a quantitative questionnaire investigating staff satisfaction could be set up in the next step, informed by the qualitative study on which topics dissatisfaction had been expressed. Amongst other things, the questionnaire design would make it possible to widen the reach of the research to more respondents from different (types of) hospitals, regions, countries or settings, and to conduct sub-group analyses for different professional groups.

How to assess qualitative research?

A variety of assessment criteria and lists have been developed for qualitative research, ranging in their focus and comprehensiveness [ 14 , 17 , 27 ]. However, none of these has been elevated to the “gold standard” in the field. In the following, we therefore focus on a set of commonly used assessment criteria that, from a practical standpoint, a researcher can look for when assessing a qualitative research report or paper.

Assessors should check the authors’ use of and adherence to the relevant reporting checklists (e.g. Standards for Reporting Qualitative Research (SRQR)) to make sure all items that are relevant for this type of research are addressed [ 23 , 28 ]. Discussions of quantitative measures in addition to or instead of these qualitative measures can be a sign of lower quality of the research (paper). Providing and adhering to a checklist for qualitative research contributes to an important quality criterion for qualitative research, namely transparency [ 15 , 17 , 23 ].

Reflexivity

While methodological transparency and complete reporting is relevant for all types of research, some additional criteria must be taken into account for qualitative research. This includes what is called reflexivity, i.e. sensitivity to the relationship between the researcher and the researched, including how contact was established and maintained, or the background and experience of the researcher(s) involved in data collection and analysis. Depending on the research question and population to be researched this can be limited to professional experience, but it may also include gender, age or ethnicity [ 17 , 27 ]. These details are relevant because in qualitative research, as opposed to quantitative research, the researcher as a person cannot be isolated from the research process [ 23 ]. It may influence the conversation when an interviewed patient speaks to an interviewer who is a physician, or when an interviewee is asked to discuss a gynaecological procedure with a male interviewer, and therefore the reader must be made aware of these details [ 19 ].

Sampling and saturation

The aim of qualitative sampling is for all variants of the objects of observation that are deemed relevant for the study to be present in the sample “ to see the issue and its meanings from as many angles as possible” [ 1 , 16 , 19 , 20 , 27 ] , and to ensure “information-richness [ 15 ]. An iterative sampling approach is advised, in which data collection (e.g. five interviews) is followed by data analysis, followed by more data collection to find variants that are lacking in the current sample. This process continues until no new (relevant) information can be found and further sampling becomes redundant – which is called saturation [ 1 , 15 ] . In other words: qualitative data collection finds its end point not a priori , but when the research team determines that saturation has been reached [ 29 , 30 ].

This is also the reason why most qualitative studies use deliberate instead of random sampling strategies. This is generally referred to as “ purposive sampling” , in which researchers pre-define which types of participants or cases they need to include so as to cover all variations that are expected to be of relevance, based on the literature, previous experience or theory (i.e. theoretical sampling) [ 14 , 20 ]. Other types of purposive sampling include (but are not limited to) maximum variation sampling, critical case sampling or extreme or deviant case sampling [ 2 ]. In the above EVT example, a purposive sample could include all relevant professional groups and/or all relevant stakeholders (patients, relatives) and/or all relevant times of observation (day, night and weekend shift).

Assessors of qualitative research should check whether the considerations underlying the sampling strategy were sound and whether or how researchers tried to adapt and improve their strategies in stepwise or cyclical approaches between data collection and analysis to achieve saturation [ 14 ].

Good qualitative research is iterative in nature, i.e. it goes back and forth between data collection and analysis, revising and improving the approach where necessary. One example of this are pilot interviews, where different aspects of the interview (especially the interview guide, but also, for example, the site of the interview or whether the interview can be audio-recorded) are tested with a small number of respondents, evaluated and revised [ 19 ]. In doing so, the interviewer learns which wording or types of questions work best, or which is the best length of an interview with patients who have trouble concentrating for an extended time. Of course, the same reasoning applies to observations or focus groups which can also be piloted.

Ideally, coding should be performed by at least two researchers, especially at the beginning of the coding process when a common approach must be defined, including the establishment of a useful coding list (or tree), and when a common meaning of individual codes must be established [ 23 ]. An initial sub-set or all transcripts can be coded independently by the coders and then compared and consolidated after regular discussions in the research team. This is to make sure that codes are applied consistently to the research data.

Member checking

Member checking, also called respondent validation , refers to the practice of checking back with study respondents to see if the research is in line with their views [ 14 , 27 ]. This can happen after data collection or analysis or when first results are available [ 23 ]. For example, interviewees can be provided with (summaries of) their transcripts and asked whether they believe this to be a complete representation of their views or whether they would like to clarify or elaborate on their responses [ 17 ]. Respondents’ feedback on these issues then becomes part of the data collection and analysis [ 27 ].

Stakeholder involvement

In those niches where qualitative approaches have been able to evolve and grow, a new trend has seen the inclusion of patients and their representatives not only as study participants (i.e. “members”, see above) but as consultants to and active participants in the broader research process [ 31 – 33 ]. The underlying assumption is that patients and other stakeholders hold unique perspectives and experiences that add value beyond their own single story, making the research more relevant and beneficial to researchers, study participants and (future) patients alike [ 34 , 35 ]. Using the example of patients on or nearing dialysis, a recent scoping review found that 80% of clinical research did not address the top 10 research priorities identified by patients and caregivers [ 32 , 36 ]. In this sense, the involvement of the relevant stakeholders, especially patients and relatives, is increasingly being seen as a quality indicator in and of itself.

How not to assess qualitative research

The above overview does not include certain items that are routine in assessments of quantitative research. What follows is a non-exhaustive, non-representative, experience-based list of the quantitative criteria often applied to the assessment of qualitative research, as well as an explanation of the limited usefulness of these endeavours.

Protocol adherence

Given the openness and flexibility of qualitative research, it should not be assessed by how well it adheres to pre-determined and fixed strategies – in other words: its rigidity. Instead, the assessor should look for signs of adaptation and refinement based on lessons learned from earlier steps in the research process.

Sample size

For the reasons explained above, qualitative research does not require specific sample sizes, nor does it require that the sample size be determined a priori [ 1 , 14 , 27 , 37 – 39 ]. Sample size can only be a useful quality indicator when related to the research purpose, the chosen methodology and the composition of the sample, i.e. who was included and why.

Randomisation

While some authors argue that randomisation can be used in qualitative research, this is not commonly the case, as neither its feasibility nor its necessity or usefulness has been convincingly established for qualitative research [ 13 , 27 ]. Relevant disadvantages include the negative impact of a too large sample size as well as the possibility (or probability) of selecting “ quiet, uncooperative or inarticulate individuals ” [ 17 ]. Qualitative studies do not use control groups, either.

Interrater reliability, variability and other “objectivity checks”

The concept of “interrater reliability” is sometimes used in qualitative research to assess to which extent the coding approach overlaps between the two co-coders. However, it is not clear what this measure tells us about the quality of the analysis [ 23 ]. This means that these scores can be included in qualitative research reports, preferably with some additional information on what the score means for the analysis, but it is not a requirement. Relatedly, it is not relevant for the quality or “objectivity” of qualitative research to separate those who recruited the study participants and collected and analysed the data. Experiences even show that it might be better to have the same person or team perform all of these tasks [ 20 ]. First, when researchers introduce themselves during recruitment this can enhance trust when the interview takes place days or weeks later with the same researcher. Second, when the audio-recording is transcribed for analysis, the researcher conducting the interviews will usually remember the interviewee and the specific interview situation during data analysis. This might be helpful in providing additional context information for interpretation of data, e.g. on whether something might have been meant as a joke [ 18 ].

Not being quantitative research

Being qualitative research instead of quantitative research should not be used as an assessment criterion if it is used irrespectively of the research problem at hand. Similarly, qualitative research should not be required to be combined with quantitative research per se – unless mixed methods research is judged as inherently better than single-method research. In this case, the same criterion should be applied for quantitative studies without a qualitative component.

The main take-away points of this paper are summarised in Table ​ Table1. 1 . We aimed to show that, if conducted well, qualitative research can answer specific research questions that cannot to be adequately answered using (only) quantitative designs. Seeing qualitative and quantitative methods as equal will help us become more aware and critical of the “fit” between the research problem and our chosen methods: I can conduct an RCT to determine the reasons for transportation delays of acute stroke patients – but should I? It also provides us with a greater range of tools to tackle a greater range of research problems more appropriately and successfully, filling in the blind spots on one half of the methodological spectrum to better address the whole complexity of neurological research and practice.

Take-away-points

• Assessing complex multi-component interventions or systems (of change)

• What works for whom when, how and why?

• Focussing on intervention improvement

• Document study

• Observations (participant or non-participant)

• Interviews (especially semi-structured)

• Focus groups

• Transcription of audio-recordings and field notes into transcripts and protocols

• Coding of protocols

• Using qualitative data management software

• Combinations of quantitative and/or qualitative methods, e.g.:

• : quali and quanti in parallel

• : quanti followed by quali

• : quali followed by quanti

• Checklists

• Reflexivity

• Sampling strategies

• Piloting

• Co-coding

• Member checking

• Stakeholder involvement

• Protocol adherence

• Sample size

• Randomization

• Interrater reliability, variability and other “objectivity checks”

• Not being quantitative research

Acknowledgements

Abbreviations.

EVTEndovascular treatment
RCTRandomised Controlled Trial
SOPStandard Operating Procedure
SRQRStandards for Reporting Qualitative Research

Authors’ contributions

LB drafted the manuscript; WW and CG revised the manuscript; all authors approved the final versions.

no external funding.

Availability of data and materials

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

presentation qualitative data collection

Home Market Research

Qualitative Data Collection: What it is + Methods to do it

qualitative-data-collection

Qualitative data collection is vital in qualitative research. It helps researchers understand individuals’ attitudes, beliefs, and behaviors in a specific context.

Several methods are used to collect qualitative data, including interviews, surveys, focus groups, and observations. Understanding the various methods used for gathering qualitative data is essential for successful qualitative research.

In this post, we will discuss qualitative data and its collection methods of it.

Content Index

What is Qualitative Data?

What is qualitative data collection, what is the need for qualitative data collection, effective qualitative data collection methods, qualitative data analysis, advantages of qualitative data collection.

Qualitative data is defined as data that approximates and characterizes. It can be observed and recorded.

This data type is non-numerical in nature. This type of data is collected through methods of observations, one-to-one interviews, conducting focus groups, and similar methods.

Qualitative data in statistics is also known as categorical data – data that can be arranged categorically based on the attributes and properties of a thing or a phenomenon.

It’s pretty easy to understand the difference between qualitative and quantitative data. Qualitative data does not include numbers in its definition of traits, whereas quantitative research data is all about numbers.

  • The cake is orange, blue, and black in color (qualitative).
  • Females have brown, black, blonde, and red hair (qualitative).

Qualitative data collection is gathering non-numerical information, such as words, images, and observations, to understand individuals’ attitudes, behaviors, beliefs, and motivations in a specific context. It is an approach used in qualitative research. It seeks to understand social phenomena through in-depth exploration and analysis of people’s perspectives, experiences, and narratives. In statistical analysis , distinguishing between categorical data and numerical data is essential, as categorical data involves distinct categories or labels, while numerical data consists of measurable quantities.

The data collected through qualitative methods are often subjective, open-ended, and unstructured and can provide a rich and nuanced understanding of complex social phenomena.

Qualitative research is a type of study carried out with a qualitative approach to understand the exploratory reasons and to assay how and why a specific program or phenomenon operates in the way it is working. A researcher can access numerous qualitative data collection methods that he/she feels are relevant.

LEARN ABOUT: Best Data Collection Tools

Qualitative data collection methods serve the primary purpose of collecting textual data for research and analysis , like the thematic analysis. The collected research data is used to examine:

  • Knowledge around a specific issue or a program, experience of people.
  • Meaning and relationships.
  • Social norms and contextual or cultural practices demean people or impact a cause.

The qualitative data is textual or non-numerical. It covers mostly the images, videos, texts, and written or spoken words by the people. You can opt for any digital data collection methods , like structured or semi-structured surveys, or settle for the traditional approach comprising individual interviews, group discussions, etc.

Data at hand leads to a smooth process ensuring all the decisions made are for the business’s betterment. You will be able to make informed decisions only if you have relevant data.

Well! With quality data, you will improve the quality of decision-making. But you will also enhance the quality of the results expected from any endeavor.

Qualitative data collection methods are exploratory. Those are usually more focused on gaining insights and understanding the underlying reasons by digging deeper.

Although quantitative data cannot be quantified, measuring it or analyzing qualitative data might become an issue. Due to the lack of measurability, collection methods of qualitative data are primarily unstructured or structured in rare cases – that too to some extent.

Let’s explore the most common methods used for the collection of qualitative data:

presentation qualitative data collection

Individual interview

It is one of the most trusted, widely used, and familiar qualitative data collection methods primarily because of its approach. An individual or face-to-face interview is a direct conversation between two people with a specific structure and purpose.

The interview questionnaire is designed in the manner to elicit the interviewee’s knowledge or perspective related to a topic, program, or issue.

At times, depending on the interviewer’s approach, the conversation can be unstructured or informal but focused on understanding the individual’s beliefs, values, understandings, feelings, experiences, and perspectives on an issue.

More often, the interviewer chooses to ask open-ended questions in individual interviews. If the interviewee selects answers from a set of given options, it becomes a structured, fixed response or a biased discussion.

The individual interview is an ideal qualitative data collection method. Particularly when the researchers want highly personalized information from the participants. The individual interview is a notable method if the interviewer decides to probe further and ask follow-up questions to gain more insights.

Qualitative surveys

To develop an informed hypothesis, many researchers use qualitative research surveys for data collection or to collect a piece of detailed information about a product or an issue. If you want to create questionnaires for collecting textual or qualitative data, then ask more open-ended questions .

LEARN ABOUT: Research Process Steps

To answer such qualitative research questions , the respondent has to write his/her opinion or perspective concerning a specific topic or issue. Unlike other collection methods, online surveys have a wider reach. People can provide you with quality data that is highly credible and valuable.

Paper surveys

Online surveys, focus group discussions.

Focus group discussions can also be considered a type of interview, but it is conducted in a group discussion setting. Usually, the focus group consists of 8 – 10 people (the size may vary depending on the researcher’s requirement). The researchers ensure appropriate space is given to the participants to discuss a topic or issue in a context. The participants are allowed to either agree or disagree with each other’s comments. 

With a focused group discussion, researchers know how a particular group of participants perceives the topic. Researchers analyze what participants think of an issue, the range of opinions expressed, and the ideas discussed. The data is collected by noting down the variations or inconsistencies (if any exist) in the participants, especially in terms of belief, experiences, and practice. 

The participants of focused group discussions are selected based on the topic or issues for which the researcher wants actionable insights. For example, if the research is about the recovery of college students from drug addiction. The participants have to be college students studying and recovering from drug addiction.

Other parameters such as age, qualification, financial background, social presence, and demographics are also considered, but not primarily, as the group needs diverse participants. Frequently, the qualitative data collected through focused group discussion is more descriptive and highly detailed.

Record keeping

This method uses reliable documents and other sources of information that already exist as the data source. This information can help with the new study. It’s a lot like going to the library. There, you can look through books and other sources to find information that can be used in your research.

Case studies

In this method, data is collected by looking at case studies in detail. This method’s flexibility is shown by the fact that it can be used to analyze both simple and complicated topics. This method’s strength is how well it draws conclusions from a mix of one or more qualitative data collection methods.

Observations

Observation is one of the traditional methods of qualitative data collection. It is used by researchers to gather descriptive analysis data by observing people and their behavior at events or in their natural settings. In this method, the researcher is completely immersed in watching people by taking a participatory stance to take down notes.

There are two main types of observation:

  • Covert: In this method, the observer is concealed without letting anyone know that they are being observed. For example, a researcher studying the rituals of a wedding in nomadic tribes must join them as a guest and quietly see everything. 
  • Overt: In this method, everyone is aware that they are being watched. For example, A researcher or an observer wants to study the wedding rituals of a nomadic tribe. To proceed with the research, the observer or researcher can reveal why he is attending the marriage and even use a video camera to shoot everything around him. 

Observation is a useful method of qualitative data collection, especially when you want to study the ongoing process, situation, or reactions on a specific issue related to the people being observed.

When you want to understand people’s behavior or their way of interaction in a particular community or demographic, you can rely on the observation data. Remember, if you fail to get quality data through surveys, qualitative interviews , or group discussions, rely on observation.

It is the best and most trusted collection method of qualitative data to generate qualitative data as it requires equal to no effort from the participants.

LEARN ABOUT: Behavioral Research

You invested time and money acquiring your data, so analyze it. It’s necessary to avoid being in the dark after all your hard work. Qualitative data analysis starts with knowing its two basic techniques, but there are no rules.

  • Deductive Approach: The deductive data analysis uses a researcher-defined structure to analyze qualitative data. This method is quick and easy when a researcher knows what the sample population will say.
  • Inductive Approach: The inductive technique has no structure or framework. When a researcher knows little about the event, an inductive approach is applied.

Whether you want to analyze qualitative data from a one-on-one interview or a survey, these simple steps will ensure a comprehensive qualitative data analysis.

Step 1: Arrange your Data

After collecting all the data, it is mostly unstructured and sometimes unclear. Arranging your data is the first stage in qualitative data analysis. So, researchers must transcribe data before analyzing it.

Step 2: Organize all your Data

After transforming and arranging your data, the next step is to organize it. One of the best ways to organize the data is to think back to your research goals and then organize the data based on the research questions you asked.

Step 3: Set a Code to the Data Collected

Setting up appropriate codes for the collected data gets you one step closer. Coding is one of the most effective methods for compressing a massive amount of data. It allows you to derive theories from relevant research findings.

Step 4: Validate your Data

Qualitative data analysis success requires data validation. Data validation should be done throughout the research process, not just once. There are two sides to validating data:

  • The accuracy of your research design or methods.
  • Reliability—how well the approaches deliver accurate data.

Step 5: Concluding the Analysis Process

Finally, conclude your data in a presentable report. The report should describe your research methods, their pros and cons, and research limitations. Your report should include findings, inferences, and future research.

QuestionPro is a comprehensive online survey software that offers a variety of qualitative data analysis tools to help businesses and researchers in making sense of their data. Users can use many different qualitative analysis methods to learn more about their data.

Users of QuestionPro can see their data in different charts and graphs, which makes it easier to spot patterns and trends. It can help researchers and businesses learn more about their target audience, which can lead to better decisions and better results.

LEARN ABOUT: Steps in Qualitative Research

Qualitative data collection has several advantages, including:

presentation qualitative data collection

  • In-depth understanding: It provides in-depth information about attitudes and behaviors, leading to a deeper understanding of the research.
  • Flexibility: The methods allow researchers to modify questions or change direction if new information emerges.
  • Contextualization: Qualitative research data is in context, which helps to provide a deep understanding of the experiences and perspectives of individuals.
  • Rich data: It often produces rich, detailed, and nuanced information that cannot capture through numerical data.
  • Engagement: The methods, such as interviews and focus groups, involve active meetings with participants, leading to a deeper understanding.
  • Multiple perspectives: This can provide various views and a rich array of voices, adding depth and complexity.
  • Realistic setting: It often occurs in realistic settings, providing more authentic experiences and behaviors.

LEARN ABOUT: 12 Best Tools for Researchers

Qualitative research is one of the best methods for identifying the behavior and patterns governing social conditions, issues, or topics. It spans a step ahead of quantitative data as it fails to explain the reasons and rationale behind a phenomenon, but qualitative data quickly does. 

Qualitative research is one of the best tools to identify behaviors and patterns governing social conditions. It goes a step beyond quantitative data by providing the reasons and rationale behind a phenomenon that cannot be explored quantitatively.

With QuestionPro, you can use it for qualitative data collection through various methods. Using Our robust suite correctly, you can enhance the quality and integrity of the collected data.

FREE TRIAL         LEARN MORE

MORE LIKE THIS

feedback loop

Feedback Loop: What It Is, Types & How It Works?

Jun 21, 2024

presentation qualitative data collection

QuestionPro Thrive: A Space to Visualize & Share the Future of Technology

Jun 18, 2024

presentation qualitative data collection

Relationship NPS Fails to Understand Customer Experiences — Tuesday CX

CX Platforms

CX Platform: Top 13 CX Platforms to Drive Customer Success

Jun 17, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Tuesday CX Thoughts (TCXT)
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Qualitative Data Collection Methods

Profile image of Lionel R Amarakoon

Related Papers

Research Methods for Business & Management

Kevin D O'Gorman

As one of our primary methodologies in the Methods Map (see chapter 4), qualitative techniques can yield valuable, revelatory, and rich data. They can be used on their own, or in conjunction with other research tools depending on the nature of the research project. For example, interviews can be used to explain and interpret the results of quantitative research, or conversely, to provide exploratory data that are later developed by quantitative research. MacIntosh & Bonnet (2007, p. 321) note with humour that “[q]ualitative research is sometimes styled as the poor cousin of ‘real science’…” This position can represent an added challenge to researchers. This chapter discusses some common approaches to qualitative research methods (see the ‘Techniques’ section of the Methods Map) and the issues that must be considered with their application in order for them not to be viewed as somehow inferior to ‘real science’.

presentation qualitative data collection

The Qualitative Report

Nancy Leech

Introduction In many disciplines and fields representing the social and behavioral sciences, the quantitative research paradigm, which has its roots in (logical) positivism, marked the first methodological wave (circa the 19th century), inasmuch as it was characterized by a comprehensive and formal set of assumptions and principles surrounding epistemology (e.g., independence of knower and known, objectivism, real causes determining outcomes reliably and validly, time- and context-free generalizations), ontology (e.g., single reality), axiology (e.g., value-free), methodology (e.g., deductive logic, testing or confirming hypotheses/theory), and rhetoric (e.g., rhetorical neutrality, formal writing style, impersonal passive voice, technical terminology). The years 1900 to 1950 marked what could be termed as the second methodological wave, in which many researchers who rejected (logical) positivism embraced the qualitative research paradigm (1). Denzin and Lincoln (2005a) refer to thi...

Kevin Meethan

• Data analysis–the examination of research data.• Data collection–the systematic process of collecting data.• Deduction–arriving at logical conclusions through the application of rational processes eg theory testing. Quantitative research tends to be deductive.• Documentary research–the use of texts or documents as source materials (eg historical reports, newspapers, diaries).

Sapthami KKM

Seda Khadimally

In the process of designing a solid research study, it is imperative that researchers be aware of the data collection methods within which they are to conduct their study. Sound data collection cannot be performed without the choice of a particular research design best suited for the study, as well as the research questions of the study that need to be answered. In order to conduct a sound research, researchers need to ask right research questions that need to correspond to the problem of their study. Data collection is driven by the design based on which researchers prefer to conduct their study. According to Creswell (2013), this process comprises a “series of interrelated activities aimed at gathering good information to answer emerging research questions” (p. 146). There is an array of data collection methods and whatever research design researchers choose to work with, they should keep in mind that they cannot gather their data in solitude. This means that their study participants’ role plays a crucial part during data collection. There are also other factors pivotal to collecting data. Creswell (2013) described the entire process as a circle of activities which, when researchers are engaged in, need to be considered through multiple phases such as “locating the site/individual, gaining access and making rapport, purposefully sampling, collecting data, recording information, resolving field issues, and storing data” (p. 146). Researchers should be informed prior to gathering their data that data collection extends beyond conducting interviews with their participants or observing the site, individuals, or groups of people. They should particularly be cognizant of the fact that some of the data collection methods they choose overlap with each other while some considerably differ from one another. With this in mind, the purpose of this paper is to delineate similarities and differences among methods of data collection employed in three different research designs: ethnographic studies, phenomenological studies, and narrative histories. Issues that would lend themselves to these three study types will be addressed, and the challenges that researchers encounter when collecting their data using each type of design will also be discussed.

Catherine N . Mwai

Nassima SACI

RELATED PAPERS

Anoop C Choolayil

Md Raihan Ubaidullah

Maira Muchlis

Language Teaching Research

Hossein Nassaji

EMILIO YERO

Rachel Irish Kozicki

Evaluation and Program Planning

Souraya Sidani

Psychology Lover

Temitope Idowu

Phillip John Relacion

Barbara Kawulich

Evidence Based Nursing

temitope oludoun

Sachin Kumar

Dr.Larry Adams

ribwar aljaf

Elise Paradis

Mataga L A T U W E L L Tapiwa

MaryPatricia Patrick

Australian and New …

Karen Willis

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024
  • Privacy Policy

Research Method

Home » Data Collection – Methods Types and Examples

Data Collection – Methods Types and Examples

Table of Contents

Data collection

Data Collection

Definition:

Data collection is the process of gathering and collecting information from various sources to analyze and make informed decisions based on the data collected. This can involve various methods, such as surveys, interviews, experiments, and observation.

In order for data collection to be effective, it is important to have a clear understanding of what data is needed and what the purpose of the data collection is. This can involve identifying the population or sample being studied, determining the variables to be measured, and selecting appropriate methods for collecting and recording data.

Types of Data Collection

Types of Data Collection are as follows:

Primary Data Collection

Primary data collection is the process of gathering original and firsthand information directly from the source or target population. This type of data collection involves collecting data that has not been previously gathered, recorded, or published. Primary data can be collected through various methods such as surveys, interviews, observations, experiments, and focus groups. The data collected is usually specific to the research question or objective and can provide valuable insights that cannot be obtained from secondary data sources. Primary data collection is often used in market research, social research, and scientific research.

Secondary Data Collection

Secondary data collection is the process of gathering information from existing sources that have already been collected and analyzed by someone else, rather than conducting new research to collect primary data. Secondary data can be collected from various sources, such as published reports, books, journals, newspapers, websites, government publications, and other documents.

Qualitative Data Collection

Qualitative data collection is used to gather non-numerical data such as opinions, experiences, perceptions, and feelings, through techniques such as interviews, focus groups, observations, and document analysis. It seeks to understand the deeper meaning and context of a phenomenon or situation and is often used in social sciences, psychology, and humanities. Qualitative data collection methods allow for a more in-depth and holistic exploration of research questions and can provide rich and nuanced insights into human behavior and experiences.

Quantitative Data Collection

Quantitative data collection is a used to gather numerical data that can be analyzed using statistical methods. This data is typically collected through surveys, experiments, and other structured data collection methods. Quantitative data collection seeks to quantify and measure variables, such as behaviors, attitudes, and opinions, in a systematic and objective way. This data is often used to test hypotheses, identify patterns, and establish correlations between variables. Quantitative data collection methods allow for precise measurement and generalization of findings to a larger population. It is commonly used in fields such as economics, psychology, and natural sciences.

Data Collection Methods

Data Collection Methods are as follows:

Surveys involve asking questions to a sample of individuals or organizations to collect data. Surveys can be conducted in person, over the phone, or online.

Interviews involve a one-on-one conversation between the interviewer and the respondent. Interviews can be structured or unstructured and can be conducted in person or over the phone.

Focus Groups

Focus groups are group discussions that are moderated by a facilitator. Focus groups are used to collect qualitative data on a specific topic.

Observation

Observation involves watching and recording the behavior of people, objects, or events in their natural setting. Observation can be done overtly or covertly, depending on the research question.

Experiments

Experiments involve manipulating one or more variables and observing the effect on another variable. Experiments are commonly used in scientific research.

Case Studies

Case studies involve in-depth analysis of a single individual, organization, or event. Case studies are used to gain detailed information about a specific phenomenon.

Secondary Data Analysis

Secondary data analysis involves using existing data that was collected for another purpose. Secondary data can come from various sources, such as government agencies, academic institutions, or private companies.

How to Collect Data

The following are some steps to consider when collecting data:

  • Define the objective : Before you start collecting data, you need to define the objective of the study. This will help you determine what data you need to collect and how to collect it.
  • Identify the data sources : Identify the sources of data that will help you achieve your objective. These sources can be primary sources, such as surveys, interviews, and observations, or secondary sources, such as books, articles, and databases.
  • Determine the data collection method : Once you have identified the data sources, you need to determine the data collection method. This could be through online surveys, phone interviews, or face-to-face meetings.
  • Develop a data collection plan : Develop a plan that outlines the steps you will take to collect the data. This plan should include the timeline, the tools and equipment needed, and the personnel involved.
  • Test the data collection process: Before you start collecting data, test the data collection process to ensure that it is effective and efficient.
  • Collect the data: Collect the data according to the plan you developed in step 4. Make sure you record the data accurately and consistently.
  • Analyze the data: Once you have collected the data, analyze it to draw conclusions and make recommendations.
  • Report the findings: Report the findings of your data analysis to the relevant stakeholders. This could be in the form of a report, a presentation, or a publication.
  • Monitor and evaluate the data collection process: After the data collection process is complete, monitor and evaluate the process to identify areas for improvement in future data collection efforts.
  • Ensure data quality: Ensure that the collected data is of high quality and free from errors. This can be achieved by validating the data for accuracy, completeness, and consistency.
  • Maintain data security: Ensure that the collected data is secure and protected from unauthorized access or disclosure. This can be achieved by implementing data security protocols and using secure storage and transmission methods.
  • Follow ethical considerations: Follow ethical considerations when collecting data, such as obtaining informed consent from participants, protecting their privacy and confidentiality, and ensuring that the research does not cause harm to participants.
  • Use appropriate data analysis methods : Use appropriate data analysis methods based on the type of data collected and the research objectives. This could include statistical analysis, qualitative analysis, or a combination of both.
  • Record and store data properly: Record and store the collected data properly, in a structured and organized format. This will make it easier to retrieve and use the data in future research or analysis.
  • Collaborate with other stakeholders : Collaborate with other stakeholders, such as colleagues, experts, or community members, to ensure that the data collected is relevant and useful for the intended purpose.

Applications of Data Collection

Data collection methods are widely used in different fields, including social sciences, healthcare, business, education, and more. Here are some examples of how data collection methods are used in different fields:

  • Social sciences : Social scientists often use surveys, questionnaires, and interviews to collect data from individuals or groups. They may also use observation to collect data on social behaviors and interactions. This data is often used to study topics such as human behavior, attitudes, and beliefs.
  • Healthcare : Data collection methods are used in healthcare to monitor patient health and track treatment outcomes. Electronic health records and medical charts are commonly used to collect data on patients’ medical history, diagnoses, and treatments. Researchers may also use clinical trials and surveys to collect data on the effectiveness of different treatments.
  • Business : Businesses use data collection methods to gather information on consumer behavior, market trends, and competitor activity. They may collect data through customer surveys, sales reports, and market research studies. This data is used to inform business decisions, develop marketing strategies, and improve products and services.
  • Education : In education, data collection methods are used to assess student performance and measure the effectiveness of teaching methods. Standardized tests, quizzes, and exams are commonly used to collect data on student learning outcomes. Teachers may also use classroom observation and student feedback to gather data on teaching effectiveness.
  • Agriculture : Farmers use data collection methods to monitor crop growth and health. Sensors and remote sensing technology can be used to collect data on soil moisture, temperature, and nutrient levels. This data is used to optimize crop yields and minimize waste.
  • Environmental sciences : Environmental scientists use data collection methods to monitor air and water quality, track climate patterns, and measure the impact of human activity on the environment. They may use sensors, satellite imagery, and laboratory analysis to collect data on environmental factors.
  • Transportation : Transportation companies use data collection methods to track vehicle performance, optimize routes, and improve safety. GPS systems, on-board sensors, and other tracking technologies are used to collect data on vehicle speed, fuel consumption, and driver behavior.

Examples of Data Collection

Examples of Data Collection are as follows:

  • Traffic Monitoring: Cities collect real-time data on traffic patterns and congestion through sensors on roads and cameras at intersections. This information can be used to optimize traffic flow and improve safety.
  • Social Media Monitoring : Companies can collect real-time data on social media platforms such as Twitter and Facebook to monitor their brand reputation, track customer sentiment, and respond to customer inquiries and complaints in real-time.
  • Weather Monitoring: Weather agencies collect real-time data on temperature, humidity, air pressure, and precipitation through weather stations and satellites. This information is used to provide accurate weather forecasts and warnings.
  • Stock Market Monitoring : Financial institutions collect real-time data on stock prices, trading volumes, and other market indicators to make informed investment decisions and respond to market fluctuations in real-time.
  • Health Monitoring : Medical devices such as wearable fitness trackers and smartwatches can collect real-time data on a person’s heart rate, blood pressure, and other vital signs. This information can be used to monitor health conditions and detect early warning signs of health issues.

Purpose of Data Collection

The purpose of data collection can vary depending on the context and goals of the study, but generally, it serves to:

  • Provide information: Data collection provides information about a particular phenomenon or behavior that can be used to better understand it.
  • Measure progress : Data collection can be used to measure the effectiveness of interventions or programs designed to address a particular issue or problem.
  • Support decision-making : Data collection provides decision-makers with evidence-based information that can be used to inform policies, strategies, and actions.
  • Identify trends : Data collection can help identify trends and patterns over time that may indicate changes in behaviors or outcomes.
  • Monitor and evaluate : Data collection can be used to monitor and evaluate the implementation and impact of policies, programs, and initiatives.

When to use Data Collection

Data collection is used when there is a need to gather information or data on a specific topic or phenomenon. It is typically used in research, evaluation, and monitoring and is important for making informed decisions and improving outcomes.

Data collection is particularly useful in the following scenarios:

  • Research : When conducting research, data collection is used to gather information on variables of interest to answer research questions and test hypotheses.
  • Evaluation : Data collection is used in program evaluation to assess the effectiveness of programs or interventions, and to identify areas for improvement.
  • Monitoring : Data collection is used in monitoring to track progress towards achieving goals or targets, and to identify any areas that require attention.
  • Decision-making: Data collection is used to provide decision-makers with information that can be used to inform policies, strategies, and actions.
  • Quality improvement : Data collection is used in quality improvement efforts to identify areas where improvements can be made and to measure progress towards achieving goals.

Characteristics of Data Collection

Data collection can be characterized by several important characteristics that help to ensure the quality and accuracy of the data gathered. These characteristics include:

  • Validity : Validity refers to the accuracy and relevance of the data collected in relation to the research question or objective.
  • Reliability : Reliability refers to the consistency and stability of the data collection process, ensuring that the results obtained are consistent over time and across different contexts.
  • Objectivity : Objectivity refers to the impartiality of the data collection process, ensuring that the data collected is not influenced by the biases or personal opinions of the data collector.
  • Precision : Precision refers to the degree of accuracy and detail in the data collected, ensuring that the data is specific and accurate enough to answer the research question or objective.
  • Timeliness : Timeliness refers to the efficiency and speed with which the data is collected, ensuring that the data is collected in a timely manner to meet the needs of the research or evaluation.
  • Ethical considerations : Ethical considerations refer to the ethical principles that must be followed when collecting data, such as ensuring confidentiality and obtaining informed consent from participants.

Advantages of Data Collection

There are several advantages of data collection that make it an important process in research, evaluation, and monitoring. These advantages include:

  • Better decision-making : Data collection provides decision-makers with evidence-based information that can be used to inform policies, strategies, and actions, leading to better decision-making.
  • Improved understanding: Data collection helps to improve our understanding of a particular phenomenon or behavior by providing empirical evidence that can be analyzed and interpreted.
  • Evaluation of interventions: Data collection is essential in evaluating the effectiveness of interventions or programs designed to address a particular issue or problem.
  • Identifying trends and patterns: Data collection can help identify trends and patterns over time that may indicate changes in behaviors or outcomes.
  • Increased accountability: Data collection increases accountability by providing evidence that can be used to monitor and evaluate the implementation and impact of policies, programs, and initiatives.
  • Validation of theories: Data collection can be used to test hypotheses and validate theories, leading to a better understanding of the phenomenon being studied.
  • Improved quality: Data collection is used in quality improvement efforts to identify areas where improvements can be made and to measure progress towards achieving goals.

Limitations of Data Collection

While data collection has several advantages, it also has some limitations that must be considered. These limitations include:

  • Bias : Data collection can be influenced by the biases and personal opinions of the data collector, which can lead to inaccurate or misleading results.
  • Sampling bias : Data collection may not be representative of the entire population, resulting in sampling bias and inaccurate results.
  • Cost : Data collection can be expensive and time-consuming, particularly for large-scale studies.
  • Limited scope: Data collection is limited to the variables being measured, which may not capture the entire picture or context of the phenomenon being studied.
  • Ethical considerations : Data collection must follow ethical principles to protect the rights and confidentiality of the participants, which can limit the type of data that can be collected.
  • Data quality issues: Data collection may result in data quality issues such as missing or incomplete data, measurement errors, and inconsistencies.
  • Limited generalizability : Data collection may not be generalizable to other contexts or populations, limiting the generalizability of the findings.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Research Project

Research Project – Definition, Writing Guide and...

Context of the Study

Context of the Study – Writing Guide and Examples

Purpose of Research

Purpose of Research – Objectives and Applications

Evaluating Research

Evaluating Research – Process, Examples and...

Research Report

Research Report – Example, Writing Guide and...

Research Questions

Research Questions – Types, Examples and Writing...

qualitative data collection

Qualitative Data Collection

May 01, 2012

420 likes | 1.02k Views

Qualitative Data Collection. JN602 Week 08 Veal Chapter 7, CDS Chapter 6. Outline. Discuss the assumptions of qualitative research Describe how qualitative researchers respond to the demands of accuracy and replicability Explain the process of planning and conducting a research interview

Share Presentation

  • socioeconomic status
  • great telephone manner
  • small number
  • interviewer involvement
  • partial life histories

lamond

Presentation Transcript

Qualitative Data Collection JN602 Week 08 Veal Chapter 7, CDS Chapter 6

Outline • Discuss the assumptions of qualitative research • Describe how qualitative researchers respond to the demands of accuracy and replicability • Explain the process of planning and conducting a research interview • Discuss the face-to-face, telephone and computer-assisted interview • Explain the process of planning and conducting a focus group • Describe non-participant and participant observational studies • Identify other special data sources • Describe the ethical issues involved in qualitative research

The nature of qualitative methods (Veal, 2005) • Use of qualitative rather than quantitative information: a large amount of ‘rich’ information about a small number of subjects • Assumes that reality is socially and subjectively constructed • Researcher’s task is to uncover meanings rather than test pre-established hypotheses: usually inductive rather than deductive • Assumes people are best able to describe their own situation, beliefs, motivations etc.

Assumptions of qualitative research (CDS) • Emphasis on understanding • Perspectival view • Discover patterns • Indwelling • Human-as-an-instrument • Hidden tacit knowledge

Accuracy and Replicability • Trustworthiness • Verification • Acknowledging subjectivity and bias • Process and sequence • Interpretation • Referential adequacy • Transparency

Some advantages of qualitative methods (Veal, 2005) • Understanding/explaining personal experiences of individuals • Focus on subjects' own understanding and interpretations • Researcher experiences issues from a participant's perspective • Reports presented in a narrative rather than a statistical form – more interesting/understandable for non-experts • Useful in examining personal changes over time • Focus on human-interest issues that are meaningful to managers

Management contexts for qualitative research • Corporate culture – exploring the way groups or networks of individuals operate in the working environment • Consumer decision-making – exploring how customers actually interact with suppliers and products and services • Work-life balance – examining how employees allocate time and resources across their different life roles

The qualitative research process

Qualitative sample designs • Representative of population • Non-probability methods • No systematic bias • Avoid ‘paralysis by analysis’

The range of qualitative methods (Veal, 2005) • In-depth interviews • Small number of subjects • Checklist rather than a formal questionnaire • Tape-recording + verbatim transcript • 30 minutes to several hours; repeat interviews possible. • Group interviews/focus groups • Conducted with a group • Interaction between subjects + interaction between interviewer and subject.

The range of qualitative methods - continued (Veal, 2005) • Participant observation • Researcher is actual participant along with subjects • Researcher known as researcher, or incognito. • Analysis of texts • Can include print, audio-visual, artefacts • Ethnography • Use a number of the above techniques - borrowed from anthropology • Biographical research • Individual full or partial life histories • In-depth interviews, documentary evidence + subjects' own written accounts.

In-depth interviews (Veal, 2005) • Nature • Lengthy, probing interview • Encourages subject to talk • Often tape-recorded + verbatim transcript • Often uses checklist rather than questionnaire – see Fig. 7.2

In-depth interviews: purpose • For research where: • Number of subjects is small - quantitative research not appropriate; • Information from subjects expected to vary considerably; or • Exploratory/preliminary stage in planning a larger study, possibly quantitative.

Interviewer responses • Whyte’s hierarchy of interviewer responses: • 'Uh-huh‘ • 'That's interesting‘ • Reflection - Repeating the last statement as a question • Probe - Inviting explanations of statements • Back tracking • New topic

Interviewing techniques • Pattern of an Interview • Listening • Questioning • Paraphrasing • Probing • Summarising • Non-verbal behaviour

Pattern of Interview(CDS Fig.6.2 p.139)

Ritual Pass time Reason Rules Preview Activity Questions Final comments Summary Exit Phases of the interview

The interview schedule • A means for structuring interviews • May include: • Open questions • Closed questions • “Laddering” and probing: further exploration for “richer” data

Focus groups/group interviews (Veal, 2005) • Nature • Interviewer becomes a ‘facilitator’ of the discussion • Group members interact with each other as well as the facilitator • Used when: • an important group may be too small to be picked up in a sample survey • The interaction/discussion process among group members is of interest • Alternative to in-depth interview when multiple interviews not practicable

Focus groups - methods • Group size: 5-12 members • Discussion typically tape-recorded • Written summary prepared from the tape • Facilitator role similar to interviewer in in-depth interview • Facilitator: • leads discussion through the range of topics of interest • ensures that all members ‘have their say’ – avoid domination by one or more group members

Issues in designing focus groups • Logistics • Group Composition • Homogeneity • Representation • Strangers vs acquaintances • Size of group

Conducting the Focus Group • Use pattern of interview as guide • Specific considerations • Facilitator team • Recording • Use of visual aids • Thinking time • Group dynamics

Participant observation • Nature • Researcher participates in the social process being studied – eg. as employee • Observing people in their natural work setting or in lab setting • Purpose • To study situations where complex/detailed information required from a group: • Group dynamics • Inter-personal relationships/dealings

Roles of the Researcher • Complete participant The researcher becomes a memberof the subject group. • Complete observer The researcher does not take part in group activities. Researcher located in a hidden position. • Observer as participant The researcher observes but is stilluninvolved. Their status known to subject group. • Participant as observer Participate and observe. Full awareness of subject Group.

The overlap • Between the collection and analysis of data • Interviewer involvement in the collection process requires reflection and reflexivity

Reflexivity Reflexivity is used to identify areas of potential bias and to manage them so their influence on the research process and findings is minimal. Total objectivity is not possible. Each person’s values, attitudes and beliefs are the result of a combination of factors including historical context, socioeconomic status, culture, gender … Consider the different social standing, cultural background,and ethnicity, between researcher and subject/participant. Identify the power differences(political context) and how they can lead to exploitation andmisinterpretation of the subject’sreality. (Source: L. Gottschalk)

Casual Telephone Interviewers Required • Are you interested in being paid to discuss important issues with regional & rural residents? • The Centre for Regional Innovation and Competitiveness (CRIC) is a research centre based within the School of Business. We are currently seeking suitable persons to join our team of telephone survey interviewers How to apply: • Please call David Lynch (Research Associate, CRIC) on 03 5327 9487 or email your CV to: [email protected]

What is involved? • No selling involved • Friendly working environment • Casual work with a good hourly rate • Flexible roster – mornings, afternoons & evenings with some weekend shifts available (so you can work around your study commitments!) • Convenient on-campus location • Potential for long-term work interviewing regional and rural residents and businesses • No experience necessary as you will be provided with full training • Play an important role in the University’s research activities

What are we looking for? • You will need: • Great telephone manner and reading skills • Good computer keyboard and typing skills • A genuine interest in regional and rural issues • A good work ethic and attitude • To be reliable when committing to shifts • Work is on a project-by-project basis • Positions available immediately • Applications accepted on an ongoing basis

  • More by User

QUALITATIVE DATA COLLECTION

QUALITATIVE DATA COLLECTION

QUALITATIVE DATA COLLECTION . The aim of this session is to enable you to: • justify decisions about sampling select and apply an appropriate method of data collection construct questions for qualitative data collection and elicit high quality responses

740 views • 22 slides

Qualitative Data collection Types Options Advantages, and Limitations

Qualitative Data collection Types Options Advantages, and Limitations

Qualitative Data collection Types Options Advantages, and Limitations . Data collection Types  Observations  Interviews  documents  Audiovisual materials . Data collection Types Observations.  Options Within Types  Complete participant: Researcher conceals role

842 views • 13 slides

Qualitative Data Analysis

Qualitative Data Analysis

Qualitative Data Analysis. Quantitative research . Involves information or data in the form of numbers Allows us to measure or to quantify things Respondents don’t necessarily give numbers as answers - answers are analysed as numbers Good example of quantitative research is the survey .

1.59k views • 86 slides

More on Qualitative Data Collection and Data Analysis

More on Qualitative Data Collection and Data Analysis

More on Qualitative Data Collection and Data Analysis. Forms of Data. John Creswell (1998) notes there are four basic types of data that may be collected, depending on the methodology used: Observations Interviews Documents Audio-visual materials. Main Types of Qualitative Notes.

419 views • 19 slides

Qualitative Data

Qualitative Data

QUALITATIVE DATA ANALYSIS Transparency and Accountability Program Launch Workshop May 2 – 6, 2011 – Johannesburg, South Africa The Protea Hotel Balalaika ( Sandton ) May 4, 2011 11.00 -13.00 HRS .

358 views • 16 slides

Distress Protocol for qualitative data collection

Distress Protocol for qualitative data collection

Distress Protocol for qualitative data collection. Professor Carol Haigh & Gary Witham Department of Nursing MMU Review date 2015.

520 views • 4 slides

Chapter 15 Qualitative Methods of Data Collection

Chapter 15 Qualitative Methods of Data Collection

Chapter 15 Qualitative Methods of Data Collection. Researcher using qualitative methods needs theoretical and social sensitivity Balance what is being observed with what is known Recognize subjective role of the researcher Think abstractly and make connections among data collected.

826 views • 25 slides

Chapter 9 Qualitative Methods: Introduction and Data Collection

Chapter 9 Qualitative Methods: Introduction and Data Collection

Chapter 9 Qualitative Methods: Introduction and Data Collection. CONTENTS. Introduction: nature, history and development Merits, functions, limitations The qualitative research process The range of methods – introduction Validity and reliability. Data collection/analysis.

890 views • 33 slides

Coding Qualitative Data

Coding Qualitative Data

Coding Qualitative Data. Oral History Project: SS10 2014. #. http://youtu.be/ 57dzaMaouXA. https://support.twitter.com/articles/49309-using-hashtags-on-twitter #. # to Coding Your Data. s ee what’s popular (trending). m eans of organising information. e nabled by technology.

862 views • 7 slides

Qualitative Methods of Data Collection

Qualitative Methods of Data Collection

Qualitative Methods of Data Collection. Researcher is in the field for prolonged periods of time Researcher uses systematic observation. People observed ,. Their social interaction. And the context. Observation Strategies. Seamless container Work around the room

721 views • 26 slides

Qualitative Data Analysis

Qualitative Data Analysis. With QSR NVivo Graham R Gibbs and Kathryn Sharratt. QSR NVivo. Developed by Lyn and Tom Richards in Australia. Started as NUD.IST in 1980s. Now NVivo v. 10. NVivo at Huddersfield. The University now has a site licence for NVivo.

1.2k views • 72 slides

Qualitative Data

Qualitative Data. S. Venkatraman UIS-UNESCO. What it collects?. Actions, Events Activities Text Visual Behavioural Oral- Opinions, statements, discussion, stories. Qualitative research. Different methods Ethnographic

208 views • 5 slides

Qualitative Data

Qualitative Data. Presented by: Carol Askin Instructional Media/Data Analysis Advisor Local District 6. What is Qualitative Data and Research?.

238 views • 9 slides

Data collection, Organization & Analysis in Qualitative Research

Data collection, Organization & Analysis in Qualitative Research

Data collection, Organization & Analysis in Qualitative Research. Dr. Kishor Gaikwad Associate Professor, Department of History University of Mumbai. Is more better?.

800 views • 45 slides

Chapter 14: Qualitative Data Collection

Chapter 14: Qualitative Data Collection

Chapter 14: Qualitative Data Collection. Objectives Distinguish between participant and nonparticipant observational techniques and describe how they can be used in a qualitative study. Identify four specific interview techniques and describe how they can be used in a qualitative study.

3.31k views • 33 slides

Qualitative Data

Building a Scientific Language Card. Kwal i t a tiv. Def. Experimenting in science involves collecting data and / or making observations. Qualitative data includes descriptions of things that you observe. descriptive - senses - observed not measured.

124 views • 3 slides

Qualitative data collection Services from Experts at Stats work

Qualitative data collection Services from Experts at Stats work

Data collection is the process of collecting and measuring information about variables of our interest. Statswork have a team of well-connected executive members across India and globe that is well connected with a variety of professionals and the public to collect the data as per your expectations. Contact Us: UK NO: 44-1143520021 India No: 91-8754446690 US NO: 1-972-502-9262 Email: [email protected] Website: http://www.statswork.com/

49 views • 4 slides

METHODS USED FOR QUALITATIVE DATA COLLECTION

METHODS USED FOR QUALITATIVE DATA COLLECTION

Data plays an important role in any research or study conducted. It aids in bringing about a breakthrough in the respective field as well as for future researches. The collection of data is carried out in two forms viz: Qualitative Data and Quantitative Data which includes further bifurcation under it. What is Qualitative Data? Qualitative research can be defined as the method of research which focuses on gaining relevant information through observational, open-ended and communication method. They are more exploratory which concentrates on gaining insights about the situation and dig a bit deeper to find the underlying reason. The central idea behind using this method is to find the answer to Why and How rather than How many. Data gathered during a qualitative research is what is termed as qualitative data. What is the purpose? A qualitative data is non-numerical and more textual which comprises mostly of images, written texts, recorded audios and spoken words by people. Moreover, one can conduct qualitative research online as well as offline too. Apart from this, the varied purpose of qualitative research is as follows: - To examine the purpose or reason for the situation - Gain an understanding of the experience of people - Understanding of relations and meaning - Varied norms including social and political as well as contextual and cultural practice which impact the cause.

146 views • 12 slides

Reporting qualitative data

Reporting qualitative data

Reporting qualitative data. The problem: how to convey to the rest of the design team what was learned in qualitative needs assessment?. Representation. No representation is an objective, unbiased report of what’s real

201 views • 19 slides

Data Collection, Measurement, &  Data Quality in Quantitative and Qualitative Research

Data Collection, Measurement, & Data Quality in Quantitative and Qualitative Research

Data Collection, Measurement, & Data Quality in Quantitative and Qualitative Research. Data Collection Methods. Without appropriate data collection methods, the validity of research conclusions is easily challenged. Data Collection Methods. Using New Data Collect own data for the study.

1.17k views • 80 slides

Qualitative data analysis

Qualitative data analysis

Qualitative data analysis. Principles of qualitative data analysis. I mportant for researchers to recognise and account for own perspective Respondent validation Seek alternative explanations

380 views • 21 slides

Divergent Insights- Data Collection Methods in Qualitative Research

Divergent Insights- Data Collection Methods in Qualitative Research

Data Collection is the process of gathering and measuring information on variables of interest. It helps you learn more about your customers, discover market trends, improves the quality of decisions, helps understand the needs, resolve issues and improve the quality of your products or services. Divergent Insights always helps you know your customers by collecting data to improve the business. Visit us to know more: www.divergentinsights.com

113 views • 10 slides

  • Methodology
  • Open access
  • Published: 21 June 2024

The Rapid Implementation Feedback (RIF) report: real-time synthesis of qualitative data for proactive implementation planning and tailoring

  • Erin P. Finley   ORCID: orcid.org/0000-0003-4497-7721 1 , 2 ,
  • Joya G. Chrystal 1 ,
  • Alicia R. Gable 1 ,
  • Erica H. Fletcher 1 ,
  • Agatha Palma 1 ,
  • Ismelda Canelo 1 ,
  • Rebecca S. Oberman 1 ,
  • La Shawnta S. Jackson 1 ,
  • Rachel Lesser 1 ,
  • Tannaz Moin 1 , 3 ,
  • Bevanne Bean-Mayberry 1 , 3 ,
  • Melissa M. Farmer 1 &
  • Alison Hamilton 1 , 3  

Implementation Science Communications volume  5 , Article number:  69 ( 2024 ) Cite this article

77 Accesses

3 Altmetric

Metrics details

Qualitative methods are a critical tool for enhancing implementation planning and tailoring, yet rapid turn-around of qualitative insights can be challenging in large implementation trials. The Department of Veterans Affairs-funded EMPOWER 2.0 Quality Enhancement Research Initiative (QUERI) is conducting a hybrid type 3 effectiveness-implementation trial comparing the impact of Replicating Effective Programs (REP) and Evidence-Based Quality Improvement (EBQI) as strategies for implementing three evidence-based practices (EBPs) for women Veterans. We describe the development of the Rapid Implementation Feedback (RIF) report, a pragmatic, team-based approach for the rapid synthesis of qualitative data to aid implementation planning and tailoring, as well as findings from a process evaluation of adopting the RIF report within the EMPOWER 2.0 QUERI.

Trained qualitative staff conducted 125 semi-structured pre-implementation interviews with frontline staff, providers, and leadership across 16 VA sites between October 2021 and October 2022. High-priority topic domains informed by the updated Consolidated Framework for Implementation Research were selected in dialogue between EMPOWER 2.0 implementation and evaluation teams, and relevant key points were summarized for each interview to produce a structured RIF report, with emergent findings about each site highlighted in weekly written and verbal communications. Process evaluation was conducted to assess EMPOWER 2.0 team experiences with the RIF report across pre-implementation data collection and synthesis and implementation planning and tailoring.

Weekly RIF updates supported continuous EMPOWER 2.0 team communication around key findings, particularly questions and concerns raised by participating sites related to the three EBPs. Introducing the RIF report into team processes enhanced: team communication; quality and rigor of qualitative data; sensemaking around emergent challenges; understanding of site readiness; and tailoring of REP and EBQI implementation strategies. RIF report findings have facilitated rapid tailoring of implementation planning and rollout, supporting increased responsiveness to sites’ needs and concerns.

Conclusions

The RIF report provides a structured strategy for distillation of time-sensitive findings, continuous team communication amid a complex multi-site implementation effort, and effective tailoring of implementation rollout in real-time. Use of the RIF report may also support trust-building by enhancing responsiveness to sites during pre- and early implementation.

Trial registration

Enhancing Mental and Physical Health of Women Veterans (NCT05050266); https://clinicaltrials.gov/study/NCT05050266?term=EMPOWER%202.0&rank=1

Date of registration: 09/09/2021.

Peer Review reports

Contributions to the literature

Tailoring implementation strategies for specific site needs is often critical for successful implementation. However, few approaches ensure that implementation teams possess the necessary information to deliver timely, tailored strategies in multi-site trials.

We introduce a practical approach, the Rapid Implementation Feedback (RIF) report, designed to share critical information within implementation and evaluation teams. We illustrate how the RIF report has proven instrumental in fostering effective communication and tailoring within the EMPOWER 2.0 Quality Enhancement Research Initiative (QUERI).

The RIF report offers a method for sharing pertinent and time-sensitive findings, empowering teams to swiftly and effectively tailor implementation in real time.

As implementation science has matured, implementation trials have become increasingly complex, often comparing two or more implementation strategies, integrating multiple quantitative and qualitative methods, and occurring across a dozen or more sites. Such complex initiatives require larger teams of implementation researchers and practitioners to conduct, raising challenges for effective and timely communication within teams. Meanwhile, tailoring interventions and implementation rollout to align with the unique strengths and challenges at individual sites – recognized as a valuable and often requisite strategy for achieving implementation and sustainment [ 1 , 2 , 3 ] – requires intensive, flexible, and dynamic engagement with sites. Contextual factors must be assessed, key partners identified, and critical information synthesized and shared to allow for rapid sensemaking and problem-solving.

The growth of implementation science as a field has been accompanied by an acceleration in the variety, rigor, and rapidity of qualitative methods available to support real-world research translation [ 4 , 5 ]. Effective work in implementation often requires gathering information that is purposeful and systematic, represents a variety of partners and perspectives, and accurately synthesizes diverse viewpoints to support meaningful communication and decision-making at every stage of implementation. Accordingly, an array of methodological strategies for supporting participatory and partner-engaged processes [ 6 , 7 ], rapid qualitative data collection and analysis [ 8 , 9 ], and ethnographic and observational approaches [ 10 , 11 , 12 ] have emerged, offering a growing array of qualitative methods to meet the needs of a given study or initiative.

To make use of these methods effectively, work and team processes suitable for the implementation context are needed. The importance of strong communication and relationship networks within implementing sites and teams has been recognized since the early days of the field [ 13 , 14 , 15 ], and recent scholarship has examined how relational communication is embedded within most strategies for implementation [ 16 ], trust-building [ 17 ], and scale-up and spread [ 18 ]. Yet relatively little scholarship has put forward methods for ensuring timely and effective communication within implementation teams, particularly amid efforts to achieve site-level tailoring in real-time. Across eight years of conducting hybrid effectiveness-implementation trials in support of improved care delivery for women Veterans, our team has learned that effective tailoring requires capturing and sharing critical information in an ongoing way [ 4 , 10 , 19 ]. In the first part of this article, we describe the development of a pragmatic, team-based approach for the rapid synthesis of qualitative data to support implementation planning and tailoring: the Rapid Implementation Feedback (RIF) report. In the latter part, we describe findings from a process evaluation of adopting the RIF report within the EMPOWER 2.0 QUERI, outlining how use of this approach has evolved our work.

Background and study overview

Women Veterans represent the fastest-growing proportion of VA healthcare users. Despite substantial VA investment in women’s health, gender disparities persist in certain health outcomes, including cardiovascular and metabolic risk and mental health [ 20 , 21 , 22 ]. In tailoring healthcare delivery for women, prior studies suggest that women Veterans prefer gender-specific care and telehealth options [ 19 , 23 ]. In response, the VA EMPOWER 2.0 QUERI is conducting a hybrid type 3 effectiveness-implementation trial [ 24 ] comparing the impact of Replicating Effective Programs (REP) and Evidence-Based Quality Improvement (EBQI) as strategies for implementing three virtual evidence-based practices (EBPs) for women Veterans in 20 VA sites across the United States: (1) Diabetes Prevention Program (DPP) to reduce risk of progressing to type 2 diabetes [ 25 ]; (2) Telephone Lifestyle Coaching (TLC) to reduce cardiovascular risk [ 26 ]; and (3) Reach Out, Stay Strong Essentials (ROSE) to prevent postpartum depression [ 27 ]. REP combines phased intervention packaging, tailoring, training and technical assistance, and re-customization during maintenance/sustainment [ 28 ], while EBQI offers a systematic quality improvement method for engaging frontline providers in improvement efforts via tailoring, multi-level partnership, and ongoing facilitation [ 29 ]. We selected these bundled implementation strategies, REP and EBQI, based on their strong evidence for effectively supporting implementation in diverse healthcare settings [ 28 , 30 ]. Both of these strategies draw upon pre-implementation needs assessment and planned tailoring as key activities for successful implementation, which we postulated would be important based on our experience in the prior EMPOWER QUERI (2015–2020) [ 19 , 30 ]. These activities were deemed to be non-research by the VA Office of Patient Care Services prior to funding.

To coordinate the separate implementation and evaluation elements of our work, we established distinct-but-overlapping teams under the broader umbrella of EMPOWER 2.0, dedicated to: (1) implementing each of the EBPs (DPP, TLC, ROSE), with these smaller teams led by principal investigators for each EBP; (2) providing REP- or EBQI-consistent implementation support at each site (i.e., “REP team” and “EBQI team” project directors); and (3) executing qualitative and quantitive components of our overall evaluation (described in detail in [ 24 ]), in the form of the “qualitative team” and “measures team,” respectively.

EMPOWER 2.0 engagement and outreach

Working in concert across these implementation and evaluation teams, EMPOWER 2.0 followed a standardized process for engaging with sites (Fig. 1 ). Initial efforts (beginning pre-funding) involved reaching out to partners at the regional Veterans Integrated Service Network (VISN) level to introduce the EBPs, answer questions, and request a list of potential VA medical centers (VAMCs) within the VISN that might be appropriate for implementation. Following EMPOWER 2.0’s cluster-randomized study design, VISNs were assigned to participate in two of the EBPS (either TLC and ROSE or DPP and ROSE; ROSE was offered to all sites in an effort to ensure an adequate number of pregnant Veteran participants) [ 24 ]. We extended invitations to identified VAMCs to participate in the two EBPs available in their VISN. If sites expressed interest, we conducted an introductory meeting with providers and leadership from Primary Care, Women’s Health, Mental Health, Whole Health [ 31 ], and/or Health Promotion and Disease Prevention, as appropriate to the EBP and each site’s local organization of care. Once a site confirmed their participation, they were randomized to receive either the REP or the EBQI implementation strategy. Following randomization, they were asked to identify a point person for each EBP and key individuals who would be likely to participate in local EBP implementation teams and/or play an important role in supporting implementation (e.g., VAMC leadership). These individuals (e.g., Medical Director, Health Educator) were then invited to participate in pre-implementation interviews prior to initiating REP or EBQI at their site. In each VISN, partners at the VISN level were also invited to participate in pre-implementation interviews, to obtain broader perspectives on the regional women’s health context and priorities.

figure 1

EMPOWER 2.0 QUERI site-level outreach, randomization, and engagement

Pre-implementation qualitative interviews

Intended to assess sites’ needs and resources and enable pre-implementation tailoring prior to launch, EMPOWER 2.0 pre-implementation interviews examined baseline care practices for each relevant care condition (prediabetes for DPP; cardiovascular risk for TLC; perinatal mental health for ROSE), as well as updated Consolidated Framework for Implementation Research (CFIR) domains including inner and outer setting, innovation, individuals (e.g., characteristics: motivation) and implementation process [ 32 ]. Semi-structured interview guides (previously published [ 24 ]) were developed building on prior work in the original EMPOWER QUERI [ 30 ] and the Women’s Health Patient-Aligned Clinical Team trial [ 33 ]. We have an expert qualitative team, each of whom has master’s or PhD-level training in qualitative methods and years of experience in conducting team-based qualitative research, including using rapid qualitative analysis approaches [ 8 , 9 ]. Most team members have worked together on EMPOWER and other projects for over five years.

Between October 2021 – October 2022, the qualitative team completed 125 interviews across 16 sites, with site and VISN-level participants representing a range of roles, including Women Veteran Program Managers, Women’s Health Primary Care Providers, Maternity Care Coordinators, primary care team members, health coaches, and nutritionists. Pre-implementation interviews took an average of 57 days (range 15–108 days) to complete per site, and included 4–13 participants depending on the size and complexity of the care facility.

Developing the Rapid Implementation Feedback (RIF) report

The EMPOWER 2.0 qualitative team has a well-established approach to conducting rapid qualitative analysis [ 8 , 19 ] and strong personnel infrastructure and expertise. Even so, once pre-implementation interviews began, challenges quickly arose in ensuring that findings were being communicated to EMPOWER 2.0 implementation teams for DPP, TLC, and ROSE in a timely and effective manner, particularly given that each team was working with multiple sites concurrently. Key questions included: how do we ensure early findings are shared in time to support pre-implementation tailoring? How do we communicate effectively across the qualitative team conducting interviews and the teams responsible for implementation? And how do we keep qualitative team members up-to-date on implementation, so they are well-informed for interviews?

In responding to these challenges, we developed the Rapid Implementation Feedback (RIF) report to support data distillation and bidirectional feedback across our qualitative and implementation teams. In developing the RIF, the EMPOWER 2.0 implementation teams, which are composed of investigators and project directors for each EBP who provide external implementation support for each site, met with the qualitative interview team and agreed upon high-priority topic domains to be extracted from the interviews. These domains were related to implementation planning and included critical roles for implementation planning and launch ; implementation concerns and/or demand for the EBP ; and use of data to track women Veterans’ population health needs (see Table  1 ). These topics reflected both specific CFIR subdomains included in the pre-implementation interview guide (e.g., use of data as an assessment of the CFIR subdomain for information technology infrastructure ), as well as higher-level domains combined to aid in prioritizing key issues (e.g., germane responses related to inner setting , individual characteristics , and implementation process were combined into implementation concerns ). These topic domains were used to create a RIF report template (see Appendix 1 ), which was organized under headings by VISN (outer setting), site (inner setting), and EBP [ 32 ]; the same domains were selected for all EBPs, ensuring consistency in data distillation across the project. Compiling the RIF report ensured that, for example, all interview data relevant to critical roles for implementation planning for ROSE in Site A were collated and easy to locate. Thereafter, at the conclusion of an interview, the qualitative team reviewed interview notes and/or Microsoft Teams transcripts and extracted key points relevant to each priority topic; in doing so, team members followed a process similar to that used in developing structured summaries for rapid qualitative analysis [ 8 , 34 ], but differing by a targeted focus on relatively few domains. For each interview, the analyst would summarize key points related to each RIF domain (e.g., critical roles for implementation planning and launch ), as well as any brief or particularly salient quotes; every key point or quote was also labeled with a study identification number indicating the role of the respondent. The resulting key points and quotes were then added to the RIF report, creating a single, up-to-date written resource for implementation teams, which was cumulatively updated over time.

This approach to analysis is distinct in two key ways from the data distillation process typically used in rapid qualitative analysis [ 8 , 34 , 35 , 36 ]. First, in rapid qualitative analysis, templated summaries are first created at the level of the individual interview or other data episode, so that each data episode is associated with a summary of contents that can later be compiled into a multi-episode matrix. Second, structured summaries are traditionally intended to capture all of the key findings in a given data episode, and thus are both more comprehensive and less focused than the RIF report. By contrast, the RIF report collapsed two steps (i.e., summary then matrix) into one (i.e., RIF report) to assemble a targeted selection of high-priority data. In addition, because the data for each domain were collated from the beginning into a single document, the process of assessing data heterogeneity (e.g., diversity of opinions) and adequacy (e.g., saturation) for a given site was expedited. Up-to-date findings could be made available to the implementation teams on a consistent basis, despite the fact that the qualitative team was often interviewing among multiple sites concurrently. During this period, EMPOWER 2.0 held a weekly full-team meeting to coordinate implementation and evaluation efforts. The day before this weekly meeting, the updated RIF report was sent to the full EMPOWER 2.0 team in a secure encrypted email, with new additions highlighted for easy reference; the team was also notified if there were no RIF updates for the week. As implementation teams were also working concurrently across multiple sites, the RIF report became a centralized resource for organizing essential information in a dynamic environment.

Although the brief written RIF expedited communication of time-sensitive information across teams, challenges continued to arise in coordinating activities, tailoring EBPs, and general communication with sites. We therefore added a verbal update to the RIF Report (see Fig.  2 ), summarizing new additions to the RIF as part of our overall EMPOWER 2.0 weekly meeting. Updates were brief, organized by site, and included a brief summary of interviews conducted that week, along with the roles interviewed and unique findings (e.g., staff turnover issues). Members of the qualitative team also gave feedback on whether saturation had been reached at a site, or if additional interviewing would be helpful in developing a snapshot of key site features, strengths, and potential challenges.

figure 2

Core components of the Rapid Implementation Feedback (RIF) report

Process evaluation

To assess whether the RIF was an effective method for communication and coordination, we conducted a process evaluation of EMPOWER 2.0 teams’ experiences of using the RIF report. We reviewed periodic reflections conducted by the first author as part of EMPOWER 2.0’s overall implementation evaluation with 11 members of five internal teams: those responsible for leading DPP, TLC, and ROSE implementation (i.e., PIs and Co-PIs), and for supporting sites using REP and EBQI implementation strategies (i.e., project directors). Periodic reflections [ 10 ] are lightly guided discussions conducted by phone or teleconference software, which allow for consistent documentation of implementation activities, processes, and events, both planned and unexpected. We adapted the original periodic reflection template [ 10 ] as a discussion guide for EMPOWER 2.0 (previously published [ 24 ]). Reflections lasted 15–60 minutes, with length roughly corresponding to the amount of recent implementation activity, and were conducted monthly or bi-monthly with each team.

In examining how the RIF report was working for our teams, we conducted thematic analysis [ 37 ] of all periodic reflections ( n  = 32) completed with EMPOWER 2.0 teams between October 2021, when the RIF was first introduced, and October 2022. All text relevant to the RIF report was extracted and reviewed inductively for key themes associated with perceived impacts of the RIF, resulting in a preliminary set of emergent themes, which were codified into a codebook. All segments of extracted text were then reviewed again and assigned codes as appropriate to their meaning; central findings for each code/theme were then distilled. This preliminary analysis was conducted by the lead author and then presented back to the full EMPOWER 2.0 team to allow for debriefing and member checking [ 38 , 39 ] over a series of meetings. Team members provided substantive feedback that aided in refining themes, and offered additional reflection and commentary on the RIF report and its role within team processes.

We identified five interconnected impacts associated with introducing the RIF report into EMPOWER 2.0 team processes: enhanced communication across teams; improved quality and rigor of qualitative data; heightened sensemaking around emergent challenges; increased understanding of site readiness; and informed tailoring of REP and EBQI implementation strategies. We describe each of these in turn below.

Enhanced communication across teams

As intended, the RIF was felt to be an effective strategy for improving communication across EMPOWER 2.0’s internal teams. Having the RIF available in written format created an easily accessible resource for implementation teams as they prepared for next steps in engaging with sites, and for qualitative team members as they prepared for upcoming interviews. The verbal RIF update, because it occurred alongside implementation team updates as part of the weekly team call, ensured that information-sharing was bidirectional in real time. The continuous flow of information provided a regular opportunity for answering questions, clarifying areas of potential confusion, and identifying where additional information was needed. Additionally, the RIF served to keep all team members in sync with site-specific information on an ongoing basis.

“I love that the qualitative team is giving us real-time feedback. I don’t think I’ve ever done that except informally. I think that’s been a really nice addition to our meetings.” [EBP 1 lead]

On the whole, the enhanced communication among teams was felt to support team “synergy” and increase synchronization of activities in continued data-gathering and site engagement.

Improved quality and rigor of qualitative data

Although improving rigor was not an explicit goal of developing the RIF report, introducing this structured process was felt to have improved both the quality of data collection and the rigor of early analyses. Because of the improved bidirectional communication occurring as part of the weekly verbal RIF report with implementation teams, qualitative team members felt as though they had an increased understanding of implementation activities and site-level context. This in turn was felt to improve the quality of their interviewing by allowing them to ask more attuned follow-up questions and to prioritize topics that were “meaningful to inform implementation.”

“[We] felt very disconnected in the beginning like we didn’t have any information. Having the weekly calls to talk about these things was really helpful.” [Qualitative team member 1]

Qualitative team members also reported feeling more consistent and “in sync” in their processes for interviewing and preparing the RIF report, as the weekly discussions provided an opportunity for the team to observe, confer, and calibrate regarding the conduct of interviews and the content and level of detail included in ongoing RIF updates.

“It helps us stay impartial as interviewers across stakeholders, across sites, and as we modify the interview guide. It kept all of us…aligned with the parts we need to dig deeper into because they’re RIF/high priority.” [Qualitative team member 2].

In addition, introducing the RIF report was felt to increase the trustworthiness of preliminary analyses and data distillation, because while initial data reviews can be impressionistic or anecdotal, the RIF provided a structured and systematic way of consolidating multi-site data from the first pass. Because the RIF report provided early synthesis, it also aided in generating ideas for targeted analysis and coding conducted as part of evaluation activities in later phases.

Heightened sensemaking around emergent challenges

Arising out of the enhanced team communication, and perhaps supported by the improved quality of information being gathered and distilled by the qualitative team, discussions prompted by the RIF helped the EMPOWER 2.0 team to identify and develop solutions to emergent challenges. As one example, the qualitative team quickly realized that, while it is common practice to keep implementation-focused and evaluation-focused teams distinct in an effort to reduce bias in hybrid trials, sites viewed everyone associated with EMPOWER 2.0 – including interviewers – as an “ambassador” of the project. Interviewers found early on that they were fielding important questions from sites regarding the EBPs and/or implementation plans, and often lacked the information to provide an appropriate response, which placed them in an awkward position. After this issue was raised as part of a weekly RIF update, the teams worked together to develop a living Frequently Asked Questions document to help interviewers answer common questions that were coming up during interviews. This document was later helpful in standardizing communication with sites more generally, serving as a resource for implementation teams as well.

In a second example, a key pre-implementation effort by the EMPOWER 2.0 measures team involved developing a dashboard of population health and performance metrics tailored to provide actionable information to sites on the healthcare needs of their women Veterans. As preparations for site launch continued, and discussions of RIF findings informed ongoing planning efforts, the measures team realized they lacked information on how sites were using existing population health and performance measures. The measures and qualitative teams then worked together to update the interview guide and add priority domains to the RIF report to aid in dashboard development. Having integrated these additions, the qualitative team was able to rapidly confirm the need for a dashboard display of women-only performance measures, and data were used to support tailoring to sites’ needs.

Increased understanding of site readiness

Reflecting the enhanced communication and improved data quality associated with adopting the RIF report, the EMPOWER 2.0 teams were also more able to develop timely assessments of site readiness. The distillation of qualitative interview data provided important contextual information about site-level participants’ level of EBP awareness, motivation, and competing demands prior to implementation planning meetings.

“They just seem generally enthusiastic.” [EBP 2 lead] “Most of what I was picking up on was people saying, ‘We don’t have anyone to do it.’ Just sites saying that they don’t have people…they don’t want to take it on right now.” [EBP 3 lead]

Readied with this information, implementation teams were able to prepare for engagement and planning efforts with a greater understanding of what the critical issues were likely to be.

Informed tailoring of REP & EBQI strategies

Finally, building on an improved understanding of sites’ pre-implementation readiness, EMPOWER 2.0 teams felt better equipped to engage in planned tailoring of site outreach and implementation activities within the REP and EBQI strategy bundles. For example, when a key leader at one site was revealed to be “not entirely on board” with DPP implementation, the DPP team lead was able to offer targeted outreach to acknowledge and address the concerns expressed. When concerns were raised about staffing and EBP ownership prior to launch of ROSE, the ROSE team lead expressed, “We were prepared for tough conversations.”

“That became our ‘MO’…anything that comes up [in the RIF], we’ll try to address in the kick-off [meeting with sites] to show that we’re helping in addressing their questions.” [EBP 1 lead]

The RIF report was developed in response to the challenge, within the EMPOWER 2.0 hybrid type 3 effectiveness-implementation trial, of distilling and sharing critical information among internal teams as they pursued distinct implementation and evaluation tasks with an evolving cast of dynamic sites. Combined, the RIF report’s written and verbal components provide a method and process for rapidly extracting high-priority, actionable data, sharing these data in a focused and digestible way, and supporting team sensemaking and tailoring of implementation approaches in real time.

In evaluating the RIF report process, we found that its key benefits were interconnected and mutually reinforcing. Bidirectional communication increased the quality of qualitative data collection, which in turn improved the depth and salience of the data conveyed to the implementation teams, which in turn increased the teams’ ability to engage in active sensemaking and identify effective strategies for tailoring the implementation approach at each site. The tight informational feedback loop allowed us to be nimble and iterative both in data-gathering (e.g., by adding novel domains to the RIF as needed) and in tailoring (e.g., by allowing us to customize early messaging to address sites’ most pressing concerns).

Tailoring and adaptation of both interventions and implementation strategies have been recognized as essential for the successful translation of research into routine practice [ 40 , 41 , 42 , 43 ]. In response, a variety of qualitative and mixed-methods approaches have been put forward for capturing feedback from diverse partners, including user-centered adaptation [ 44 ], the Method for Program Adaptation through Community Engagement (M-PACE) [ 45 ], the ADAPT guidance [ 46 ], concept mapping [ 47 ], and intervention mapping [ 48 ]. These approaches have strengthened capacity for implementation researchers and practitioners to gather and synthesize often wide-ranging perspectives into actionable guidance for improving the acceptability, feasibility, appropriateness, and compatibility of interventions and implementation strategies. Yet there remains significant opportunity to streamline and systematize methods for tailoring in the context of hybrid type 2 and 3 trials, which often conduct formative evaluation in real time amid simultaneous data collection and implementation activities. In addition to providing a model for how to embed a structured method for data capture, distillation, and sharing within a complex implementation trial, we believe the RIF report offers a pragmatic method to improve both the quality of information synthesis and the ability of teams to engage in timely sensemaking.

Creating an effective internal communication process via the RIF supported tailored delivery of EBPs at each site, which in turn was felt to enhance the relationships between EMPOWER 2.0 QUERI members and site partners. The role of relationships as an underlying and underexplored element within implementation has garnered increasing attention [ 15 ]. Bartley et al. [ 16 ] conducted an analysis of the Expert Recommendations for Implementing Change (ERIC) taxonomy of implementation strategies [ 49 ], and found that nearly half (36 of 73) could be classified as highly or semi-relational in nature. Connelly and collaborators [ 50 ] developed a Relational Facilitation Guidebook based in relational coordination and the principle that high-quality communication and relationships result in improved healthcare quality. Metz and colleagues [ 17 ] have proposed a theoretical model for building trusting relationships to support implementation, drawing on theory and research evidence to identify both technical and relational strategies associated with nurturing trust. There is considerable overlap between Metz et al.’s strategies and the processes supported by adopting the RIF report in EMPOWER 2.0, particularly those related to bidirectional communication, co-learning, and frequent interactions, which in turn enabled greater responsiveness to sites. We found the structured communication offered by the RIF helped to support trust-building both within EMPOWER 2.0 and in our teams’ interactions with sites.

Future teams weighing potential use of the RIF report should first consider whether the RIF report is suitable to their project goals and resources. It may be less suitable for teams whose timelines allow for traditional coding-based or rapid qualitative approaches to data analysis, who do not intend to engage in formative evaluation or planned tailoring, or who have concerns that any modifications to the implementation approach may be incompatible with their trial design. In EMPOWER 2.0, core components for determining fidelity to implementation strategy in both study arms (REP and EBQI) were identified before initiating pre-implementation activities, and both strategies included planned tailoring to address specific conditions at sites (e.g., perceived patient needs, key professional roles and service lines to be involved). We were thus able to ensure that no decisions made in RIF-related or other discussions varied from our trial protocol.

Teams electing to adopt the RIF report may benefit from discussing how best to integrate this method into their workflow, and what specific tailoring of the RIF report is needed to ensure alignment with their implementation, research, and/or evaluation goals. We recommend that teams discuss and come to consensus on four RIF elements: (1) selected high-priority topic domains, e.g., site-level concerns, which may be higher-level or more closely focused on implementation theory constructs, as appropriate to the project; (2) what data sources will be included (e.g., data from provider or leadership interviews, surveys, or periodic reflections); (3) the preferred format for written and verbal RIF reports, including salient categories for organizing information (e.g., by site or professional role); and (4) the preferred frequency of sharing RIF reports. Given the established importance of identifying effective local champions in implementation [ 51 , 52 , 53 , 54 ], identifying critical roles and service lines for implementation planning and launch are domains likely to be of value for many projects, as is the domain of implementation concerns , which encapsulates important doubts or anxieties expressed by respondents that may be addressable by the implementation team. Teams documenting shifts to the implementation approach in response to respondent feedback might also consider adding a tailoring / action items or next steps domain to track decisions made during discussions of RIF findings. With regard to frequency, weekly RIF reports worked well for EMPOWER 2.0 because this tempo aligned with existing meetings and the busy pace of pre-implementation activities, but this frequency may not be necessary for all teams. Dialogue across these issues is likely to be of value for teams in developing a shared understanding of how project goals will be operationalized, and may allow for more agile responses when change is needed or challenges arise.

There are several limitations to the process evaluation described here. First, it should be noted that periodic reflections were conducted by the first author, who has worked with most members of the implementation teams for at least five years. As an ethnographic method occurring repeatedly over time, reflections benefit from the long-term relationship built between discussion lead and participants, and may be subject to less reporting bias than other data collection methods [ 10 ]. Nonetheless, the potential for biased reporting should be acknowledged. We endeavored to ensure the accuracy, completeness, and trustworthiness of findings [ 39 , 55 , 56 ] by engaging in multiple rounds of member checking with the EMPOWER 2.0 team, first in dedicated meetings and later in preparing and revising this manuscript.

In considering the limitations of the RIF report as a methodological approach to support effective distillation and tailoring, it is important to note that this process was developed and executed by a highly trained and experienced team, which likely facilitated qualitative team members in completing the structured reports in a timely and consistent manner. We found that analyses conducted for the RIF report were adequate to support all of the pre-implementation tailoring required for this initiative; however, projects – and particularly projects occurring earlier in the implementation pipeline than this hybrid type 3 trial – may vary in their early-stage analytic needs. Notably, no negative impacts associated with introducing the RIF were identified by team members; this may reflect the fact that the RIF report replaced other rapid qualitative analysis activities (e.g., developing structured summaries for each interview) rather than adding to the team workload. It should be noted that the EMPOWER 2.0 core team also builds on significant experience working together over time, which may have enhanced the quality of communication and coordination emerging from RIF updates. The RIF report may not be relevant or appropriate in implementation efforts where formative evaluation and/or tailoring are not intended or desirable (e.g., in implementation trials assessing the effectiveness of strategies that do not include planned tailoring), although its step-by-step process for synthesizing data relevant to high-priority topics for rapid communication is likely to have broad utility. Future research should consider whether the RIF report has generalizability as a method for use in less complex implementation studies, or by smaller or less experienced teams.

Rapid qualitative methods are a critical tool for enhancing implementation planning, communication, and tailoring, but can be challenging to execute in the context of complex implementation trials, such as those occurring across multiple sites and requiring coordination across implementation and evaluation teams. The RIF report extends rapid qualitative methods by providing a structured process to enhance focused data distillation and timely communication across teams, laying the groundwork for an up-to-date assessment of site readiness, improved identification and sensemaking around emergent problems, and effective and responsive tailoring to meet the needs of diverse sites.

Availability of data and materials

The datasets generated and/or analysed during the current study are not publicly available as participants have not provided consent for sharing; de-identified portions may be available from the corresponding author on reasonable request.

Abbreviations

Diabetes Prevention Program

Evidence-Based Practice

Evidence-Based Quality Improvement

Enhancing Mental and Physical Health of Women Veterans through Engagement and Retention

Quality Enhancement and Research Initiative

Replicating Effective Programs

Rapid Implementation Feedback

Reach Out, Stay Strong Essentials

Telephone Lifestyle Coaching

Veterans Affairs

VA Medical Center

Veterans Integrated Service Network

Krause J, Van Lieshout J, Klomp R, Huntink E, Aakhus E, Flottorp S, et al. Identifying determinants of care for tailoring implementation in chronic diseases: an evaluation of different methods. Implementation Sci. 2014;9(1):102.

Article   Google Scholar  

Treichler EBH, Mercado R, Oakes D, Perivoliotis D, Gallegos-Rodriguez Y, Sosa E, et al. Using a stakeholder-engaged, iterative, and systematic approach to adapting collaborative decision skills training for implementation in VA psychosocial rehabilitation and recovery centers. BMC Health Serv Res. 2022;22(1):1543.

Article   PubMed   PubMed Central   Google Scholar  

Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implementation Science. 2013;8(1). Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/1748-5908-8-117   Cited 2017 Mar 14.

Hamilton AB, Finley EP. Qualitative methods in implementation research: An introduction. Psychiatry Res. 2019;280:112516.

Cohen D, Crabtree BF, Damschroder LJ, Hamilton AB, Heurtin-Roberts S, Leeman J, et al. Qualitative Methods in Implementation Science. National Cancer Institute; 2018. Available from: https://cancercontrol.cancer.gov/sites/default/files/2020-09/nci-dccps-implementationscience-whitepaper.pdf

Cunningham-Erves J, Mayo-Gamble T, Vaughn Y, Hawk J, Helms M, Barajas C, et al. Engagement of community stakeholders to develop a framework to guide research dissemination to communities. Health Expect. 2020;23(4):958–68.

Hamilton AB, Brunner J, Cain C, Chuang E, Luger TM, Canelo I, et al. Engaging multilevel stakeholders in an implementation trial of evidence-based quality improvement in VA women’s health primary care. Behav Med Pract Policy Res. 2017;7(3):478–85.

Hamilton. Qualitative methods in rapid turn-around health services research. 2013 Dec 11; VA HSR&D Cyberseminar Spotlight on Women’s Health. Available from: http://www.hsrd.research.va.gov/for_researchers/cyber_seminars/archives/780-notes.pdf

St. George SM, Harkness AR, Rodriguez-Diaz CE, Weinstein ER, Pavia V, Hamilton AB. Applying Rapid Qualitative Analysis for Health Equity: Lessons Learned Using “EARS” With Latino Communities. Int J Qual Methods. 2023;22:160940692311649.

Finley EP, Huynh AK, Farmer MM, Bean-Mayberry B, Moin T, Oishi SM, et al. Periodic reflections: a method of guided discussions for documenting implementation phenomena. BMC Med Res Methodol. 2018;18(1):153.

Gertner AK, Franklin J, Roth I, Cruden GH, Haley AD, Finley EP, et al. A scoping review of the use of ethnographic approaches in implementation research and recommendations for reporting. Implementation Research and Practice. 2021;2:263348952199274.

Palinkas LA, Zatzick D. Rapid Assessment Procedure Informed Clinical Ethnography (RAPICE) in Pragmatic Clinical Trials of Mental Health Services Implementation: Methods and Applied Case Study. Adm Policy Ment Health. 2019;46(2):255–70.

Lanham HJ, McDaniel RR, Crabtree BF, Miller WL, Stange KC, Tallia AF, et al. How Improving Practice Relationships Among Clinicians and Nonclinicians Can Improve Quality in Primary Care. Jt Comm J Qual Patient Saf. 2009;35(9):457–66.

PubMed   PubMed Central   Google Scholar  

Miake-Lye IM, Delevan DM, Ganz DA, Mittman BS, Finley EP. Unpacking organizational readiness for change: an updated systematic review and content analysis of assessments. BMC Health Serv Res. 2020;20(1):106.

Finley EP, Closser S, Sarker M, Hamilton AB. Editorial: The theory and pragmatics of power and relationships in implementation. Front Health Serv. 2023;23(3):1168559.

Bartley L, Metz A, Fleming WO. What implementation strategies are relational? Using Relational Theory to explore the ERIC implementation strategies. FrontHealth Serv. 2022;17(2):913585.

Metz A, Jensen T, Farley A, Boaz A, Bartley L, Villodas M. Building trusting relationships to support implementation: A proposed theoretical model. FrontHealth Serv. 2022;23(2):894599.

Ketley D. A new and unique resource to help you spread and scale innovation and improvement. NHS Horizons. 2023. Available from: https://horizonsnhs.com/a-new-and-unique-resource-to-help-you-spread-and-scale-innovation-and-improvement/   Cited 2023 Mar 27

Dyer KE, Moreau JL, Finley E, Bean-Mayberry B, Farmer MM, Bernet D, et al. Tailoring an evidence-based lifestyle intervention to meet the needs of women Veterans with prediabetes. Women Health. 2020;60(7):748–62.

Goldstein KM, Melnyk SD, Zullig LL, Stechuchak KM, Oddone E, Bastian LA, et al. Heart Matters: Gender and Racial Differences Cardiovascular Disease Risk Factor Control Among Veterans. Women’s Health Issues. 2014;24(5):477–83.

Article   PubMed   Google Scholar  

Vimalananda VG, Biggs ML, Rosenzweig JL, Carnethon MR, Meigs JB, Thacker EL, et al. The influence of sex on cardiovascular outcomes associated with diabetes among older black and white adults. J Diabetes Complications. 2014;28(3):316–22.

Breland JY, Phibbs CS, Hoggatt KJ, Washington DL, Lee J, Haskell S, et al. The Obesity Epidemic in the Veterans Health Administration: Prevalence Among Key Populations of Women and Men Veterans. J GEN INTERN MED. 2017;32(S1):11–7.

Sheahan KL, Goldstein KM, Than CT, Bean-Mayberry B, Chanfreau CC, Gerber MR, et al. Women Veterans’ Healthcare Needs, Utilization, and Preferences in Veterans Affairs Primary Care Settings. J GEN INTERN MED. 2022;37(S3):791–8.

Hamilton AB, Finley EP, Bean-Mayberry B, Lang A, Haskell SG, Moin T, et al. Enhancing Mental and Physical Health of Women through Engagement and Retention (EMPOWER) 2.0 QUERI: study protocol for a cluster-randomized hybrid type 3 effectiveness-implementation trial. Implement Sci Commun. 2023Mar 8;4(1):23.

Moin T, Damschroder LJ, AuYoung M, Maciejewski ML, Havens K, Ertl K, et al. Results From a Trial of an Online Diabetes Prevention Program Intervention. Am J Prev Med. 2018;55(5):583–91.

Damschroder LJ, Reardon CM, Sperber N, Robinson CH, Fickel JJ, Oddone EZ. Implementation evaluation of the Telephone Lifestyle Coaching (TLC) program: organizational factors associated with successful implementation. Behav Med Pract Policy Res. 2017;7(2):233–41.

Zlotnick C, Tzilos G, Miller I, Seifer R, Stout R. Randomized controlled trial to prevent postpartum depression in mothers on public assistance. J Affect Disord. 2016;189:263–8.

Kilbourne AM, Neumann MS, Pincus HA, Bauer MS, Stall R. Implementing evidence-based interventions in health care: application of the replicating effective programs framework. Implementation Science. 2007;2(1).  Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/1748-5908-2-42  Cited 2017 May 11

Rubenstein LV, Stockdale SE, Sapir N, Altman L, Dresselhaus T, Salem-Schatz S, et al. A Patient-Centered Primary Care Practice Approach Using Evidence-Based Quality Improvement: Rationale, Methods, and Early Assessment of Implementation. J GEN INTERN MED. 2014;29(S2):589–97.

Article   PubMed Central   Google Scholar  

Hamilton AB, Farmer MM, Moin T, Finley EP, Lang AJ, Oishi SM, et al. Enhancing Mental and Physical Health of Women through Engagement and Retention (EMPOWER): a protocol for a program of research. Implementation Science. 2017;12(1).  Available from: https://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/s13012-017-0658-9   Cited 2018 Jan 5

Kligler B. Whole Health in the Veterans Health Administration. Glob Adv Health Med. 2022;11:2164957X2210772.

Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated Consolidated Framework for Implementation Research based on user feedback. Implementation Sci. 2022;17(1):75.

Yano EM, Darling JE, Hamilton AB, Canelo I, Chuang E, Meredith LS, et al. Cluster randomized trial of a multilevel evidence-based quality improvement approach to tailoring VA Patient Aligned Care Teams to the needs of women Veterans. Implementation Sci. 2015;11(1):101.

Nevedal AL, Reardon CM, Opra Widerquist MA, Jackson GL, Cutrona SL, White BS, et al. Rapid versus traditional qualitative analysis using the Consolidated Framework for Implementation Research (CFIR). Implementation Sci. 2021;16(1):67.

Kowalski C, Nevedal AL, Finley, Erin P., Young J, Lewinski A, Midboe AM, et al. Raising expectations for rapid qualitative implementation efforts: guidelines to ensure rigor in rapid qualitative study design, conduct, and reporting. 16th Annual Conference on the Science of Dissemination and Implementation in Health; 2023 Dec 13; Washington, D.C.

Gale RC, Wu J, Erhardt T, Bounthavong M, Reardon CM, Damschroder LJ, et al. Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration. Implementation Sci. 2019;14(1):11.

Braun V, Clarke V. Thematic analysis. In: Cooper H, Camic PM, Long DL, Panter AT, Rindskopf D, Sher KJ, editors. APA handbook of research methods in psychology, Vol 2: Research designs: Quantitative, qualitative, neuropsychological, and biological. Washington: American Psychological Association. 2012;57–71.   Available from: http://content.apa.org/books/13620-004  Cited 2023 Mar 28

Torrance H. Triangulation, Respondent Validation, and Democratic Participation in Mixed Methods Research. J Mixed Methods Res. 2012;6(2):111–23.

Birt L, Scott S, Cavers D, Campbell C, Walter F. Member Checking: A Tool to Enhance Trustworthiness or Merely a Nod to Validation? Qual Health Res. 2016;26(13):1802–11.

Stirman SW, Miller CJ, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implementation Science. 2013;8(1).  Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/1748-5908-8-65  Cited 2017 Sep 5

Miller CJ, Barnett ML, Baumann AA, Gutner CA, Wiltsey-Stirman S. The FRAME-IS: a framework for documenting modifications to implementation strategies in healthcare. Implementation Sci. 2021;16(1):36.

Wiltsey Stirman S, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implementation Sci. 2019;14(1):58.

Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to Improve the Selection and Tailoring of Implementation Strategies. J Behav Health Serv Res. 2017;44(2):177–94.

Ware P, Ross HJ, Cafazzo JA, Laporte A, Gordon K, Seto E. User-Centered Adaptation of an Existing Heart Failure Telemonitoring Program to Ensure Sustainability and Scalability: Qualitative Study. JMIR Cardio. 2018;2(2):e11466.

Chen EK, Reid MC, Parker SJ, Pillemer K. Tailoring Evidence-Based Interventions for New Populations: A Method for Program Adaptation Through Community Engagement. Eval Health Prof. 2013;36(1):73–92.

Article   CAS   PubMed   Google Scholar  

Moore G, Campbell M, Copeland L, Craig P, Movsisyan A, Hoddinott P, et al. Adapting interventions to new contexts—the ADAPT guidance. BMJ. 2021;3:n1679.

Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science. 2015;10(1).  Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/s13012-015-0295-0  Cited 2017 Sep 5

Fernandez ME, Ruiter RAC, Markham CM, Kok G. Intervention Mapping: Theory- and Evidence-Based Health Promotion Program Planning: Perspective and Examples. Front Public Health. 2019;7:209.

Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science. 2015;10(1). Available from: http://implementationscience.biomedcentral.com/articles/ https://doi.org/10.1186/s13012-015-0209-1  Cited 2017 Nov 2

Connelly B, Gilmartin H, Hale A, Kenney R, Morgon B, Sjoberg H. The Relational Facilitation Guidebook [Internet]. Denver-Seattle Center of Innovation for Veteran-Centered and Value-Driven Care; 2023 Feb.  Available from: https://www.seattledenvercoin.research.va.gov/education/rc/docs/Relational_Facilitation_Guidebook.pdf  Cited 2023 Mar 29

Bonawitz K, Wetmore M, Heisler M, Dalton VK, Damschroder LJ, Forman J, et al. Champions in context: which attributes matter for change efforts in healthcare? Implementation Sci. 2020;15(1):62.

Demes JAE, Nickerson N, Farand L, Montekio VB, Torres P, Dube JG, et al. What are the characteristics of the champion that influence the implementation of quality improvement programs? Eval Program Plann. 2020;80:101795.

Flanagan ME, Plue L, Miller KK, Schmid AA, Myers L, Graham G, et al. A qualitative study of clinical champions in context: Clinical champions across three levels of acute care. SAGE Open Medicine. 2018;6:205031211879242.

Wood K, Giannopoulos V, Louie E, Baillie A, Uribe G, Lee KS, et al. The role of clinical champions in facilitating the use of evidence-based practice in drug and alcohol and mental health settings: A systematic review. Implementation Research and Practice. 2020;1:263348952095907.

Morse JM, Barrett M, Mayan M, Olson K, Spiers J. Verification strategies for establishing reliability and validity in qualitative research. Int J Qual Methods. 2002;1(2):13–22.

Abraham TH, Finley EP, Drummond KL, Haro EK, Hamilton AB, Townsend JC, et al. A Method for Developing Trustworthiness and Preserving Richness of Qualitative Data During Team-Based Analysis of Large Data Sets. Am J Eval. 2021;42(1):139–56.

Download references

Acknowledgements

All views expressed are those of the authors and do not represent the views of the US Government or the Department of Veterans Affairs. The authors would like to thank the EMPOWER QUERI 2.0 team, the VA Women’s Health Research Network (SDR 10-012), the participating Veteran Integrated Service Networks, and the women Veterans who inspire this work. Dr. Hamilton is supported by a VA HSR&D Research Career Scientist Award (RCS 21-135). Dr. Moin also receives support from the NIH/NIDDK (R01DK124503, R01DK127733, and R18DK122372), NIH/NIDDK Centers for Disease Control and Prevention (U18DP006535), the Patient-Centered Outcomes Research Institute (PCORI; SDM-2018C2-13543), the Department of Veterans Affairs (CSP NODES, CSP#2002), and UCLA/UCOP.

We would like to acknowledge funding from the VA Quality Enhancement Research Initiative (QUERI; QUE 20–028), the VA QUERI Rapid Qualitative Methods for Implementation Practice Hub (QIS 22–234), and VA Health Services Research & Development (Hamilton; RCS 21–135).

Author information

Authors and affiliations.

Center for the Study of Healthcare Innovation, Implementation, and Policy (CSHIIP), VA Greater Los Angeles Healthcare System, Los Angeles, CA, USA

Erin P. Finley, Joya G. Chrystal, Alicia R. Gable, Erica H. Fletcher, Agatha Palma, Ismelda Canelo, Rebecca S. Oberman, La Shawnta S. Jackson, Rachel Lesser, Tannaz Moin, Bevanne Bean-Mayberry, Melissa M. Farmer & Alison Hamilton

Joe R. & Teresa Lozano Long School of Medicine, The University of Texas Health Science Center at San Antonio, San Antonio, TX, USA

Erin P. Finley

David Geffen School of Medicine, University of California Los Angeles, Los Angeles, CA, USA

Tannaz Moin, Bevanne Bean-Mayberry & Alison Hamilton

You can also search for this author in PubMed   Google Scholar

Contributions

The original Rapid Implementation Feedback (RIF) report format was developed by JC, AG, AH, and EPF, with feedback from EHF, AP, IC, RO, LSJ, RL, TM, BBM, and MF. The analysis for this manuscript was planned by EPF, AH, JC, and AG. Preliminary analysis was conducted by EPF, with refinement and verification of findings provided by all authors during member checking meetings. The first draft was written by EPF, JC, AG, EHF, and AH. All authors reviewed, edited, and approved the final manuscript.

Corresponding author

Correspondence to Erin P. Finley .

Ethics declarations

Ethics approval and consent to participate.

This proposal was funded through VA’s Quality Enhancement Research Initiative (QUERI), which uses operational funds to support program improvement. QUERI projects are conducted as quality improvement for the purposes of program implementation and evaluation and are approved as such by the main VA operations partner, which was the VA Office of Patient Care Services for EMPOWER 2.0 (approval received 11/26/2019). All interview participants provide oral, recorded consent for participation.

Consent for publication

Not applicable.

Competing interests

Erin P. Finley and Alison Hamilton are on the editorial board for Implementation Science Communications.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Finley, E.P., Chrystal, J.G., Gable, A.R. et al. The Rapid Implementation Feedback (RIF) report: real-time synthesis of qualitative data for proactive implementation planning and tailoring. Implement Sci Commun 5 , 69 (2024). https://doi.org/10.1186/s43058-024-00605-9

Download citation

Received : 14 December 2023

Accepted : 09 June 2024

Published : 21 June 2024

DOI : https://doi.org/10.1186/s43058-024-00605-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Rapid qualitative methods
  • Implementation strategies
  • Implementation planning
  • Evidence-based practice

Implementation Science Communications

ISSN: 2662-2211

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

presentation qualitative data collection

  • Study Protocol
  • Open access
  • Published: 20 June 2024

Improving patients’, carers’ and primary care healthcare professionals’ experiences of discharge communication from specialist palliative care to community settings: a protocol for a qualitative interview study

  • Katharine Weetman 1 , 2 ,
  • John I. MacArtney 2 , 3 ,
  • Catherine Grimley 2 ,
  • Cara Bailey 1 &
  • Jeremy Dale 2  

BMC Palliative Care volume  23 , Article number:  156 ( 2024 ) Cite this article

54 Accesses

Metrics details

Patients who have benefited from specialist intervention during periods of acute/complex palliative care needs often transition from specialist-to-primary care once such needs have been controlled. Effective communication between services is central to co-ordination of care to avoid the potential consequences of unmet needs, fragmented care, and poor patient and family experience. Discharge communications are a key component of care transitions. However, little is known about the experiences of those primarily receiving these communications, to include patients’, carers’ and primary care healthcare professionals. This study aims to have a better understanding of how the discharge communications from specialist palliative care services to primary care are experienced by patients, carers, and healthcare professionals, and how these communications might be improved to support effective patient-centred care.

This is a 15-month qualitative study. We will interview 30 adult patients and carers and 15 healthcare professionals ( n  = 45). We will seek a range of experiences of discharge communication by using a maximum variation approach to sampling, including purposively recruiting people from a range of demographic backgrounds from 4–6 specialist palliative care services (hospitals and hospices) as well as 5–7 general practices. Interview data will be analysed using a reflexive thematic approach and will involve input from the research and advisory team. Working with clinicians, commissioners, and PPI representatives we will co-produce a list of recommendations for discharge communication from specialist palliative care.

Data collection may be limited by the need to be sensitive to participants’ wellbeing needs. Study findings will be shared through academic publications and presentations. We will draft principles for how specialist palliative care clinicians can best communicate discharge with patients, carers, and primary care clinicians. These will be shared with clinicians, policy makers, commissioners, and PPI representatives and key stakeholders and organisations (e.g. Hospice UK) and on social media. Key outputs will be recommendations for a specialist palliative care discharge proforma.

Trial registration

Registered in ISRCTN Registry on 29.12.2023 ref: ISRCTN18098027.

Peer Review reports

Introduction

Palliative care provides care and support for people with terminal illnesses and those at the end of their life who are dying, as well as for their close persons. In the UK, it is often called generalist palliative care when provided by health and social care professionals in hospitals or the community, and primary palliative care when provided by healthcare professionals in primary care e.g. community nurses and general practitioners (GPs) [ 1 ]. Those with acute or complex palliative care needs can be referred to specialist palliative care (e.g. hospices), which is not only for people in the last days of life, but is available to intervene and support people with life-limiting illnesses whenever they develop complex palliative needs [ 2 , 3 , 4 ]. Patients who have benefited from specialist intervention during periods of acute/complex palliative care needs often transition from specialist-to-primary care where their day-to-day healthcare needs will be managed by general practice and community teams, once such needs have been managed and/or controlled [ 5 ]. A previous systematic review on hospice discharge [ 6 ] estimated discharge rates from 5–23%.

Effective communication between services during care transitions known as “discharge communication” is central to co-ordination of care to avoid the potential consequences of unmet needs, fragmented care, and poor patient and family experience. Miscommunications and unclear information can result in a lack of patient-centred care [ 7 ] and continuity of care [ 8 ], confusion and anxiety [ 9 ], and avoidable crises such as readmission as an emergency [ 10 ]; such readmissions may be unnecessary and/or preventable as they could be avoided or at least reduced with better co-ordinated care transitions [ 11 ] and improved communication and information continuity/sharing [ 12 ]. However, if communication is effective and involves patients in a way that respects their choices and needs, this can lead to benefits such as improved well-being [ 9 ], increased satisfaction [ 13 ], and better understanding of how to manage their symptoms [ 14 , 15 ]. Indeed, it has been good practice for over 20 years in the UK for patients to receive copies of written communications sent between their physicians [ 16 , 17 , 18 , 19 ].

Our recent study looking at hospice discharge in five UK hospices, indicated that hospice patients (and where relevant, their carers) are not consistently receiving or being offered discharge letters [ 5 ]. Although hospice care seeks to provide a holistic service, we also found that there was a focus on physical needs in these letters, with much less focus on psychological/emotional and social needs, with spiritual needs being rarely documented (2.4%) [ 5 ].

Our previous research [ 5 , 20 ] found heterogeneity in the quality of specialist palliative care discharge communications to primary care, along with the inconsistencies in copying in patients to discharge communication. We also found that little is known about how being discharged from specialist palliative services affects patients’ and carers’ experiences of end of life care, or what information community teams need for managing the patient’s ongoing care;

Improving discharge communication has the potential to increase shared understanding of the patient’s condition, their symptoms and planned management of pain, symptoms and holistic needs. Improved communication should empower those receiving such information to better enact the patient’s chosen advance care plan [ 5 , 20 ], which may in turn improve a patient’s quality of life and experience of death and dying. Therefore, this study aims to understand how discharge communications from specialist palliative care services to primary care are currently being experienced by those receiving them, and how this can be improved in order to ensure improved care at this crucial and time-sensitive part of the healthcare journey.

Research question

How can specialist palliative care discharge communications to primary care better support patient and carer needs?

Explore patient and carer experiences of discharge communication from specialist palliative care to identify how it currently supports their needs and how it might be improved.

Investigate how primary healthcare professionals currently experience specialist palliative care discharge communication and how it might be improved to support joined-up care for people with palliative needs.

Synthesise findings to inform recommendations about how discharge communication from specialist palliative care services can be improved.

This is a qualitative study exploring the lived experiences of those receiving specialist palliative care discharge communications. Qualitative methods [ 21 ] are well suited for generating rich data drawing on participants’ accounts, and will allow us to explore experiences in relation to their contextual settings.

Theoretical framework

This qualitative research is positioned within an interpretative paradigm [ 22 ]. Crucial to our approach are reflexive practices, the ways in which both ours and our participants’ bounded and partial positions become knowable [ 23 , 24 ]. A qualitative approach is appropriate for the proposed research as we engage participants about their experiences and explore the meanings and interpretations of the discharge communication event(s).

General practices and specialist palliative care services provided by hospices and hospitals in the West Midlands. The West Midlands in England, United Kingdom (UK), has the largest ethnically diverse population outside of London distributed across a range of geographical locations, from inner city to rural areas [ 25 ]. This is a multicentre project within this geographical region. Participating sites will be sampled for deliberate heterogeneity of sociodemographic characteristics of the public population for which the site cares for and seeks to serve e.g. indices of deprivation, urban or rural setting, patient mean age group and ethnicity. Therefore, there will be diversity and variation of practice locality.

Participants

Adult patients and carers who have had recent experience of discharge from specialist palliative care. Primary healthcare professionals including general practice team members and district nurses.

To ensure the findings have sufficient depth and information power [ 26 ], we will purposefully recruit [ 27 ] from 4–6 specialist palliative care services located at either hospice or hospital sites, and 5–7 general practices to provide participant diversity.

We seek to collect 30 recent discharge experiences of patients and /or carers. Should ongoing community support be being received, participants will still be eligible to take part. Where interviews take place with patients and carers in dyads, this will be counted as a single “experience”, such that one person’s account cannot be disaggregated from the others. The sampling timeframe was discussed and endorsed by our PPI, who felt people should be spoken to as soon as possible after discharge for memory recall, ensuring relevance, and respecting that participants may not have long left to live.

We will also interview 15 primary care health professionals with experience of receiving patients discharged from specialist palliative care. We will recruit for healthcare professional diversity in regards to role (general practitioners, district and practice nurses…), setting and locality, specialty/special interest areas, and grade/experience.

The inclusion and exclusion criteria for the selection and screening of all participants is found in Table  1 below.

The total sample size of n  = 45 has been devised using the principles of the ‘information power matrix’ [ 26 ]. The matrix helps us anticipate the pressures for ‘more’ or ‘less’ interviews and provides a balance between fulfilling: (a) the broad explanatory aim of this study – there has been very little research into experiences of discharge from specialist palliative care ( more ); (b) a need to explore several specific locales (hospices, hospitals, and general practices) and demographic groups (gender; socio-economic status; ethnicity; sexuality; disability) ( more ); (c) the specific and applied contribution this study provides to the established theories and literature on discharge communication that exists in other healthcare fields ( less ); (d) the in-depth quality of dialogue we expect to collect – accounts of discharge are expected to be detailed and contain experiences of ‘before’ and 'after’ (less); and, (e) the thematic focus of our analytical strategy (less).

Equality and diversity

This research, as far as possible will ensure and promote equality and diversity from design, through to implementation, delivery and dissemination. The study will collect data on age, sex, sexuality, religion, disability, and ethnicity. The study will be introduced to eligible participants as they are being discharged from specialist palliative care to mitigate the barriers that may be experienced from mail out invitations alone. Professional translators will be provided if required to ensure that participants can take part in their preferred language and that this is not a barrier to participation.

Recruitment

To ensure patient confidentiality, participating hospices and hospitals will be responsible for screening, identifying, and inviting eligible patient and carer participants for the study. The assessment of capacity will be undertaken by the patient’s clinical team at the site.

Eligible participants will be signposted to the study verbally and provided with an electronic or hard copy invitation. The study invitation will include a copy of the invitation letter, participant information sheet, and consent form. Those eligible may also be contacted retrospectively by the direct care team, and/or provided a reminder after 48 h – this is in acknowledgement that palliative care discharges can be rapid and/or unplanned in practice [ 28 ]. Interested participants will be given a minimum of 24 h to decide if they wish to take part and may directly contact the research team with questions and to agree a time, date, and place for the interview.

All eligible staff at participating primary care sites will be invited to take part in interviews, by the staff member/route of contact for site recruitment or an appropriate colleague (e.g. CRN nurse or GP champion). This study invitation will be electronic and/or hard copy and include a copy of the invitation letter, participant information sheet, and consent form.

Data collection and interview procedure

Interviews will be semi-structured and will seek to elicit the participants’ views on discharge from specialist palliative care, any discharge communications or discharge letters they were party to, and when these were received, as well as their views on what currently works (or not), and their suggestions for improvements to discharge communications. The interview schedule (Supplementary files 1 and 2) are informed by our previous research involving interviews on discharge communications [ 9 , 29 ].

Each participant will only take part in one interview with the research team, which is expected to take no more than one hour. However, interview timing will be adjusted flexibly to meet the needs and preferences of participants.

Patient and carer: Participants will be offered interviews via online video (e.g. Microsoft Teams or Zoom), phone, or in-person at the patient or carer’s home, hospice, or another place of their choice. Given the potential vulnerability of the participant population, infection control will be followed for in-person contact to include encouraging covid-19 lateral flow testing prior and wearing of face masks; any other reasonable requests to reduce the risk of viral infection will be respected and adhered to wherever practicably possible.

Healthcare professionals: Interviews will be flexible and more rapid in recognition of the pressures on general practice (10–30 min); structured topic guides will be used to ensure minimal disruption. Interview dyads or groups (≤ 3) will be offered, whereby persons can be interviewed concurrently/in groups. This method has closer likeness to that of an individual interview, as opposed to a focus group [ 7 ].

All interviews will be recorded and transcribed. Demographic information on participants will be collected at interviews to monitor the purposive sampling objectives. Appropriate steps will be taken to ensure participant confidentiality and interview transcripts will be pseudonymised to include the removal of any direct identifiers.

Consent process

Informed consent for this study for all participants will be sought both verbally and recorded in written form using the study consent form (supplementary file 3). Before each interview commences, the researcher will review the study materials with the participant, which will have been provided in advance, and invite any questions. The researcher will confirm with the participant their understanding of the study. In all cases, the consent form will be co-signed by the interviewer. The study materials clearly state that participation is voluntary and participants can decline or withdraw without reason or consequence and choosing not to take part will not affect participant medical or legal rights in any way.

Inclusion and accessibility

Participants will be asked when arranging the interview, what their accessibility needs are and how best the research team can accommodate these. The consent process has been deliberately designed to maximise inclusivity and accessibility. This is because some of the participants for this study will likely have life-limiting conditions which can affect mobility and small motor skills, such as Motor Neurone Disease or following chemotherapy, radiation, or stroke. Members of the research team have lived experience of disability and palliative care (personally and/or professionally). Efforts to limit unintended exclusion from the study include the following. First, the consent form can be completed in hard copy using a wet ink signature and/or in electronic form. Combination completion will also be permitted as suited to participant preference. For electronic completion, the boxes in the word document consent form have been formatted so that clicking in the large square box enters a "tick". The electronic signature in the end declaration can then be electronically signed and/or typed. The reasons for tick box consent for each section are to support participants who may struggle with initialling boxes. This is an adaption that has been made to the HRA template but is in alignment with HRA e-consent guidance (2019) [ 30 ]. Second, proxy signing has also been designed into this study to allow participation by persons who are able to read and process information, and provide informed consent, but may not have the physical ability to complete the written consent form (e.g. difficulty in coordination, mobility or writing). The research team acknowledge in some circumstances that electronic completion of the consent form may be difficult or prohibiting for those persons with disabilities or health conditions that affects their mobility or small motor skills. Therefore, a participant can nominate a proxy such as a carer or companion (e.g. friend, partner, family member), healthcare professional, or a member of the research team to tick the boxes and type/sign their name in the signature line for them. The use of proxy signing will be participant led to ensure autonomy and dignity is prioritised at all times. The consent form has been designed so that proxy signing will be clearly indicated and recorded in all cases that apply.

Study participant support

It is a common occurrence that when interviewing people with serious illnesses or terminal conditions that they will often prefer to have a close person with them during the interview for support [ 31 ]. Study participants will therefore have the option to attend the interview alone or with a carer or close person. The carer/support person can listen only, or if they wish may be able to take part in the interview in which case a signed consent form will be required. Where patients and carers, relating to the same case, have both been invited to take part in the study, they can be interviewed together (joint interview or dyad) or separately.

If a participant becomes upset during an interview the researcher will remind the participant that they can have breaks or stop the interview at any time. The participant information sheet provides a list of supportive resources the participant can access and they will also be reminded that they can discuss their concerns with their doctor. No medical advice will be provided at interviews.

Methods for sharing study findings with participants

The results of the study will be shared with all living study participants (unless they would prefer not to receive this) in a lay summary which will be co-produced with our PPI members (see below). If a study participant has died, the study findings can be shared with a nominated person, which will be ascertained as part of the consent process. Participants will also be signposted to any study webpages hosted by the University of Birmingham and University of Warwick, which will be updated as outputs are published and produced e.g. to read the open access peer-reviewed paper.

Payments, rewards, and recognition for study participants

Patient and carers will be provided with a £25 high street shopping voucher as a thank you for participating. An acknowledgement of the contribution of study participants will be provided in outputs such as publications as a collective statement. Travel has also been costed as necessary. Clinical sites will be reimbursed for staff time and any other costs.

Patient and public involvement

To develop the study proposal and assess the acceptability of the research, we consulted with an existing palliative care PPI group associated with BRHUmB – a NIHR funded Research Hub for Palliative and End of LIfe Care in the West Midlands. These PPI members reflect a diverse range of health, social, and cultural needs and have all experienced palliative care as a patient, carer or volunteer. We held an initial meeting with the BRHUmB PPI group and the group members recognised the issues around poor communication and how it can affect care following discharge and cause confusion.

Three PPI members with varying backgrounds and experience of palliative care have joined the research advisory team. One member was involved in developing the patient facing materials and ensuring readability and accessibility. The PPI group will advise on research design, recruitment, and where they wish to the analysis, writing and dissemination of findings. They will be invited to a total of 10 meetings during the 15-month study to include PPI meetings, advisory meetings with the research team, and a dissemination workshop.

Interview data will be analysed with reflexive thematic analysis [ 23 , 24 , 32 , 33 ]. This will involve synthesis and interrogation of interview transcripts to identify themes in the data by CG and KW in line with the six stages of reflexive thematic analysis. We will use NVivo software to support the processes of (i) familiarisation, (ii) generating codes and (iii) constructing themes [ 23 ]. KW and CG will draw on existing relevant literature and discuss potential themes with JM, the research team, PPI representatives, and the advisory group as part of the iterative process of (iv) refining codes and (v) defining themes, throughout the ongoing interviews and during the process of (vi) writing-up [ 23 ].

Development of recommendations

The final study advisory group will take the form of a hybrid half day collaboration workshop. In addition to clinical, academic and PPI group members, we will invite relevant stakeholders and collaborating commissioners to attend. We will share initial findings and pseudonymised excerpts from the interviews, using a Modified Nominal Group Technique [ 34 ] (M-NGT), a collaborative and consensus building approach, to generate transferrable insights to best inform and translate findings to practice and policy. We will also co-produce a list of recommendations for discharge communications from specialist palliative care.

This study is important because it will be one of the first to provide an in-depth consideration into how discharge communications from specialist palliative care might be improved to support effective patient-centred care at the end of life [ 5 ]. Our use of qualitative methods and inclusive approach to data collection will help to ensure that the experiences of people with life-limiting conditions are not “forgotten” and will help to ensure that palliative and primary care services are able to support their quality of life through provision of timely and holistic care [ 5 , 35 ].

This study will produce evidence that can inform improvement in communication between specialist palliative services, general practice and community teams, and patients and their carers. Informed by our patient and public involvement work, and through listening to peoples’ experiences of discharge from specialist palliative care, this research will focus on making communication better for all involved. It will highlight the need to continue to prioritise discharge communications through evidencing deficits and providing empirically-based recommendations for practice [ 7 ]. Whilst there are specialised discharge templates for emergency care and mental health [ 36 ], they have not yet to our knowledge been developed for specialist palliative care in the same way. As an outcome of this work, we will develop a set of recommendations for palliative care discharge communications, designed to address the variable quality that currently exists. This research will bring improvements to patient and carer experiences as well as inform routes for better communication and integration of care between services.

Limitations

There are several limitations to this study that arise from the need to ensure data is collected in an ethical and appropriate way. Research with people at the end of life, and those who support them, needs to take into account the personal, emotional and social difficulties people can experience during this time [ 5 ]. Potential participants will be identified by clinicians to ensure that those who are finding this time particularly difficult are not troubled. However, this may mean that those who are most in need of most discharge support are not afforded the opportunity to contribute to the study. We may also need to wait days or weeks to speak to participants, to ensure participation does not result in an undue emotional burden upon them. As the study is exploring participant’s accounts of their experiences of discharge, this delay may lead to poorer recall of events and feelings.

Dissemination and impact strategy

Study findings will be shared through academic publications and presentations. Through our engagement and dissemination activities, we hope to raise awareness of patient’s entitlement to have a copy of their discharge letter. This study will also inform future research with the need to implement and evaluate the recommended principles to potentially generate a palliative care discharge proforma. We expect our findings will be submitted to the NHS England and PRSB (Professional Records Standards Body) for consideration to affect policies as providing quality care in community settings becomes increasingly important. Such policy recommendations are likely to have a relatively short timeframe to impact and be supported by professional bodies, Colleges, and organisations that have previously supported similar initiatives, such as the Royal College of Physicians and Academy of Medical Royal Colleges.

This study will provide important insights into the experiences of being discharged from specialist palliative care to primary care that have, so far, been significantly underexplored. It will provide an in-depth evidence base from which to develop recommendations and principles of good discharge practice for palliative care. The findings and recommendations will be of relevance both across the UK, but also to those responsible for transferring care from specialist palliative care to primary care in healthcare services around the world.

Availability of data and materials

The datasets generated during and/or analysed during the current study will be available upon request in accordance with the below. The full datasets will not be made available due to the potentially identifiable nature of the in-depth qualitative data. Access to patient-identifiable data will be restricted to members of the study co-ordination team who require it for the performance of their role, inclusive of the research team. However, the data and results may be used for secondary analysis for future research and/or research involving modified or different research questions; this may be undertaken by the research team and other researchers. In the latter case, data will only be shared in secure and pseudonymised form with other researchers. Data will be stored in line with University policy for 10 years and then destroyed.

Abbreviations

Advanced care plan

Clinical research network

General practitioner

Health research authority

National health service (UK)

United Kingdom

Murray SA, Firth A, Schneider N, et al. Promoting palliative care in the community: production of the primary palliative care toolkit by the European Association of Palliative Care Taskforce in primary palliative care. Palliat Med. 2015;29(2):101–11.

Article   PubMed   Google Scholar  

Clark D. From margins to centre: a review of the history of palliative care in cancer. Lancet Oncol. 2007;8(5):430–8.

NIHR. Better endings: right care, right place, right time: an independent review of NIHR research on end of life care services. NIHR Dissemination Centre. 2015. Available from: https://evidence.nihr.ac.uk/wp-content/uploads/2020/03/Better-endings-FINAL-WEB.pdf . Accessed 21 Jul 2021.

Yardley S. Deconstructing complexity: finding opportunity in problems shared. Palliat Med. 2018;32(6):1039–41. Available from: https://journals.sagepub.com/doi/abs/10.1177/0269216318771239 .

Weetman K, Dale J, Mitchell SJ, et al. Communication of palliative care needs in discharge letters from hospice providers to primary care: a multisite sequential explanatory mixed methods study. BMC Palliat Care. 2022;21(1):155. https://doi.org/10.1186/s12904-022-01038-8 .

Article   PubMed   PubMed Central   Google Scholar  

Wu S, Volker DL. Live discharge from hospice: a systematic review. J Hosp Palliat Nurs. 2019;21(6):482–8.

Weetman K. An investigation of written discharge communication between hospital clinicians, GPs, and patients in the UK [PhD thesis]. University of Warwick; 2020. http://webcat.warwick.ac.uk/record=b3714883~S15 .

Thelen M, Brearley SG, Walshe C. A grounded theory of interdependence between specialist and generalist palliative care teams across healthcare settings. Palliat Med. 0(0):02692163231195989. Available from: https://journals.sagepub.com/doi/abs/10.1177/02692163231195989 .

Weetman K, Dale J, Scott E, et al. Adult patient perspectives on receiving hospital discharge letters: a corpus analysis of patient interviews. BMC Health Serv Res. 2020;20(1):537. https://doi.org/10.1186/s12913-020-05250-1 . Accessed 24 June 2020.

Karasouli E, Munday D, Bailey C, et al. Qualitative critical incident study of patients’ experiences leading to emergency hospital admission with advanced respiratory illness. BMJ Open. 2016;6(2):e009030. Available from: https://bmjopen.bmj.com/content/bmjopen/6/2/e009030.full.pdf . [Accessed: 02/11/21]

Zhang Y, Luth EA, Phongtankuel V, et al. Factors associated with preventable hospitalizations after hospice live discharge among Medicare patients with Alzheimer's disease and related dementias. J Am Geriatr Soc. Available from: https://agsjournals.onlinelibrary.wiley.com/doi/abs/10.1111/jgs.18505 .

Turbow SD, Ali MK, Culler SD, et al. Association of fragmented readmissions and electronic information sharing with discharge destination among older adults. JAMA Netw Open. 2023;6(5):e2313592-e. https://doi.org/10.1001/jamanetworkopen.2023.13592 . Accessed 19 Sept 2023.

Baxter S, Farrell K, Brown C, et al. Where have all the copy letters gone? A review of current practice in professional-patient correspondence. Patient Educ Couns. 2008;71(2):259–64 https://doi.org/10.1016/j.pec.2007.12.002 . Accessed: 10/06/20

Weetman K, Dale J, Scott E, et al. Discharge communication study: a realist evaluation of discharge communication experiences of patients, general practitioners and hospital practitioners, alongside a corresponding discharge letter sample. BMJ Open. 2021;11(7):e045465.   https://bmjopen.bmj.com/content/bmjopen/11/7/e045465.full.pdf . Accessed: 26/01/22.

Reilly MO, Cahill M, Perry IJ. Writing to patients: ‘putting the patient in the picture.’ Ir Med J. 2005;98(2):58–60.

CAS   PubMed   Google Scholar  

Department of Health. Copying letters to patients: good practice guidelines. 2003. Available from: https://webarchive.nationalarchives.gov.uk/20120504030618/ . http://www.dh.gov.uk/prod_consum_dh/groups/dh_digitalassets/@dh/@en/documents/digitalasset/dh_4086054.pdf

National Institute for Health and Care Excellence (NICE). Patient experience in adult NHS services: improving the experience of care for people using adult NHS services. Clinical guideline [CG138]. 2012. Available from: https://www.nice.org.uk/guidance/cg138 . Accessed 25 Feb 2019.

The Academy of Medical Royal Colleges. Please, write to me: writing outpatient clinic letters to patients. 2018. Available from: https://www.aomrc.org.uk/reports-guidance/please-write-to-me-writing-outpatient-clinic-letters-to-patients-guidance/ . Accessed 10/06/20.

Department of Health. The NHS plan: a plan for investment a plan for reform. London: HMSO; 2000. Available from:  https://webarchive.nationalarchives.gov.uk/20121102184216/ . http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/PublicationsPolicyAndGuidance/DH_4002960 .

Google Scholar  

Finucane AM, Swenson C, MacArtney JI, et al. What makes palliative care needs complex? A multisite sequential explanatory mixed methods study of patients referred for specialist palliative care. BMC Palliat Care. 2021;20(1):18. https://doi.org/10.1186/s12904-020-00700-3 . Accessed 01/11/21.

Green J, Thorogood N. Qualitative methodology and health research. London: Sage; 2014.

Greene J. Mixed methods in social inquiry. 9th ed. San Francisco: John Wiley & Sons; 2007.

Braun V, Clarke V. Thematic analysis. In: Cooper H, Camic PM, Long DL, Panter AT, Rindskopf D, Sher KJ, editors. APA handbook of research methods in psychology. Research designs: Quantitative, qualitative, neuropsychological, and biological, vol. 2. United States: American Psychological Association; 2012. p. 57–71.

Braun V, Clarke V. Reflecting on reflexive thematic analysis. Qual Res Sport Exerc Health. 2019;11(4):589–97. https://doi.org/10.1080/2159676X.2019.1628806 .

Article   Google Scholar  

Evans N, Meñaca A, Andrew EV, et al. Systematic review of the primary research on minority ethnic groups and end-of-life care from the United Kingdom. J Pain Symptom Manage. 2012;43(2):261–86.

Malterud K, Siersma VD, Guassora AD. Sample size in qualitative interview studies: guided by information power. 2016;26(13):1753-60. Available from: https://journals.sagepub.com/doi/abs/10.1177/1049732315617444 .

Teddlie C, Yu F. Mixed methods sampling: a typology with examples. J Mixed Methods Res. 2007;1(1):77–100 ( https://journals.sagepub.com/doi/10.1177/1558689806292430 ). Accessed 09/07/2020.

Morrison J, Choudhary C, Beazley R, et al. Observational study of survival outcomes of people referred for ‘fast-track’ end-of-life care funding in a district general hospital: too little too late? BMJ Open Qual. 2023;12(2):e002279.

Weetman K, Dale J, Spencer R, et al. GP perspectives on hospital discharge letters: an interview and focus group study. Br J Gen Pract Open. 2020. Available from: https://bjgpopen.org/content/4/2/bjgpopen20X101031 . Accessed 10/06/20.

NHS Health Research Authority (HRA). Informing participants and seeking consent. Last updated on 4 Sep 2019 ed. 2019. Available from https://www.hra.nhs.uk/planning-and-improving-research/best-practice/informing-participants-and-seeking-consent/ .

Parsons JE, Dale J, MacArtney JI, et al. Caring for each other: a rapid review of how mutual dependency is challenged by advanced illness. Int J Care Caring. 2021;5(3):509–27 ( https://bristoluniversitypressdigital.com/view/journals/ijcc/5/3/article-p509.xml ).

Braun V, Clarke V, Hayfield N, et al. Thematic analysis. In: Liamputtong P, et al., editors. Handbook of research methods in health social sciences. Singapore: Springer Singapore; 2019. p. 843–60.

Chapter   Google Scholar  

Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101. Available from: ( https://www.tandfonline.com/doi/abs/10.1191/1478088706qp063oa ).

Manera K, Hanson CS, Gutman T, et al. Consensus methods: nominal group technique. In: Liamputtong P, et al., editors. Handbook of research methods in health social sciences. Singapore: Springer Singapore; 2019. p. 737–50.

Wladkowski SP, Wallace CL. The Forgotten and Misdiagnosed Care Transition: Live Discharge From Hospice Care. Gerontol Geriatr Med. 2022;8:23337214221109984.

Professional Records Standards Body. eDischarge summary standard. 2020. Available from: https://theprsb.org/standards/edischargesummary/ . Accessed 24 June 2020.

Download references

Acknowledgements

We would like to acknowledge the work of the PPI representatives in advising on the design of this study. We would also like to thank the CRN for their support and advice regarding sampling and costings.

This project is funded by the National Institute for Health and Care Research (NIHR) under its Research for Patient Benefit (RfPB) Programme (Grant Reference Number NIHR204938). The views expressed are those of the author(s) and not necessarily those of the NIHR or the Department of Health and Social Care. The study proposal was peer reviewed independently as part of the assessment of the funding process for NIHR. As stated on the NIHR funder pages for RfPB, the funder committee members review all stage 1 and stage 2 applications and make funding recommendations based on the quality of applications. During this process, the research team responded to all points raised.

Author information

Authors and affiliations.

Institute of Clinical Sciences, Birmingham Medical School, University of Birmingham, Birmingham, B15 2TT, UK

Katharine Weetman & Cara Bailey

Unit of Academic Primary Care, Warwick Medical School, University of Warwick, Coventry, UK

Katharine Weetman, John I. MacArtney, Catherine Grimley & Jeremy Dale

Marie Curie Hospice West Midlands, Solihull, West Midlands, UK

John I. MacArtney

You can also search for this author in PubMed   Google Scholar

Contributions

JM and KW first drafted this protocol with input from CG, CB and JD. The study was designed and conceptualised by JM, KW, CB and JD. CG will lead on data collection and analysis for the study, with support from KW and JM. The manuscript has been revised for intellectual consent by all co-authors who approved the final manuscript version.

Corresponding author

Correspondence to Katharine Weetman .

Ethics declarations

Ethics approval and consent to participate.

Participants will provide informed verbal consent to participant and this will be recorded in writing through a consent form. Ethical approval was granted by the Health Research Authority (HRA) and NHS REC in the UK on 05/12/2023 ref: 23/WM/0250. The University of Birmingham is acting as sponsor for this study. We will also obtain local research approval for each site’s research governance committee. Data protection and confidentiality procedures will be observed.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1., supplementary material 2., supplementary material 3., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Weetman, K., MacArtney, J.I., Grimley, C. et al. Improving patients’, carers’ and primary care healthcare professionals’ experiences of discharge communication from specialist palliative care to community settings: a protocol for a qualitative interview study. BMC Palliat Care 23 , 156 (2024). https://doi.org/10.1186/s12904-024-01451-1

Download citation

Received : 31 January 2024

Accepted : 07 May 2024

Published : 20 June 2024

DOI : https://doi.org/10.1186/s12904-024-01451-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Palliative care
  • Hospice care
  • Patient discharge summaries
  • Transitional care
  • Communication

BMC Palliative Care

ISSN: 1472-684X

presentation qualitative data collection

IMAGES

  1. Qualitative Data Collection Tools Ppt Powerpoint Presentation

    presentation qualitative data collection

  2. Qualitative Data Analysis PowerPoint Template

    presentation qualitative data collection

  3. PPT

    presentation qualitative data collection

  4. Qualitative Data Analysis PowerPoint Presentation Slides

    presentation qualitative data collection

  5. PPT

    presentation qualitative data collection

  6. Qualitative Data Collection Tools With Online Forums And Focus Groups

    presentation qualitative data collection

VIDEO

  1. Types of Qualitative Data Collection Part 1

  2. Quantitative vs qualitative data presentation

  3. Chapter13-2 Understanding the Process of Qualitative Data Collection

  4. Webinar Qualitative Primary Data Collection Strategy

  5. Types of Qualitative Data Collection Part 2

  6. PCS 803 Peace Research (Qualitative Methods) Class Presentation

COMMENTS

  1. Presenting and Evaluating Qualitative Research

    The purpose of this paper is to help authors to think about ways to present qualitative research papers in the American Journal of Pharmaceutical Education. It also discusses methods for reviewers to assess the rigour, quality, and usefulness of qualitative research. Examples of different ways to present data from interviews, observations, and ...

  2. How to present and visualize qualitative data

    ‍Determine the best data collection method(s): The data collected should be appropriate to answer the research question. ... Presenting qualitative data visually helps to bring the user's attention to specific items and draw them into a more in-depth analysis. Visuals provide an efficient way to communicate complex information, making it ...

  3. How to Present Qualitative Data?

    Presenting qualitative data. In the end, presenting qualitative research findings is just as important a skill as mastery of qualitative research methods for the data collection and data analysis process. Simply uncovering insights is insufficient to the research process; presenting a qualitative analysis holds the challenge of persuading your ...

  4. Qualitative Research: Data Collection, Analysis, and Management

    For more information about collecting qualitative data, please see the "Further Reading" section at the end of this paper. DATA ANALYSIS AND MANAGEMENT. ... The aim of the whole process from data collection to presentation is to tell the participants' stories using exemplars from their own narratives, thus grounding the research findings ...

  5. Chapter 20. Presentations

    Effective Data Visualization 2. Thousand Oaks, CA: SAGE. This is an advanced primer for presenting clean and clear data using graphs, tables, color, font, and so on. Start with Evergreen (2018), and if you graduate from that text, move on to this one. Schwabisch, Jonathan. 2021. Better Data Visualizations: A Guide for Scholars, Researchers, and ...

  6. Qualitative Presentation Strategies

    Qualitative Presentation Strategies. Nov 14, 2023. By Dr. Linda Bloomberg, and hosted by Janet Salmons, Ph.D., Research Community Manager for Sage Methodspace. Dr. Bloomberg is the author of Completing Your Qualitative Dissertation: A Road Map From Beginning to End. Use the code COMMUNITY3 for a 20% discount when you order her book, valid ...

  7. [Guide] How to Present Qualitative Research ...

    Here's my recommended structure to create your Research Findings presentation -. 1. Objective of the Research. A great way to start your presentation is to highlight the objective of your research project. It is important to remember that merely sharing the objective may sometimes not be enough.

  8. PDF Asking the Right Question: Qualitative Research Design and Analysis

    Limitations of Qualitative Research. Lengthy and complicated designs, which do not draw large samples. Validity of reliability of subjective data. Difficult to replicate study because of central role of the researcher and context. Data analysis and interpretation is time consuming. Subjective - open to misinterpretation.

  9. Data Collection

    Revised on June 21, 2023. Data collection is a systematic process of gathering observations or measurements. Whether you are performing research for business, governmental or academic purposes, data collection allows you to gain first-hand knowledge and original insights into your research problem. While methods and aims may differ between ...

  10. (PDF) Qualitative Data Collection, Analysis and Presentation: A

    qualitative analysis is the production of visual displays. Laying out data in table or matrix form, and drawing theories. out in the form of a flow chart or map, helps to understand. what the ...

  11. How to use and assess qualitative research methods

    The methods of qualitative data collection most commonly used in health research are document study, observations, semi-structured interviews and focus groups [1, 14, 16, 17]. Document study . Document study (also called document analysis) refers to the review by the researcher of written materials . These can include personal and non-personal ...

  12. PDF Analysing and presenting IN BRIEF qualitative data

    Analysing and presenting qualitative data P. Burnard, 1 P. Gill, 2 K. Stewart, 3 E. Treasure 4 and B. Chadwick ... Methods of data collection in qualitative research: interviews and focus groups

  13. Chapter 14: Qualitative Data Collection

    An Image/Link below is provided (as is) to download presentation Download Policy: ... Qualitative Data Collection • Qualitative data collection is referred to as fieldwork. • Fieldwork includes materials gathered, recorded, and compiled during the study. • Fieldwork requires the researcher to immerse himself in the setting over time.

  14. Qualitative Data Collection: What it is + Methods to do it

    Qualitative data collection is gathering non-numerical information, such as words, images, and observations, to understand individuals' attitudes, behaviors, beliefs, and motivations in a specific context. It is an approach used in qualitative research. It seeks to understand social phenomena through in-depth exploration and analysis of ...

  15. (PDF) Analysing and presenting qualitative data

    This chapter aims to help you to: • Develop an understanding of how to analyse qualitative data - particularly text; • Recognise there is a strong link between data collection and data analysis; • Undertake content analysis and develop coding techniques for data handling; • Deal with a potential for bias that is commonly to be found in qualitative data handling; • Explore the ...

  16. PDF 12 Qualitative Data, Analysis, and Design

    data collection ends. What may start as a case study may indeed develop into a design that more closely resembles a phenomenological study (described later). For this reason, this chapter is organized somewhat differently. Qualitative research designs are described after types of qualitative data and methods of analysis are described.

  17. Qualitative Data

    This may involve writing a report, presenting at a conference, or publishing in a peer-reviewed journal. Qualitative Data Examples. Some examples of qualitative data in different fields are as follows: ... Flexibility: Qualitative data collection methods are flexible and can be adapted to the specific needs of the research question, ...

  18. (PPT) Qualitative Data Collection Methods

    Qualitative Research Methods: Glossary. 2011 •. Kevin Meethan. • Data analysis-the examination of research data.•. Data collection-the systematic process of collecting data.•. Deduction-arriving at logical conclusions through the application of rational processes eg theory testing. Quantitative research tends to be deductive.•.

  19. Data Presentation in Qualitative Research: The Outcomes of the Pattern

    a processing used observations, interviews, and audio recordings to have a balanced presentation. The pattern of ideas in data presentation involved familiarizing. with the data to generate initial codes, searching for themes, defining and producing the report. Thus, the paper g. ves the reader a clear view of how qualitative data are presented ...

  20. Data Collection

    Qualitative Data Collection. Qualitative data collection is used to gather non-numerical data such as opinions, experiences, perceptions, and feelings, through techniques such as interviews, focus groups, observations, and document analysis. ... This could be in the form of a report, a presentation, or a publication.

  21. ch.6

    The document discusses qualitative methods of data collection. It describes various methods like observation, content analysis, and focus group discussions. It provides details on how each method is conducted, their advantages and disadvantages. Key elements covered include how observations can be structured or unstructured, levels of respondent consciousness. For content analysis, it ...

  22. PPT

    Presentation Transcript. Qualitative Data Collection JN602 Week 08 Veal Chapter 7, CDS Chapter 6. Outline • Discuss the assumptions of qualitative research • Describe how qualitative researchers respond to the demands of accuracy and replicability • Explain the process of planning and conducting a research interview • Discuss the face ...

  23. Data Collection in Qualitative Research Lesson

    Data Collection in Qualitative Research Lesson - Free download as Powerpoint Presentation (.ppt / .pptx), PDF File (.pdf), Text File (.txt) or view presentation slides online. This document provides information on various qualitative research methods and data collection techniques. It discusses that the main purpose of sampling is to select appropriate participants to focus a study [1].

  24. The Rapid Implementation Feedback (RIF) report: real-time synthesis of

    Accordingly, an array of methodological strategies for supporting participatory and partner-engaged processes [6, 7], rapid qualitative data collection and analysis [8, 9], and ethnographic and observational approaches [10,11,12] have emerged, offering a growing array of qualitative methods to meet the needs of a given study or initiative.

  25. Improving patients', carers' and primary care healthcare professionals

    This study is important because it will be one of the first to provide an in-depth consideration into how discharge communications from specialist palliative care might be improved to support effective patient-centred care at the end of life [].Our use of qualitative methods and inclusive approach to data collection will help to ensure that the experiences of people with life-limiting ...

  26. Full article: Digital learning preferences of Arabic-speaking older

    Utilizing a range of data collection approaches to explore older adults' digital literacy enhances the potential for participant engagement ... In qualitative studies, ... Having both Arabic and English on the presentation slides in each session was an opportunity to engage with class content and English-language learning. Participants showed ...