constant comparative analysis qualitative research

Constant Comparative Method in Qualitative Research

constant comparative analysis qualitative research

Introduction

What is constant comparison in qualitative research, what is an example of a comparative analysis, step-by-step guide to using the constant comparative method, benefits of using the constant comparative method.

Constant comparison is an essential qualitative research method that originally comes from grounded theory analysis . Under the constant comparative method, the goal of the qualitative data collection process and data analysis is to facilitate organization of information to generate a coherent theory. In this article, we'll examine constant comparative analysis and its place in qualitative research methods.

constant comparative analysis qualitative research

Constant comparison in qualitative research is a systematic procedure used for collecting data and analyzing that raw data throughout the research process. This method is closely associated with grounded theory methodology , which aims to construct theories grounded in systematically gathered and analyzed data. Instead of beginning with a hypothesis , research using constant comparison allows for the emergence of concepts and connections as the data is collected.

The essence of constant comparison lies in its iterative process. Researchers collect data, analyze it, and then use what they learn to inform further data collection. This ongoing interaction between data collection and analysis ensures that the emerging theory is deeply rooted in the data itself. The process involves breaking down the data into discrete parts, coding them , and comparing these codes to find similarities and differences. Through this comparison, researchers can identify patterns and establish categories that form the basis of theory development.

The procedure of constant comparison includes several key steps: initial coding, where data is segmented into discrete parts; focused coding, where codes are synthesized and narrowed down; axial coding , where relationships between codes are established; and selective coding , which integrates the codes into a coherent framework that proposes a theory. Each step involves constant revisiting and comparison of data, codes, and categories, allowing for refinement and complexity to be built into the emerging theory. In other words, researchers compare segments of data with other data segments, codes are compared with other codes, and categories are compared with other categories. All this is done with the objective of assessing whether additional data, codes, or categories are contradicting, expanding, or supporting the emerging theory.

The method demands that researchers remain open to the data, allowing for flexibility and adaptability as new insights are gained. It requires meticulous documentation of how the analysis was conducted, as the rationale behind coding and category development must be transparent and trackable. This transparency is crucial for the credibility and transferability of the research.

Ultimately, constant comparison is a dynamic and rigorous analytical process. It's designed to handle the complexity of qualitative data , providing a structured approach to theory development that is directly informed by the research data. By using this method, qualitative researchers can produce well-grounded, nuanced, and relevant theories that offer deep insights into the phenomena being studied.

constant comparative analysis qualitative research

An example of comparative analysis using the constant comparative method can be illustrated in a study exploring the experiences of remote workers. Researchers might start by conducting in-depth interviews with a diverse group of individuals who work remotely. Initially, researchers transcribe and examine the interview data line by line, looking for keywords, phrases, or incidents that stand out—this is the open coding phase.

During this phase, a researcher might identify recurring themes such as "work-life balance," "communication challenges," or "adaptation to technology." As more data is gathered, the researcher begins the process of focused coding, where these initial themes are compared against new data, refining the themes to better fit the data set. For example, "communication challenges" might be broken down into "synchronous communication issues" and "asynchronous misunderstandings."

The next step, axial coding , involves examining the relationships between these focused codes. The researcher might find that "synchronous communication issues" often lead to a "sense of isolation," which in turn affects "work-life balance." These connections start to form the basis for a larger understanding of the remote work experience.

Finally, in the selective coding phase, the researcher weaves these relationships into a coherent theory that explains how remote workers manage their professional and personal lives. Perhaps the theory suggests that successful remote work depends on the development of new communication norms and self-regulation strategies to maintain balance.

Throughout the study, the constant comparative method ensures that each interview contributes to a deeper understanding of the remote work experience. The researcher continually revisits and compares the data, refining the analysis until theoretical saturation is established and a rich, grounded theory is developed that captures the complexities and nuances of working remotely.

This iterative process of data analysis , a hallmark of the constant comparative method, ensures that the resulting theory is not just a collection of data points but a reflective and comprehensive model of the real-world experiences of individuals.

constant comparative analysis qualitative research

Conduct insightful analysis with ATLAS.ti

Powerful data analysis tools help you make sense of your qualitative data. See how with a free trial.

Using the constant comparative method involves an intricate process of collecting and analyzing qualitative data to build a grounded theory . This method is not a linear journey but rather a cyclical one, where data collection and analysis occur simultaneously, each informing and enhancing the other. Here is a step-by-step guide to understanding how this method unfolds in qualitative research .

The first step is data collection, which is conducted without preconceived theories, allowing the data to guide the researcher. As the data—usually textual data from interviews , observations , or documents —is collected, the researcher begins the process of open coding . In this initial phase, the researcher reads through the data meticulously, identifying, naming, and categorizing phenomena found in the text. These categories are derived directly from the data, not from existing theories or hypotheses . Each piece of data is compared with the rest of the data, identifying similarities and differences, and is coded accordingly.

Following open coding, the researcher moves into the focused coding phase. Here, codes and categories are synthesized, with the researcher honing in on the most significant initial codes to explain larger chunks of data. This step involves a constant back-and-forth between the emerging categories and the data, ensuring that these categories are representative and comprehensive.

Next is the axial coding phase, where the researcher explores the relationships between categories by comparing each category with other categories. This involves a more conceptual level of analysis, looking at how categories can be related to form more abstract concepts. During this phase, categories are organized in a way that showcases their relationships, often indicating causation, intervention, and/or outcomes. The researcher repeatedly sifts through the data, comparing incidents, and refining categories as needed.

Finally, selective coding is where the substantive theory begins to take shape. The researcher integrates the categories to form a cohesive theory that is grounded in the data collected. This theory should provide a detailed understanding of the subject under investigation, based on the relationships between categories established in the axial coding.

Throughout each of these steps, the researcher must remain flexible and responsive to the data. As new data is gathered and analyzed, initial codes and categories may evolve, and the emerging theory may shift. This iterative process is at the heart of the constant comparative method, requiring the researcher to be constantly engaged with the data, comparing new data with existing codes, and revisiting categories in light of new evidence that might contradict, expand, or support the emerging theory.

In using the constant comparative method, the researcher's role is both systematic and creative, applying rigorous methods to the data while also being open to the insights that emerge. The goal is to ensure that the theory developed is not just data-driven but also analytically rich, providing a meaningful contribution to understanding the phenomenon under study. The end result is a grounded theory that offers a deep, nuanced understanding of the qualitative data , constructed through a methodical process of comparison and analysis.

The constant comparative method offers a plethora of benefits in qualitative research , particularly in its capacity to construct theories that are deeply embedded in the data. This method stands in contrast to quantitative analysis , which seeks to test hypotheses through statistical analysis, often missing the nuanced understanding that emerges from qualitative data.

One of the primary benefits of the constant comparative method is its dynamic nature. Unlike quantitative methods, which often require a rigid structure and a predefined hypothesis, the constant comparative method is fluid and adaptable. It allows the theory to emerge from the data itself, ensuring that the resulting theory is organically connected to the realities represented in the data. This flexibility means that the researcher can adjust the focus of the study as new themes and patterns emerge, leading to a more authentic and grounded understanding of the research subject.

Another advantage is the depth of insight that can be achieved. Quantitative analysis can tell us the frequency and correlation of certain phenomena but often fails to explain the "why" behind the numbers. The constant comparative method, by continuously analyzing and synthesizing data, provides a rich, detailed view of the context, processes, and meanings that underpin the data. This depth of analysis is particularly beneficial when exploring complex social phenomena that cannot be easily quantified or reduced to numerical variables.

The method can also enhance the validity of the research. By constantly comparing new data with existing codes and categories, the researcher is continuously validating and refining the emerging theory. This iterative process ensures that the theory is not only grounded in the data but also extensively cross-checked and corroborated throughout the research process. In contrast, quantitative analysis often relies on the statistical significance of results, which may not always capture the complexity of the data.

Additionally, the constant comparative method is inherently reflective. It requires researchers to engage deeply with the data, to think critically about their interpretations, and to be aware of their biases. This reflective stance is less pronounced in quantitative analysis, which often assumes a degree of objectivity in analyzing statistics. In qualitative research using constant comparison, the researcher's subjectivity is not a drawback but a tool for deeper engagement with the data.

constant comparative analysis qualitative research

Analyze data quickly and efficiently with ATLAS.ti

Draw connections between your documents for critical insights with ATLAS.ti, starting with a free trial.

constant comparative analysis qualitative research

Constant comparative method in qualitative analysis

The constant comparison method isn't restricted to Grounded Theory, and is a frequently applied approach to analysing and exploring qualitative data. It's essentially a really common-sense approach for examining qualitative data...

Daniel Turner

Daniel Turner

It has roots in classical Grounded Theory , but the constant comparison method isn't restricted to Grounded Theory, and is a frequently applied approach to analysing and exploring qualitative data.

It's essentially a common-sense approach for examining qualitative data - to understand your data or part of it, you need to compare with something else! This might be an interview with another participant (comparing between interviews), between groups of respondents, or even between parts of data assigned to codes or themes. The idea is that comparison can show differences (and similarities) through the data, and the comparisons help you understand the story of why these differences arise.

For Tesch (1990), comparison is the most significant way that researchers create and refine categories and analytic themes:

"Comparing and contrasting is used for practically all intellectual tasks during analysis: forming categories, establishing the boundaries of the categories, assigning the segments to categories, summarizing the content of each category, finding negative evidence, etc. The goal is to discern conceptual similarities, to refine the discriminative power of categories, and to discover patterns." (Tesch 1990)

However, it's also used as a methodology: an approach to analysing data that benefits from on-going sampling and recruitment of new participants to provide points of comparison, and explore certain themes in greater depth. But let's go back to the beginning and see how the terminology came about.

Constant Comparative Method is actually a critical part of Glaser and Strauss' (1967) treatise on Grounded Theory, but actually predates it in an article attributed to Glaser alone (Glaser 1965). It proposed a way to bridge the differences between a basic comprehensive thematic coding approach, and theory generation with analysis. They suggest that going through and methodically creating codes for everything hinders the generation of new hypotheses, yet without coding the analyst "merely inspects his [sic] data for new properties of his theoretical categories and writes memos on these properties". They propose a hybrid model where the analyst should be essentially re-examining the code each time something is added to it, and seeing commonalities and differences. That way, theory is constantly being created, or at least refined, in a more systematic and thorough way.

"Systematizing the second approach [pure grounded theory with no coding] by this method does not supplant the skills and sensitivities required in inspection. Rather the constant comparative method is designed to aid analysts with these abilities in generating a theory which is integrated, consistent, plausible, close to the data" (Glaser 1965)

Now, while the terms 'systematizing', 'consistent' and even the suggestion of coding will be abhorrent to certain practitioners who find them too positivistic, for me the key phrase is the last one: 'close to the data'. For researchers that apply a very pure grounded theory approach, with no coding, no notes, sometimes not even transcripts, there is a distinct possibility that the hypotheses generated only connects with a very abstracted and un-evenly absorbed reading of the data. Constant comparison encourages the researcher to stay deeply entwined with the data, and the words of the participants, without relying on their own remembered interpretations. Yet this 'systematizing' approach is not intended to produce a consistent and systematic interpretation:

"the constant comparative method is not designed (as methods of quantitative analysis are) to guarantee that two analysts working independently with the same data will achieve the same result" (Glaser 1965)

Glaser (1965) notes that the focus of constant comparison should be the generation of many, possibly initially conflicting hypotheses on a general issue. If the intention is to create one precise theory, and test it through the data, Glaser recommends analytic induction - a separate approach with a separate aim (which we will leave for a separate article).

For constant comparison Glaser (1965) suggests 4 stages for constant comparison (which really cover the whole of the analysis process):

1) Comparing incidents applicable to each category 2) Integrated categories and their properties 3) Delimiting the theory 4) Writing the theory

The process about should sound fairly common to a qualitative iterative approach, with each stage building on the last. But an important part of the process is that word, constant . We often talk about qualitative coding being a cyclical, iterative process . It's rare that you can just read through and code the data once. Approaches like Grounded Theory and Thematic Analysis suggest phases that build on each other ( open, then axial coding for example), and when applying constant comparison, it's important that comparison is a constant and frequently applied part of the process, not just a phase to be done at the end.

It's tempting to limit comparison to natural 'break points' in the analysis, like at the end of coding a source, or a group of interviews, but really the comparison needs to be an integrated part of the process. Every few sentences there might be a statement that is challenging to the emerging theory or definition of a code, and that should invite a process of reflection and comparison with other parts of the data. Quirkos is designed to make that comparison quick, and keep you close to the data as you review it.

constant comparative analysis qualitative research

However, you should be able to see that when applied well, constant comparison can add a lot of extra time to an already slow process. Don't get disheartened through, this careful reading and cross-examination is what makes qualitative research powerful, challenging to the status-quo and the researcher's own assumptions, and capable of creating change. But also note that

"Comparison can often be based on memory. Usually there is no need to refer to the actual note on every previous incident for each comparison." Glaser and Strauss (1967)

So it's clearly implied that the researcher should be becoming close and familiar with the data through the process, and this is an important for applying the skill with which they will create codes and later themes.

Because constant comparison is so often used to inform further and ongoing recruitment, it is really a methodology and not just an analytical technique, since it informs the whole research design and sampling process. Constant comparison should suggest new people with new experiences that need to be recruited to explore uncertainties, contradictions and refine codes and hypotheses. Therefore, analysis should begin early in the data collection process, and be continual, without pre-defined ideas about sample size.

This links back to the issue of saturation in qualitative research : when adding new participants doesn't seem to be uncovering new findings or theory. Saturation itself has become a contested issue, with some claiming that it is too positivistic and problematic (see Low 2019 ) for example. However, I feel that the concept Glaser introduces as theoretical saturation is not necessarily the same as sample saturation, and that alternatives like 'information power' ( Kirsti et al. 2016 ) maybe just different ways of describing the same issues.

Also, note that the constant comparative technique needn't be limited to just Classical Grounded Theory (CGT). Elements and concepts can be applied in thematic analysis, discourse analysis, and even approaches like IPA in the later stages. Fram (2013) discusses this with some examples in different approaches, but I'm not sure about some of her interpretations and critique of Classical Grounded Theory. As ever in these blog posts, my interpretation (and others) are only a part of the understanding. I would highly recommend anyone to read the original paper on constant comparative analysis (Glaser 1965) - it's an easy read, with clear examples, and an almost prescient anticipation of the theoretical and practical issues the modern literature still frets about today!

Also, make sure you don't confuse this with Qualitative Comparative Analysis or QCA, ( Ragin 1998 ) which mostly focuses on classifying whole cases, not within-case qualitative analysis.

So! There are many things you should compare, within and between codes and themes, within sources and between sources, across and between groups of respondents (for example by role or demographics like gender or age) and in your own notes and memos. But with all these comparisons, note that it is a constant, continual process, with the aim to develop and write theory, not reduce the analysis to quantitative measures of difference.

Especially when using software tools ( CAQDAS or QDAS ), it can become easy to conflate numerical counts of the number of 'incidents', codes or themes occurring in a source or group. However, this is not the right focus for a proper qualitative approach, it should be comparison of the qualitative data itself - reading and regrouping to create and challenge a constant stream of theory.

Quirkos is designed to not show quantitative summaries of the data by default, and has a specific query comparison mode to show the text side-by-side. Comparison is designed to be quick so that it can be used constantly, unlike other software that requires you to set up a complicated query you need to run and re-run. But it still allows you to set the parameters to compare codes, groups, individuals, or even between coders working on the some project. Quirkos Cloud also has real live collaboration, so working as a team and constantly comparing your work is greatly simplified.

It has a free trial with no restrictions on the features, so that you can see if it will work for your qualitative approach, whether you end up using a form of constant comparative method or not! You can also watch some tutorials to see how it works with a variety of methods.

References:

Boeije, H. 2002. A Purposeful Approach to the Constant Comparative Method in the Analysis of Qualitative Interviews. Quality & Quantity 36, pp. 391–409. https://doi.org/10.1023/A:1020909529486

Fram, S. M. 2013. The Constant Comparative Analysis Method Outside of Grounded Theory. TQR 18(1). https://files.eric.ed.gov/fulltext/EJ1004995.pdf

Glaser, BG. 1965. The Constant Comparative Method of Qualitative Analysis . Social Problems. 12(4). pp. 436-445.

Glaser, BG. & Strauss, AL. 1967. The Discovery of Grounded Theory: Strategies for Qualitative Research . New York: Aldine De Gruyter.

Tesch, R. 1990. Qualitative research: Analysis types and software tools. Falmer, New York.

Example articles using constant comparative methods and Quirkos

constant comparative analysis qualitative research

Sign up for more like this.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • SAGE Open Med

Grounded theory research: A design framework for novice researchers

Ylona chun tie.

1 Nursing and Midwifery, College of Healthcare Sciences, James Cook University, Townsville, QLD, Australia

Melanie Birks

Karen francis.

2 College of Health and Medicine, University of Tasmania, Australia, Hobart, TAS, Australia

Background:

Grounded theory is a well-known methodology employed in many research studies. Qualitative and quantitative data generation techniques can be used in a grounded theory study. Grounded theory sets out to discover or construct theory from data, systematically obtained and analysed using comparative analysis. While grounded theory is inherently flexible, it is a complex methodology. Thus, novice researchers strive to understand the discourse and the practical application of grounded theory concepts and processes.

The aim of this article is to provide a contemporary research framework suitable to inform a grounded theory study.

This article provides an overview of grounded theory illustrated through a graphic representation of the processes and methods employed in conducting research using this methodology. The framework is presented as a diagrammatic representation of a research design and acts as a visual guide for the novice grounded theory researcher.

Discussion:

As grounded theory is not a linear process, the framework illustrates the interplay between the essential grounded theory methods and iterative and comparative actions involved. Each of the essential methods and processes that underpin grounded theory are defined in this article.

Conclusion:

Rather than an engagement in philosophical discussion or a debate of the different genres that can be used in grounded theory, this article illustrates how a framework for a research study design can be used to guide and inform the novice nurse researcher undertaking a study using grounded theory. Research findings and recommendations can contribute to policy or knowledge development, service provision and can reform thinking to initiate change in the substantive area of inquiry.

Introduction

The aim of all research is to advance, refine and expand a body of knowledge, establish facts and/or reach new conclusions using systematic inquiry and disciplined methods. 1 The research design is the plan or strategy researchers use to answer the research question, which is underpinned by philosophy, methodology and methods. 2 Birks 3 defines philosophy as ‘a view of the world encompassing the questions and mechanisms for finding answers that inform that view’ (p. 18). Researchers reflect their philosophical beliefs and interpretations of the world prior to commencing research. Methodology is the research design that shapes the selection of, and use of, particular data generation and analysis methods to answer the research question. 4 While a distinction between positivist research and interpretivist research occurs at the paradigm level, each methodology has explicit criteria for the collection, analysis and interpretation of data. 2 Grounded theory (GT) is a structured, yet flexible methodology. This methodology is appropriate when little is known about a phenomenon; the aim being to produce or construct an explanatory theory that uncovers a process inherent to the substantive area of inquiry. 5 – 7 One of the defining characteristics of GT is that it aims to generate theory that is grounded in the data. The following section provides an overview of GT – the history, main genres and essential methods and processes employed in the conduct of a GT study. This summary provides a foundation for a framework to demonstrate the interplay between the methods and processes inherent in a GT study as presented in the sections that follow.

Glaser and Strauss are recognised as the founders of grounded theory. Strauss was conversant in symbolic interactionism and Glaser in descriptive statistics. 8 – 10 Glaser and Strauss originally worked together in a study examining the experience of terminally ill patients who had differing knowledge of their health status. Some of these suspected they were dying and tried to confirm or disconfirm their suspicions. Others tried to understand by interpreting treatment by care providers and family members. Glaser and Strauss examined how the patients dealt with the knowledge they were dying and the reactions of healthcare staff caring for these patients. Throughout this collaboration, Glaser and Strauss questioned the appropriateness of using a scientific method of verification for this study. During this investigation, they developed the constant comparative method, a key element of grounded theory, while generating a theory of dying first described in Awareness of Dying (1965). The constant comparative method is deemed an original way of organising and analysing qualitative data.

Glaser and Strauss subsequently went on to write The Discovery of Grounded Theory: Strategies for Qualitative Research (1967). This seminal work explained how theory could be generated from data inductively. This process challenged the traditional method of testing or refining theory through deductive testing. Grounded theory provided an outlook that questioned the view of the time that quantitative methodology is the only valid, unbiased way to determine truths about the world. 11 Glaser and Strauss 5 challenged the belief that qualitative research lacked rigour and detailed the method of comparative analysis that enables the generation of theory. After publishing The Discovery of Grounded Theory , Strauss and Glaser went on to write independently, expressing divergent viewpoints in the application of grounded theory methods.

Glaser produced his book Theoretical Sensitivity (1978) and Strauss went on to publish Qualitative Analysis for Social Scientists (1987). Strauss and Corbin’s 12 publication Basics of Qualitative Research: Grounded Theory Procedures and Techniques resulted in a rebuttal by Glaser 13 over their application of grounded theory methods. However, philosophical perspectives have changed since Glaser’s positivist version and Strauss and Corbin’s post-positivism stance. 14 Grounded theory has since seen the emergence of additional philosophical perspectives that have influenced a change in methodological development over time. 15

Subsequent generations of grounded theorists have positioned themselves along a philosophical continuum, from Strauss and Corbin’s 12 theoretical perspective of symbolic interactionism, through to Charmaz’s 16 constructivist perspective. However, understanding how to position oneself philosophically can challenge novice researchers. Birks and Mills 6 provide a contemporary understanding of GT in their book Grounded theory: A Practical Guide. These Australian researchers have written in a way that appeals to the novice researcher. It is the contemporary writing, the way Birks and Mills present a non-partisan approach to GT that support the novice researcher to understand the philosophical and methodological concepts integral in conducting research. The development of GT is important to understand prior to selecting an approach that aligns with the researcher’s philosophical position and the purpose of the research study. As the research progresses, seminal texts are referred back to time and again as understanding of concepts increases, much like the iterative processes inherent in the conduct of a GT study.

Genres: traditional, evolved and constructivist grounded theory

Grounded theory has several distinct methodological genres: traditional GT associated with Glaser; evolved GT associated with Strauss, Corbin and Clarke; and constructivist GT associated with Charmaz. 6 , 17 Each variant is an extension and development of the original GT by Glaser and Strauss. The first of these genres is known as traditional or classic GT. Glaser 18 acknowledged that the goal of traditional GT is to generate a conceptual theory that accounts for a pattern of behaviour that is relevant and problematic for those involved. The second genre, evolved GT, is founded on symbolic interactionism and stems from work associated with Strauss, Corbin and Clarke. Symbolic interactionism is a sociological perspective that relies on the symbolic meaning people ascribe to the processes of social interaction. Symbolic interactionism addresses the subjective meaning people place on objects, behaviours or events based on what they believe is true. 19 , 20 Constructivist GT, the third genre developed and explicated by Charmaz, a symbolic interactionist, has its roots in constructivism. 8 , 16 Constructivist GT’s methodological underpinnings focus on how participants’ construct meaning in relation to the area of inquiry. 16 A constructivist co-constructs experience and meanings with participants. 21 While there are commonalities across all genres of GT, there are factors that distinguish differences between the approaches including the philosophical position of the researcher; the use of literature; and the approach to coding, analysis and theory development. Following on from Glaser and Strauss, several versions of GT have ensued.

Grounded theory represents both a method of inquiry and a resultant product of that inquiry. 7 , 22 Glaser and Holton 23 define GT as ‘a set of integrated conceptual hypotheses systematically generated to produce an inductive theory about a substantive area’ (p. 43). Strauss and Corbin 24 define GT as ‘theory that was derived from data, systematically gathered and analysed through the research process’ (p. 12). The researcher ‘begins with an area of study and allows the theory to emerge from the data’ (p. 12). Charmaz 16 defines GT as ‘a method of conducting qualitative research that focuses on creating conceptual frameworks or theories through building inductive analysis from the data’ (p. 187). However, Birks and Mills 6 refer to GT as a process by which theory is generated from the analysis of data. Theory is not discovered; rather, theory is constructed by the researcher who views the world through their own particular lens.

Research process

Before commencing any research study, the researcher must have a solid understanding of the research process. A well-developed outline of the study and an understanding of the important considerations in designing and undertaking a GT study are essential if the goals of the research are to be achieved. While it is important to have an understanding of how a methodology has developed, in order to move forward with research, a novice can align with a grounded theorist and follow an approach to GT. Using a framework to inform a research design can be a useful modus operandi.

The following section provides insight into the process of undertaking a GT research study. Figure 1 is a framework that summarises the interplay and movement between methods and processes that underpin the generation of a GT. As can be seen from this framework, and as detailed in the discussion that follows, the process of doing a GT research study is not linear, rather it is iterative and recursive.

An external file that holds a picture, illustration, etc.
Object name is 10.1177_2050312118822927-fig1.jpg

Research design framework: summary of the interplay between the essential grounded theory methods and processes.

Grounded theory research involves the meticulous application of specific methods and processes. Methods are ‘systematic modes, procedures or tools used for collection and analysis of data’. 25 While GT studies can commence with a variety of sampling techniques, many commence with purposive sampling, followed by concurrent data generation and/or collection and data analysis, through various stages of coding, undertaken in conjunction with constant comparative analysis, theoretical sampling and memoing. Theoretical sampling is employed until theoretical saturation is reached. These methods and processes create an unfolding, iterative system of actions and interactions inherent in GT. 6 , 16 The methods interconnect and inform the recurrent elements in the research process as shown by the directional flow of the arrows and the encompassing brackets in Figure 1 . The framework denotes the process is both iterative and dynamic and is not one directional. Grounded theory methods are discussed in the following section.

Purposive sampling

As presented in Figure 1 , initial purposive sampling directs the collection and/or generation of data. Researchers purposively select participants and/or data sources that can answer the research question. 5 , 7 , 16 , 21 Concurrent data generation and/or data collection and analysis is fundamental to GT research design. 6 The researcher collects, codes and analyses this initial data before further data collection/generation is undertaken. Purposeful sampling provides the initial data that the researcher analyses. As will be discussed, theoretical sampling then commences from the codes and categories developed from the first data set. Theoretical sampling is used to identify and follow clues from the analysis, fill gaps, clarify uncertainties, check hunches and test interpretations as the study progresses.

Constant comparative analysis

Constant comparative analysis is an analytical process used in GT for coding and category development. This process commences with the first data generated or collected and pervades the research process as presented in Figure 1 . Incidents are identified in the data and coded. 6 The initial stage of analysis compares incident to incident in each code. Initial codes are then compared to other codes. Codes are then collapsed into categories. This process means the researcher will compare incidents in a category with previous incidents, in both the same and different categories. 5 Future codes are compared and categories are compared with other categories. New data is then compared with data obtained earlier during the analysis phases. This iterative process involves inductive and deductive thinking. 16 Inductive, deductive and abductive reasoning can also be used in data analysis. 26

Constant comparative analysis generates increasingly more abstract concepts and theories through inductive processes. 16 In addition, abduction, defined as ‘a form of reasoning that begins with an examination of the data and the formation of a number of hypotheses that are then proved or disproved during the process of analysis … aids inductive conceptualization’. 6 Theoretical sampling coupled with constant comparative analysis raises the conceptual levels of data analysis and directs ongoing data collection or generation. 6

The constant comparative technique is used to find consistencies and differences, with the aim of continually refining concepts and theoretically relevant categories. This continual comparative iterative process that encompasses GT research sets it apart from a purely descriptive analysis. 8

Memo writing is an analytic process considered essential ‘in ensuring quality in grounded theory’. 6 Stern 27 offers the analogy that if data are the building blocks of the developing theory, then memos are the ‘mortar’ (p. 119). Memos are the storehouse of ideas generated and documented through interacting with data. 28 Thus, memos are reflective interpretive pieces that build a historic audit trail to document ideas, events and the thought processes inherent in the research process and developing thinking of the analyst. 6 Memos provide detailed records of the researchers’ thoughts, feelings and intuitive contemplations. 6

Lempert 29 considers memo writing crucial as memos prompt researchers to analyse and code data and develop codes into categories early in the coding process. Memos detail why and how decisions made related to sampling, coding, collapsing of codes, making of new codes, separating codes, producing a category and identifying relationships abstracted to a higher level of analysis. 6 Thus, memos are informal analytic notes about the data and the theoretical connections between categories. 23 Memoing is an ongoing activity that builds intellectual assets, fosters analytic momentum and informs the GT findings. 6 , 10

Generating/collecting data

A hallmark of GT is concurrent data generation/collection and analysis. In GT, researchers may utilise both qualitative and quantitative data as espoused by Glaser’s dictum; ‘all is data’. 30 While interviews are a common method of generating data, data sources can include focus groups, questionnaires, surveys, transcripts, letters, government reports, documents, grey literature, music, artefacts, videos, blogs and memos. 9 Elicited data are produced by participants in response to, or directed by, the researcher whereas extant data includes data that is already available such as documents and published literature. 6 , 31 While this is one interpretation of how elicited data are generated, other approaches to grounded theory recognise the agency of participants in the co-construction of data with the researcher. The relationship the researcher has with the data, how it is generated and collected, will determine the value it contributes to the development of the final GT. 6 The significance of this relationship extends into data analysis conducted by the researcher through the various stages of coding.

Coding is an analytical process used to identify concepts, similarities and conceptual reoccurrences in data. Coding is the pivotal link between collecting or generating data and developing a theory that explains the data. Charmaz 10 posits,

codes rely on interaction between researchers and their data. Codes consist of short labels that we construct as we interact with the data. Something kinaesthetic occurs when we are coding; we are mentally and physically active in the process. (p. 5)

In GT, coding can be categorised into iterative phases. Traditional, evolved and constructivist GT genres use different terminology to explain each coding phase ( Table 1 ).

Comparison of coding terminology in traditional, evolved and constructivist grounded theory.

Adapted from Birks and Mills. 6

Coding terminology in evolved GT refers to open (a procedure for developing categories of information), axial (an advanced procedure for interconnecting the categories) and selective coding (procedure for building a storyline from core codes that connects the categories), producing a discursive set of theoretical propositions. 6 , 12 , 32 Constructivist grounded theorists refer to initial, focused and theoretical coding. 9 Birks and Mills 6 use the terms initial, intermediate and advanced coding that link to low, medium and high-level conceptual analysis and development. The coding terms devised by Birks and Mills 6 were used for Figure 1 ; however, these can be altered to reflect the coding terminology used in the respective GT genres selected by the researcher.

Initial coding

Initial coding of data is the preliminary step in GT data analysis. 6 , 9 The purpose of initial coding is to start the process of fracturing the data to compare incident to incident and to look for similarities and differences in beginning patterns in the data. In initial coding, the researcher inductively generates as many codes as possible from early data. 16 Important words or groups of words are identified and labelled. In GT, codes identify social and psychological processes and actions as opposed to themes. Charmaz 16 emphasises keeping codes as similar to the data as possible and advocates embedding actions in the codes in an iterative coding process. Saldaña 33 agrees that codes that denote action, which he calls process codes, can be used interchangeably with gerunds (verbs ending in ing ). In vivo codes are often verbatim quotes from the participants’ words and are often used as the labels to capture the participant’s words as representative of a broader concept or process in the data. 6 Table 1 reflects variation in the terminology of codes used by grounded theorists.

Initial coding categorises and assigns meaning to the data, comparing incident-to-incident, labelling beginning patterns and beginning to look for comparisons between the codes. During initial coding, it is important to ask ‘what is this data a study of’. 18 What does the data assume, ‘suggest’ or ‘pronounce’ and ‘from whose point of view’ does this data come, whom does it represent or whose thoughts are they?. 16 What collectively might it represent? The process of documenting reactions, emotions and related actions enables researchers to explore, challenge and intensify their sensitivity to the data. 34 Early coding assists the researcher to identify the direction for further data gathering. After initial analysis, theoretical sampling is employed to direct collection of additional data that will inform the ‘developing theory’. 9 Initial coding advances into intermediate coding once categories begin to develop.

Theoretical sampling

The purpose of theoretical sampling is to allow the researcher to follow leads in the data by sampling new participants or material that provides relevant information. As depicted in Figure 1 , theoretical sampling is central to GT design, aids the evolving theory 5 , 7 , 16 and ensures the final developed theory is grounded in the data. 9 Theoretical sampling in GT is for the development of a theoretical category, as opposed to sampling for population representation. 10 Novice researchers need to acknowledge this difference if they are to achieve congruence within the methodology. Birks and Mills 6 define theoretical sampling as ‘the process of identifying and pursuing clues that arise during analysis in a grounded theory study’ (p. 68). During this process, additional information is sought to saturate categories under development. The analysis identifies relationships, highlights gaps in the existing data set and may reveal insight into what is not yet known. The exemplars in Box 1 highlight how theoretical sampling led to the inclusion of further data.

Examples of theoretical sampling.

Thus, theoretical sampling is used to focus and generate data to feed the iterative process of continual comparative analysis of the data. 6

Intermediate coding

Intermediate coding, identifying a core category, theoretical data saturation, constant comparative analysis, theoretical sensitivity and memoing occur in the next phase of the GT process. 6 Intermediate coding builds on the initial coding phase. Where initial coding fractures the data, intermediate coding begins to transform basic data into more abstract concepts allowing the theory to emerge from the data. During this analytic stage, a process of reviewing categories and identifying which ones, if any, can be subsumed beneath other categories occurs and the properties or dimension of the developed categories are refined. Properties refer to the characteristics that are common to all the concepts in the category and dimensions are the variations of a property. 37

At this stage, a core category starts to become evident as developed categories form around a core concept; relationships are identified between categories and the analysis is refined. Birks and Mills 6 affirm that diagramming can aid analysis in the intermediate coding phase. Grounded theorists interact closely with the data during this phase, continually reassessing meaning to ascertain ‘what is really going on’ in the data. 30 Theoretical saturation ensues when new data analysis does not provide additional material to existing theoretical categories, and the categories are sufficiently explained. 6

Advanced coding

Birks and Mills 6 described advanced coding as the ‘techniques used to facilitate integration of the final grounded theory’ (p. 177). These authors promote storyline technique (described in the following section) and theoretical coding as strategies for advancing analysis and theoretical integration. Advanced coding is essential to produce a theory that is grounded in the data and has explanatory power. 6 During the advanced coding phase, concepts that reach the stage of categories will be abstract, representing stories of many, reduced into highly conceptual terms. The findings are presented as a set of interrelated concepts as opposed to presenting themes. 28 Explanatory statements detail the relationships between categories and the central core category. 28

Storyline is a tool that can be used for theoretical integration. Birks and Mills 6 define storyline as ‘a strategy for facilitating integration, construction, formulation, and presentation of research findings through the production of a coherent grounded theory’ (p. 180). Storyline technique is first proposed with limited attention in Basics of Qualitative Research by Strauss and Corbin 12 and further developed by Birks et al. 38 as a tool for theoretical integration. The storyline is the conceptualisation of the core category. 6 This procedure builds a story that connects the categories and produces a discursive set of theoretical propositions. 24 Birks and Mills 6 contend that storyline can be ‘used to produce a comprehensive rendering of your grounded theory’ (p. 118). Birks et al. 38 had earlier concluded, ‘storyline enhances the development, presentation and comprehension of the outcomes of grounded theory research’ (p. 405). Once the storyline is developed, the GT is finalised using theoretical codes that ‘provide a framework for enhancing the explanatory power of the storyline and its potential as theory’. 6 Thus, storyline is the explication of the theory.

Theoretical coding occurs as the final culminating stage towards achieving a GT. 39 , 40 The purpose of theoretical coding is to integrate the substantive theory. 41 Saldaña 40 states, ‘theoretical coding integrates and synthesises the categories derived from coding and analysis to now create a theory’ (p. 224). Initial coding fractures the data while theoretical codes ‘weave the fractured story back together again into an organized whole theory’. 18 Advanced coding that integrates extant theory adds further explanatory power to the findings. 6 The examples in Box 2 describe the use of storyline as a technique.

Writing the storyline.

Theoretical sensitivity

As presented in Figure 1 , theoretical sensitivity encompasses the entire research process. Glaser and Strauss 5 initially described the term theoretical sensitivity in The Discovery of Grounded Theory. Theoretical sensitivity is the ability to know when you identify a data segment that is important to your theory. While Strauss and Corbin 12 describe theoretical sensitivity as the insight into what is meaningful and of significance in the data for theory development, Birks and Mills 6 define theoretical sensitivity as ‘the ability to recognise and extract from the data elements that have relevance for the emerging theory’ (p. 181). Conducting GT research requires a balance between keeping an open mind and the ability to identify elements of theoretical significance during data generation and/or collection and data analysis. 6

Several analytic tools and techniques can be used to enhance theoretical sensitivity and increase the grounded theorist’s sensitivity to theoretical constructs in the data. 28 Birks and Mills 6 state, ‘as a grounded theorist becomes immersed in the data, their level of theoretical sensitivity to analytic possibilities will increase’ (p. 12). Developing sensitivity as a grounded theorist and the application of theoretical sensitivity throughout the research process allows the analytical focus to be directed towards theory development and ultimately result in an integrated and abstract GT. 6 The example in Box 3 highlights how analytic tools are employed to increase theoretical sensitivity.

Theoretical sensitivity.

The grounded theory

The meticulous application of essential GT methods refines the analysis resulting in the generation of an integrated, comprehensive GT that explains a process relating to a particular phenomenon. 6 The results of a GT study are communicated as a set of concepts, related to each other in an interrelated whole, and expressed in the production of a substantive theory. 5 , 7 , 16 A substantive theory is a theoretical interpretation or explanation of a studied phenomenon 6 , 17 Thus, the hallmark of grounded theory is the generation of theory ‘abstracted from, or grounded in, data generated and collected by the researcher’. 6 However, to ensure quality in research requires the application of rigour throughout the research process.

Quality and rigour

The quality of a grounded theory can be related to three distinct areas underpinned by (1) the researcher’s expertise, knowledge and research skills; (2) methodological congruence with the research question; and (3) procedural precision in the use of methods. 6 Methodological congruence is substantiated when the philosophical position of the researcher is congruent with the research question and the methodological approach selected. 6 Data collection or generation and analytical conceptualisation need to be rigorous throughout the research process to secure excellence in the final grounded theory. 44

Procedural precision requires careful attention to maintaining a detailed audit trail, data management strategies and demonstrable procedural logic recorded using memos. 6 Organisation and management of research data, memos and literature can be assisted using software programs such as NVivo. An audit trail of decision-making, changes in the direction of the research and the rationale for decisions made are essential to ensure rigour in the final grounded theory. 6

This article offers a framework to assist novice researchers visualise the iterative processes that underpin a GT study. The fundamental process and methods used to generate an integrated grounded theory have been described. Novice researchers can adapt the framework presented to inform and guide the design of a GT study. This framework provides a useful guide to visualise the interplay between the methods and processes inherent in conducting GT. Research conducted ethically and with meticulous attention to process will ensure quality research outcomes that have relevance at the practice level.

Declaration of conflicting interests: The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding: The author(s) received no financial support for the research, authorship, and/or publication of this article.

An external file that holds a picture, illustration, etc.
Object name is 10.1177_2050312118822927-img1.jpg

Use of constant comparative analysis in qualitative research

Affiliation.

  • 1 Distance Learning Centre, South Bank University, London. [email protected]
  • PMID: 12212430
  • DOI: 10.7748/ns2001.07.15.42.39.c3052

This article describes the application of constant comparative analysis, which is one method that can be used to analyse qualitative data. The need for data analysis to be congruent with the overall research design is highlighted.

Publication types

  • Abstracting and Indexing / standards
  • Data Interpretation, Statistical*
  • Nursing Methodology Research / methods*
  • Nursing Methodology Research / standards
  • Reproducibility of Results
  • Research Design / standards

The “qualitative” in qualitative comparative analysis (QCA): research moves, case-intimacy and face-to-face interviews

  • Open access
  • Published: 26 March 2022
  • Volume 57 , pages 489–507, ( 2023 )

Cite this article

You have full access to this open access article

constant comparative analysis qualitative research

  • Sofia Pagliarin   ORCID: orcid.org/0000-0003-4846-6072 3 , 4 ,
  • Salvatore La Mendola 2 &
  • Barbara Vis 1  

8471 Accesses

3 Citations

5 Altmetric

Explore all metrics

Qualitative Comparative Analysis (QCA) includes two main components: QCA “as a research approach” and QCA “as a method”. In this study, we focus on the former and, by means of the “interpretive spiral”, we critically look at the research process of QCA. We show how QCA as a research approach is composed of (1) an “analytical move”, where cases, conditions and outcome(s) are conceptualised in terms of sets, and (2) a “membership move”, where set membership values are qualitatively assigned by the researcher (i.e. calibration). Moreover, we show that QCA scholars have not sufficiently acknowledged the data generation process as a constituent research phase (or “move”) for the performance of QCA. This is particularly relevant when qualitative data–e.g. interviews, focus groups, documents–are used for subsequent analysis and calibration (i.e. analytical and membership moves). We call the qualitative data collection process “relational move” because, for data gathering, researchers establish the social relation “interview” with the study participants. By using examples from our own research, we show how a dialogical interviewing style can help researchers gain the in-depth knowledge necessary to meaningfully represent qualitative data into set membership values for QCA, hence improving our ability to account for the “qualitative” in QCA.

Similar content being viewed by others

constant comparative analysis qualitative research

The role of analytic direction in qualitative research

constant comparative analysis qualitative research

Adapting and blending grounded theory with case study: a practical guide

Working at a remove: continuous, collective, and configurative approaches to qualitative secondary analysis.

Avoid common mistakes on your manuscript.

1 Introduction

Qualitative Comparative Analysis (QCA) is a configurational comparative research approach and method for the social sciences based on set-theory. It was introduced in crisp-set form by Ragin ( 1987 ) and later expanded to fuzzy sets (Ragin 2000 ; 2008a ; Rihoux and Ragin 2009 ; Schneider and Wagemann 2012 ). QCA is a diversity-oriented approach extending “the single-case study to multiple cases with an eye toward configurations of similarities and differences” (Ragin 2000 :22). QCA aims at finding a balance between complexity and generalizability by identifying data patterns that can exhibit or approach set-theoretic connections (Ragin 2014 :88).

As a research approach, QCA researchers first conceptualise cases as elements belonging, in kind and/or degree, to a selection of conditions and outcome(s) that are conceived as sets. They then assign cases’ set membership values to conditions and outcome(s) (i.e. calibration). Populations are constructed for outcome-oriented investigations and causation is conceived to be conjunctural and heterogeneous (Ragin 2000 : 39ff). As a method, QCA is the systematic and formalised analysis of the calibrated dataset for cross-case comparison through Boolean algebra operations. Combinations of conditions (i.e. configurations) represent both the characterising features of cases and also the multiple paths towards the outcome (Byrne 2005 ).

Most of the critiques to QCA focus on the methodological aspects of “QCA as a method” (e.g. Lucas and Szatrowski 2014 ), although epistemological issues regarding deterministic causality and subjectivity in assigning set membership values are also discussed (e.g. Collier 2014 ). In response to these critiques, Ragin ( 2014 ; see also Ragin 2000 , ch. 11) emphasises the “mindset shift” needed to perform QCA: QCA “as a method” makes sense only if researchers admit “QCA as a research approach”, including its qualitative component.

The qualitative character of QCA emerges when recognising the relevance of case-based knowledge or “case intimacy”. The latter is key to perform calibration (see e.g. Ragin 2000 :53–61; Byrne 2005 ; Ragin 2008a ; Harvey 2009 ; Greckhamer et al. 2013 ; Gerrits and Verweij 2018 :36ff): when associating “meanings” to “numbers”, researchers engage in a “dialogue between ideas and evidence” by using set-membership values as “ interpretive tools ” (Ragin 2000 : 162, original emphasis). The foundations of QCA as a research approach are explicitly rooted in qualitative, case-oriented research approaches in the social sciences, in particular in the understanding of causation as multiple and configurational, in terms of combinations of conditions, and in the conceptualisation of populations as types of cases, which should be refined in the course of an investigation (Ragin 2000 : 30–42).

Arguably, QCA researchers should make ample use of qualitative methods for the social sciences, such as narrative or semi-structured interviews, focus groups, discourse and document analysis, because this will help gain case intimacy and enable the dialogue between theories and data. Furthermore, as many QCA-studies have a small to medium sample size (10–50 cases), qualitative data collection methods appear to be particularly appropriate to reach both goals. However, so far only around 30 published QCA studies use qualitative data (de Block and Vis 2018 ), out of which only a handful employ narrative interviews (see Sect.  2 ).

We argue that this puzzling observation about QCA empirical research is due to two main reasons. First, quantitative data, in particular secondary data available from official databases, are more malleable for calibration. Although QCA researchers should carefully distinguish between measurement and calibration (see e.g. Ragin, 2008a , b ; Schneider and Wagemann 2012 , Sect. 1.2), quantitative data are more convenient for establishing the three main qualitative anchors (i.e. the cross-over point as maximum ambiguity; the lower and upper thresholds for full set membership exclusion or inclusion). Quantitative data facilitate QCA researchers in performing QCA both as a research approach and method. QCA scholars are somewhat aware of this when discussing “the two QCAs” (large-n/quantitative data and small-n/more frequent use of qualitative data; Greckhamer et al. 2013 ; see also Thomann and Maggetti 2017 ).

Second, the use of qualitative data for performing QCA requires an additional effort from the part of the researcher, because data collected through, for instance, narrative interviews, focus groups and document analysis come in verbal form. Therefore, QCA researchers using qualitative methods for empirical research have to first collect data and only then move to their analysis and conceptualisation as sets (analytical move) and their calibration into “numbers” (membership move) for their subsequent handling through QCA procedures (QCA as a method).

Because of these two main reasons, we claim that data generation (or data construction) should also be recognised and integrated in the QCA research process. Fully accounting for QCA as a “qualitative” research approach necessarily entails questions about the data generation process, especially when qualitative research methods are used that come in verbal, and not numerical, form.

This study’s contributions are twofold. First, we present the “interpretative spiral” (see Fig.  1 ) or “cycle” (Sandelowski et al. 2009 ) where data gradually transit through changes of state: from meanings, to concepts to numerical values. In limiting our discussion to QCA as a research approach, we identified three main moves composing the interpretative spiral: the (1) relational (data generation through qualitative methods), (2) analytical (set conceptualisation) and (3) membership (calibration) moves. Second, we show how in-depth knowledge for subsequent set conceptualisation and calibration can be more effectively generated if the researcher is open, during data collection, to support the interviewee’s narration and to establish a dialogue—a relation—with him/her (i.e. the relational move). It is the researcher’s openness that can facilitate the development of case intimacy for set conceptualisation and assessment (analytical and membership moves). We hence introduce a “dialogical” interviewing style (La Mendola 2009 ) to show how this approach can be useful for QCA researchers. Although we mainly discuss narrative interviews, a dialogical interviewing style can also adapt to face-to-face semi-structured interviews or questionnaires.

figure 1

The interpretative spiral and the relational, analytical and membership moves

Our main aim is to make QCA researchers more aware of “minding their moves” in the interpretative spiral. Additionally, we show how a “dialogical” interviewing style can facilitate the access to the in-depth knowledge of cases useful for calibration. Researchers using narrative interviews who have not yet performed QCA can gain insight into–and potentially see the advantages of–how qualitative data, in particular narrative interviews, can be employed for the performance of QCA (see Gerrits and Verweij 2018 :36ff).

In Sect.  2 we present the interpretative spiral (Fig.  1 ,) the interconnections between the three moves and we discuss the limited use of qualitative data in QCA research. In Sect.  3 , we examine the use of qualitative data for performing QCA by discussing the relational move and a dialogical interviewing style. In Sect.  4 , we examine the analytical and membership moves and discuss how QCA researchers have so far dealt with them when using qualitative data. In Sect.  5 , we conclude by putting forward some final remarks.

2 The interpretative spiral and the three moves

Sandelowski et al. ( 2009 ) state that the conversion of qualitative data into quantitative data (“quantitizing”) necessarily involves “qualitazing”, because researchers perform a “continuous cycling between assigning numbers to meaning and meaning to numbers” (p. 213). “Data” are recognised as “the product of a move on the part of researchers” (p. 209, emphasis added) because information has to be conceptualised, understood and interpreted to become “data”. In Fig.  1 , we tailor this “cycling” to the performance of QCA by means of the interpretative spiral.

Through the interpretative spiral, we show both how knowledge for QCA is transformed into data by means of “moves” and how the gathering of qualitative data consists of a move on its own. Our choice for the term “move” is grounded in the need to communicate a sense of movement along the “cycling” between meanings and numbers. Furthermore, the term “move” resonates with the communicative steps that interviewers and interviewee engage in during an interview (see Sect.  3 below).

Although we present these moves as separate, they are in reality interfaces, because they are part of the same interpretative spiral. They can be thought of as moves in a dance; the latter emerges because of the succession of moves and steps as a whole, as we show below.

The analytical and membership moves are intertwined-as shown by the central “vortex” of the spiral in Fig.  1 -as they are composed of a number of interrelated steps, in particular case selection, theory-led set conceptualisation, definition of the most appropriate set membership scales and of the cross-over and upper and lower thresholds (e.g. crisp-set, 4- or 6-scale fuzzy-sets; see Ragin 2000 :166–171; Rihoux and Ragin 2009 ). Calibration is the last move of the dialogue between theory (concepts of the analytical move) and data (cases). In the membership move, fuzzy sets are used as “an interpretative algebra, a language that is half-verbal-conceptual and half-mathematical-analytical” (Ragin 2000 :4). Calibration is hence a type of “quantitizing” and “qualitizing” (Sandelowski et al. 2009 ). In applied QCA, set membership values can be reconceptualised and recalibrated. This will for instance be done to solve true logical contradictions in the truth table and when QCA results are interpreted by “going back to cases”, hence overlapping with the practices related to QCA “as a method”.

The relational move displayed in Fig.  1 expresses the additional interpretative process that researchers engage in when collecting and analysing qualitative data. De Block and Vis ( 2018 ) show that only around 30 published QCA-studies combine qualitative data with QCA, including a range of additional data, like observations, site visits, newspaper articles.

However, a closer look reveals that the majority of the published QCA-studies using qualitative data employ (semi)structured interviews or questionnaires. Footnote 1 For instance, Basurto and Speer ( 2012 ) Footnote 2 proposed a step-wise calibration process based on a frequency-oriented strategy (e.g. number of meetings, amount of available information) to calibrate the information collected through 99 semi-structured interviews. Fischer ( 2015 ) conducted 250 semi-structured interviews by cooperating with four trained researchers using pre-structured questions, where respondents could voluntarily add “qualitative pieces of information” in “an interview protocol” (p. 250). Henik ( 2015 ) structured and carried out 50 interviews on whistle-blowing episodes to ensure subsequent blind coding of a high number of items (almost 1000), arguably making them resemble face-to-face questionnaires.

In turn, only a few QCA-researchers use data from narrative interviews. Footnote 3 For example, Metelits ( 2009 ) conducted narrative interviews during ethnographic fieldwork over the course of several years. Verweij and Gerrits ( 2015 ) carried out 18 “open” interviews, while Chai and Schoon ( 2016 ) conducted “in-depth” interviews. Wang ( 2016 ), in turn, conducted structured interviews through a questionnaire, following a similar approach as in Fischer ( 2015 ); however, during the interviews, Wang’s respondents were asked to reflexively justify the chosen questionnaire's responses, hence moving the structured interviews closer to narrative ones. Tóth et al. ( 2017 ) performed 28 semi-structured interviews with company managers to evaluate the quality and attractiveness of customer-provider relationships for maintaining future business relations. Their empirical strategy was however grounded in initial focus groups and other semi-structured interviews, composed of open questions in the first part and a questionnaire in the second part (Tóth et al. 2015 ).

Although no interview is completely structured or unstructured, it is useful to conceptualise (semi-)structured and less structured (or narrative) interviews as the two ends of a continuum (Brinkmann 2014 ). Albeit still relatively rare as compared to quantitative data, the more popular integration of (semi-)structured interviews into QCA might be due to the advantages that this type of qualitative data holds for calibration. The “structured” portion of face-to-face semi-structured interviews or questionnaires facilitates the calibration of this type of qualitative data, because quantitative anchor points can be more clearly identified to assign set membership values (see e.g. Basurto and Speer 2012 ; Fischer 2015 ; Henik 2015 ).

Hence, when critically looking at the “qualitative” character of QCA as a research approach, applied research shows that qualitative methods uneasily fit with QCA. This is because data collection has not been recognised as an integral part of the QCA research process. In Sect.  3 , we show how qualitative data, and in particular a dialogical interviewing style, can help researchers to develop case intimacy.

3 The relational move

Social data are not self-evident facts, they do not reveal anything in themselves, but researchers must engage in interpretative efforts concerning their meaning (Sandelowski et al. 2009 ; Silverman, 2017 ). Differently stated, quantitising and qualitising characterise both quantitative and qualitative social data, albeit to different degrees (Sandelowski et al. 2009 ). This is an ontological understanding of reality that is diversely held by post-positivist, critical realist, critical and constructivist approaches (except by positivist scholars; see Guba and Lincoln, 2005 :193ff). Our position is more akin to critical realism that, in contrast to post-modernist perspectives (Spencer et al. 2014 :85ff), holds that reality exists “out there” and that epistemologically, our knowledge of it, although imperfect, is possible–for instance through the scientific method (Sayer 1992 ).

The socially constructed, not self-evident character of social data is manifest in the collection and analysis of qualitative data. Access to the field needs to be earned, as well as trust and consent from participants, to gradually build and expand a network of participants. More than “collected”, data are “gathered”, because they imply the cooperation with participants. Data from interviews and observations are heterogeneous, and need to be transcribed and analysed by researchers, who also self-reflectively experience the entire process of data collection. QCA researchers using qualitative data necessarily have to go through this additional research process–or move-to gather and generate data, before QCA as a research approach can even start. As QCA researchers using qualitative data need to interact with participants to collect their data, we call this additional research process “relational move”.

While we limit our discussion to narrative interviews and select a few references from a vast literature, our claim is that it is the ability of the interviewer to give life to interviews as a distinct type of social interaction that is key for the data collection process (Chase 2005 ; Leech 2002 ; La Mendola 2009 ; Brinkmann. 2014 ). The ability of the interviewer to establish a dialogue with the interviewee–also in the case of (semi-)structured interviews–is crucial to gain access to case-based knowledge and thus develop the case intimacy later needed in the analytical and membership moves. The relational move is about a researcher’s ability to handle the intrinsic duality characterising that specific social interaction we define as an interview. Both (or more) partners have to be considered as necessary actors involved in giving shape to the “inter-view” as an ex-change of views.

Qualitative researchers call this ability “rapport” (Leech, 2002 :665), “contract” or “staging” (Legard et al., 2003 :139). In our specific understanding of the relational move through a “dialogical” Footnote 4 interviewing style, during the interview 1) the interviewer and the interviewee become the “listener” and the “narrator” (Chase, 2005 :660) and 2) a true dialogue between listener and narrator can only take place when they engage in an “I-thou” interaction (Buber 1923 /2008), as we will show below when we discuss selected examples from our own research.

As a communicative style, in a dialogical interview not only the researcher cannot disappear behind the veil of objectivity (Spencer et al. 2014 ), but the researcher is also aware of the relational duality–or “dialogueness”–inherent to the “inter-view”. Dialogical face-to-face interviewing can be compared to a choreography (Brinkman 2014 :283; Silverman 2017 :153) or a dance (La Mendola 2009 , ch. 4 and 5) where one of the partners (the researcher) is the porteur (“supporter”) of the interaction. As in a dancing couple, the listener supports, but does not lead, the narrator in the unfolding of her story. The dialogical approach to interviewing is hence non-directive, but supportive. A key characteristic of dialogical interviews is a particular way of “being in the interview” (see example 2 below) because it requires the researcher to consider the interviewee as a true narrator (a “thou”). Footnote 5

In a dialogical approach to interviews, questions can be thought of as frames through which the listener invites the narrator to tell a story in her own terms (Chase 2005 :662). The narrator becomes the “subject of study” who can be disobedient and capable to raise her own questions (Latour 2000 : 116; see also Lund 2014). This is also compatible with a critical realist ontology and epistemology, which holds that researchers inevitably draw artificial (but negotiable) boundaries around the object and subject of analysis (Gerrits and Verweij 2013). The case-based, or data-driven (ib.), character of QCA as a research approach hence takes a new meaning: in a dialogical interviewing style, although the interviewer/listener proposes a focus of analysis and a frame of meaning, the interviewee/narrator is given the freedom to re-negotiate that frame of meaning (La Mendola 2009 ; see examples 1 and 2 below).

We argue that this is an appropriate way to obtain case intimacy and in-depth knowledge for subsequent QCA, because it is the narrator who proposes meanings that will then be translated by the researcher, in the following moves, into set membership values.

Particularly key for a dialogical interviewing style is the question formulation, where interviewer privileges “how” questions (Becker 1998 ). In this way, “what” and “why” (evaluative) questions are avoided, where the interviewee is asked to rationally explain a process with hindsight and that supposedly developed in a linear way. Also typifying questions are avoided, where the interviewer gathers general information (e.g. Can you tell me about the process through which an urban project is typically built? Can you tell me about your typical day as an academic?). Footnote 6 “Dialogical” questions can start with: “I would like to propose you to tell me about…” and are akin to “grand tour questions” (Spradley 1979 ; Leech 2002 ) or questions posed “obliquely” (Roulston 2018 ) because they aim at collecting stories, episodes in a certain situation or context and allowing the interviewee to be relatively free to answer the questions.

An example taken from our own research on a QCA of large-scale urban transformations in Western Europe illustrates the distinct approach characterising dialogical interviewing. One of our aims was to reconstruct the decision-making process concerning why and how a certain urban transformation took place (Pagliarin et al. 2019 ). QCA has already been previously used to study urban development and spatial policies because it is sensitive to individual cases, while also accounting for cross-case patterns by means of causal complexity (configurations of conditions), equifinality and causal asymmetry (e.g. Byrne 2005 ; Verweij and Gerrits 2015 ; Gerrits and Verweij 2018 ). A conventional way to formulate this question would be: “In your opinion, why did this urban transformation occur at this specific time?” or “Which were the governance actors that decided its implementation?”. Instead, we formulated the question in a narrative and dialogical way:

Example 1 Listener [L]: Can you tell me how the site identification and materialization of Ørestad came about? Narrator [N]: Yes. I mean there’s always a long background for these projects. (…) it’s an urban area built on partly reclaimed land. It was, until the second world war, a seaport and then they reclaimed it during the second world war, a big area. (…) this is the island called Amager. In the western part here, you can see it differs completely from the rest and that’s because they placed a dam all around like this, so it’s below sea level. (…) [L]: When you say “they”, it’s…? [N]: The municipality of Copenhagen. Footnote 7 (…)

In this example, the posed question (“how… [it]… came about?”) is open and oriented toward collecting the specific story of the narrator about “how” the Ørestad project emerged (Becker 1998 ), starting at the specific time point and angle decided by the interviewee. In this example, the interviewee decided to start just after the Second World War (albeit the focus of the research was only from the 1990s) and described the area’s geographical characteristics as a background for the subsequent decision-making processes. It is then up to the researcher to support the narrator in funnelling in the topics and themes of interest for the research. In the above example, the listener asked: “When you say “they”, it’s…?” to signal to the narrator to be more specific about “they”, without however assuming to know the answer (“it’s…?”). In this way, the narrator is supported to expand on the role of Copenhagen municipality without directly asking for it (which is nevertheless always a possibility to be seized by the interviewer).

The specific “dialogical” way of the researcher of “being in the interview” is rooted in the epistemological awareness of the discrepancy between the narrator’s representation and the listener’s. During an interview, there are a number of “representation loops”. As discussed in the interpretative spiral (see Sect.  2 ), the analytical and membership moves are characterised by a number of research steps; similarly, in the relational move the researcher engages in representation loops or interpretative steps when interacting with the interviewee. The researcher holds ( a ) an analytical representation of her focus of analysis, ( b ) which will be re-interpreted by the interviewee (Geertz, 1973 ). In a dialogical style of interview, the researcher also embraces ( c ) her representation of the ( b ) interviewee's interpretation of ( a ) her theory-led representation of the focus of analysis. Taken together, ( a )-( b )-( c ) are the structuring steps of a dialogical interview, where the listener’s and narrator’s representations “dance” with one another. In the relational move, the interviewer is aware of the steps from one representation to another.

In the following Example 2 , the narrator re-elaborated (interpretative step b) the frame of meaning of the listener (interpretative step a) by emphasising to the listener two development stages of a certain project (an airport expansion in Barcelona, Spain), which the researcher did not previously think of (interpretative step c):

Example 2 [L]: Could you tell me about how the project identification and realisation of the Barcelona airport come about? [N]: Of the Barcelona airport? Well. The Barcelona airport is I think a good thermometer of something deeper, which has been the inclusion of Barcelona and of its economy in the global economy. So, in the last 30 years El Prat airport has lived through like two impulses of development, because it lived, let´s say, the necessary adaptation to a specific event, that is the Olympic games. There it lived its first expansion, to what we today call Terminal 2. So, at the end of the ´80 and early ´90, El Prat airport experienced its first big jump. (...) Later, in 2009 (...) we did a more important expansion, because we did not expand the original terminal, but we did a new, bigger one, (...) the one we now call Terminal 1. Footnote 8

If the interviewee is considered as a “thou”, and if the researcher is aware of the representation loops (see above), the collected information can also be helpful for constructing the study population in QCA. The population under analysis is oftentimes not given in advance but gradually defined through the process of casing (Ragin 2000 ). This allows the researcher to be open to construct the study population “with the help of others”, like “informants, people in the area, the interlocutors” (Lund 2014:227). For instance, in example 2 above, the selection of which urban transformations will form the dataset can depend on the importance given by the interviewees to the structuring impact of a certain urban transformation on the overall urban structure of an urban region.

In synthesis, the data collection process is a move on its own in the research process for performing QCA. Especially when the collected data are qualitative, the researcher engages in a relation with the interlocutor to gather information. A dialogical approach emphasises that the quality of the gathered data depends on the quality of the dialogue between narrator and listener (La Mendola 2009 ). When the listener is open to consider the interviewee as a “thou”, and when she is aware of the interpretative steps occurring in the interview, then meaningful case-based knowledge can be accessed.

Case intimacy is at best developed when the researcher is open to integrate her focus of analysis with fieldwork information and when s/he invites, like in a dance, the narrator to tell his story. However, a dialogical interviewing style is not theory-free, but it is “theory-independent”: the dialogical interviewer supports the narration of the interviewee and does not lead the narrator by imposing her own conceptualisations. We argue that such dialogical I-thou interaction during interviews fosters in-depth knowledge of cases, because the narrator is treated as a subject that can propose his interpretation of the focus of analysis before the researcher frames it within her analytical and membership moves.

However, in practice, there is a tension between the researcher's need to collect data and the “here-and-now interactional event of the interview” (Rapley, 2001 :310). It is inevitable that the researcher re-elaborates, to a certain degree, her  analytical framework during the interviews, because this enables the researcher to get acquainted with the object of analysis and to keep the interview content on target with the research goals (Jopke and Gerrits, 2019 ). But is it this re-interpretation of the interviewee's replies and stories by the listener during the interviews that opens the interviewer’s awareness of the representation loops.

4 The analytical and membership moves

Researchers engage in face-to-face interviews as a strategy for data collection by holding specific analytical frameworks and theories. A researcher seldom begins his or her undertakings, even in the exploratory phase, with a completely open mind (Lund 2014:231). This means that the researcher's representations (a and c, see above) of the narrator's representation(s) (b, see above) are related to the theory-led frames of inquiry that the researcher organises to understand the world. These frames are typically also verbal, as “[t]his framing establishes, and is established through, the language we employ to speak about our concerns” (Lund 2014:226).

In particular for the collection of qualitative data, the analytical move is composed of two main movements: during and after the data collection process. During the data collection process, when adopting a dialogical interviewing style, the researcher should mind keeping the interview theory-independent (see above). First, this means that the interviewee is not asked to get to the researcher’s analytical level. The use of jargon should be avoided, either in narrative or semi-structured interviews and questionnaires, because it would limit the narrator's representation(s) (b) within the listener's interpretative frames (a), and hence the chance for the researcher to gain in-depth case knowledge (c). Silverman ( 2017 :154) cautions against “flooding” interviewees with “social science categories, assumptions and research agendas”. Footnote 9 In example 1 above, the use of the words “governance actors” may have misled the narrator–even an expert–since its meaning might not be clear or be the same as the interviewer's.

Second, the researcher should neither sympathise with the interviewee nor judge the narrator’s statements, because this would transform the interview into another type of social interaction, such as a conversation, an interrogation or a confession (La Mendola 2009 ). The analytical move requires that the researcher does not confuse the interview as social interaction with his or her analysis of the data, because this is a specific, separate moment after the interview is concluded. Whatever material or stories a researcher receives during the interviews, it is eventually up to him or her to decide which representation(s) will be told (and how) (Stake 2005 :456). It is the job of the researcher to perform the necessary analytical work on the collected data.

After the fieldwork, the second stage of the analytical move is a change of state of the interviewees' replies and stories to subsequently “feed in” in QCA. The researcher begins to qualitatively assess and organise the in-depth knowledge, in the form of replies or stories, received by the interviewees through their narrations. This usually involves the (double-)coding of the qualitative material, manually or through the use of dedicated software. The analysis of the qualitative material organises the in-depth knowledge gained through the relational move and sustains the (re)definition of the outcome and conditions, their related attributes and sub-dimensions, for performing QCA.

In recognising the difficulty in integrating qualitative (interview) data into QCA procedures, QCA-researchers have developed templates, tables or tree diagrams to structure the analysed qualitative material into set membership scores (Basurto and Speer 2012 ; Legewie 2017 ; Tóth et al. 2017 ; see also online supplementary material). We call these different templates “ Supports for Membership Representation ” (SMeRs) because they facilitate the passage from conceptualisation (analytical move) to operationalisation into set membership values (membership move). Below, we discuss these templates by placing them along a continuum from “more theory-driven” to “more data-driven” (see Gerrits and Verweij 2018 , ch. 1). Although the studies included below did not use a dialogical approach to interviews, we also examine the SMeRs in terms of their openness towards the collected material. As explained above, we believe it is this openness–at best “dialogical”–that facilitates the development of case intimacy on the side of the researcher. In distinguishing the steps characterising both moves (see Sect.  2 above), below we differentiate the analytical and membership moves.

Basurto and Speer ( 2012 ) were the first develop and present a preliminary but modifiable list of theoretical dimensions for conditions and outcome. Their interview guideline is purposely developed to obtain responses to identify anchor points prior to the interviews and to match fuzzy sets. In our perspective, this contravenes the separation between the relational and analytical move: the researcher deals with interviewees as “objects” whose shared information is fitted to the researchers’ analytical framework. In their analytical move, Basurto and Speer define an ideal and a deviant case–both of them non-observable–to locate, by comparison, their cases and facilitate the assignment of fuzzy-set membership scores (membership move).

Legewie ( 2017 ) proposes a “grid” called Anchored Calibration (AC) by building on Goertz ( 2006 ). In the analytical move, the researcher first structures (sub-)dimensions for each condition and the outcome by means of concept trees. Each concept is then represented by a gradation, which should form conceptual continua (e.g. from low to high) and is organised in a tree diagram to include sub-dimensions of the conditions and outcome. In the membership move, to each “graded” concept, anchor points are assigned (i.e. 0, 0.25, 0.75, 1). The researcher then iteratively matches coded evidence from narrative interviews (analytical move) to the identified anchor points for calibration, thus assigning set membership scores (e.g. 0.33 or 0.67; i.e. membership move). Similar to Basurto and Speer ( 2012 ), the analytical framework of the researcher is given priority to and tightly structures the collected data. Key for anchored calibration is the conceptual neatness of the SMeR, which is advantageous for the researcher but that, following our perspective, allows a limited dialogue with the cases and hence the development of case intimacy.

An alternative route is the one proposed by Tóth et al. ( 2017 ). The authors devise the Generic Membership Evaluation Template (GMET) as a “grid” where qualitative information from the interviews (e.g. quotes) and from the researcher’s interpretative process is included. In the analytical move, their template clearly serves as a “translation support” to represent “meanings” into “numbers”: researchers included information on how they interpreted the evidence (e.g. positive/negative direction/effect on membership of a certain attribute; i.e. analytical move), as well as an explanation of why specific set membership scores have been assigned to cases (i.e. membership move). Tóth et al.’s ( 2017 ) SMeR appears more open to the interviewees’ perspective, as researchers engaged in a mixed-method research process where the moment of data collection–the relational move–is elaborated on (Tóth et al. 2015 ). We find their approach more effective for gaining in-depth knowledge of cases and for supporting the dialogue between theory and data.

Jopke and Gerrits ( 2019 ) discuss routines, concrete procedures and recommendations on how to inductively interpret and code qualitative interview material for subsequent calibration by using a grounded-theory approach. In their analytical move, the authors show how conditions can be constructed from the empirical data collected from interviews; they suggest first performing an open coding of the interview material and then continuing with a theoretical coding (or “closed coding”) that is informed by the categories identified in the previous open coding procedure, before defining set membership scores for cases (i.e. membership move). Similar to Tóth et al. ( 2017 ), Jopke and Gerrits’ ( 2019 ) SMeR engages with the data collection and the gathered qualitative material by being open to what the “data” have to “tell”, hence implementing a strategy for data analysis that is effective to gain in-depth knowledge of cases.

Another type of SMeR is the elaboration of summaries of the interview material by unit of analysis (e.g. urban transformations, participation initiatives, interviewees’ individual careers paths). Rihoux and Lobe ( 2009 ) propose the so-called short case descriptions (SCDs). Footnote 10 As a possible step within the interpretative spiral available to the researcher, short case descriptions (SCDs) are concise summaries that effectively synthesise the most important information sorted by certain identified dimensions, which will then compose the conditions, and their sub-dimensions, for QCA. As a type of SMeR, the summaries consist of a change of state of the qualitative material, because they provide “intermediate” information on the threshold between the coding of the interviews' transcripts and the subsequent assignment of membership scores (the membership move, or calibration) for the outcome and each condition. Furthermore, the writing of short summaries appears to be particularly useful to allow researchers that have already performed narrative interviews to evaluate whether to carry out QCA as a systematic method for comparative analysis. For instance, similar to what Tóth et al. ( 2017 :200) did to reduce interview bias, in our own research interviewees could cover the development of multiple cases, and the use of short summaries helped us compare information per each case across multiple interviewees and spot possible contradictions.

The overall advantage of SMeRs is helping researchers provide an overview of the quality and “patchiness” of available information about the cases per interview (or document). SMeRs can also help spot inconsistencies and contradictions, thus guiding researchers to judge if their data can provide sufficiently homogeneous information for the conditions and outcome composing their QCA-model. This is particularly relevant in case-based QCA research, where descriptive inferences are drawn from the material collected from the selected cases and the degree of its internal validity (Thomann and Maggetti 2017 :361). Additionally, the issue of the “quality” and “quantity” across the available qualitative data (de Block and Vis 2018 ) can be checked ex-ante before embarking on QCA.

For the membership move, the GMET, the AC, grounded theory coding and short summaries supports the qualitative assignment of set membership values from empirical interview data. SMeRs typically include an explanation about why a certain set membership score has been assigned to each case record, and diagrammatically arrange information about the interpretation path that researchers have followed to attribute values. They are hence a true “interface” between qualitative empirical data (“words/meaning”) and set membership values (“numbers”). Each dimension included in SMeRs can also be coupled with direct quotes from the interviews (Basurto and Speer 2012 ; Tóth et al. 2017 ).

In our own research (Pagliarin et al. 2019 ), after having coded the interview narratives, we developed concepts and conditions first by comparing the gathered information through short summaries—similar to short case descriptions (SCDs), see Rihoux and Lobe ( 2009 )—and then by structuring the conditions and indicators in a grid by adapting the template proposed by Tóth et al. ( 2017 ). One of the goals of our research was to identify “external factors or events” affecting the formulation and development of large-scale urban transformations. External (national and international) events (e.g. failed/winning bid for the Olympic Games, fall of Iron Curtain/Berlin wall) do not have an effect per se, but they stimulate actors locally to make a certain decision about project implementation. We were able to gain this knowledge because we adopted a dialogical interviewing style (see Example 3 below). As the narrator is invited to tell us about some of the most relevant projects of urban transformation in Greater Copenhagen in the past 25–30 years, the narrator is free to mention the main factors and actors impacting on Ørestad as an urban transformation.

Example 3 [L]: In this interview, I would propose that you tell me about some of the most relevant projects of urban transformation that have been materialized in Greater Copenhagen in the past 25–30 years. I would like you to tell me about their itinerary of development, step by step, and if possible from where the idea of the project emerged. [N]: Okay, I will try to start in the 80’s. In the 80’s, there was a decline in the city of Copenhagen. (…) In the end of the 80’s and the beginning of the 90’s, there was a political trend. They said, “We need to do something about Copenhagen. It is the only big city in Denmark so if we are going to compete with other cities, we have to make something for Copenhagen so it can grow and be one of the cities that can compete with Amsterdam, Hamburg, Stockholm and Berlin”. I think also it was because of the EU and the market so we need to have something that could compete and that was the wall falling in Berlin. (…) The Berlin Wall, yes. So, at that time, there was a commission to sit down with the municipality and the state and they come with a plan or report. They have 20 goals and the 20 goals was to have a bridge to Sweden, expanding of the airport, a metro in Copenhagen, investment in cultural buildings, investment in education. (…) In the next 5 years, from the beginning of the 90’s to the middle of the 90’s, there were all of these projects more or less decided. (…) The state decided to make the airport, to make the bridge to Sweden, to make… the municipality and the city of Copenhagen decides to make Ørestad and the metro together with the state. So, all these projects that were lined up on the report, it was, let’s decide in the next 5 years. [L]: So, there was a report that decided at the end of the 80’s and in the 90’s…? [N]: Yes, ‘89. (…) To make all these projects, yes. (…). [L]: Actually, one of the projects I would like you to tell me about is the Ørestad. R: Yes. It is the Ørestad. The Ørestad was a transformation… (…).

The factors mentioned by the interviewee corresponded to the main topics of interest by the researcher. In this example, we can also highlight the presence of a “prompt” (Leech 2002 ) or “clue” (La Mendola 2009 ). To keep the narrator on focus, the researcher “brings back” (the original meaning of rapporter ) the interviewee to the main issues of the inter-view by asking “So, there was a report…”.

Following the question formulation as shown in example 3, below we compare the external event(s) impacting the cases of Lyon Part-Dieu in France (Example 4 ) and Scharnhauserpark in Stuttgart in Germany (Example 5 ).

Example 4 [N]: So, Part-Dieu is a transformation of the1970s, to equip [Lyon] with a Central Business District like almost all Western cities, following an encompassing regional plan. This is however not local planning, but it is part of a major national policy. (…) To counterbalance the macrocephaly of Paris, 8 big metropolises were identified to re-balance territorial development at the national level in the face of Paris. (…) including Lyon. (…) The genesis of Part-Dieu is, in my opinion, a real-estate opportunity, and the fact to have military barracks in an area (…) 15 min away from the city centre (…) to reconvert in a business district. Footnote 11
Example 5 [N]: When the American Army left the site in 1992, the city of Ostfildern consisted of five villages. They bought the site and they said, “We plan and build a new centre for our village”, because these are five villages and this is in the very centre. It’s perfectly located, and when they started they had 30,000 inhabitants and now that it’s finished, they have 40,000, so a third of the population were added in the last 20 years by this project. For a small municipality like Ostfildern, it was a tremendous effort and they were pretty good at it. Footnote 12

In the examples above, Lyon Part-Dieu and Scharnhauserpark are unique cases and developed into an area with different functions (a business district and a mixed-use area), but we can identify a similar event: the unforeseen dismantling of military barracks. Both events were considered external factors punctually identifiable in time that triggered the redevelopment of the areas. Instead, in the following illustration about the “Confluence” urban renewal in Lyon, the identified external event relates to a global trend regarding post-industrial cities and the “patchwork” replacement of functions in urban areas:

Example 6 [N]: The Confluence district (…) the wholesale market dismantles and opens an opportunity at the south of the Presqu'Île, so an area extremely well located, we are in the city centre, with water all around because of the Saône and Rhône rivers, so offering a great potential for a high quality of life. However, I say “potential” because there is also a highway passing at the boundary of the neighbourhood. Footnote 13

Although our theoretical framework identified a set of exogenous factors affecting large-scale urban transformations locally, we used the empirical material from our interviews to conceptualise the closing of military barracks and the dismantling of the wholesale market as two different, but similar types of external events, and considered them to be part of the same “external events” condition. In set-theoretic terms, this condition is defined as a “set of projects where external (unforeseen) events or general/international trends had a large impact on project implementation”. The broader set conceptualisation of this condition is possibly not optimal, as it reflects the tension in comparative research to find a balance between capturing cases’ individual histories (case idiosyncrasies) and more concepts that are abstract “enough” to account for cross-case patterns (see Gerrits and Verwej 2018 ; Jopke and Gerrit 2019 ). This is a key challenge of the analytical move.

However, the core of the subsequent membership move is precisely to perform a qualitative assessment to capture these differences by assigning different set-membership values. In the case of Lyon Confluence, where the closing of the whole sale market as external event did happen but did only have a “general” influence on the area’s redevelopment, the case was given a set membership value of 0.33 to this condition. In contrast, the case of Lyon Part-Dieu was given a set membership score of 0.67 to the condition “external events” because a French military area was dismantled, but it was also combined with a national strategy of the French state to redistribute territorial development across France. According to our analysis of the collected qualitative material, it was an advantage that the military area was dismantled but the redevelopment of Part-Dieu would have probably been affected anyway by the overall national territorial strategy. Footnote 14 Finally, the case of Stuttgart Scharnhauserpark case was given full membership (1.00) to the condition, because the US army left the area–which is an indication of a “fully exogenous” event–that truly stimulated urban change in Scharnhauserpark. Footnote 15

Our calibration (membership move) of the three cases illustrated in Examples 4 , 5 and 6 shows that set membership values represent a concept, at times also relatively broad to allow comparison (analytical move), but that they do not replace the specific way (or “meaning”) through which the impact of external factors empirically instantiate in each of the cases discussed in the above examples.

In the interpretative spiral Fig.  1 , there is hence–despite our wishes–no perfect correspondence between meanings and numbers (quantitising) and numbers and meanings (qualitising; see Sandelowski et al. 2009 ). This is a consequence of the constructed nature of social data (see Sect.  2 ). When using qualitative data, fuzzy-sets are “ interpretive tools” to operationalise theoretical concepts (Ragin 2000 :162, original emphasis) and hence are approximations to reality. In other words, set memberships values are token s. Here, we agree with Sandelowski et al. ( 2009 ), who are critical of “the rhetorical appeal of numbers” (p. 208) and the vagaries of ordinal categories in questionnaires (p. 211ff).

Note that calibration by using qualitative data is not blurry or unreliable. On the contrary, its robustness is given by the quality of the dialogue established between researcher and interviewee and by the acknowledgement that the analytical and membership moves are types of representation –as fourth and fifth representation loops. It might hence be possible that QCA researchers using qualitative data have a different research experience of QCA as a research approach and method than QCA researchers using quantitative data.

5 Conclusion

In this study, we critically observed how, so far, qualitative data have been used in few QCA studies, and only a handful use narrative interviews (de Block and Vis 2018 ). This situation is puzzling because qualitative research methods can offer an effective route to gain access to in-depth case knowledge, or case intimacy, considered key to perform QCA.

Besides the higher malleability of quantitative data for set conceptualisation and calibration (here called “analytical” and “membership” moves), we claimed that the limited use of qualitative data in QCA applied research depends on the failure to recognise that the data collection process is a constituent part of QCA “as a research approach”. Qualitative data, such as interviews, focus groups or documents, come in verbal form–hence, less “ready” for calibration than quantitative data–and require a research phase on their own for data collection (here called the “relational move”). The relational, analytical and membership moves form an “interpretative spiral” that hence accounts for the main research phases composing QCA “as a research approach”.

In the relational move, we showed how researchers can gain access to in-depth case-based knowledge, or case intimacy, by adopting a “dialogical” interviewing style (La Mendola 2009 ). First, researchers should be aware of the discrepancy between the interviewee/narrator’s representation and the interviewer/listener’. Second, researchers should establish an “I-thou” relationship with the narrator (Buber 1923 /2010; La Mendola 2009 ). As in a dancing couple, the interviewer/listener should accompany, but not lead, the narrator in the unfolding of her story. These are fundamental routes to make the most of QCA’s qualitative potential as a “close dialogue with cases” (Ragin 2014 :81).

In the analytical and membership moves, researchers code, structure and interpret their data to assign crisp- and fuzzy-set membership values. We examined the variety of templates–what we call Supports for Membership Representation (SMeRs)–designed by QCA-researchers to facilitate the assignment of “numbers” to “words” (Rihoux and Lobe 2009 ; Basurto and Speer 2012 ; Legewie 2017 ; Tóth et al. 2015 , 2017 ; Jopke and Gerrits 2019 ).

Our study did not offer an overarching examination of the research process involved in QCA, but critically focussed on a specific aspect of QCA as a research approach. We focussed on the “translation” of data collected through qualitative research methods (“words” and “meanings”) into set membership values (“numbers”). Hence, in this study the discussion of QCA as a method has been limited.

We hope our paper has been a first contribution to identify and critically examine the “qualitative” character of QCA as a research approach. Further research could identify other relevant moves in QCA as a research approach, especially when non-numerical data are employed and regarding internal and external validity. Other moves and steps could also be identified or clearly labelled in QCA as a method, in particular when assessing limited diversity, skewedness (e.g. “data distribution” step) and the management of true logical contradictions (e.g. “solving contradictions” move). These are all different mo(ve)ments in the full-fledged application of QCA that allow researchers to make sense of their data and to connect “theory” and “evidence”.

As also noted by de Block and Vis ( 2018 ), QCA researchers are not always clear about what they exactly mean with “in-depth” or “open” interviews and how they informed the calibration process (e.g. Verweij and Gerrits, 2015 ), especially when also quantitative data and different coders were used (e.g. Chai and Schoon, 2016 ).

See online appendix.

We are aware that other studies combining narrative interviews and QCA have been carried out, but here we limit our discussion only to already published articles that we are aware of at the time of writing.

Without going into further details on this occasion, the term “dialogical” explicitly refers to the “dialogical epistemology” as discussed by Buber ( 1923 /2008) who distinguishes between an “I-thou” relation and an “I-it” experience. In this perspective, “dialogical” is considered as a synonym of “relational” (i.e. “I-thou” relation).

See footnote.4

The interviewer avoids posing evaluative and typifying questions to the narrator, but the former naturally works through evaluative and typifying research questions.

Copenhagen, Interview 5, September 1, 2016.

Barcelona, Interview 1, June 27, 2016. Translated from the original Spanish.

We take the risk to quote Silverman ( 2017 ) although in his article he warned about extracting and using quotes to support the researchers' arguments.

Gerrits and Verweij ( 2018 ) also emphasise the usefulness of thick case descriptions.

Lyon, Interview 4, October, 13 2016. Translated from the original French.

Stuttgart, Interview 1, July, 18 2016.

Lyon, Interview 1, October 11, 2016. Translated from the original French.

This consideration also relates to the interdependence, and not necessarily independence, of conditions in QCA, which is a topic that is beyond the scope of this study (see e.g. Jopke and Gerrits 2019 ).

For a discussion regarding the “absence” of possible factors from the interviewees' narrations, we refer readers to Sandelowski et al. ( 2009 ) and de Block and Vis ( 2018 ). In general, data triangulation is a good strategy to deal with partial and even contradictory information collected from multiple interviewees. For our own strategy regarding data triangulation, we also used an online questionnaire, additional literature and site visits (Pagliarin et al. 2019 ).

Basurto, X., Speer, J.: Structuring the calibration of qualitative data as sets for qualitative comparative analysis (QCA). Field Methods 24 , 155–174 (2012)

Article   Google Scholar  

Becker, H.S.: Tricks of the Trade: HOW to Think About Your Research While You’re Doing It. University Press, Chicago (1998)

Book   Google Scholar  

Brinkmann, S.: Unstructured and Semistructured Interviewing. In: Leavy, P. (ed.) The Oxford Handbook of Qualitative Research, pp. 277–300. University Press, Oxford (2014)

Google Scholar  

Buber M (1923/2010) I and Thou. Charles Scribner's Sons New York

Byrne, D.: Complexity configurations and cases. Theory Cult. Soc. 22 (5), 95–111 (2005)

Chai, Y., Schoon, M.: Institutions and government efficiency: decentralized Irrigation management in China. Int. J. Commons 10 (1), 21–44 (2016)

Chase, S.E.: Narrative inquiry: multiple lenses, approaches, voices. In: Denzin, N.K., Lincoln, Y.S. (eds.) The Sage Handbook of Qualitative Research, pp. 631–679. Sage, Thousand Oaks, CA (2005)

Collier, D.: Symposium: The Set-Theoretic Comparative Method—Critical Assessment and the Search for Alternatives. ID 2463329, SSRN Scholarly Paper, 1 July. Rochester, NY: Social Science Research Network. Available at: https://papers-ssrn-com.eur.idm.oclc.org/abstract=2463329 (Accessed 9 March 2021). (2014)

de Block, D., Vis, B.: Addressing the challenges related to transforming qualitative into quantitative data in qualitative comparative analysis. J. Mixed Methods Res. 13 , 503–535 (2018). https://doi.org/10.1177/1558689818770061

Fischer, M.: Institutions and coalitions in policy processes: a cross-sectoral comparison. J. Publ. Policy 35 , 245–268 (2015)

Geertz, C.: The Interpretation of Cultures. Basic Books, New York (1973)

Gerrits, L., Verweij, S.: The Evaluation of Complex Infrastructure Projects. A Guide to Qualitative Comparative Analysis. Edward Elgar, Cheltenham UK (2018)

Goertz, G.: Social Science Concepts. A User’s Guide. University Press, Princeton (2006)

Greckhamer, T., Misangyi, V.F., Fiss, P.C.: Chapter 3 the two QCAs: from a small-N to a large-N set theoretic approach, In Fiss, P.C., Cambré, B. and Marx, A. (Eds.), Configurational theory and methods in organizational research (Research in the Sociology of Organizations, Vol. 38), Emerald Group Publishing Limited, Bingley, pp. 49–75. https://doi.org/10.1108/S0733-558X(2013)0000038007 (2013)

Guba, E.G., Lincoln, Y.S.: Paradigmatic controversies, contradictions and emerging confluences. In: Denzin, N.K., Lincoln, Y.S. (eds.) The Sage Handbook of Qualitative Research, pp. 191–215. Sage, Thousand Oaks, CA (2005)

Harvey, D.L.: Complexity and case D. In: Byrne, Ragin, C.C. (eds.) The SAGE Handbook of Case-Based Methods, pp. 15–38. SAGE Publications Inc, London (2009)

Chapter   Google Scholar  

Henik, E.: Understanding whistle-blowing: a set-theoretic approach. J. Bus. Res. 68 , 442–450 (2015)

Jopke, N., Gerrits, L.: Constructing cases and conditions in QCA – lessons from grounded theory. Int. J. Soc. Res. Methodol. 22 (6), 599–610(2019). https://doi.org/10.1080/13645579.2019.1625236

La Mendola, S.: Centrato e Aperto: dare vita a interviste dialogiche [Centred and Open: Give life to dialogical interviews]. UTET Università, Torino (2009)

Latour, B.: When things strike back: a possible contribution of ‘science studies’ to the social sciences. Br. J. Sociol. 51 , 107–123 (2000)

Leech, B.L.: Asking questions: Techniques for semistructured interviews. Polit. Sci. Polit. 35 , 665–668 (2002)

Legard, R., Keegan, J., Ward, K.: In-depth interviews. In: Richie, J., Lewis, J. (eds.) Qualitative Research Practice, pp. 139–168. Sage, London (2003)

Legewie, N.: Anchored Calibration: From qualitative data to fuzzy sets. In: Forum Qualitative Sozialforschung / Forum: Qualitative Social Research 18 (3), 14 (2017). https://doi.org/10.17169/fqs-18.3.2790

Lucas, S.R., Szatrowski, A.: Qualitative comparative analysis in critical perspective. Sociol. Methodol. 44 (1), 1–79 (2014)

Metelits, C.M.: The consequences of rivalry: explaining insurgent violence using fuzzy sets. Polit. Res. q. 62 , 673–684 (2009)

Pagliarin, S., Hersperger, A.M., Rihoux, B.: Implementation pathways of large-scale urban development projects (lsUDPs) in Western Europe: a qualitative comparative analysis (QCA). Eur. Plan. Stud. 28 , 1242–1263 (2019). https://doi.org/10.1080/09654313.2019.1681942

Ragin, C.C.: The Comparative Method. Moving Beyond Qualitative and Quantitative Strategies. University of California Press, Berkeley and Los Angeles (1987)

Ragin, C.C.: Fuzzy-Set Social Science. University Press, Chicago (2000)

Ragin, C.C.: Redesigning Social Inquiry. Fuzzy Sets and Beyond. University Press, Chicago (2008a)

Ragin, C.C.: Fuzzy sets: calibration versus measurement. In: Collier, D., Brady, H., Box-Steffensmeier, J. (eds.) Methodology Volume of Oxford Handbooks of Political Science, pp. 174–198. University Press, Oxford (2008b)

Ragin, C.C.: Comment: Lucas and Szatrowski in Critical Perspective. Sociol. Methodol. 44 (1), 80–94 (2014)

Rapley, T.J.: The art (fulness) of open-ended interviewing: some considerations on analysing interviews. Qual. Res. 1 (3), 303–323 (2001)

Rihoux, B., Ragin, C. (eds.): Configurational Comparative Methods. Qualitative Comparative Analysis (QCA) and related Techniques. Sage, Thousand Oaks, CA (2009)

Rihoux, B., Lobe, B.: The case for qualitative comparative analysis (QCA): adding leverage for thick cross-case comparison. In: Byrne, D., Ragin, C.C. (eds.) The SAGE Handbook of Case-Based Methods, pp. 222–242. SAGE Publications Inc, London (2009)

Roulston, K.: Qualitative interviewing and epistemics. Qual. Res. 18 (3), 322–341 (2018)

Sandelowski, M., Voils, C.I., Knafl, G.: On quantitizing. J. Mixed Methods Res. 3 , 208–222 (2009)

Sayer, A.: Method in Social Science. A Realist Approach. Routledge, London (1992)

Schneider, C.Q., Wagemann, C.: Set-Theoretic Methods for the Social Sciences. A Guide to Qualitative Comparative Analysis. University Press, Cambridge (2012)

Silverman, D.: How was it for you? The Interview Society and the irresistible rise of the (poorly analyzed) interview. Qual. Res. 17 (2), 144–158 (2017)

Spencer, R., Pryce, J.M., Walsh, J.: Philosophical approaches to qualitative research. In: Leavy, P. (ed.) The Oxford Handbook of Qualitative Research, pp. 81–98. University Press, Oxford (2014)

Spradley, J.P.: The ethnographic interview. Holt Rinehart and Winston, New York (1979)

Stake, R.E.: Qualitative case studies. In: Denzin, N.K., Lincoln, Y.S. (eds.) The Sage Handbook of Qualitative Research, pp. 443–466. Sage, Thousand Oaks, CA (2005)

Thomann, E., Maggetti, M.: Designing research with qualitative comparative analysis (QCA): approaches, challenges, and tools. Sociol. Methods Res. 49 (2), 356–386 (2017)

Tóth, Z., Thiesbrummel, C., Henneberg, S.C., Naudé, P.: Understanding configurations of relational attractiveness of the customer firm using fuzzy set QCA. J. Bus. Res. 68 (3), 723–734 (2015)

Tóth, Z., Henneberg, S.C., Naudé, P.: Addressing the ‘qualitative’ in fuzzy set qualitative comparative analysis: the generic membership evaluation template. Ind. Mark. Manage. 63 , 192–204 (2017)

Verweij, S., Gerrits, L.M.: How satisfaction is achieved in the implementation phase of large transportation infrastructure projects: a qualitative comparative analysis into the A2 tunnel project. Public W. Manag. Policy 20 , 5–28 (2015)

Wang, W.: Exploring the determinants of network effectiveness: the case of neighborhood governance networks in Beijing. J. Public Adm. Res. Theory 26 , 375–388 (2016)

Download references

Acknowledgements

The authors would like to thank the two reviewers who provided great insights and careful remarks, thus allowing us to improve the quality of the manuscript. During a peer-review process lasting for more than 2 years, we intensely felt the pushes and slows, and at times the impasses, of a fruitful dialogue on the qualitative and quantitative aspects of comparative analysis in the social sciences.

Open Access funding enabled and organized by Projekt DEAL. This research has been partially funded through the Consolidator Grant (ID: BSCGIO 157789), held by Prof. h. c. Dr. Anna M. Hersperger, provided by the Swiss National Science Foundation.

Author information

Authors and affiliations.

Utrecht University School of Governance, Utrecht University, Utrecht, The Netherlands

Barbara Vis

Department of Philosophy, Sociology, Pedagogy and Applied Psychology, Padua University, Padua, Italy

Salvatore La Mendola

Chair for the Governance of Complex and Innovative Technological Systems, Otto-Friedrich-Universität Bamberg, Bamberg, Germany

Sofia Pagliarin

Landscape Ecology Research Unit, CONCUR Project, Swiss Federal Research Institute WSL, Birmensdorf, Zurich, Switzerland

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Sofia Pagliarin .

Ethics declarations

Conflict of interest.

The Authors declare not to have any conflict of interest to report.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 21 KB)

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Pagliarin, S., La Mendola, S. & Vis, B. The “qualitative” in qualitative comparative analysis (QCA): research moves, case-intimacy and face-to-face interviews. Qual Quant 57 , 489–507 (2023). https://doi.org/10.1007/s11135-022-01358-0

Download citation

Accepted : 20 February 2022

Published : 26 March 2022

Issue Date : February 2023

DOI : https://doi.org/10.1007/s11135-022-01358-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Calibration
  • Data generation
  • Interviewing
  • In-depth knowledge
  • Qualitative data
  • Find a journal
  • Publish with us
  • Track your research
  • Open Access
  • Submissions
  • Barney Glaser – In Memoriam

The Constant Comparative Method of Qualitative Analysis

[This paper was originally published in Social Problems, 12(1965), pp. 436-45 and later as Chapter V in Glaser, B.G. & Strauss, A.L. (1967). The Discovery of Grounded Theory: Strategies fro qualitative research. New York: Aldine DeGruyter.]

Barney G. Glaser, Ph.D.

Currently, the general approaches to the analysis of qualitative data are these:

1.) If the analyst wishes to convert qualitative data into crudely quantifiable form so that he can provisionally test a hypothesis, he codes the data first and then analyzes it. He makes an effort to code “all relevant data [that] can be brought to bear on a point,” and then systematically assembles, assesses and analyzes these data in a fashion that will “constitute proof for a given proposition.”i

2.) If the analyst wishes only to generate theoretical ideasnew categories and their properties, hypotheses and interrelated hypotheses- he cannot be confined to the practice of coding first and then analyzing the data since, in generating theory, he is constantly redesigning and reintegrating his theoretical notions as he reviews his material.ii Analysis with his purpose, but the explicit coding itself often seems an unnecessary, burdensome task. As a result, the analyst merely inspects his data for new properties of his theoretical categories, and writes memos on these properties.

We wish to suggest a third approach to the analysis of qualitative data- one that combines, by an analytic procedure of constant comparison, the explicit coding procedure of the first approach and the style of theory development of the second. The purpose of the constant comparative method of joint coding and analysis is to generate theory more systematically than allowed by the second approach, by using explicit coding and analytic procedures . While more systematic than the second approach, this method does not adhere completely to the first, which hinders the development of theory because it is designed for provisional testing, not discovering, of hypotheses.iii This method of comparative analysis is to be used jointly with theoretical sampling, whether for collective new data or on previously collected or compiled qualitative data.

Systematizing the second approach (inspecting data and redesigning a developing theory) by this method does not supplant the skills and sensitivities required in generating theory. Rather, the constant comparative method is designed to aid the analyst who possesses these abilities in generating a theory that is integrated, consistent, plausible, close to the dataand at the same time is in a dorm clear enough to be readily, if only partially, operationalized for testing in quantitative research. Still dependent on the skills and sensitivities of the analyst, the constant comparative method is not designed (as methods of quantitative analysis are) to guarantee that two analysts working independently with the same data will achieve the same results; it is designed to allow, with discipline, for some of the vagueness and flexibility that aid the creative generation of theory.

If a researcher using the first approach (coding all data first) wishes to discover some or all of the hypotheses to be tested, typically he makes his discoveries by using the second approach of inspection and memo-writing along with explicit coding. By contrast, the constant comparative method cannot be used for both provisional testing and discovering theory: in theoretical sampling, the data collected are not extensive enough and, because of theoretical saturation, are not coded extensively enough to yield provisional tests, as they are in the first approach. They are coded only enough to generate, hence to suggest, theory. Partial testing of theory, when necessary, is left to more rigorous approaches (sometimes qualitative but usually quantitative). These come later in the scientific enterprise (see Chapter X).

The first approach also differs in another way from the constant comparative method. It is usually concerned with a few hypotheses couched at the same level of generality, while our method is concerned with many hypotheses synthesized at different levels of generality. The reason for this difference between methods is that the first approach must keep the theory tractable so that it can be provisionally tested in the same presentation. Of course, the analyst, using this approach might, after proving or disproving this hypotheses, attempt to explain his findings with more general ideas suggested by his data, this achieving some synthesis different levels of generality.

A fourth general approach to qualitative analysis is “analytic induction,” which combines the first and second approaches in a manner different from t he constant comparative method.iv Analytic induction has been concerned with generating and proving an integrated, limited, precise, universally applicable theory of causes accounting for a specific behavior ( e.g ., drug addiction, embezzlement). In line with the first approach, it tests a limited number of hypotheses with all available data, consisting of numbers of clearly defined and carefully selected cases of phenomena. Following the second approach, the theory is generated by the reformation of hypotheses and redefinition of the phenomena forced by constantly confronting the theory with negative cases, cases which do not confirm the current formulation.

In contrast to analytic induction, the constant comparative method is concerned with generating and plausibly suggesting (but not provisionally testing) many categories, properties, and hypotheses about general problems ( e.g., the distribution of services according to the social value of clients). Some of these properties may be causes, as in analytic induction, but unlike analytic induction, others are conditions, consequences, dimensions, types, processes, etc. In both approaches, these properties should result in an integrated theory. Further, no attempt is made by the constant comparative method, unlike analytic induction, is more likely to be applied in the same study to any kind of qualitative information, including observations, interviews, documents, articles, books, and so forth. As a consequence, the constant comparisons required by both methods differ in breadth of purpose, extent of comparing, and what data and ideas are compared.

Clearly the purposes of both these methods for generating theory supplement each other, as well as the first and second approaches. All four methods provide different alternative to qualitative analysis. Table I locates the use of the approaches to qualitative analysis and provides a scheme for locating additional approaches according to their purposes. The general idea of the constant comparative method can also be used for generating theory in quantitative research. Then one compares findings within subgroups and with external groups (see Chapter VIII).

Table I. Use of Approaches to Qualitative Analysis (cannot be shown, see PDF Version)

The Constant Comparative Method

We shall describe in four stages the constant comparative method: (1) comparing incidents applicable to each category, (2) integrating categories and their properties, (3) delimited the theory, and (4) writing the theory. Although this method of generating theory is a continuously growing process- each stage after a time is transformed into the next- earlier stages do remain in operation simultaneously throughout the analysis and each provides continuous development to its successive stage until the analysis is terminated.

Comparing incidents applicable to each category. The analyst starts by coding each incident in his data into as many categories of analysis as possible, as categories emerge or as data emerge that fit an existing category. For example, the category of “social loss” of dying patients emerged quickly from comparisons of nurses’ responses to the potential deaths of their patients. Each relevant response involved the nurse’s appraisal of the degree of loss that her patient would be to his family, his occupation, or society: “He was so young,” He was to be a doctor,” “She had a full life,” or “What will the children and her husband do without her?”v

Coding need consist only of noting categories on margins, but can be done more elaborately ( e.g., on cards). It should keep track of the comparison group in which the incident occurs. To this procedure we add the basic, defining rule for the constant comparative method: while coding an incident for a category, compare it with the previous incidents in the same and different groups coded in the same category . For example, as the analyst codes an incident in which a nurse responds to the potential “social loss” of a dying patient, he also compares this incident, before further coding, with others previously coded in the same category. Since coding qualitative data requires study of each incident, this comparison can often be based on memory. Usually there is no need to refer to the actual note on every previous incident for each comparison.

This constant comparison of the incidents very soon starts to generate theoretical properties of the category. The analyst starts thinking in terms of the full range of types or continua of the category, its dimensions, the conditions under which it is pronounced or minimized, its major consequences, its relation to other categories, and its other properties. For example, while constantly comparing incidents on how nurses respond to the social loss of dying patients, we realized that some patients are perceived as a high social loss and others as a low social loss, and that patient care tends to vary positively with degree of social loss. It was also apparent that some social attributes that nurses’ combine to establish a degree of social loss are seem immediately (age, ethnic group, social class). While some are learned after time is spent with the patient (occupational worth, marital status, education). This observation led us to the realization that perceived social loss can change as new attributes of the patients are learned. It also becomes apparent, from studying the comparison groups, under what conditions (types of wards and hospitals) we would find clusters of patients with different degrees or social loss.

As categories and their properties emerge, the analyst will discover two kinds: those that he had constructed himself (such as “social loss” or “calculation” of social loss); and those that have been abstracted from the language of the research situation. (for example, “composure” was derived from nurses’ statements like “I was afraid of losing my composure when the family started crying over their child.”) As his theory develops, the analyst will notice that the concepts abstracted from the substantive situation will tend to be the current labels in use for the actual processes and behaviors that are to be explained, while the concepts constructed by the analyst will tend to be the explanations.vi For example, a nurse’s perception of the social loss of a dying patient will affect (an explanation) how she maintains her composure (a behavior) in his presence.

After coding for a category perhaps three or four times, the analyst will find conflicts in the emphases of his thinking. He will be musing over theoretical notions and, at the same time, trying to concentrate on his study of the next incident, to determine the alternate ways by which it should be coded and compared. At this point the second rule of the constant comparative method is: stop coding and record a memo on your ideas . This rule is designed to tap the initial freshness of the analyst’s theoretical notions and to relieve the conflict in his thoughts. In doing so, the analyst should take as much time as necessary to reflect and carry his thinking to its most logical (grounded in the data, not speculative) conclusions. It is scheduled routine covering the amount to be coded per day, as there is in predesigned research. The analyst may spend hours on one page or he may code twenty pages in a half hour, depending on the relevance of the material, saturation of categories, emergence of new categories, stage or formulation of theory, and of course, the mood of the analyst, since this method takes his personal sensitivity into consideration. These factors are in a continual process of change.

If one is working on a research team, it is also a good idea to discuss theoretical notions with one and more teammates. Teammates can help bring out points missed, add points they have run across in their own coding and data collection, and crosscheck his points They, too, being to compare the analyst’s notions with their own ideas and knowledge of the data; this comparison generates additional theoretical ideas. With clearer ideas on the emerging theory systematically recorded, the analyst then returns to the data for more coding and constant comparison.

From the point of view of generating theory it is often useful to write memos on, as well as code, the copy of one’s field notes. Memo writing on the field note provides an immediate illustration for an idea. Also, since an incident can be coded for several categories, this tactic forces the analyst to use an incident as an illustration only once, for the most important among the many properties of diverse categories that it indicates. He must look elsewhere in his notes for illustration for his other properties and categories. This corrects the tendency to use the same illustration over and over for different properties.

The generation of theory requires that the analyst take apart the story within his data. Therefore when he rearranges his memos and field notes for writing up his theory, he sufficiently “fractures” his story at the same time that he saves apt illustrations for each idea (see Step 4). At just this point in his writing, breaking down and out of the story is necessary for clear integration of the theory.

Integrating categories and their properties . This process starts out in a small way; memos and possible conferences are short. But as the coding continues, the constant comparative units change from comparison of incident with incident to comparison of incident with properties of the category that resulted from initial comparisons of incidents. For example, in comparing incident with incident we discovered the property that nurses constantly recalculate a patient’s social loss as they learn more about him. From then on, each incident bearing on “calculation” was compared with “accumulated knowledge on calculating”- not with all other incidents involving calculation. Thus, once we found that age was the most important characteristic in calculating social loss, we could discern how a patient’s age affected the nurses’ recalculation of social loss as they found out more about his education. We found that education was most influential in calculations of the social loss of a middle-aged adult, since for a person of this age, education was considered to be of most social worth. This example also shows that constant comparison causes the accumulated knowledge pertaining to a property of the category to readily start to become integrated; that is, related in many different ways, resulting in a unified whole.

In addition, the diverse properties themselves start to become integrated. Thus, we soon found that the calculating and recalculating of social loss by nurses was related tot heir development of a social loss “story” about the patient. When asked about a dying patient, nurses would tell what amounted to a story about him. The ingredients of this story consisted of a continual balancing out of social loss factors as the nurses learned more about the patient. Both the calculus of social loss and the social loss story were related to the nurse’s strategies for coping with the upsetting impact on her professional composure of, say, a dying patient with a high social loss (e.g., a mother with two children). This example further shows that the category becomes integrated with other categories of analysis; the social loss of the dying patient is related to how nurses maintain professional composure while attending his dying.vii Thus the theory develops as different categories and their properties tend to become integrated through constant comparisons that force the analyst to make some related theoretical sense of each comparison.

If the data are collected by theoretical sampling at the same time that they are analyzed (as we suggest should be done), then integration of the theory is more likely to emerge by itself. By joint collection and analysis, the sociologist is tapping the fullest extent the in vivo patterns of integration in the data itself; questions guide the collection of data to fill in gaps and to extend the theory- and this also is an integrative strategy. Emergence of integration schemes also occurs in analyses that are separate from data collection, but more contrivance may be necessary when the data run thin and no more can be collected. (Other aspects of integration have been discussed in Chapter II.)

Delimiting the theory . As the theory develops, various delimiting features of the constant comparative method begin to curb what could otherwise become an overwhelming task. Delimiting occurs at two levels: the theory and the categories. First, the theory solidifies, in the sense that major modifications become fewer and fewer as the analyst compares the next incidents of a category to its properties. Later modifications are mainly on the order of clarifying the logic, taking out nonrelevant properties, integrating elaborating details of properties into the major outline of interrelated categories and –most importantreduction.

By reduction we mean that the analyst may discover underlying uniformities in the original set of categories or their properties, and can then formulate the theory with a smaller set of higher level concepts. This delimits its terminology and text. Here s an illustration which shows the integration of more details into the theory and some consequent reduction: We decide to elaborate our theory by adding detailed strategies used by the nurses to maintain professional composure while taking care of patients with varying degrees of social loss. We discovered that the rationales which nurses used, when talking among themselves, could all be considered “loss rationales.” The underlying uniformity was that all these rationales indicated why the patient, given his degree of social loss, would, if he lived, now would be socially worthless: in spite of the social loss, he would be better off dead. For example, he would have brain damage, or be in constant, unendurable pain, or have no chance for a normal life.

Through further reduction of terminology we were also discovering that our theory could be generalized so that it pertained to the care of all patients (not just the dying ones) by all staff (not just nurses). On the level of formal theory, it could even be generalized as a theory of how the social values of professionals affect the distribution of their services to clients; for examples, how they decide who among many waiting clients should next receive a service, and what caliber of service he should be given.

Thus, with reduction of terminology and consequent generalizing forced by constant comparisons (some comparisons can at this point be based on the literature of other professional areas), the analyst starts to achieve two major requirements of theory: (1) parsimony of variables and formulation, and (2) scope in the applicability of the theory to a wide range of situations,viii while keeping a close correspondence of theory and data.

The second level for delimiting the theory is a reduction in the original list of categories for coding. As the theory grows, becomes reduced, and increasingly works better for ordering a mass of qualitative data, the analyst becomes committed to it. His commitment now allows him to cut down the original list of categories for collecting and coding data, according to the present boundaries of his theory. In turn, his consideration, coding, and analyzing of incidents can become more select and focused. He can devote more time to the constant comparison of incidents clearly applicable to this smaller set of categories.

Another factor, which still further delimits the list of categories, is that they become theoretically saturated. After an analyst has coded incidents for the same category a number of times, he learns to see quickly whether or not the next applicable incident points to a new aspect. If yes, then the incident is coded and compared. If no, the incident is not coded, since it only adds bulk to the coded data and nothing to the theory.ix For example, after we had established age as the base line for calculating social loss, no longer did we need to code incidents referring to age for calculating social loss. However, if we came across a case where age did not appear to be the base line (a negative case), the case was coded and then compared. In the case of an 85-year-old dying woman who was considered a great social loss, we discovered that her “wonderful personality” outweighed her age as the most important factor for calculating her social loss. In addition, the amount of data the analyst needs to code is considerable reduced when the data are obtained by theoretical sampling; thus he saves time in studying his data for coding.

Theoretical saturation of categories also can be employed as a strategy in coping with another problem: new categories will emerge after hundreds of pages of coding, and the question is whether or not to go back and re-code all previously coded pages. The answer for large studies is “no”. The analyst should start to code for the new category where it emerges, and continue for a few hundred pages of coding, or until the remaining (or additionally collected) data have been coded, to see whether the new category has become theoretically saturated. If it has, then it is unnecessary to go back, either to the field or the notes, because theoretical saturation suggests that what has been missed will probably have little modifying effect on the theory. If the category does not saturate, then the analyst needs to go back and try to saturate it, provided it is central to the theory.

Theoretical saturation can help solve still another problem concerning categories. If the analyst has collected his own data, then from time to time he will remember other incidents that he observed or heard but did not record. What does he do now? If the unrecorded incident applies to an established category, after comparison it can either be ignored because the category is saturated; or, if it indicates a new property of the category, it can be added to the next memo and thus integrated into the theory. If the remembered incident generates a new category, both incident and category can be included in a memo directed toward their place in the theory. However, if it becomes central to the theory, the memo becomes a directive for further coding of the field notes and for returning to the field or library to collect more data.

The universe of data that the constant comparative method uses is based on the reduction of the theory and the delimitation and saturation of categories. Thus, the collected universe of data is first deliminated and then, if necessary, carefully extended by a return to data collection according to the requirements of theoretical sampling. Research resources are economized by this theoretical deliminating of the possible universe of data, since working within limits forced the analyst to spend his time and effort only on data relevant to his categories. In large field studies, with long lists of possibly useful categories and thousands of pages of notes embodying thousands of incidents, each of which could be coded a multitude of ways, theoretical criteria are very necessary for paring down an otherwise monstrous task to fit the available resources of personnel, time, and money. Without Theoretical criteria, delimiting a universe a collected data, if done at all, can become very arbitrary and less likely to yield an integrated product; the analyst is also more likely to waste time on what may later prove to be irrelevant incidents and categories.

Writing theory . At this stage in the process of qualitative analysis, the analyst possesses coded data, a series of memos, and a theory. The discussions in his memos provide the content behind the categories, which become the major themes of the theory later presented in papers or books. For example, the major themes (section titles) for our paper on social loss were “calculating social loss,” “the patient’s social loss story,” and “the impact of social loss on the nurse’s professional composure.”

When the research is convinced that his analytic framework forms a systematic substantive theory, that it is a reasonably accurate statement of the matters studied, and that it is couched in a form that other going into the same field could use- then he can publish his results with confidence. To start writing one’s theory, it is first necessary to collate the memos on each category, which is easily accomplished since the memos have been written about categories. Thus, we brought together all memos on calculating social loss for summarizing and, perhaps, further analyzing before writing about it. One can return to the coded data when necessary to validate a suggested point, pinpoint data behind a hypothesis or gaps in the theory, and provide illustrations.x

Properties of the Theory

Using the constant comparative method makes probably the achievement of a complex theory that corresponds closely to the data, since the constant comparisons force the analyst to consider much diversity in the data. By diversity we mean that each incident is compared with other incidents, or with properties of a category, in terms of as many similarities and differences as possible. This mode of comparing is in contrast to coding for crude proofs; such coding only establishes whether an incident indicates the few properties of the category that are being counted.

The constant comparison of incidents in this manner tends to result in the creation of a “developmental” theory.xi Although this method can also be used to generate static theories, it especially facilitates the generation of theories of process, sequence, and change pertaining to organizations, positions, and social interaction. But whether the theory itself is static or developmental, its generation, by this method and by theoretical sampling, is continually in process. In comparing incidents, the analyst learns to see his categories in terms of both their internal development and their changing relations to other categories. For example, as the nurse learns more about the patient, her calculations of social loss change; and these recalculations change their social loss stories, her loss rationales and her care of the patient.

This is an inductive method of theory development. To make theoretical sense of so much diversity in his data, the analyst is forced to develop ideas on a level of a generality higher in conceptual abstraction than the qualitative material being analyzed. He is forced to bring out underlying uniformities and diversities, and to use more abstract concepts to account for differences in the data. To master his data, he is forced to engage in reduction of terminology. If the analyst starts with raw data, he will end up initially with a substantive theory: a theory for the substantive area on which he has done research (for example, patient care or gang behavior). If he starts with the findings drawn from many studies pertaining to an abstract sociological category, he will end up with a formal theory pertaining to a conceptual area (such as stigma, deviance, lower class, status congruency, organizational careers, or reference groups).xii To be sure, as we described in Chapter IV, the level of generality of s substantive theory can be raised to a formal theory. (Our theory of dying patients’ social loss could be raised to the level of how professional people give service to clients according to their respective social value.) This move to formal theory requires additional analysis of one’s substantive theory, and the analyst should, as stated in the previous chapter, include material from other studies with the same formal theoretical import, however diverse their substantive content.xiii The point is that the analyst should be aware of the level of generality from which he starts in relation to the level at which he starts in relation to the level at which he wishes to end.

The constant comparative method can yield either discussion or propositional theory. The analyst may wish to cover many properties of a category in his discussion or to write formal propositions about a category. The former type of presentation is often sufficiently useful at the exploratory stage of theory development, and can easily be translated into propositions by the reader if he requires a formal hypothesis. For example, two related categories of dying are the patient’s social loss and the amount of attention he receives from nurses. This can easily be restated as a proposition: patients considered a high social loss, as compared with those considered a low social loss, will tend to receive more attention from nurses.

i Howard S. Becker and Blanche Geer, “The Analysis of Qualitative Field Data” in Richard N. Adams and Jack J. Preiss (Eds.), Human Organization Research (Homewood, Ill.:Dorsey Press, Inc., 1960), pp.279-89. See also Howard S. Becker, “Problems of Inference and Proof in Participant Observation,” American Sociological Review, (December, 1958), pp. 652-60; and Bernard Berelson, Content Analysis (Glencoe, Ill.: Free Press. 1952), Chapter III, and p. 16.

ii Constantly redesigning the analysis is a well-known normal tendency in qualitative research (no matter what the approach to analysis), which occurs through the whole research experience from initial data collection through coding to final analysis and writing. The tendency has been noted in Beck and Geer, op. cit., p. 270, Berelson, op. cit., p. 125; and for an excellent example of how it goes on, see Robert K. Merton, Social Theory and Social Structure (New York: Free Press of Glencoe, 1957), pp. 390-392. However, this tendency may have to be suppressed in favor of the purpose of the first approach; but in the second approach and the approach presented here, the tendency is used purposefully as an analytic strategy.

iii Our other purpose in presenting the constant comparative method may be indicated by a direct quotation from Robert K. Merton- a statement he made in connection with his own qualitative analysis of locals and cosmopolitans as community influentials: “This part of our report, then, is a bid to the sociological fraternity for the practice of incorporating in publications a detailed account of the ways in which qualitative analysis actually developed. Only when a considerable body of such reports are available will it be possible to codify methods of qualitative analysis with something of clarity with which quantitative methods have been articulated.” Op. cit., p. 390. This is, of course, also the basic position of Paul F. Lazarsfeld. See Allen H. Barton and Paul F. Lazarsfeld, “Some Functions of Qualitative Analysis in Social Research,” in Seymour M. Lipset and Neil J. Smelser (Eds.), Sociology: the Progress of a Decade (Englewood Cliffs, N.J.: Prentice-Hall, 1961). It is the position that has stimulated the work of Becker and Geer, and of Berelson, cited in Footnote 1.

iv See Alfred R Lindesmith, Opiate Addiction (Bloomington: Principia, 1947), pp. 12-14; Donald R. Cressey, Other People’s Money (New York: Free Press of Glencoe, 1953), p. 16 and passim; and Florian Znaniecki, The Method of Sociology (New York: Farrar and Rinehart, 1934), pp. 249-331.

v Illustrations will refer to Barney G. Glaser and Anselm L. Strauss, “The Social Loss of Dying Patients,” American Journal of Nursing, 64 (June, 1964) pp. 119-121.

vi Thus we have studies of delinquency, justice, “becoming,” stigma, consultation, consolation, contraception, etc.; these usually become the variables or processes to be described and explained.

vii See Glaser and Strauss, “Awareness and the Nurse’s Composure,” in Chapter 13 in Awareness of Dying (Chicago: Aldine Publishing Co., 1965).

viii Merton, op. cit., p.260

ix If the analyst’s purpose, besides developing theory, is also to count incidents for a category to establish provisional proofs, then he must code the incident. Furthermore, Merton has made the additional point, in correspondence, that to count for establishing provisional proofs may also feedback to developing the theory, since frequency and cross-tabulation of frequencies can also generate new theoretical ideas. See Berelson on the conditions under which one can justify time-consuming, careful counting; op. cit., pp.128-134. See Becker and Geer for a new method of counting the frequency of incidents; op. cit., pp. 283-87.

x On “pinpointing’ see Anselm Strauss, Leonard Schatzman, Rue Bucher, Danuta Ehrlich and Melvin Shabshin, Psychiatric Ideologies and Institutions (New York: Free Press of Glencoe, 1964), Chapter 2, “Logic, Techniques and Strategies of Team Fieldwork.”

xi Recent calls for more developmental, as opposed to static, theories have been made by Wilbert Moore, “Predicting Discontinuities in Social Change,” American Sociological Review 29 (1964), p. 322; Howard S. Becker, Outsiders (New York: Free Press of Glencoe, 1962), pp. 22-25; and Barney G. Glaser and Anselm Strauss, “Awareness Contexts and Social Interaction,” op. cit.

xii For an example, see Barney G. Glaser, Organizational Careers (Chicago: Aldine Publishing Co., 1967).

xiii “…the development of any of these coherent analytic perspectives is not likely to come from those who restrict their interest exclusively to one substantive area.” From Erving Goffman, Stigma: Notes of the Management of Spoiled Identity (Englewood Cliffs, N.J.: Prentice Hall, 1963), p. 147. See also Reinhard Bendix, “Concepts and generalizations in Comparative Sociological Studies,” American Sociological Review, 28 (1963), pp. 532-39.

Facebook

Subscribe to receive updates

Your email:

Call for Papers

Are you developing a classic grounded theory? Do you have data that could be resorted and further developed into a new grounded theory? Are you working on a formal theory, or are you reflecting on a methodological issue? We invite you to submit your paper for consideration for the next issue of Grounded Theory Review, which is published in late December and June each year.

The database of the Grounded Theory Review now contains more than a hundred articles on classic grounded theories—from either a methodological or a theoretical perspective. We would like to expand the open access database with more grounded theories that truly demonstrates the interdisciplinary potential of the classic grounded theory method. Following the 50th anniversary wish of GT’s co-founder Dr. Barney Glaser, we would like to see a conglomerate of new grounded theories that span a wide array of disciplines and topics and that demonstrate general applicability and conceptual strengths in diverse social contexts. The theories will be peer reviewed by experienced members of the advisory board of the Grounded Theory Review.

Please submit your paper no later than April 1 for the June edition and September 15 for the December edition.

Current Issue

  • Issue 1, June 2023
  • GT Institute
  • GT Mentoring
  • The Grounded Theory Review is published by Sociology Press ISSN: 1556-1550

Indexed by:

  • EBSCO, Google Scholar, and DOAJ
  • ESCI (Web of Science)
  • Article Archives
  • PDF Archives

Europe PMC requires Javascript to function effectively.

Either your web browser doesn't support Javascript or it is currently turned off. In the latter case, please turn on Javascript support in your web browser and reload this page.

Search life-sciences literature (43,973,697 articles, preprints and more)

  • Available from publisher site using DOI. A subscription may be required. Full text
  • Citations & impact
  • Similar Articles

Use of constant comparative analysis in qualitative research.

Author information, affiliations.

  • Hewitt-Taylor J 1

Nursing Standard (Royal College of Nursing (Great Britain) : 1987) , 01 Jul 2001 , 15(42): 39-42 https://doi.org/10.7748/ns2001.07.15.42.39.c3052   PMID: 12212430 

Abstract 

Full text links .

Read article at publisher's site: https://doi.org/10.7748/ns2001.07.15.42.39.c3052

Citations & impact 

Impact metrics, citations of article over time, smart citations by scite.ai smart citations by scite.ai include citation statements extracted from the full text of the citing article. the number of the statements may be higher than the number of citations provided by europepmc if one paper cites another multiple times or lower if scite has not yet processed some of the citing articles. explore citation contexts and check if this article has been supported or disputed. https://scite.ai/reports/10.7748/ns2001.07.15.42.39.c3052, article citations, sexual health knowledge acquisition processes among very young adolescent girls in rural malawi: implications for sexual and reproductive health programs..

Chimwaza-Manda W , Kamndaya M , Chipeta EK , Sikweyiya Y

PLoS One , 19(2):e0276416, 23 Feb 2024

Cited by: 0 articles | PMID: 38394159 | PMCID: PMC10889655

Implementation and effectiveness of a physician-focused peer support program.

Tolins ML , Rana JS , Lippert S , LeMaster C , Kimura YF , Sax DR

PLoS One , 18(11):e0292917, 01 Nov 2023

Cited by: 0 articles | PMID: 37910457 | PMCID: PMC10619771

Patients' Views on AI for Risk Prediction in Shared Decision-Making for Knee Replacement Surgery: Qualitative Interview Study.

Gould DJ , Dowsey MM , Glanville-Hearst M , Spelman T , Bailey JA , Choong PFM , Bunzli S

J Med Internet Res , 25:e43632, 18 Sep 2023

Cited by: 0 articles | PMID: 37721797 | PMCID: PMC10546266

General practitioner residents' experiences and perceptions of outpatient training in primary care settings in China: a qualitative study.

Wu L , Tong Y , Yu Y , Yu X , Zhou Y , Xu M , Guo Y , Song Z , Xu Z

BMJ Open , 13(9):e076821, 15 Sep 2023

Cited by: 0 articles | PMID: 37714679 | PMCID: PMC10510923

A developmental formative evaluation of a pilot participatory music program for veterans with housing insecurity.

Wasmuth S , Rattray NA , Cheng P , Crow S , Myers J , Burns DS , Myers LJ , Hook B , Lustig A , Perkins AJ , Cheatham AJ , Bravata DM

BMC Public Health , 23(1):1583, 19 Aug 2023

Cited by: 0 articles | PMID: 37596545 | PMCID: PMC10439562

Similar Articles 

To arrive at the top five similar articles we use a word-weighted algorithm to compare words from the Title and Abstract of each citation.

Use of a computer software program for qualitative analyses--Part 2: Advantages and disadvantages.

Tak SH , Nield M , Becker H

West J Nurs Res , 21(3):436-439, 01 Jun 1999

Cited by: 2 articles | PMID: 11512208

Part II. rigour in qualitative research: complexities and solutions.

Nurse Res , 13(1):29-42, 01 Jan 2005

Cited by: 73 articles | PMID: 16220839

Evaluating qualitative research studies.

J Pediatr Health Care , 21(3):195-197, 01 May 2007

Cited by: 2 articles | PMID: 17478311

Qualitative content analysis: a guide to paths not taken.

Qual Health Res , 3(1):112-121, 01 Feb 1993

Cited by: 402 articles | PMID: 8457790

Qualitative research: data analysis techniques.

Prof Nurse , 14(8):531-533, 01 May 1999

Cited by: 7 articles | PMID: 10532026

Europe PMC is part of the ELIXIR infrastructure

IMAGES

  1. Constant Comparative Analysis in Qualitative Research

    constant comparative analysis qualitative research

  2. 12 Different Steps in the Constant Comparative Analysis Procedure

    constant comparative analysis qualitative research

  3. Steps in the Constant Comparative Method

    constant comparative analysis qualitative research

  4. The Constant Comparative Method

    constant comparative analysis qualitative research

  5. Constant comparative analysis

    constant comparative analysis qualitative research

  6. PPT

    constant comparative analysis qualitative research

VIDEO

  1. Comparative Analysis Presentation- Malaysia and USA

  2. Comparative Historical Analysis by Marcus Kreuzer

  3. Basics of Constant Comparative Analysis, Part 1: Introduction

  4. Basics of Constant Comparative Analysis, Part 2: Open Coding

  5. Qualitative Research Analysis Approaches

  6. Literature Review

COMMENTS

  1. Constant Comparative Method in Qualitative Research

    Constant comparison is an essential qualitative research method that originally comes from grounded theory analysis. Under the constant comparative method, the goal of the qualitative data collection process and data analysis is to facilitate organization of information to generate a coherent theory. In this article, we'll examine constant ...

  2. Constant comparative method in qualitative analysis

    Constant Comparative Method is actually a critical part of Glaser and Strauss' (1967) treatise on Grounded Theory, but actually predates it in an article attributed to Glaser alone (Glaser 1965). It proposed a way to bridge the differences between a basic comprehensive thematic coding approach, and theory generation with analysis.

  3. Grounded theory research: A design framework for novice researchers

    Grounded theory provided an outlook that questioned the view of the time that quantitative methodology is the only valid, unbiased way to determine truths about the world. 11 Glaser and Strauss 5 challenged the belief that qualitative research lacked rigour and detailed the method of comparative analysis that enables the generation of theory.

  4. Grounded theory and the constant comparative method: Valid research

    An adaptation of the constant comparative method of qualitative data analysis (Glaser & Strauss, 1967; Strauss, 1987) is illustrated using data from a study of three preservice secondary ...

  5. Use of constant comparative analysis in qualitative research

    This article describes the application of constant comparative analysis, which is one method that can be used to analyse qualitative data. The need for data analysis to be congruent with the overall research design is highlighted. ... Use of constant comparative analysis in qualitative research Nurs Stand. 2001 Jul;15(42):39-42. doi: 10.7748 ...

  6. A Purposeful Approach to the Constant Comparative Method in the

    The constant comparative method (CCM) together with theoretical sampling constitutethe core of qualitative analysis in the grounded theory approach and in other types ofqualitative research. Since the application of the method remains rather unclear, researchers do not know how to `go about' the CCM in their research practice. This study contributes to a purposeful approach of the CCM in order ...

  7. PDF The Constant Comparative Analysis Method Outside of Grounded Theory

    The Qualitative Report 2013 Volume 18, Article 1, 1-25 ... tradition of methodological innovation in qualitative research (Wiles, Pain, & Crow, 2010, p. 3). Examples of how the method has improved and how my model adapts and advances the ... Constant Comparative Analysis: Emergence and Theoretical Sensitivity Glaser and Strauss (1967) developed ...

  8. Qualitative Data Analysis: Using the Constant Comparative Method

    There are several seasoned qualitative researchers whose work is the basis of our own. We have found Glaser and Strauss's (1967) constant comparative method of data analysis well-suited to our purposes, although this method of analysis was developed for theory building. Lincoln and Guba (1985) have added important procedural detail to the ...

  9. The Constant Comparative Method of Qualitative Analysis

    THE CONSTANT COMPARATIVE METHOD OF QUALITATIVE ANALYSIS* BARNEY G. GLASER University of California Medical Center, San Francisco Research into social problems, prob-lems of deviation, of control and of crisis, and the like-the general sub-ject matter to which Social Problems is devoted-is still mainly feasible through methods which yield ...

  10. A Purposeful Approach to the Constant Comparative Method in the

    The constant comparative method (CCM) together with theoretical sampling constitutethe core of qualitative analysis in the grounded theory approach and in other types ofqualitative research.

  11. The Constant Comparative Method of Qualitative Analysis*

    The constant comparative method, unlike analytic induction, is more likely to be applied in the same study to any kind of qualitative information, including observations, interviews, documents, articles, books, and so forth. The general idea of the constant comparative method can also be used for generating theory in quantitative research.

  12. RWJF

    The contant comparative method is a method for analyzing data in order to develop a grounded theory. Glaser and Strauss (1967) suggest that when used to generate theory, the comparative analytical method they describe can be applied to social units of any size. As Glaser and Strauss (1967, pp. 28-52) describe it, this process involves:

  13. Use of constant comparative analysis in qualitative research

    The data was analyzed using the constant comparative analysis method, a way of analyzing qualitative data where the information gathered is coded into emergent themes or codes (Hewitt-Taylor, 2001 ...

  14. Constant Comparative Method

    Five study team members with experience in qualitative research methods used grounded theory and the constant comparative method for inductive semantic content analysis 10, 11 using MAXQDA software version 2020 plus (MAXQDA). Codes were derived from parental responses to each question. ... (one study). All seven protocols were found to be ...

  15. The "qualitative" in qualitative comparative analysis (QCA): research

    Qualitative Comparative Analysis (QCA) includes two main components: QCA "as a research approach" and QCA "as a method". In this study, we focus on the former and, by means of the "interpretive spiral", we critically look at the research process of QCA. We show how QCA as a research approach is composed of (1) an "analytical move", where cases, conditions and outcome(s) are ...

  16. Planning Qualitative Research: Design and Decision Making for New

    Indeed, there are other approaches for conducting qualitative research, including grounded theory, discourse analysis, feminist qualitative research, historical qualitative research, among others. ... To accomplish that reconciliation, researchers may use the constant comparative method of content analysis (Lincoln & Guba, 1985) of the entire ...

  17. What is Constant Comparative Method?

    What is Constant Comparative Method. Constant comparative method is a process developed by Glaser and Strauss and used in grounded theory, where you sort and organize excerpts of raw data into groups according to attributes, and organize those groups in a structured way to formulate a new theory.. If you've ever sorted puzzle pieces by shape, separated out Skittles by flavor, or organized ...

  18. The Constant Comparative Method of Qualitative Analysis

    The. purpose of the constant comparative method of joint coding and. analysis is to generate theory more systematically than allowed. by the second approach, by using explicit coding and analytic. procedures. While more systematic than the second approach, this method does not adhere completely to the first, which.

  19. PDF Grounded Theory and the Constant Comparative Method: Valid Research

    In qualitative research, the process of theoretical sampling combined with the constant comparative method mentioned above is a significant strategy

  20. Use of constant comparative analysis in qualitative research.

    This article describes the application of constant comparative analysis, which is one method that can be used to analyse qualitative data. The need for data analysis to be congruent with the overall research design is highlighted.

  21. PDF A Purposeful Approach to the Constant Comparative Method in the

    The constant comparative method (CCM) together with theoretical sampling constitute the core of qualitative analysis in the grounded theory approach and in other types of qualitative research. Since the application of the method remains rather unclear, researchers do not know how to 'go about' the CCM in their research practice.

  22. Use of Constant Comparative Analysis in Qualitative Research

    This article describes the use of constant comparative analysis, a method of analysing qualitative data where the information gathered is coded into emergent themes or codes. The data is constantly revisited after initial coding, until it is clear that no new themes are emerging. It can be used in a study with a single method of data collection ...

  23. APA Dictionary of Psychology

    a procedure for evaluating qualitative data in which the information is coded and compared across categories, patterns are identified, and these patterns are refined as new data are obtained. For example, a researcher might use constant comparative analysis to assess responses to interview questions, creating categories of answers according to the perspectives expressed, examining their ...

  24. Decentralized Composting Analysis Model—The Qualitative Analysis Path

    The qualitative analysis identifies the main players in the field, the critical stakeholders, and the potential conflicts between them. It also reveals the root problems and the core competencies for the project's implementation. The DCAM qualitative analysis in the Shefa-Amr case study indicates that unresolved root problems, such as "lack ...