What is comparative analysis? A complete guide

Last updated

18 April 2023

Reviewed by

Jean Kaluza

Comparative analysis is a valuable tool for acquiring deep insights into your organization’s processes, products, and services so you can continuously improve them. 

Similarly, if you want to streamline, price appropriately, and ultimately be a market leader, you’ll likely need to draw on comparative analyses quite often.

When faced with multiple options or solutions to a given problem, a thorough comparative analysis can help you compare and contrast your options and make a clear, informed decision.

If you want to get up to speed on conducting a comparative analysis or need a refresher, here’s your guide.

Make comparative analysis less tedious

Dovetail streamlines comparative analysis to help you uncover and share actionable insights

  • What exactly is comparative analysis?

A comparative analysis is a side-by-side comparison that systematically compares two or more things to pinpoint their similarities and differences. The focus of the investigation might be conceptual—a particular problem, idea, or theory—or perhaps something more tangible, like two different data sets.

For instance, you could use comparative analysis to investigate how your product features measure up to the competition.

After a successful comparative analysis, you should be able to identify strengths and weaknesses and clearly understand which product is more effective.

You could also use comparative analysis to examine different methods of producing that product and determine which way is most efficient and profitable.

The potential applications for using comparative analysis in everyday business are almost unlimited. That said, a comparative analysis is most commonly used to examine

Emerging trends and opportunities (new technologies, marketing)

Competitor strategies

Financial health

Effects of trends on a target audience

  • Why is comparative analysis so important? 

Comparative analysis can help narrow your focus so your business pursues the most meaningful opportunities rather than attempting dozens of improvements simultaneously.

A comparative approach also helps frame up data to illuminate interrelationships. For example, comparative research might reveal nuanced relationships or critical contexts behind specific processes or dependencies that wouldn’t be well-understood without the research.

For instance, if your business compares the cost of producing several existing products relative to which ones have historically sold well, that should provide helpful information once you’re ready to look at developing new products or features.

  • Comparative vs. competitive analysis—what’s the difference?

Comparative analysis is generally divided into three subtypes, using quantitative or qualitative data and then extending the findings to a larger group. These include

Pattern analysis —identifying patterns or recurrences of trends and behavior across large data sets.

Data filtering —analyzing large data sets to extract an underlying subset of information. It may involve rearranging, excluding, and apportioning comparative data to fit different criteria. 

Decision tree —flowcharting to visually map and assess potential outcomes, costs, and consequences.

In contrast, competitive analysis is a type of comparative analysis in which you deeply research one or more of your industry competitors. In this case, you’re using qualitative research to explore what the competition is up to across one or more dimensions.

For example

Service delivery —metrics like the Net Promoter Scores indicate customer satisfaction levels.

Market position — the share of the market that the competition has captured.

Brand reputation —how well-known or recognized your competitors are within their target market.

  • Tips for optimizing your comparative analysis

Conduct original research

Thorough, independent research is a significant asset when doing comparative analysis. It provides evidence to support your findings and may present a perspective or angle not considered previously. 

Make analysis routine

To get the maximum benefit from comparative research, make it a regular practice, and establish a cadence you can realistically stick to. Some business areas you could plan to analyze regularly include:

Profitability

Competition

Experiment with controlled and uncontrolled variables

In addition to simply comparing and contrasting, explore how different variables might affect your outcomes.

For example, a controllable variable would be offering a seasonal feature like a shopping bot to assist in holiday shopping or raising or lowering the selling price of a product.

Uncontrollable variables include weather, changing regulations, the current political climate, or global pandemics.

Put equal effort into each point of comparison

Most people enter into comparative research with a particular idea or hypothesis already in mind to validate. For instance, you might try to prove the worthwhileness of launching a new service. So, you may be disappointed if your analysis results don’t support your plan.

However, in any comparative analysis, try to maintain an unbiased approach by spending equal time debating the merits and drawbacks of any decision. Ultimately, this will be a practical, more long-term sustainable approach for your business than focusing only on the evidence that favors pursuing your argument or strategy.

Writing a comparative analysis in five steps

To put together a coherent, insightful analysis that goes beyond a list of pros and cons or similarities and differences, try organizing the information into these five components:

1. Frame of reference

Here is where you provide context. First, what driving idea or problem is your research anchored in? Then, for added substance, cite existing research or insights from a subject matter expert, such as a thought leader in marketing, startup growth, or investment

2. Grounds for comparison Why have you chosen to examine the two things you’re analyzing instead of focusing on two entirely different things? What are you hoping to accomplish?

3. Thesis What argument or choice are you advocating for? What will be the before and after effects of going with either decision? What do you anticipate happening with and without this approach?

For example, “If we release an AI feature for our shopping cart, we will have an edge over the rest of the market before the holiday season.” The finished comparative analysis will weigh all the pros and cons of choosing to build the new expensive AI feature including variables like how “intelligent” it will be, what it “pushes” customers to use, how much it takes off the plates of customer service etc.

Ultimately, you will gauge whether building an AI feature is the right plan for your e-commerce shop.

4. Organize the scheme Typically, there are two ways to organize a comparative analysis report. First, you can discuss everything about comparison point “A” and then go into everything about aspect “B.” Or, you alternate back and forth between points “A” and “B,” sometimes referred to as point-by-point analysis.

Using the AI feature as an example again, you could cover all the pros and cons of building the AI feature, then discuss the benefits and drawbacks of building and maintaining the feature. Or you could compare and contrast each aspect of the AI feature, one at a time. For example, a side-by-side comparison of the AI feature to shopping without it, then proceeding to another point of differentiation.

5. Connect the dots Tie it all together in a way that either confirms or disproves your hypothesis.

For instance, “Building the AI bot would allow our customer service team to save 12% on returns in Q3 while offering optimizations and savings in future strategies. However, it would also increase the product development budget by 43% in both Q1 and Q2. Our budget for product development won’t increase again until series 3 of funding is reached, so despite its potential, we will hold off building the bot until funding is secured and more opportunities and benefits can be proved effective.”

Get started today

Go from raw data to valuable insights with a flexible research platform

Editor’s picks

Last updated: 21 December 2023

Last updated: 16 December 2023

Last updated: 6 October 2023

Last updated: 5 March 2024

Last updated: 25 November 2023

Last updated: 15 February 2024

Last updated: 11 March 2024

Last updated: 12 December 2023

Last updated: 6 March 2024

Last updated: 10 April 2023

Last updated: 20 December 2023

Latest articles

Related topics, log in or sign up.

Get started for free

  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, quantitative methods for the comparative analysis of cities in history.

comparative analysis in quantitative research

  • 1 Mansueto Institute for Urban Innovation, University of Chicago, Chicago, IL, United States
  • 2 Department of Ecology & Evolution, Department of Sociology, University of Chicago, Chicago, IL, United States
  • 3 Santa Fe Institute, Santa Fe, NM, United States
  • 4 School of Sustainability, Arizona State University, Tempe, AZ, United States
  • 5 ASU–Santa Fe Institute Center for Biosocial Complex Systems, Arizona State University, Tempe, AZ, United States

Comparative studies of cities throughout history are one of the greatest sources of insight into the nature of change in human societies. This paper discusses strategies to anchor these comparisons on well-defined, quantitative and empirical characteristics of cities, derived from theory and observable in the archeological and historical records. We show how quantitative comparisons based on a few simple variables across settlements allow us to analyze how different places and peoples dealt with general problems of any society. These include demographic change, the organization of built spaces, the intensity and size of socioeconomic networks and the processes underlying technological change and economic growth. Because the historical record contains a much more varied and more independent set of experiences than contemporary urbanization, it has a unique power for illuminating present puzzles of human development and testing emergent urban theory.

Introduction

Cities have always held a special fascination to any scholar of human societies. Coincident with the advent of the first cities, we observe the appearance of many technologies and adaptations that, in different forms, are still with us today ( Adams, 2005 ). Thus, the experience of living in cities ( Wirth, 1938 ; Lees, 2015 ) provides a general conducting line throughout history, connecting common phenomena across different societies and thus also identifying features that are truly contextual.

Performing comparative analyses of different societies is always an exercise fraught with challenges. There is the empirical challenge of identifying cultural, social, political and economic traits, which can be measured in very different settings. But there is another difficulty when doing comparative analysis which habitually goes unnoticed. The identification of common traits is often conditioned on performance measures, such as rates of economic growth or energy use per capita, which convey a sense of what today we find important ( Mcfarlane, 2010 ). Assessing the nature, and even the quality, of ancient societies can easily be biased by using the socioeconomic experience of today's high-income nations and their recent history. Is evidence for improvements in diet or material conditions in ancient societies to be disregarded because these same societies did not experience high (by today's standards) output growth rates? Such a stance crudely disregards many of the extraordinary adaptations and inventions—social, cultural, and technological—of earlier societies. Alternatively, conditioning on environmental stewardship and sustainability leads to the opposite conclusion, ranking smaller scale societies that had less impact on their immediate natural environments as having higher quality than most recent societies.

There is however an alternative to such approaches, which starts with much more basic but also more pervasive features of any settled human society ( Bettencourt, 2013 ; Ortman et al., 2014 ). A number of recent new ideas, supported by extensive empirical analyses, point to certain quantitative comparisons of basic general quantities that may shed light on a number of key puzzles about the organization, sociality and capacity for adaptation of past urban societies ( Bettencourt et al., 2007 ; Fletcher, 2011 ; Bettencourt, 2013 ; Ortman et al., 2014 , 2015 , 2016 ; Cesaretti et al., 2016 ; Hanson and Ortman, 2017 ; Ortman and Coffey, 2017 ). Such puzzles include the relative size, structure and flows between settlements in the same polity, the nature of socioeconomic networks in cities, the spatial organization of settlements, and the nature of change and adaptation in these systems, including processes of economic growth (Economic growth is here understood to be simply an increase, from one period to the next, in a society's material output). What is most important to capture through such comparisons, in our view, is how different societies deal with general problems affecting them all, including energy and resource extraction, and the organization of their socioeconomic networks over space and time ( Bettencourt, 2013 ; Morris, 2013 ; Ortman et al., 2014 ).

As we look back at history from a modern perspective, shaped by an urban planet with large human population and fast economic growth and technological change, these puzzles become especially poignant: Are pre-industrial societies fundamentally different in the way people lived and interacted? Or are these differences primarily connected to issues of scope, scale and technology? Can we identify, clearly and empirically, lines of continuity and divergence in the structure and dynamics of urbanizing societies?

These puzzles cannot be answered simply by using the present as the baseline for comparison: what is needed is a framework that makes comparison between the experiences of the past and life in the present conceptually coherent and empirically consistent ( Bettencourt, 2013 ; Ortman et al., 2014 ). Here, we explore three strategies for quantitative analysis of settlements throughout history. We discuss how these are undertaken methodologically and their promise for generating a more integrated understanding of our social history as well as an appreciation of each society in its own context. A comparison of the past and the present that is based on fundamental processes and features makes it intelligible to use the past and present to discern what the future might be like.

Because we are asking for quantitative ways to perform comparative analysis of cities in history we need to obtain data that are consistent across places and times. This remains a challenge, not only because empirical evidence in the archeological record is sparse and mostly associated with durable materials, but also because methods and definitions have naturally varied between many different communities, each dedicated to different periods, using different methods of analysis, etc ( Kintigh et al., 2014 ).

Thus, to go forward and attempt any reasonable synthesis, simplicity and clarity are paramount. Simple quantities such as the area of a settlement, its putative population count (based on independent measures, such as room counts or amounts of debris), and perhaps other basic quantities related to public spaces or monument construction are usually available through the material record, and have now been measured in several instances ( Bettencourt, 2013 ; Ortman et al., 2014 , 2015 , 2016 ; Ortman and Coffey, 2017 ). The analytical advantage of these quantities is that they are reasonably objective and salient features of any human settlement, while leaving plenty of room for varying cultural, political, and economic features of different societies ( Mcfarlane, 2010 ; Lees, 2015 ).

For simplicity then, we ask below what we can be inferred from fairly sparse data records, where only a few variables (one, two,…) are available for each site. This approach also allows us to connect to well-known traditions in history, demography and geography ( Fujita, 1990 ; Bairoch, 1991 ; Zipf, 2012 ; Morris, 2013 ; Ober, 2016 ), before we attempt to take longer steps toward the end of the paper.

One Variable: Demography and the City-Size Distribution

Perhaps the most established way to characterize an urban system quantitatively is by analyzing the statistics of settlement sizes , or equivalently testing the “rank-size” rule ( Henderson, 1974 ; Fujita, 1990 ; Bairoch, 1991 ; Zipf, 2012 ). This is the simplest of all tests of any quantitative expectation for cities. It requires data on only a single variable, such as the population of each settlement. For this reason, studies constructing the settlement size distribution for many societies are numerous and have been undertaken for decades ( Bairoch, 1991 ; Gabaix, 1999 ; Zipf, 2012 ; Swerts and Pumain, 2013 ). In many archeological applications, population is replaced by more directly observable proxies, such as the settlement's area.

The simplest expectation for the rank-size rule (also known as Zipf's law Krugman, 1996 ; Zipf, 2012 states that, when cities are rank-ordered from largest (rank = 1) to smallest (rank = number of cities in the system), the size of each city is simply inversely proportional to its rank:

where size max is the size of the largest city and z is an exponent. The standard rank-size rule applies for z = 1. This is equivalent to the probability distribution of city sizes taking the form

where P 0 is a normalizing constant, so that the probability integrates to unity.

Much has been made of the shape of the city size distribution and its meaning. The common exercise deals with the estimation of the rank-size exponent, z , and observing its deviations away from unity. The existence of a distribution of city sizes has been attributed to the (neutral) trade-offs between the benefits and disadvantages accruing from populations agglomerating ( Henderson, 1974 ), between economies of scale and costs of movement ( Fujita, 1990 ), and a stochastic “preferential attachment” growth process ( Simon, 1955 ). Others have shown that, in some circumstances, Zipf's law is not a good description of data at all, and distributions in the lognormal family, in particular, may fit the data better ( Eeckhout, 2004 ).

This kind of problem is clearly visible in Figure 1 , and is discussed below.

www.frontiersin.org

Figure 1 . The city size distribution of Metropolitan Areas in the USA in 2010. (A) Histogram of city sizes (red dashed line is lognormal distribution fit); (B) Rank-size rule [black line is Equation (1)]. We see that in the USA the rank size rule approximately describes the relative distribution of large cities but fails to account for an overabundance and then deficit of progressively smaller towns. Data is available online at the US Census Bureau, Website: https://www.census.gov/programs-surveys/metro-micro/data.html .

Often patterns of settlement size are called primate if a single city is much larger than all the others and larger than what the rank size rule would predict. This has often been taken to signal political and economic centralization, in some cases beyond the territory of the settlement system, as in the case of empires ( Savage, 1997 ). Primate (or macrocephalous) settlement systems of this kind seem to apply to many cases in history, from the Aztecs to contemporary France or England ( Ortman et al., 2014 ; Bettencourt and Lobo, 2016 ). Likewise, in many other situations there are several large cities of roughly about the same size [perhaps the Maya, contemporary Spain, Italy, or even Germany ( Bettencourt and Lobo, 2016 )]. This is sometimes interpreted as a sign of a not fully integrated political or economic system across settlements, with several large cities competing for the “highest” functions associated with the urban hierarchy, such as the central place of government or the dominant (financial) market ( Harris and Ullman, 1945 ).

Another pattern is a deficit of small settlements relative to what the rank-size rule would predict. This is a common occurrence for most contemporary, highly urbanized settlement systems. To appreciate this consider that, for a system with a largest city of a million [like Rome under Hanson and Ortman (2017) ], the rank size rule predicts 10,000 towns of 100 people and 1,000,000 with one person.

For contemporary systems, with the largest cities in the region of 20–30 million, this would predict way too many small settlements, which are demonstrably not there. This means in practice that the rank-size rule cannot apply across the entire set of settlement sizes, especially for very small ones. For intermediate settlement sizes, some quantitative geographers would attribute this deficit to issues related to the definition of small settlements, many of which they would separate from the orbit of larger places. Varying spatial definitions of cities, usually through different criteria of spatial clustering, can indeed obtain more “Zipfian” city size distributions. This in turn raises issues for of settlement definition, especially for large cities, which are often surrounded by many commuting towns, giving rise to integrated labor markets known in modern settings as metropolitan areas ( OECD, 2012 ).

Despite these interesting interpretations, there seems to be no strong connection between the relative size of settlements in an urban system and its overall performance, for example in terms of rates of economic or demographic change ( Berry, 1961 ).

The sure lesson that can be derived from the observation of the relative sizes of settlements is very simple.

Mechanically, the size of each city measures simply the integrated growth (including periods of decline) over its history, which is essentially a measure of its demographic average growth rate over a long period of time. The simplest rank-size rule states, from this perspective, that all settlements grow at the same rate ( Gabaix, 1999 ) (if they were created at the same time), another approximate statistical regularity known as Gibrat's law. Note that this does not have to mean that demographic growth rates are the same for all cities at all time, but simply that over a long period of time these rates converge to the same number, presumably as the result of balances between births, deaths and migration between these towns and cities.

Thus, by comparing the relative size of different settlements, historians and archeologists should be asking whether these were part of the same “demographic” system, connected by mutual migration flows and other networks of exchange and trade. If so, observing something close to the settlement size distribution predicted by the rank-size rule would imply the same average population growth rates for all places, big and small. Then, for example, if mortality rates were higher in larger cities, this would imply a correspondingly larger rate of immigration from smaller places to larger cities ( Dyson, 2011 ; Bocquier and Costa, 2015 ).

If it is possible to measure the size distribution of the same settlements at two or more times, then we can moreover compare their relative growth rates during the intervening periods, giving us an empirical basis to rank their relative (demographic) success.

Some additional issues are worth flagging here. For human settlements, spatial areas are not typically proportional to population sizes. How physical space is used socially can be modulated by cultural and physical infrastructure, as well as by technology ( Wirth, 1938 ; Adams, 2005 ; Bettencourt, 2013 ). Furthermore, areas can also be measured in different ways, as the surface within the convex hull of the settlement's putative boundaries A , or as the actual built up area of buildings, streets and other structures, A n . We discuss these issues next.

Two Variables: Settlement Scaling, Density, and Agglomeration

What was life like in the ancient city of Ur? Or in the great city of Teotihuacan? We will never really know 150 for sure, of course. One of the main objectives of archeological and historical research is to reconstruct 151 what social and economic life might have been like from fragmentary information, much of it about the 152 built environment. This is a very difficult type of inference that requires a testable theory of how properties 153 of social and economic life relate to variations in specific characteristics of the built environment.

Settlement scaling theory attempts to do precisely this ( Ortman et al., 2014 , 2015 , 2016 ; Ortman and Coffey, 2017 ). Developed originally to explain urban scaling properties in contemporary cities ( Bettencourt et al., 2007 ; Bettencourt, 2013 ; Ortman et al., 2014 ), its ingredients are very general leading to the exciting prospect of the application of its core ideas to settlements in history. The empirical observations on which it is based, as well as its core models, indicate that several basic social economic and infrastructural properties of settlements are interrelated, and can thus be predicted on the basis of comparative analyses of their built environment and estimates of their population size. Empirically, scaling analysis is also very simple, requiring only pairs of variables for each settlement, and the analysis of a familiar xy plot ( Figure 2 ).

www.frontiersin.org

Figure 2 . Economic and demographic growth in modern cities, such as those of China (shown) is a property of the urban system. (A) Shows about 20 years of data for Chinese Prefectural cities (colored dots), and the scaling of GDP with population size (solid line shows the scaling relation). (B) Shows the same scaling after system wide growth (yellows squares) is subtracted. This growth is shown in (C) for both GDP (red) and Population (yellow) and versus each other on (D) . Data are from Chinese City Statistical Yearbooks compiled by the Chinese National Bureau of Statistics (see Zünd and Bettencourt, 2019 ). The same data is compiled and translated in English and made available online at https://www.china-data-online.com/member/city/ (requires subscription).

If we have two variables for each settlement we can ask for example, how does their built-up area depend on their population size: all we have to do is plot one quantity against the other. The answer tends to be non-linear, but well-described by scale invariant functions (power-laws), such as

This can be made linear by a simple transformation to logarithmic variables, or a loglog xy plot ( Figure 2 ).

Theoretical considerations derive the values for the prefactor α( t ) and the exponent for area as 2 / 3 α 5 / 6, depending on the type of settlement and how area is measured ( Bettencourt, 2013 ; Ortman et al., 2014 ). These expectations are confirmed by empirical analysis of many settlement systems, including in the pre-Columbian Basin of Mexico ( Ortman et al., 2015 , 2016 ), classical Rome ( Hanson and Ortman, 2017 ), Medieval Europe ( Cesaretti et al., 2016 ), and of course contemporary urban areas ( Bettencourt et al., 2007 ; Bettencourt, 2013 ).

The same theoretical framework predicts scaling relations and exponent numerical values for many other quantities (see e.g., Figure 2 ), including the number of socioeconomic interactions in a settlement, its division of labor, its rate of socioeconomic production and many detailed characteristics of the built environment, such as street length and width and associated transportation costs ( Bettencourt, 2013 ).

In this way, a very straightforward two-variable scaling analysis can reveal commonalities of settlements as socioeconomic networks self-consistently embedded in built spaces. An expansion of this type of analysis to other settlement systems promises to reveal common quantitative patterns of basic settlement organization and socioeconomic capacity in societies through space and time.

It is also from the perspective provided by these observations and associated theoretical frameworks that we may appreciate any exceptions. For example, an interesting set of questions has been raised by Fletcher about “low density urbanism” ( Fletcher, 2011 ), specifically in the context of Mayan settlements and Angkor Wat, which appear to show an expansion of their area with population with an exponent, α > 1. Therefore, such settlements would become less dense the larger they are, not realizing agglomeration effects typical of other cities. A similar, but perhaps more expected pattern also applies to mobile hunter-gatherer camps, but with greater variability. These patterns also vary in time in specific ways, to which we now turn.

Technological and Economic Change

In modern societies, cities have been a necessary condition for economic growth ( Jones and Romer, 2009 ).

We say necessary because the existence and expansion of cities is not always sufficient for income growth at the national level: there are many episodes, some shorter and some longer, of urbanizing societies experiencing no (economic) growth ( Inoue et al., 2015 ). Nevertheless, the association between higher levels of urbanization and larger GDP per capita is one of the strongest empirical results in studies of economic change and international development.

Much work has been done to try to elucidate this connection and better understand the mechanisms of technological change and economic growth generated by urban environments ( Lucas, 1988 ; Jones and Romer, 2009 ). However, if the judgment of success is predicated on creating quantitatively precise growth rates, it remains fair to say that the problem is not yet well-understood.

Many studies in economic history have also shed light on the circumstances that led to sustained economic growth after the industrial revolution, calling our attention to macroeconomic factors such as the availability of energy on a large scale, political and economic institutions, and the advent of modern science ( Morris, 2013 ). The study of socieconomic development in the past has also highlighted the role of urbanization ( Algaze, 2008 ; Cowgill, 2015 ; Ober, 2016 ; Harper, 2017 ; Manning, 2018 ) as have historical experiences of urbanization without growth ( Jedwab and Vollrath, 2015 ). As useful as detailed case studies and historical examinations are, comparative analyses have been hampered partly because of a perceived lack of common empirical evidence within regions and across eras and geographies.

There are, however, a set of facts that may be useful for framing the study of urbanization's role across time: i) sustained economic growth is a system's level property (see Figure 2 for China); ii) growth volatility reduces rates of economic growth; iii) very small rates of systemic economic growth are not perceptible over a human lifetime; and iv) the accumulation of material wealth resulting from low-levels of growth are vulnerable to exogenous shocks (such as disease or changes in climate). As a consequence, growth can go unnoticed and remain accidental. This is not to say that people were not keenly aware of times of prosperity or famine, resulting from conquest or good harvests, it simply means that the concept of long-run intensive economic growth would have been very hard to perceive and nurture in pre-industrial societies.

The first point may not be obvious as we often think of rich and poor settlements, even within the same nation or polity. It is, however, generally true that the type of sustained and fast economic growth observed in modern settlement systems is a system level property (so that information, ideas, resources and individuals can flow among settlements), with all cities experiencing about the same annual rate of growth over long periods of time (see Figure 2 ).

The happy consequence of this observation is that studying systemic economic growth in history may require only a number of point assessments, which should agree in magnitude whether they were measured in small towns or larger cities. This also means that golden ages often associated with large cities, such as classical Athens or Rome, whether triggered by a technological innovation or by conquest and theft, may not be sustainable unless they induce economic growth across their settlement systems ( Ober, 2016 ). This means, for example, that we should see the living experience of primary producers living in small settlements change so as to enjoy some of the products of large cities and vice-versa in a virtuous cycle of exchange and common development. We know of course that prior to the industrial revolution such periods, if they existed at all, were not associated with large growth rates, and were typically localized in space and time.

The second and third properties of economic growth follow from its character as a stochastic (fluctuating) process. This is a very general feature of collective dynamics of growth, from population biology to financial markets ( Bettencourt, 2018 ). Without going into detailed models for those contexts, quantities such as the resources available to a society ('wealth') are expected to grow approximately as

where η, ϵ are the average growth rate (an approximate constant in time, say 1% a year) and the corresponding stochastic variations, respectively.

Writing the variance of ε as σ 2 (also known as the square volatility) allows us to integrate the equation in time to give

where ξ( t ) is an approximately normal variable with zero mean and unit variance. The actual growth rate η - σ 2 2 that results is the geometric mean (not the arithmetic mean!) of growth rates, as is well-known in population biology. This is reduced from the average growth rate by a term proportional to the square volatility, σ 2 , that is half the variance of the growth rate, due to its fluctuations over time. Thus, high variability can render any small growth rate zero or even negative (see Figure 3 ). This means that innovations to reduce instability in the economy are, in the beginning, almost as important as having a positive growth rate in the first place.

www.frontiersin.org

Figure 3 . Production and volatility measured by lead emissions [measured in Greenland ice cores, McConnell et al. (2018) , data available online at https://www.pnas.org/content/suppl/2018/05/09/1721818115.DCSupplemental ]. (A) Shows estimated emissions over a long historical period. (B) Shows the corresponding growth rate in emissions (orange is a running average). Red vertical lines delimit the period between 150BC and 150AD, associated with a rise of the Roman Empire ( Delile et al., 2014 ). For this period the effective growth rate is very small due to high volatility. The annual average growth rate is about η ≃ 0.17%.

With all that said, the final argument we wish to emphasize here is that the growth rate for any preindustrial economy over any extended time period (say decades) was likely very small. Figure 3 , based on lead emissions, suggests a value of about 0.17%, certainly lower than 0.3% a year. This translates at the most into a doubling time for the economy of 240 years. This time scale is too long to be felt by anyone—on average at least—in their own lifetimes. Thus, even if slow economic growth was present in preindustrial societies, it was likely too slow for its society to become conscious of it and take measures that could sustain it. The perception would then be one of effectively zero growth, where any positive period would be quickly reversed by fluctuations.

Even if the change in material output of societies in the past had been exponential in nature, the accumulation of wealth could have been greatly set back by disease, climate change or war. And even if the underlying social processes by which agglomerated populations learn, innovate, and become more productive are the same across eras, societies abilities to deal with the plague or sharp reduction in rainfall are importantly determined by science and technology.

In conclusion, processes of human development and economic growth recognizable to us today were probably at play throughout history, and certainly in most urban societies. However, even in the best of times such rates of change may have been too local, too volatile and too short-lived to be acted upon and sustained, intentionally, over the long term. The search for some of the tell-tale signs of these episodes, especially in the systemic change in living conditions across settlement sizes may give us precious new insights into the actual time dependent variability of these effects, and on the human experience in cities during long periods of very slow growth and even decay.

The history of cities presents us with a bewildering variety of social, economic, political and cultural ways in which human settlements can exist. Making sense of this variation, while at the same time extracting what may be essential across time and space, is a necessary goal not only for a “science of cities” but for a science of human sociality. We have shown how an approach to comparative analysis based on common but determinant variables for human settlements—including population, area and measures of the built environment- has the power to support an analytical narrative relating the earliest settlements in history to contemporary cities and presumably their future forms.

The ambition to develop a theoretical and empirical basis for the study of human settlements through history may invoke in the reader common criticisms of any cross-cultural comparative analysis, specifically that some societies will be judged to be better or worse, and that contemporary high-income societies along with their economic and political systems will be used as standards for evaluation. None of this follows from the strategy proposed here, except the fairly mechanical features that some societies are larger than others—in both their creative and destructive capacity—and so must possess knowledge of, and access to, different types of resources, in ways that are sometimes sustainable and sometimes exploitative. Only by learning formally about this variation can we come to appreciate the range of the human experience in cities.

The approach proposed here then simply connects social and cultural life to some of its most basic material underpinnings, common to all societies in all places. This includes the fact that people exist in space and that their interactions must be structured over space and time in ways that must be compatible with their collective socioeconomic capacity. Evidence from historical and archeological sources have the singular potential to illuminate these issues in ways that contemporary evidence cannot.

Data Availability Statement

Publicly available datasets were analyzed in this study. This data can be found here: https://www.dropbox.com/sh/ xuuuy0oedvfekbp/AADhWggmVuiMOU2txx-Kc2aNa?dl=0 .

Author Contributions

LB has conceptualized the study and analyzed the data. LB and JL wrote the manuscript.

This work is partially supported by OPEN-AIRE funding linked to the Marie Curie Project Past-people-nets 628818, conducted by Francesca Fulminante (2014–2016). Publication of this research has been supported by a grant from the James S. McDonnell Foundation (#220020438).

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We are thankful to the speakers and participants of two workshops, Urbanization without Growth in the Ancient World? (at the British Academy in Rome, July 2017) and at the Roman Archeology Conference (RAC/TRAC) 2018, where some of the material presented here was originally developed.

Adams, R. M. (2005). The Evolution of Urban Society: Early Mesopotamia and Prehispanic Mexico . New Brunswick: AldineTransaction.

Google Scholar

Algaze, G. (2008). Ancient Mesopotamia at the Dawn of Civilization: The Evolution of an Urban Landscape . Chicago: University of Chicago Press.

Bairoch, P. (1991). Cities and Economic Development: From the Dawn of History to the Present. Chicago: University of Chicago Press.

Berry, B. J. L. (1961). City size distributions and economic development. Econ. Dev. Cult. Change 9, 573–588. doi: 10.1086/449923

CrossRef Full Text | Google Scholar

Bettencourt, L. M. A. (2013). The origins of scaling in cities. Science 340, 1438–1441. doi: 10.1126/science.1235823

PubMed Abstract | CrossRef Full Text | Google Scholar

Bettencourt, L. M. A. (2018). Urban Growth and the Emergence Statistics of Cities .

Bettencourt, L. M. A., and Lobo, J. (2016). Urban scaling in Europe. J. R. Soc. Interface 13:20160005. doi: 10.1098/rsif.2016.0005

Bettencourt, L. M. A., Lobo, J., Helbing, D., Kühnert, C., and West, G. B. (2007). Growth, innovation, scaling, and the pace of life in cities. Proc. Natl. Acad. Sci. U.S.A. 104, 7301–7306. doi: 10.1073/pnas.0610172104

Bocquier, P., and Costa, R. (2015). Which transition comes first? Urban and demographic transitions in Belgium and Sweden. Demogr. Res. 33, 1297–1332. doi: 10.4054/DemRes.2015.33.48

Cesaretti, R., Lobo, J., Bettencourt, L. M. A., Ortman, S. G., and Smith, M. E. (2016). Population-area relationship for medieval European cities. PLoS ONE 11:e0162678. doi: 10.1371/journal.pone.0162678

Cowgill, G. L. (2015). Ancient Teotihuacan: Early Urbanism in Central Mexico . New York, NY: Cambidge University Press.

Delile, H., Blichert-Toft, J., Goiran, J. P., Keay, S., and Albarede, F. (2014). Lead in ancient Rome's city waters. Proc. Natl. Acad. Sci. U.S.A. 111, 6594–6599. doi: 10.1073/pnas.1400097111

Dyson, T. (2011). The role of the demographic transition in the process of urbanization. Populat. Dev. Rev. 37, 34–54. doi: 10.1111/j.1728-4457.2011.00377.x

Eeckhout, J. (2004). Gibrat's law for (all) cities. Am. Econ. Rev. 94, 1429–1451. doi: 10.1257/0002828043052303

Fletcher, R. (2011). Low-Density, Agrarian-Based Urbanism .

Fujita, M. (1990). Urban Economic Theory: Land Use and City Size . Cambridge: Cambridge University Press.

Gabaix, X. (1999). Zipf's law for cities: an explanation. Q. J. Econ. 114, 739–767. doi: 10.1162/003355399556133

Hanson, J. W., and Ortman, S. G. (2017). A systematic method for estimating the populations of Greek and Roman settlements. J. Roman Archaeol. 30, 301–324. doi: 10.1017/S1047759400074134

Harper, K. (2017). The Fate of Rome: Climate, Disease & the End of an Empire . Princeton, NJ: Princeton University Press.

Harris, C. D., and Ullman, E. L. (1945). The nature of cities. Ann. Am. Acad. Polit. Soc. Sci. 242, 7–17. doi: 10.1177/000271624524200103

Henderson, J. V. (1974). The sizes and types of cities. Am. Econ. Rev. 64, 640–656. doi: 10.2105/AJPH.64.7.656

Inoue, H., Álvarez, A., Anderson, E. N., Owen, A., Álvarez, R., Lawrence, K., et al. (2015). Urban scale shifts since the bronze age: upsweeps, collapses, and semiperipheral development. Soc. Sci. Hist. 39, 175–200. doi: 10.1017/ssh.2015.50

Jedwab, R., and Vollrath, D. (2015). Urbanization without growth in historical perspective. Explorat. Econ. Hist. 58, 1–21. doi: 10.1016/j.eeh.2015.09.002

Jones, C., and Romer, P. (2009). The New Kaldor Facts: Ideas, Institutions, Population, and Human Capital .

Kintigh, K. W., Altschul, J. H., Beaudry, M. C., Drennan, R. D., Kinzig, A. P., Kohler, T. A., et al. (2014). Grand challenges for archaeology. Proc. Natl. Acad. Sci. U.S.A. 111, 879–880. doi: 10.1073/pnas.1324000111

Krugman, P. (1996). Confronting the mystery of urban hierarchy. J. Jpn. Int. Econ. 10, 399–418. doi: 10.1006/jjie.1996.0023

Lees, A. (2015). “The city: a world history,” in The City: A World History (New York, NY: Oxford University Press).

Lucas, R. E. (1988). On the mechanics of economic development. J. Monet. Econ. 22, 3–42. doi: 10.1016/0304-3932(88)90168-7

Manning, J. G. (2018). The Open Sea: The Economic Life of the Ancient Mediterranean World from the Iron Age to the Rise of Rome . Princeton, NJ: Princeton University Press.

McConnell, J. R., Wilson, A. I., Stohl, A., Arienzo, M. M., Chellman, N. J., Eckhardt, S., et al. (2018). Lead pollution recorded in greenland ice indicates european emissions tracked plagues, wars, and imperial expansion during antiquity. Proc. Natl. Acad. Sci. U.S.A. 115, 5726–31. doi: 10.1073/pnas.1721818115

Mcfarlane, C. (2010). The comparative city: knowledge, learning, urbanism: the comparative city: knowledge, learning, urbanism. Int. J. Urban Region. Res. 34, 725–742. doi: 10.1111/j.1468-2427.2010.00917.x

Morris, I. (2013). The Measure of Civilization: How Social Development Decides the Fate of Nations . Princeton, NJ: Princeton University Press.

Ober, J. (2016). The Rise and Fall of Classical Greece . Princeton, NJ: Princeton University Press.

OECD (2012). Redefining “Urban“: A New Way to Measure Metropolitan Areas . Paris: OECD.

Ortman, S. G., Cabaniss, A. H. F., Sturm, J. O., and Bettencourt, L. M. A. (2014). The pre-history of urban scaling. PLoS ONE 9:e87902. doi: 10.1371/journal.pone.0087902

Ortman, S. G., Cabaniss, A. H. F., Sturm, J. O., and Bettencourt, L. M. A. (2015). Settlement scaling and increasing returns in an ancient society. Sci. Adv. 1:e1400066. doi: 10.1126/sciadv.1400066

Ortman, S. G., and Coffey, G. D. (2017). Settlement scaling in middle-range societies. Am. Antiquity 82, 662–682. doi: 10.1017/aaq.2017.42

Ortman, S. G., Davis, K. E., Lobo, J., Smith, M. E., Bettencourt, L. M., and Trumbo, A. (2016). Settlement scaling and economic change in the Central Andes. J. Archaeol. Sci. 73, 94–106. doi: 10.1016/j.jas.2016.07.012

Savage, S. H. (1997). Assessing departures from log-normality in the rank-size rule. J. Archaeol. Sci. 24, 233–244. doi: 10.1006/jasc.1996.0106

Simon, H. A. (1955). On a class of skew distribution functions. Biometrika 42, 425–440. doi: 10.1093/biomet/42.3-4.425

Swerts, E., and Pumain, D. (2013). A statistical approach to territorial cohesion: the Indian city system. L'Espace Géogr. 42, 75–90. doi: 10.3917/eg.421.0077

Wirth, L. (1938). Urbanism as a way of life. Am. J. Sociol. 44, 1–24. doi: 10.1086/217913

Zipf, G. K. (2012). Human Behavior and the Principle of Least Effort: An Introduction to Human Ecology . Mansfield Centre, CT: Martino Pub.

Zünd, D., and Bettencourt, L. M. A. (2019). Growth and development in prefecture-level cities in China. PLoS ONE 14:e0221017. doi: 10.1371/journal.pone.0221017

Keywords: urbanization, scaling, zip's law, economic growth, data

Citation: Bettencourt LMA and Lobo J (2019) Quantitative Methods for the Comparative Analysis of Cities in History. Front. Digit. Humanit. 6:17. doi: 10.3389/fdigh.2019.00017

Received: 16 January 2019; Accepted: 16 October 2019; Published: 01 November 2019.

Reviewed by:

Copyright © 2019 Bettencourt and Lobo. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Luís M. A. Bettencourt, bettencourt@uchicago.edu ; José Lobo, jose.lobo@asu.edu

This article is part of the Research Topic

Where Do Cities Come From and Where Are They Going To? Modelling Past and Present Agglomerations to Understand Urban Ways of Life

What Is Comparative Analysis and How to Conduct It? (+ Examples)

Appinio Research · 30.10.2023 · 36min read

What Is Comparative Analysis and How to Conduct It Examples

Have you ever faced a complex decision, wondering how to make the best choice among multiple options? In a world filled with data and possibilities, the art of comparative analysis holds the key to unlocking clarity amidst the chaos.

In this guide, we'll demystify the power of comparative analysis, revealing its practical applications, methodologies, and best practices. Whether you're a business leader, researcher, or simply someone seeking to make more informed decisions, join us as we explore the intricacies of comparative analysis and equip you with the tools to chart your course with confidence.

What is Comparative Analysis?

Comparative analysis is a systematic approach used to evaluate and compare two or more entities, variables, or options to identify similarities, differences, and patterns. It involves assessing the strengths, weaknesses, opportunities, and threats associated with each entity or option to make informed decisions.

The primary purpose of comparative analysis is to provide a structured framework for decision-making by:

  • Facilitating Informed Choices: Comparative analysis equips decision-makers with data-driven insights, enabling them to make well-informed choices among multiple options.
  • Identifying Trends and Patterns: It helps identify recurring trends, patterns, and relationships among entities or variables, shedding light on underlying factors influencing outcomes.
  • Supporting Problem Solving: Comparative analysis aids in solving complex problems by systematically breaking them down into manageable components and evaluating potential solutions.
  • Enhancing Transparency: By comparing multiple options, comparative analysis promotes transparency in decision-making processes, allowing stakeholders to understand the rationale behind choices.
  • Mitigating Risks : It helps assess the risks associated with each option, allowing organizations to develop risk mitigation strategies and make risk-aware decisions.
  • Optimizing Resource Allocation: Comparative analysis assists in allocating resources efficiently by identifying areas where resources can be optimized for maximum impact.
  • Driving Continuous Improvement: By comparing current performance with historical data or benchmarks, organizations can identify improvement areas and implement growth strategies.

Importance of Comparative Analysis in Decision-Making

  • Data-Driven Decision-Making: Comparative analysis relies on empirical data and objective evaluation, reducing the influence of biases and subjective judgments in decision-making. It ensures decisions are based on facts and evidence.
  • Objective Assessment: It provides an objective and structured framework for evaluating options, allowing decision-makers to focus on key criteria and avoid making decisions solely based on intuition or preferences.
  • Risk Assessment: Comparative analysis helps assess and quantify risks associated with different options. This risk awareness enables organizations to make proactive risk management decisions.
  • Prioritization: By ranking options based on predefined criteria, comparative analysis enables decision-makers to prioritize actions or investments, directing resources to areas with the most significant impact.
  • Strategic Planning: It is integral to strategic planning, helping organizations align their decisions with overarching goals and objectives. Comparative analysis ensures decisions are consistent with long-term strategies.
  • Resource Allocation: Organizations often have limited resources. Comparative analysis assists in allocating these resources effectively, ensuring they are directed toward initiatives with the highest potential returns.
  • Continuous Improvement: Comparative analysis supports a culture of continuous improvement by identifying areas for enhancement and guiding iterative decision-making processes.
  • Stakeholder Communication: It enhances transparency in decision-making, making it easier to communicate decisions to stakeholders. Stakeholders can better understand the rationale behind choices when supported by comparative analysis.
  • Competitive Advantage: In business and competitive environments , comparative analysis can provide a competitive edge by identifying opportunities to outperform competitors or address weaknesses.
  • Informed Innovation: When evaluating new products , technologies, or strategies, comparative analysis guides the selection of the most promising options, reducing the risk of investing in unsuccessful ventures.

In summary, comparative analysis is a valuable tool that empowers decision-makers across various domains to make informed, data-driven choices, manage risks, allocate resources effectively, and drive continuous improvement. Its structured approach enhances decision quality and transparency, contributing to the success and competitiveness of organizations and research endeavors.

How to Prepare for Comparative Analysis?

1. define objectives and scope.

Before you begin your comparative analysis, clearly defining your objectives and the scope of your analysis is essential. This step lays the foundation for the entire process. Here's how to approach it:

  • Identify Your Goals: Start by asking yourself what you aim to achieve with your comparative analysis. Are you trying to choose between two products for your business? Are you evaluating potential investment opportunities? Knowing your objectives will help you stay focused throughout the analysis.
  • Define Scope: Determine the boundaries of your comparison. What will you include, and what will you exclude? For example, if you're analyzing market entry strategies for a new product, specify whether you're looking at a specific geographic region or a particular target audience.
  • Stakeholder Alignment: Ensure that all stakeholders involved in the analysis understand and agree on the objectives and scope. This alignment will prevent misunderstandings and ensure the analysis meets everyone's expectations.

2. Gather Relevant Data and Information

The quality of your comparative analysis heavily depends on the data and information you gather. Here's how to approach this crucial step:

  • Data Sources: Identify where you'll obtain the necessary data. Will you rely on primary sources , such as surveys and interviews, to collect original data? Or will you use secondary sources, like published research and industry reports, to access existing data? Consider the advantages and disadvantages of each source.
  • Data Collection Plan: Develop a plan for collecting data. This should include details about the methods you'll use, the timeline for data collection, and who will be responsible for gathering the data.
  • Data Relevance: Ensure that the data you collect is directly relevant to your objectives. Irrelevant or extraneous data can lead to confusion and distract from the core analysis.

3. Select Appropriate Criteria for Comparison

Choosing the right criteria for comparison is critical to a successful comparative analysis. Here's how to go about it:

  • Relevance to Objectives: Your chosen criteria should align closely with your analysis objectives. For example, if you're comparing job candidates, your criteria might include skills, experience, and cultural fit.
  • Measurability: Consider whether you can quantify the criteria. Measurable criteria are easier to analyze. If you're comparing marketing campaigns, you might measure criteria like click-through rates, conversion rates, and return on investment.
  • Weighting Criteria : Not all criteria are equally important. You'll need to assign weights to each criterion based on its relative importance. Weighting helps ensure that the most critical factors have a more significant impact on the final decision.

4. Establish a Clear Framework

Once you have your objectives, data, and criteria in place, it's time to establish a clear framework for your comparative analysis. This framework will guide your process and ensure consistency. Here's how to do it:

  • Comparative Matrix: Consider using a comparative matrix or spreadsheet to organize your data. Each row in the matrix represents an option or entity you're comparing, and each column corresponds to a criterion. This visual representation makes it easy to compare and contrast data.
  • Timeline: Determine the time frame for your analysis. Is it a one-time comparison, or will you conduct ongoing analyses? Having a defined timeline helps you manage the analysis process efficiently.
  • Define Metrics: Specify the metrics or scoring system you'll use to evaluate each criterion. For example, if you're comparing potential office locations, you might use a scoring system from 1 to 5 for factors like cost, accessibility, and amenities.

With your objectives, data, criteria, and framework established, you're ready to move on to the next phase of comparative analysis: data collection and organization.

Comparative Analysis Data Collection

Data collection and organization are critical steps in the comparative analysis process. We'll explore how to gather and structure the data you need for a successful analysis.

1. Utilize Primary Data Sources

Primary data sources involve gathering original data directly from the source. This approach offers unique advantages, allowing you to tailor your data collection to your specific research needs.

Some popular primary data sources include:

  • Surveys and Questionnaires: Design surveys or questionnaires and distribute them to collect specific information from individuals or groups. This method is ideal for obtaining firsthand insights, such as customer preferences or employee feedback.
  • Interviews: Conduct structured interviews with relevant stakeholders or experts. Interviews provide an opportunity to delve deeper into subjects and gather qualitative data, making them valuable for in-depth analysis.
  • Observations: Directly observe and record data from real-world events or settings. Observational data can be instrumental in fields like anthropology, ethnography, and environmental studies.
  • Experiments: In controlled environments, experiments allow you to manipulate variables and measure their effects. This method is common in scientific research and product testing.

When using primary data sources, consider factors like sample size, survey design, and data collection methods to ensure the reliability and validity of your data.

2. Harness Secondary Data Sources

Secondary data sources involve using existing data collected by others. These sources can provide a wealth of information and save time and resources compared to primary data collection.

Here are common types of secondary data sources:

  • Public Records: Government publications, census data, and official reports offer valuable information on demographics, economic trends, and public policies. They are often free and readily accessible.
  • Academic Journals: Scholarly articles provide in-depth research findings across various disciplines. They are helpful for accessing peer-reviewed studies and staying current with academic discourse.
  • Industry Reports: Industry-specific reports and market research publications offer insights into market trends, consumer behavior, and competitive landscapes. They are essential for businesses making strategic decisions.
  • Online Databases: Online platforms like Statista , PubMed , and Google Scholar provide a vast repository of data and research articles. They offer search capabilities and access to a wide range of data sets.

When using secondary data sources, critically assess the credibility, relevance, and timeliness of the data. Ensure that it aligns with your research objectives.

3. Ensure and Validate Data Quality

Data quality is paramount in comparative analysis. Poor-quality data can lead to inaccurate conclusions and flawed decision-making. Here's how to ensure data validation and reliability:

  • Cross-Verification: Whenever possible, cross-verify data from multiple sources. Consistency among different sources enhances the reliability of the data.
  • Sample Size: Ensure that your data sample size is statistically significant for meaningful analysis. A small sample may not accurately represent the population.
  • Data Integrity: Check for data integrity issues, such as missing values, outliers, or duplicate entries. Address these issues before analysis to maintain data quality.
  • Data Source Reliability: Assess the reliability and credibility of the data sources themselves. Consider factors like the reputation of the institution or organization providing the data.

4. Organize Data Effectively

Structuring your data for comparison is a critical step in the analysis process. Organized data makes it easier to draw insights and make informed decisions. Here's how to structure data effectively:

  • Data Cleaning: Before analysis, clean your data to remove inconsistencies, errors, and irrelevant information. Data cleaning may involve data transformation, imputation of missing values, and removing outliers.
  • Normalization: Standardize data to ensure fair comparisons. Normalization adjusts data to a standard scale, making comparing variables with different units or ranges possible.
  • Variable Labeling: Clearly label variables and data points for easy identification. Proper labeling enhances the transparency and understandability of your analysis.
  • Data Organization: Organize data into a format that suits your analysis methods. For quantitative analysis, this might mean creating a matrix, while qualitative analysis may involve categorizing data into themes.

By paying careful attention to data collection, validation, and organization, you'll set the stage for a robust and insightful comparative analysis. Next, we'll explore various methodologies you can employ in your analysis, ranging from qualitative approaches to quantitative methods and examples.

Comparative Analysis Methods

When it comes to comparative analysis, various methodologies are available, each suited to different research goals and data types. In this section, we'll explore five prominent methodologies in detail.

Qualitative Comparative Analysis (QCA)

Qualitative Comparative Analysis (QCA) is a methodology often used when dealing with complex, non-linear relationships among variables. It seeks to identify patterns and configurations among factors that lead to specific outcomes.

  • Case-by-Case Analysis: QCA involves evaluating individual cases (e.g., organizations, regions, or events) rather than analyzing aggregate data. Each case's unique characteristics are considered.
  • Boolean Logic: QCA employs Boolean algebra to analyze data. Variables are categorized as either present or absent, allowing for the examination of different combinations and logical relationships.
  • Necessary and Sufficient Conditions: QCA aims to identify necessary and sufficient conditions for a specific outcome to occur. It helps answer questions like, "What conditions are necessary for a successful product launch?"
  • Fuzzy Set Theory: In some cases, QCA may use fuzzy set theory to account for degrees of membership in a category, allowing for more nuanced analysis.

QCA is particularly useful in fields such as sociology, political science, and organizational studies, where understanding complex interactions is essential.

Quantitative Comparative Analysis

Quantitative Comparative Analysis involves the use of numerical data and statistical techniques to compare and analyze variables. It's suitable for situations where data is quantitative, and relationships can be expressed numerically.

  • Statistical Tools: Quantitative comparative analysis relies on statistical methods like regression analysis, correlation, and hypothesis testing. These tools help identify relationships, dependencies, and trends within datasets.
  • Data Measurement: Ensure that variables are measured consistently using appropriate scales (e.g., ordinal, interval, ratio) for meaningful analysis. Variables may include numerical values like revenue, customer satisfaction scores, or product performance metrics.
  • Data Visualization: Create visual representations of data using charts, graphs, and plots. Visualization aids in understanding complex relationships and presenting findings effectively.
  • Statistical Significance: Assess the statistical significance of relationships. Statistical significance indicates whether observed differences or relationships are likely to be real rather than due to chance.

Quantitative comparative analysis is commonly applied in economics, social sciences, and market research to draw empirical conclusions from numerical data.

Case Studies

Case studies involve in-depth examinations of specific instances or cases to gain insights into real-world scenarios. Comparative case studies allow researchers to compare and contrast multiple cases to identify patterns, differences, and lessons.

  • Narrative Analysis: Case studies often involve narrative analysis, where researchers construct detailed narratives of each case, including context, events, and outcomes.
  • Contextual Understanding: In comparative case studies, it's crucial to consider the context within which each case operates. Understanding the context helps interpret findings accurately.
  • Cross-Case Analysis: Researchers conduct cross-case analysis to identify commonalities and differences across cases. This process can lead to the discovery of factors that influence outcomes.
  • Triangulation: To enhance the validity of findings, researchers may use multiple data sources and methods to triangulate information and ensure reliability.

Case studies are prevalent in fields like psychology, business, and sociology, where deep insights into specific situations are valuable.

SWOT Analysis

SWOT Analysis is a strategic tool used to assess the Strengths, Weaknesses, Opportunities, and Threats associated with a particular entity or situation. While it's commonly used in business, it can be adapted for various comparative analyses.

  • Internal and External Factors: SWOT Analysis examines both internal factors (Strengths and Weaknesses), such as organizational capabilities, and external factors (Opportunities and Threats), such as market conditions and competition.
  • Strategic Planning: The insights from SWOT Analysis inform strategic decision-making. By identifying strengths and opportunities, organizations can leverage their advantages. Likewise, addressing weaknesses and threats helps mitigate risks.
  • Visual Representation: SWOT Analysis is often presented as a matrix or a 2x2 grid, making it visually accessible and easy to communicate to stakeholders.
  • Continuous Monitoring: SWOT Analysis is not a one-time exercise. Organizations use it periodically to adapt to changing circumstances and make informed decisions.

SWOT Analysis is versatile and can be applied in business, healthcare, education, and any context where a structured assessment of factors is needed.

Benchmarking

Benchmarking involves comparing an entity's performance, processes, or practices to those of industry leaders or best-in-class organizations. It's a powerful tool for continuous improvement and competitive analysis.

  • Identify Performance Gaps: Benchmarking helps identify areas where an entity lags behind its peers or industry standards. These performance gaps highlight opportunities for improvement.
  • Data Collection: Gather data on key performance metrics from both internal and external sources. This data collection phase is crucial for meaningful comparisons.
  • Comparative Analysis: Compare your organization's performance data with that of benchmark organizations. This analysis can reveal where you excel and where adjustments are needed.
  • Continuous Improvement: Benchmarking is a dynamic process that encourages continuous improvement. Organizations use benchmarking findings to set performance goals and refine their strategies.

Benchmarking is widely used in business, manufacturing, healthcare, and customer service to drive excellence and competitiveness.

Each of these methodologies brings a unique perspective to comparative analysis, allowing you to choose the one that best aligns with your research objectives and the nature of your data. The choice between qualitative and quantitative methods, or a combination of both, depends on the complexity of the analysis and the questions you seek to answer.

How to Conduct Comparative Analysis?

Once you've prepared your data and chosen an appropriate methodology, it's time to dive into the process of conducting a comparative analysis. We will guide you through the essential steps to extract meaningful insights from your data.

What Is Comparative Analysis and How to Conduct It Examples

1. Identify Key Variables and Metrics

Identifying key variables and metrics is the first crucial step in conducting a comparative analysis. These are the factors or indicators you'll use to assess and compare your options.

  • Relevance to Objectives: Ensure the chosen variables and metrics align closely with your analysis objectives. When comparing marketing strategies, relevant metrics might include customer acquisition cost, conversion rate, and retention.
  • Quantitative vs. Qualitative : Decide whether your analysis will focus on quantitative data (numbers) or qualitative data (descriptive information). In some cases, a combination of both may be appropriate.
  • Data Availability: Consider the availability of data. Ensure you can access reliable and up-to-date data for all selected variables and metrics.
  • KPIs: Key Performance Indicators (KPIs) are often used as the primary metrics in comparative analysis. These are metrics that directly relate to your goals and objectives.

2. Visualize Data for Clarity

Data visualization techniques play a vital role in making complex information more accessible and understandable. Effective data visualization allows you to convey insights and patterns to stakeholders. Consider the following approaches:

  • Charts and Graphs: Use various types of charts, such as bar charts, line graphs, and pie charts, to represent data. For example, a line graph can illustrate trends over time, while a bar chart can compare values across categories.
  • Heatmaps: Heatmaps are particularly useful for visualizing large datasets and identifying patterns through color-coding. They can reveal correlations, concentrations, and outliers.
  • Scatter Plots: Scatter plots help visualize relationships between two variables. They are especially useful for identifying trends, clusters, or outliers.
  • Dashboards: Create interactive dashboards that allow users to explore data and customize views. Dashboards are valuable for ongoing analysis and reporting.
  • Infographics: For presentations and reports, consider using infographics to summarize key findings in a visually engaging format.

Effective data visualization not only enhances understanding but also aids in decision-making by providing clear insights at a glance.

3. Establish Clear Comparative Frameworks

A well-structured comparative framework provides a systematic approach to your analysis. It ensures consistency and enables you to make meaningful comparisons. Here's how to create one:

  • Comparison Matrices: Consider using matrices or spreadsheets to organize your data. Each row represents an option or entity, and each column corresponds to a variable or metric. This matrix format allows for side-by-side comparisons.
  • Decision Trees: In complex decision-making scenarios, decision trees help map out possible outcomes based on different criteria and variables. They visualize the decision-making process.
  • Scenario Analysis: Explore different scenarios by altering variables or criteria to understand how changes impact outcomes. Scenario analysis is valuable for risk assessment and planning.
  • Checklists: Develop checklists or scoring sheets to systematically evaluate each option against predefined criteria. Checklists ensure that no essential factors are overlooked.

A well-structured comparative framework simplifies the analysis process, making it easier to draw meaningful conclusions and make informed decisions.

4. Evaluate and Score Criteria

Evaluating and scoring criteria is a critical step in comparative analysis, as it quantifies the performance of each option against the chosen criteria.

  • Scoring System: Define a scoring system that assigns values to each criterion for every option. Common scoring systems include numerical scales, percentage scores, or qualitative ratings (e.g., high, medium, low).
  • Consistency: Ensure consistency in scoring by defining clear guidelines for each score. Provide examples or descriptions to help evaluators understand what each score represents.
  • Data Collection: Collect data or information relevant to each criterion for all options. This may involve quantitative data (e.g., sales figures) or qualitative data (e.g., customer feedback).
  • Aggregation: Aggregate the scores for each option to obtain an overall evaluation. This can be done by summing the individual criterion scores or applying weighted averages.
  • Normalization: If your criteria have different measurement scales or units, consider normalizing the scores to create a level playing field for comparison.

5. Assign Importance to Criteria

Not all criteria are equally important in a comparative analysis. Weighting criteria allows you to reflect their relative significance in the final decision-making process.

  • Relative Importance: Assess the importance of each criterion in achieving your objectives. Criteria directly aligned with your goals may receive higher weights.
  • Weighting Methods: Choose a weighting method that suits your analysis. Common methods include expert judgment, analytic hierarchy process (AHP), or data-driven approaches based on historical performance.
  • Impact Analysis: Consider how changes in the weights assigned to criteria would affect the final outcome. This sensitivity analysis helps you understand the robustness of your decisions.
  • Stakeholder Input: Involve relevant stakeholders or decision-makers in the weighting process. Their input can provide valuable insights and ensure alignment with organizational goals.
  • Transparency: Clearly document the rationale behind the assigned weights to maintain transparency in your analysis.

By weighting criteria, you ensure that the most critical factors have a more significant influence on the final evaluation, aligning the analysis more closely with your objectives and priorities.

With these steps in place, you're well-prepared to conduct a comprehensive comparative analysis. The next phase involves interpreting your findings, drawing conclusions, and making informed decisions based on the insights you've gained.

Comparative Analysis Interpretation

Interpreting the results of your comparative analysis is a crucial phase that transforms data into actionable insights. We'll delve into various aspects of interpretation and how to make sense of your findings.

  • Contextual Understanding: Before diving into the data, consider the broader context of your analysis. Understand the industry trends, market conditions, and any external factors that may have influenced your results.
  • Drawing Conclusions: Summarize your findings clearly and concisely. Identify trends, patterns, and significant differences among the options or variables you've compared.
  • Quantitative vs. Qualitative Analysis: Depending on the nature of your data and analysis, you may need to balance both quantitative and qualitative interpretations. Qualitative insights can provide context and nuance to quantitative findings.
  • Comparative Visualization: Visual aids such as charts, graphs, and tables can help convey your conclusions effectively. Choose visual representations that align with the nature of your data and the key points you want to emphasize.
  • Outliers and Anomalies: Identify and explain any outliers or anomalies in your data. Understanding these exceptions can provide valuable insights into unusual cases or factors affecting your analysis.
  • Cross-Validation: Validate your conclusions by comparing them with external benchmarks, industry standards, or expert opinions. Cross-validation helps ensure the reliability of your findings.
  • Implications for Decision-Making: Discuss how your analysis informs decision-making. Clearly articulate the practical implications of your findings and their relevance to your initial objectives.
  • Actionable Insights: Emphasize actionable insights that can guide future strategies, policies, or actions. Make recommendations based on your analysis, highlighting the steps needed to capitalize on strengths or address weaknesses.
  • Continuous Improvement: Encourage a culture of continuous improvement by using your analysis as a feedback mechanism. Suggest ways to monitor and adapt strategies over time based on evolving circumstances.

Comparative Analysis Applications

Comparative analysis is a versatile methodology that finds application in various fields and scenarios. Let's explore some of the most common and impactful applications.

Business Decision-Making

Comparative analysis is widely employed in business to inform strategic decisions and drive success. Key applications include:

Market Research and Competitive Analysis

  • Objective: To assess market opportunities and evaluate competitors.
  • Methods: Analyzing market trends, customer preferences, competitor strengths and weaknesses, and market share.
  • Outcome: Informed product development, pricing strategies, and market entry decisions.

Product Comparison and Benchmarking

  • Objective: To compare the performance and features of products or services.
  • Methods: Evaluating product specifications, customer reviews, and pricing.
  • Outcome: Identifying strengths and weaknesses, improving product quality, and setting competitive pricing.

Financial Analysis

  • Objective: To evaluate financial performance and make investment decisions.
  • Methods: Comparing financial statements, ratios, and performance indicators of companies.
  • Outcome: Informed investment choices, risk assessment, and portfolio management.

Healthcare and Medical Research

In the healthcare and medical research fields, comparative analysis is instrumental in understanding diseases, treatment options, and healthcare systems.

Clinical Trials and Drug Development

  • Objective: To compare the effectiveness of different treatments or drugs.
  • Methods: Analyzing clinical trial data, patient outcomes, and side effects.
  • Outcome: Informed decisions about drug approvals, treatment protocols, and patient care.

Health Outcomes Research

  • Objective: To assess the impact of healthcare interventions.
  • Methods: Comparing patient health outcomes before and after treatment or between different treatment approaches.
  • Outcome: Improved healthcare guidelines, cost-effectiveness analysis, and patient care plans.

Healthcare Systems Evaluation

  • Objective: To assess the performance of healthcare systems.
  • Methods: Comparing healthcare delivery models, patient satisfaction, and healthcare costs.
  • Outcome: Informed healthcare policy decisions, resource allocation, and system improvements.

Social Sciences and Policy Analysis

Comparative analysis is a fundamental tool in social sciences and policy analysis, aiding in understanding complex societal issues.

Educational Research

  • Objective: To compare educational systems and practices.
  • Methods: Analyzing student performance, curriculum effectiveness, and teaching methods.
  • Outcome: Informed educational policies, curriculum development, and school improvement strategies.

Political Science

  • Objective: To study political systems, elections, and governance.
  • Methods: Comparing election outcomes, policy impacts, and government structures.
  • Outcome: Insights into political behavior, policy effectiveness, and governance reforms.

Social Welfare and Poverty Analysis

  • Objective: To evaluate the impact of social programs and policies.
  • Methods: Comparing the well-being of individuals or communities with and without access to social assistance.
  • Outcome: Informed policymaking, poverty reduction strategies, and social program improvements.

Environmental Science and Sustainability

Comparative analysis plays a pivotal role in understanding environmental issues and promoting sustainability.

Environmental Impact Assessment

  • Objective: To assess the environmental consequences of projects or policies.
  • Methods: Comparing ecological data, resource use, and pollution levels.
  • Outcome: Informed environmental mitigation strategies, sustainable development plans, and regulatory decisions.

Climate Change Analysis

  • Objective: To study climate patterns and their impacts.
  • Methods: Comparing historical climate data, temperature trends, and greenhouse gas emissions.
  • Outcome: Insights into climate change causes, adaptation strategies, and policy recommendations.

Ecosystem Health Assessment

  • Objective: To evaluate the health and resilience of ecosystems.
  • Methods: Comparing biodiversity, habitat conditions, and ecosystem services.
  • Outcome: Conservation efforts, restoration plans, and ecological sustainability measures.

Technology and Innovation

Comparative analysis is crucial in the fast-paced world of technology and innovation.

Product Development and Innovation

  • Objective: To assess the competitiveness and innovation potential of products or technologies.
  • Methods: Comparing research and development investments, technology features, and market demand.
  • Outcome: Informed innovation strategies, product roadmaps, and patent decisions.

User Experience and Usability Testing

  • Objective: To evaluate the user-friendliness of software applications or digital products.
  • Methods: Comparing user feedback, usability metrics, and user interface designs.
  • Outcome: Improved user experiences, interface redesigns, and product enhancements.

Technology Adoption and Market Entry

  • Objective: To analyze market readiness and risks for new technologies.
  • Methods: Comparing market conditions, regulatory landscapes, and potential barriers.
  • Outcome: Informed market entry strategies, risk assessments, and investment decisions.

These diverse applications of comparative analysis highlight its flexibility and importance in decision-making across various domains. Whether in business, healthcare, social sciences, environmental studies, or technology, comparative analysis empowers researchers and decision-makers to make informed choices and drive positive outcomes.

Comparative Analysis Best Practices

Successful comparative analysis relies on following best practices and avoiding common pitfalls. Implementing these practices enhances the effectiveness and reliability of your analysis.

  • Clearly Defined Objectives: Start with well-defined objectives that outline what you aim to achieve through the analysis. Clear objectives provide focus and direction.
  • Data Quality Assurance: Ensure data quality by validating, cleaning, and normalizing your data. Poor-quality data can lead to inaccurate conclusions.
  • Transparent Methodologies: Clearly explain the methodologies and techniques you've used for analysis. Transparency builds trust and allows others to assess the validity of your approach.
  • Consistent Criteria: Maintain consistency in your criteria and metrics across all options or variables. Inconsistent criteria can lead to biased results.
  • Sensitivity Analysis: Conduct sensitivity analysis by varying key parameters, such as weights or assumptions, to assess the robustness of your conclusions.
  • Stakeholder Involvement: Involve relevant stakeholders throughout the analysis process. Their input can provide valuable perspectives and ensure alignment with organizational goals.
  • Critical Evaluation of Assumptions: Identify and critically evaluate any assumptions made during the analysis. Assumptions should be explicit and justifiable.
  • Holistic View: Take a holistic view of the analysis by considering both short-term and long-term implications. Avoid focusing solely on immediate outcomes.
  • Documentation: Maintain thorough documentation of your analysis, including data sources, calculations, and decision criteria. Documentation supports transparency and facilitates reproducibility.
  • Continuous Learning: Stay updated with the latest analytical techniques, tools, and industry trends. Continuous learning helps you adapt your analysis to changing circumstances.
  • Peer Review: Seek peer review or expert feedback on your analysis. External perspectives can identify blind spots and enhance the quality of your work.
  • Ethical Considerations: Address ethical considerations, such as privacy and data protection, especially when dealing with sensitive or personal data.

By adhering to these best practices, you'll not only improve the rigor of your comparative analysis but also ensure that your findings are reliable, actionable, and aligned with your objectives.

Comparative Analysis Examples

To illustrate the practical application and benefits of comparative analysis, let's explore several real-world examples across different domains. These examples showcase how organizations and researchers leverage comparative analysis to make informed decisions, solve complex problems, and drive improvements:

Retail Industry - Price Competitiveness Analysis

Objective: A retail chain aims to assess its price competitiveness against competitors in the same market.

Methodology:

  • Collect pricing data for a range of products offered by the retail chain and its competitors.
  • Organize the data into a comparative framework, categorizing products by type and price range.
  • Calculate price differentials, averages, and percentiles for each product category.
  • Analyze the findings to identify areas where the retail chain's prices are higher or lower than competitors.

Outcome: The analysis reveals that the retail chain's prices are consistently lower in certain product categories but higher in others. This insight informs pricing strategies, allowing the retailer to adjust prices to remain competitive in the market.

Healthcare - Comparative Effectiveness Research

Objective: Researchers aim to compare the effectiveness of two different treatment methods for a specific medical condition.

  • Recruit patients with the medical condition and randomly assign them to two treatment groups.
  • Collect data on treatment outcomes, including symptom relief, side effects, and recovery times.
  • Analyze the data using statistical methods to compare the treatment groups.
  • Consider factors like patient demographics and baseline health status as potential confounding variables.

Outcome: The comparative analysis reveals that one treatment method is statistically more effective than the other in relieving symptoms and has fewer side effects. This information guides medical professionals in recommending the more effective treatment to patients.

Environmental Science - Carbon Emission Analysis

Objective: An environmental organization seeks to compare carbon emissions from various transportation modes in a metropolitan area.

  • Collect data on the number of vehicles, their types (e.g., cars, buses, bicycles), and fuel consumption for each mode of transportation.
  • Calculate the total carbon emissions for each mode based on fuel consumption and emission factors.
  • Create visualizations such as bar charts and pie charts to represent the emissions from each transportation mode.
  • Consider factors like travel distance, occupancy rates, and the availability of alternative fuels.

Outcome: The comparative analysis reveals that public transportation generates significantly lower carbon emissions per passenger mile compared to individual car travel. This information supports advocacy for increased public transit usage to reduce carbon footprint.

Technology Industry - Feature Comparison for Software Development Tools

Objective: A software development team needs to choose the most suitable development tool for an upcoming project.

  • Create a list of essential features and capabilities required for the project.
  • Research and compile information on available development tools in the market.
  • Develop a comparative matrix or scoring system to evaluate each tool's features against the project requirements.
  • Assign weights to features based on their importance to the project.

Outcome: The comparative analysis highlights that Tool A excels in essential features critical to the project, such as version control integration and debugging capabilities. The development team selects Tool A as the preferred choice for the project.

Educational Research - Comparative Study of Teaching Methods

Objective: A school district aims to improve student performance by comparing the effectiveness of traditional classroom teaching with online learning.

  • Randomly assign students to two groups: one taught using traditional methods and the other through online courses.
  • Administer pre- and post-course assessments to measure knowledge gain.
  • Collect feedback from students and teachers on the learning experiences.
  • Analyze assessment scores and feedback to compare the effectiveness and satisfaction levels of both teaching methods.

Outcome: The comparative analysis reveals that online learning leads to similar knowledge gains as traditional classroom teaching. However, students report higher satisfaction and flexibility with the online approach. The school district considers incorporating online elements into its curriculum.

These examples illustrate the diverse applications of comparative analysis across industries and research domains. Whether optimizing pricing strategies in retail, evaluating treatment effectiveness in healthcare, assessing environmental impacts, choosing the right software tool, or improving educational methods, comparative analysis empowers decision-makers with valuable insights for informed choices and positive outcomes.

Conclusion for Comparative Analysis

Comparative analysis is your compass in the world of decision-making. It helps you see the bigger picture, spot opportunities, and navigate challenges. By defining your objectives, gathering data, applying methodologies, and following best practices, you can harness the power of Comparative Analysis to make informed choices and drive positive outcomes.

Remember, Comparative analysis is not just a tool; it's a mindset that empowers you to transform data into insights and uncertainty into clarity. So, whether you're steering a business, conducting research, or facing life's choices, embrace Comparative Analysis as your trusted guide on the journey to better decisions. With it, you can chart your course, make impactful choices, and set sail toward success.

How to Conduct Comparative Analysis in Minutes?

Are you ready to revolutionize your approach to market research and comparative analysis? Appinio , a real-time market research platform, empowers you to harness the power of real-time consumer insights for swift, data-driven decisions. Here's why you should choose Appinio:

  • Speedy Insights:  Get from questions to insights in minutes, enabling you to conduct comparative analysis without delay.
  • User-Friendly:  No need for a PhD in research – our intuitive platform is designed for everyone, making it easy to collect and analyze data.
  • Global Reach:  With access to over 90 countries and the ability to define your target group from 1200+ characteristics, Appinio provides a worldwide perspective for your comparative analysis

Register now EN

Get free access to the platform!

Join the loop 💌

Be the first to hear about new updates, product news, and data insights. We'll send it all straight to your inbox.

Get the latest market research news straight to your inbox! 💌

Wait, there's more

What is Data Analysis Definition Tools Examples

11.04.2024 | 34min read

What is Data Analysis? Definition, Tools, Examples

What is a Confidence Interval and How to Calculate It

09.04.2024 | 29min read

What is a Confidence Interval and How to Calculate It?

What is Field Research Definition Types Methods Examples

05.04.2024 | 28min read

What is Field Research? Definition, Types, Methods, Examples

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

comparative analysis in quantitative research

Home Market Research Research Tools and Apps

Comparative Analysis: What It Is & How to Conduct It

Comparative analysis compares your site or tool to those of your competitors. It's better to know what your competitors have to offer.

When a business wants to start a marketing campaign or grow, a comparative analysis can give them information that helps them make crucial decisions. This analysis gathers different data sets to compare different options so a business can make good decisions for its customers and itself. If you or your business want to make good decisions, learning about comparative analyses could be helpful. 

In this article, we’ll explain the comparative analysis and its importance. We’ll also learn how to do a good in-depth analysis .

What is comparative analysis?

Comparative analysis is a way to look at two or more similar things to see how they are different and what they have in common. 

It is used in many ways and fields to help people understand the similarities and differences between products better. It can help businesses make good decisions about key issues.

One meaningful way it’s used is when applied to scientific data. Scientific data is information that has been gathered through scientific research and will be used for a certain purpose.

When it is used on scientific data, it determines how consistent and reliable the data is. It also helps scientists make sure their data is accurate and valid.

Importance of comparative analysis 

Comparative analyses are important if you want to understand a problem better or find answers to important questions. Here are the main goals businesses want to reach through comparative analysis.

  • It is a part of the diagnostic phase of business analytics. It can answer many of the most important questions a company may have and help you figure out how to fix problems at the company’s core to improve performance and even make more money.
  • It encourages a deep understanding of the opportunities that apply to specific processes, departments, or business units. This analysis also ensures that we’re addressing the real reasons for performance gaps.
  • It is used a lot because it helps people understand the challenges an organization has faced in the past and the ones it faces now. This method gives objective, fact-based information about performance and ways to improve it.

How to successfully conduct it

Consider using the advice below to carry out a successful comparative analysis:

Conduct research

Before doing an analysis, it’s important to do a lot of research . Research not only gives you evidence to back up your conclusions, but it might also show you something you hadn’t thought of before.

Research could also tell you how your competitors might handle a problem.

Make a list of what’s different and what’s the same.

When comparing two things in a comparative analysis, you need to make a detailed list of the similarities and differences.

Try to figure out how a change to one thing might affect another. Such as how increasing the number of vacation days affects sales, production, or costs. 

A comparative analysis can also help you find outside causes, such as economic conditions or environmental analysis problems.

Describe both sides

Comparative analysis may try to show that one argument or idea is better, but the analysis must cover both sides equally. The analysis shows both sides of the main arguments and claims. 

For example, to compare the benefits and drawbacks of starting a recycling program, one might examine both the positive effects, such as corporate responsibility and the potential negative effects, such as high implementation costs, to make wise, practical decisions or come up with alternate solutions.

Include variables

A thorough comparison unit of analysis is usually more than just a list of pros and cons because it usually considers factors that affect both sides.

Variables can be both things that can’t be changed, like how the weather in the summer affects shipping speeds, and things that can be changed, like when to work with a local shipper.

Do analyses regularly

Comparative analyses are important for any business practice. Consider the different areas and factors that a comparative analysis looks at:

  • Competitors
  • How well do stocks
  • Financial position
  • Profitability
  • Dividends and revenue
  • Development and research

Because a comparative analysis can help more than one department in a company, doing them often can help you keep up with market changes and stay relevant.

We’ve talked about how good a comparative analysis is for your business. But things always have two sides. It is a good workaround, but still do your own user interviews or user tests if you can. 

We hope you have fun doing comparative analyses! Comparative analysis is always a method you like to use, and the point of learning from competitors is to add your own ideas. In this way, you are not just following but also learning and making.

QuestionPro can help you with your analysis process, create and design a survey to meet your goals, and analyze data for your business’s comparative analysis.

At QuestionPro, we give researchers tools for collecting data, like our survey software and a library of insights for all kinds of l ong-term research . If you want to book a demo or learn more about our platform, just click here.

LEARN MORE         FREE TRIAL

MORE LIKE THIS

A/B testing software

Top 13 A/B Testing Software for Optimizing Your Website

Apr 12, 2024

contact center experience software

21 Best Contact Center Experience Software in 2024

Government Customer Experience

Government Customer Experience: Impact on Government Service

Apr 11, 2024

Employee Engagement App

Employee Engagement App: Top 11 For Workforce Improvement 

Apr 10, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Transl Behav Med
  • v.4(2); 2014 Jun

Logo of transbehavmed

Using qualitative comparative analysis to understand and quantify translation and implementation

Heather kane.

RTI International, 3040 Cornwallis Road, Research Triangle Park, P.O. Box 12194, Durham, NC 27709 USA

Megan A Lewis

Pamela a williams, leila c kahwati.

Understanding the factors that facilitate implementation of behavioral medicine programs into practice can advance translational science. Often, translation or implementation studies use case study methods with small sample sizes. Methodological approaches that systematize findings from these types of studies are needed to improve rigor and advance the field. Qualitative comparative analysis (QCA) is a method and analytical approach that can advance implementation science. QCA offers an approach for rigorously conducting translational and implementation research limited by a small number of cases. We describe the methodological and analytic approach for using QCA and provide examples of its use in the health and health services literature. QCA brings together qualitative or quantitative data derived from cases to identify necessary and sufficient conditions for an outcome. QCA offers advantages for researchers interested in analyzing complex programs and for practitioners interested in developing programs that achieve successful health outcomes.

INTRODUCTION

In this paper, we describe the methodological features and advantages of using qualitative comparative analysis (QCA). QCA is sometimes called a “mixed method.” It refers to both a specific research approach and an analytic technique that is distinct from and offers several advantages over traditional qualitative and quantitative methods [ 1 – 4 ]. It can be used to (1) analyze small to medium numbers of cases (e.g., 10 to 50) when traditional statistical methods are not possible, (2) examine complex combinations of explanatory factors associated with translation or implementation “success,” and (3) combine qualitative and quantitative data using a unified and systematic analytic approach.

This method may be especially pertinent for behavioral medicine given the growing interest in implementation science [ 5 ]. Translating behavioral medicine research and interventions into useful practice and policy requires an understanding of the implementation context. Understanding the context under which interventions work and how different ways of implementing an intervention lead to successful outcomes are required for “T3” (i.e., dissemination and implementation of evidence-based interventions) and “T4” translations (i.e., policy development to encourage evidence-based intervention use among various stakeholders) [ 6 , 7 ].

Case studies are a common way to assess different program implementation approaches and to examine complex systems (e.g., health care delivery systems, interventions in community settings) [ 8 ]. However, multiple case studies often have small, naturally limited samples or populations; small samples and populations lack adequate power to support conventional, statistical analyses. Case studies also may use mixed-method approaches, but typically when researchers collect quantitative and qualitative data in tandem, they rarely integrate both types of data systematically in the analysis. QCA offers solutions for the challenges posed by case studies and provides a useful analytic tool for translating research into policy recommendations. Using QCA methods could aid behavioral medicine researchers who seek to translate research from randomized controlled trials into practice settings to understand implementation. In this paper, we describe the conceptual basis of QCA, its application in the health and health services literature, and its features and limitations.

CONCEPTUAL BASIS OF QCA

QCA has its foundations in historical, comparative social science. Researchers in this field developed QCA because probabilistic methods failed to capture the complexity of social phenomena and required large sample sizes [ 1 ]. Recently, this method has made inroads into health research and evaluation [ 9 – 13 ] because of several useful features as follows: (1) it models equifinality , which is the ability to identify more than one causal pathway to an outcome (or absence of the outcome); (2) it identifies conjunctural causation , which means that single conditions may not display their effects on their own, but only in conjunction with other conditions; and (3) it implies asymmetrical relationships between causal conditions and outcomes, which means that causal pathways for achieving the outcome differ from causal pathways for failing to achieve the outcome.

QCA is a case-oriented approach that examines relationships between conditions (similar to explanatory variables in regression models) and an outcome using set theory; a branch of mathematics or of symbolic logic that deals with the nature and relations of sets. A set-theoretic approach to modeling causality differs from probabilistic methods, which examines the independent, additive influence of variables on an outcome. Regression models, based on underlying assumptions about sampling and distribution of the data, ask “what factor, holding all other factors constant at each factor’s average, will increase (or decrease) the likelihood of an outcome .” QCA, an approach based on the examination of set, subset, and superset relationships, asks “ what conditions —alone or in combination with other conditions—are necessary or sufficient to produce an outcome .” For additional QCA definitions, see Ragin [ 4 ].

Necessary conditions are those that exhibit a superset relationship with the outcome set and are conditions or combinations of conditions that must be present for an outcome to occur. In assessing necessity, a researcher “identifies conditions shared by cases with the same outcome” [ 4 ] (p. 20). Figure  1 shows a hypothetical example. In this figure, condition X is a necessary condition for an effective intervention because all cases with condition X are also members of the set of cases with the outcome present; however, condition X is not sufficient for an effective intervention because it is possible to be a member of the set of cases with condition X, but not be a member of the outcome set [ 14 ].

An external file that holds a picture, illustration, etc.
Object name is 13142_2014_251_Fig1_HTML.jpg

Necessary and sufficient conditions and set-theoretic relationships

Sufficient conditions exhibit subset relationships with an outcome set and demonstrate that “the cause in question produces the outcome in question” [ 3 ] (p. 92). Figure  1 shows the multiple and different combinations of conditions that produce the hypothetical outcome, “effective intervention,” (1) by having condition A present, (2) by having condition D present, or (3) by having the combination of conditions B and C present. None of these conditions is necessary and any one of these conditions or combinations of conditions is sufficient for the outcome of an effective intervention.

QCA AS AN APPROACH AND AS AN ANALYTIC TECHNIQUE

The term “QCA” is sometimes used to refer to the comparative research approach but also refers to the “analytic moment” during which Boolean algebra and set theory logic is applied to truth tables constructed from data derived from included cases. Figure  2 characterizes this distinction. Although this figure depicts steps as sequential, like many research endeavors, these steps are somewhat iterative, with respecification and reanalysis occurring along the way to final findings. We describe each of the essential steps of QCA as an approach and analytic technique and provide examples of how it has been used in health-related research.

An external file that holds a picture, illustration, etc.
Object name is 13142_2014_251_Fig2_HTML.jpg

QCA as an approach and as an analytic technique

Operationalizing the research question

Like other types of studies, the first step involves identifying the research question(s) and developing a conceptual model. This step guides the study as a whole and also informs case, condition (c.f., variable), and outcome selection. As mentioned above, QCA frames research questions differently than traditional quantitative or qualitative methods. Research questions appropriate for a QCA approach would seek to identify the necessary and sufficient conditions required to achieve the outcome. Thus, formulating a QCA research question emphasizes what program components or features—individually or in combination—need to be in place for a program or intervention to have a chance at being effective (i.e., necessary conditions) and what program components or features—individually or in combination—would produce the outcome (i.e., sufficient conditions). For example, a set theoretic hypothesis would be as follows: If a program is supported by strong organizational capacity and a comprehensive planning process, then the program will be successful. A hypothesis better addressed by probabilistic methods would be as follows: Organizational capacity, holding all other factors constant, increases the likelihood that a program will be successful.

For example, Longest and Thoits [ 15 ] drew on an extant stress process model to assess whether the pathways leading to psychological distress differed for women and men. Using QCA was appropriate for their study because the stress process model “suggests that particular patterns of predictors experienced in tandem may have unique relationships with health outcomes” (p. 4, italics added). They theorized that predictors would exhibit effects in combination because some aspects of the stress process model would buffer the risk of distress (e.g., social support) while others simultaneously would increase the risk (e.g., negative life events).

Identify cases

The number of cases in a QCA analysis may be determined by the population (e.g., 10 intervention sites, 30 grantees). When particular cases can be chosen from a larger population, Berg-Schlosser and De Meur [ 16 ] offer other strategies and best practices for choosing cases. Unless the number of cases relies on an existing population (i.e., 30 programs or grantees), the outcome of interest and existing theory drive case selection, unlike variable-oriented research [ 3 , 4 ] in which numbers are driven by statistical power considerations and depend on variation in the dependent variable. For use in causal inference, both cases that exhibit and do not exhibit the outcome should be included [ 16 ]. If a researcher is interested in developing typologies or concept formation, he or she may wish to examine similar cases that exhibit differences on the outcome or to explore cases that exhibit the same outcome [ 14 , 16 ].

For example, Kahwati et al. [ 9 ] examined the structure, policies, and processes that might lead to an effective clinical weight management program in a large national integrated health care system, as measured by mean weight loss among patients treated at the facility. To examine pathways that lead to both better and poorer facility-level weight loss, 11 facilities from among those with the largest weight loss outcomes and 11 facilities from among those with the smallest were included. By choosing cases based on specific outcomes, Kahwati et al. could identify multiple patterns of success (or failure) that explain the outcome rather than the variability associated with the outcome.

Identify conditions and outcome sets

Selecting conditions relies on the research question, conceptual model, and number of cases similar to other research methods. Conditions (or “sets” or “condition sets”) refer to the explanatory factors in a model; they are similar to variables. Because QCA research questions assess necessary and sufficient conditions, a researcher should consider which conditions in the conceptual model would theoretically produce the outcome individually or in combination. This helps to focus the analysis and number of conditions. Ideally, for a case study design with a small (e.g., 10–15) or intermediate (e.g., 16–100) number of cases, one should aim for fewer than five conditions because in QCA a researcher assesses all possible configurations of conditions. Adding conditions to the model increases the possible number of combinations exponentially (i.e., 2 k , where k = the number of conditions). For three conditions, eight possible combinations of the selected conditions exist as follows: the presence of A, B, C together, the lack of A with B and C present, the lack of A and lack of B with C present, and so forth. Having too many conditions will likely mean that no cases fall into a particular configuration, and that configuration cannot be assessed by empirical examples. When one or more configurations are not represented by the cases, this is known as limited diversity, and QCA experts suggest multiple strategies for managing such situations [ 4 , 14 ].

For example, Ford et al. [ 10 ] studied health departments’ implementation of core public health functions and organizational factors (e.g., resource availability, adaptability) and how those conditions lead to superior and inferior population health changes. They operationalized three core public functions (i.e., assessment of environmental and population public health needs, capacity for policy development, and authority over assurance of healthcare operations) and operationalized those for their study by using composite measures of varied health indicators compiled in a UnitedHealth Group report. In this examination of 41 state health departments, the authors found that all three core public health functions were necessary for population health improvement. The absence of any of the core public health functions was sufficient for poorer population health outcomes; thus, only the health departments with the ability to perform all three core functions had improved outcomes. Additionally, these three core functions in combination with either resource availability or adaptability were sufficient combinations (i.e., causal pathways) for improved population health outcomes.

Calibrate condition and outcome sets

Calibration refers to “adjusting (measures) so that they match or conform to dependably known standards” and is a common way of standardizing data in the physical sciences [ 4 ] (p. 72). Calibration requires the researcher to make sense of variation in the data and apply expert knowledge about what aspects of the variation are meaningful. Because calibration depends on defining conditions based on those “dependably known standards,” QCA relies on expert substantive knowledge, theory, or criteria external to the data themselves [ 14 ]. This may require researchers to collaborate closely with program implementers.

In QCA, one can use “crisp” set or “fuzzy” set calibration. Crisp sets, which are similar to dichotomous categorical variables in regression, establish decision rules defining a case as fully in the set (i.e., condition) or fully out of the set; fuzzy sets establish degrees of membership in a set. Fuzzy sets “differentiate between different levels of belonging anchored by two extreme membership scores at 1 and 0” [ 14 ] (p.28). They can be continuous (0, 0.1, 0.2,..) or have qualitatively defined anchor points (e.g., 0 is fully out of the set; 0.33 is more out than in the set; 0.66 is more in than out of the set; 1 is fully in the set). A researcher selects fuzzy sets and the corresponding resolution (i.e., continuous, four cutoff points, six cutoff) based on theory and meaningful differences between cases and must be able to provide a verbal description for each cutoff point [ 14 ]. If, for example, a researcher cannot distinguish between 0.7 and 0.8 membership in a set, then a more continuous scoring of cases would not be useful, rather a four point cutoff may better characterize the data. Although crisp and fuzzy sets are more commonly used, new multivariate forms of QCA are emerging as are variants that incorporate elements of time [ 14 , 17 , 18 ].

Fuzzy sets have the advantage of maintaining more detail for data with continuous values. However, this strength also makes interpretation more difficult. When an observation is coded with fuzzy sets, a particular observation has some degree of membership in the set “condition A” and in the set “condition NOT A.” Thus, when doing analyses to identify sufficient conditions, a researcher must make a judgment call on what benchmark constitutes recommendation threshold for policy or programmatic action.

In creating decision rules for calibration, a researcher can use a variety of techniques to identify cutoff points or anchors. For qualitative conditions, a researcher can define decision rules by drawing from the literature and knowledge of the intervention context. For conditions with numeric values, a researcher can also employ statistical approaches. Ideally, when using statistical approaches, a researcher should establish thresholds using substantive knowledge about set membership (thus, translating variation into meaningful categories). Although measures of central tendency (e.g., cases with a value above the median are considered fully in the set) can be used to set cutoff points, some experts consider the sole use of this method to be flawed because case classification is determined by a case’s relative value in regard to other cases as opposed to its absolute value in reference to an external referent [ 14 ].

For example, in their study of National Cancer Institutes’ Community Clinical Oncology Program (NCI CCOP), Weiner et al. [ 19 ] had numeric data on their five study measures. They transformed their study measures by using their knowledge of the CCOP and by asking NCI officials to identify three values: full membership in a set, a point of maximum ambiguity, and nonmembership in the set. For their outcome set, high accrual in clinical trials, they established 100 patients enrolled accrual as fully in the set of high accrual, 70 as a point of ambiguity (neither in nor out of the set), and 50 and below as fully out of the set because “CCOPs must maintain a minimum of 50 patients to maintain CCOP funding” (p. 288). By using QCA and operationalizing condition sets in this way, they were able to answer what condition sets produce high accrual, not what factors predict more accrual. The advantage is that by using this approach and analytic technique, they were able to identify sets of factors that are linked with a very specific outcome of interest.

Obtain primary or secondary data

Data sources vary based on the study, availability of the data, and feasibility of data collection; data can be qualitative or quantitative, a feature useful for mixed-methods studies and systematically integrating these different types of data is a major strength of this approach. Qualitative data include program documents and descriptions, key informant interviews, and archival data (e.g., program documents, records, policies); quantitative data consists of surveys, surveillance or registry data, and electronic health records.

For instance, Schensul et al. [ 20 ] relied on in-depth interviews for their analysis; Chuang et al. [ 21 ] and Longest and Thoits [ 15 ] drew on survey data for theirs. Kahwati et al. [ 9 ] used a mixed-method approach combining data from key informant interviews, program documents, and electronic health records. Any type of data can be used to inform the calibration of conditions.

Assign set membership scores

Assigning set membership scores involves applying the decision rules that were established during the calibration phase. To accomplish this, the research team should then use the extracted data for each case, apply the decision rule for the condition, and discuss discrepancies in the data sources. In their study of factors that influence health care policy development in Florida, Harkreader and Imershein [ 22 ] coded contextual factors that supported state involvement in the health care market. Drawing on a review of archival data and using crisp set coding, they assigned a value of 1 for the presence of a contextual factor (e.g., presence of federal financial incentives promoting policy, unified health care provider policy position in opposition to state policy, state agency supporting policy position) and 0 for the absence of a contextual factor.

Construct truth table

After completing the coding, researchers create a “truth table” for analysis. A truth table lists all of the possible configurations of conditions, the number of cases that fall into that configuration, and the “consistency” of the cases. Consistency quantifies the extent to which cases that share similar conditions exhibit the same outcome; in crisp sets, the consistency value is the proportion of cases that exhibit the outcome. Fuzzy sets require a different calculation to establish consistency and are described at length in other sources [ 1 – 4 , 14 ]. Table  1 displays a hypothetical truth table for three conditions using crisp sets.

Sample of a hypothetical truth table for crisp sets

1 fully in the set, 0 fully out of the set

QCA AS AN ANALYTIC TECHNIQUE

The research steps to this point fall into QCA as an approach to understanding social and health phenomena. Analysis of the truth table is the sine qua non of QCA as an analytic technique. In this section, we provide an overview of the analysis process, but analytic techniques and emerging forms of analysis are described in multiple texts [ 3 , 4 , 14 , 17 ]. The use of computer software to conduct truth table analysis is recommended and several software options are available including Stata, fsQCA, Tosmana, and R.

A truth table analysis first involves the researcher assessing which (if any) conditions are individually necessary or sufficient for achieving the outcome, and then second, examining whether any configurations of conditions are necessary or sufficient. In instances where contradictions in outcomes from the same configuration pattern occur (i.e., one case from a configuration has the outcome; one does not), the researcher should also consider whether the model is properly specified and conditions are calibrated accurately. Thus, this stage of the analysis may reveal the need to review how conditions are defined and whether the definition should be recalibrated. Similar to qualitative and quantitative research approaches, analysis is iterative.

Additionally, the researcher examines the truth table to assess whether all logically possible configurations have empiric cases. As described above, when configurations lack cases, the problem of limited diversity occurs. Configurations without representative cases are known as logical remainders, and the researcher must consider how to deal with those. The analysis of logical remainders depends on the particular theory guiding the research and the research priorities. How a researcher manages the logical remainders has implications for the final solution, but none of the solutions based on the truth table will contradict the empirical evidence [ 14 ]. To generate the most conservative solution term, a researcher makes no assumptions about truth table rows with no cases (or very few cases in larger N studies) and excludes them from the logical minimization process. Alternately, a researcher can choose to include (or exclude) rows with no cases from analysis, which would generate a solution that is a superset of the conservative solution. Choosing inclusion criteria for logical remainders also depends on theory and what may be empirically possible. For example, in studying governments, it would be unlikely to have a case that is a democracy (“condition A”), but has a dictator (“condition B”). In that circumstance, the researcher may choose to exclude that theoretically implausible row from the logical minimization process.

Third, once all the solutions have been identified, the researcher mathematically reduces the solution [ 1 , 14 ]. For example, if the list of solutions contains two identical configurations, except that in one configuration A is absent and in the other A is present, then A can be dropped from those two solutions. Finally, the researcher computes two parameters of fit: coverage and consistency. Coverage determines the empirical relevance of a solution and quantifies the variation in causal pathways to an outcome [ 14 ]. When coverage of a causal pathway is high, the more common the solution is, and more of the outcome is accounted for by the pathway. However, maximum coverage may be less critical in implementation research because understanding all of the pathways to success may be as helpful as understanding the most common pathway. Consistency assesses whether the causal pathway produces the outcome regularly (“the degree to which the empirical data are in line with a postulated subset relation,” p. 324 [ 14 ]); a high consistency value (e.g., 1.00 or 100 %) would indicate that all cases in a causal pathway produced the outcome. A low consistency value would suggest that a particular pathway was not successful in producing the outcome on a regular basis, and thus, for translational purposes, should not be recommended for policy or practice changes. A causal pathway with high consistency and coverage values indicates a result useful for providing guidance; a high consistency with a lower coverage score also has value in showing a causal pathway that successfully produced the outcome, but did so less frequently.

For example, Kahwati et al. [ 9 ] examined their truth table and analyzed the data for single conditions and combinations of conditions that were necessary for higher or lower facility-level patient weight loss outcomes. The truth table analysis revealed two necessary conditions and four sufficient combinations of conditions. Because of significant challenges with logical remainders, they used a bottom-up approach to assess whether combinations of conditions yielded the outcome. This entailed pairing conditions to ensure parsimony and maximize coverage. With a smaller number of conditions, a researcher could hypothetically find that more cases share similar characteristics and could assess whether those cases exhibit the same outcome of interest.

At the completion of the truth table analysis, Kahwati et al. [ 9 ] used the qualitative data from site interviews to provide rich examples to illustrate the QCA solutions that were identified, which explained what the solutions meant in clinical practice for weight management. For example, having an involved champion (usually a physician), in combination with low facility accountability, was sufficient for program success (i.e., better weight loss outcomes) and was related to better facility weight loss. In reviewing the qualitative data, Kahwati et al. [ 9 ] discovered that involved champions integrate program activities into their clinical routines and discuss issues as they arise with other program staff. Because involved champions and other program staff communicated informally on a regular basis, formal accountability structures were less of a priority.

ADVANTAGES AND LIMITATIONS OF QCA

Because translational (and other health-related) researchers may be interested in which intervention features—alone or in combination—achieve distinct outcomes (e.g., achievement of program outcomes, reduction in health disparities), QCA is well suited for translational research. To assess combinations of variables in regression, a researcher relies on interaction effects, which, although useful, become difficult to interpret when three, four, or more variables are combined. Furthermore, in regression and other variable-oriented approaches, independent variables are held constant at the average across the study population to isolate the independent effect of that variable, but this masks how factors may interact with each other in ways that impact the ultimate outcomes. In translational research, context matters and QCA treats each case holistically, allowing each case to keep its own values for each condition.

Multiple case studies or studies with the organization as the unit of analysis often involve a small or intermediate number of cases. This hinders the use of standard statistical analyses; researchers are less likely to find statistical significance with small sample sizes. However, QCA draws on analyses of set relations to support small-N studies and to identify the conditions or combinations of conditions that are necessary or sufficient for an outcome of interest and may yield results when probabilistic methods cannot.

Finally, QCA is based on an asymmetric concept of causation , which means that the absence of a sufficient condition associated with an outcome does not necessarily describe the causal pathway to the absence of the outcome [ 14 ]. These characteristics can be helpful for translational researchers who are trying to study or implement complex interventions, where more than one way to implement a program might be effective and where studying both effective and ineffective implementation practices can yield useful information.

QCA has several limitations that researchers should consider before choosing it as a potential methodological approach. With small- and intermediate-N studies, QCA must be theory-driven and circumscribed by priority questions. That is, a researcher ideally should not use a “kitchen sink” approach to test every conceivable condition or combination of conditions because the number of combinations increases exponentially with the addition of another condition. With a small number of cases and too many conditions, the sample would not have enough cases to provide examples of all the possible configurations of conditions (i.e., limited diversity), or the analysis would be constrained to describing the characteristics of the cases, which would have less value than determining whether some conditions or some combination of conditions led to actual program success. However, if the number of conditions cannot be reduced, alternate QCA techniques, such as a bottom-up approach to QCA or two-step QCA, can be used [ 14 ].

Another limitation is that programs or clinical interventions involved in a cross-site analysis may have unique programs that do not seem comparable. Cases must share some degree of comparability to use QCA [ 16 ]. Researchers can manage this challenge by taking a broader view of the program(s) and comparing them on broader characteristics or concepts, such as high/low organizational capacity, established partnerships, and program planning, if these would provide meaningful conclusions. Taking this approach will require careful definition of each of these concepts within the context of a particular initiative. Definitions may also need to be revised as the data are gathered and calibration begins.

Finally, as mentioned above, crisp set calibration dichotomizes conditions of interest; this form of calibration means that in some cases, the finer grained differences and precision in a condition may be lost [ 3 ]. Crisp set calibration provides more easily interpretable and actionable results and is appropriate if researchers are primarily interested in the presence or absence of a particular program feature or organizational characteristic to understand translation or implementation.

QCA offers an additional methodological approach for researchers to conduct rigorous comparative analyses while drawing on the rich, detailed data collected as part of a case study. However, as Rihoux, Benoit, and Ragin [ 17 ] note, QCA is not a miracle method, nor a panacea for all studies that use case study methods. Furthermore, it may not always be the most suitable approach for certain types of translational and implementation research. We outlined the multiple steps needed to conduct a comprehensive QCA. QCA is a good approach for the examination of causal complexity, and equifinality could be helpful to behavioral medicine researchers who seek to translate evidence-based interventions in real-world settings. In reality, multiple program models can lead to success, and this method accommodates a more complex and varied understanding of these patterns and factors.

Implications

Practice : Identifying multiple successful intervention models (equifinality) can aid in selecting a practice model relevant to a context, and can facilitate implementation.

Policy : QCA can be used to develop actionable policy information for decision makers that accommodates contextual factors.

Research : Researchers can use QCA to understand causal complexity in translational or implementation research and to assess the relationships between policies, interventions, or procedures and successful outcomes.

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • Qualitative vs. Quantitative Research | Differences, Examples & Methods

Qualitative vs. Quantitative Research | Differences, Examples & Methods

Published on April 12, 2019 by Raimo Streefkerk . Revised on June 22, 2023.

When collecting and analyzing data, quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings. Both are important for gaining different kinds of knowledge.

Common quantitative methods include experiments, observations recorded as numbers, and surveys with closed-ended questions.

Quantitative research is at risk for research biases including information bias , omitted variable bias , sampling bias , or selection bias . Qualitative research Qualitative research is expressed in words . It is used to understand concepts, thoughts or experiences. This type of research enables you to gather in-depth insights on topics that are not well understood.

Common qualitative methods include interviews with open-ended questions, observations described in words, and literature reviews that explore concepts and theories.

Table of contents

The differences between quantitative and qualitative research, data collection methods, when to use qualitative vs. quantitative research, how to analyze qualitative and quantitative data, other interesting articles, frequently asked questions about qualitative and quantitative research.

Quantitative and qualitative research use different research methods to collect and analyze data, and they allow you to answer different kinds of research questions.

Qualitative vs. quantitative research

Quantitative and qualitative data can be collected using various methods. It is important to use a data collection method that will help answer your research question(s).

Many data collection methods can be either qualitative or quantitative. For example, in surveys, observational studies or case studies , your data can be represented as numbers (e.g., using rating scales or counting frequencies) or as words (e.g., with open-ended questions or descriptions of what you observe).

However, some methods are more commonly used in one type or the other.

Quantitative data collection methods

  • Surveys :  List of closed or multiple choice questions that is distributed to a sample (online, in person, or over the phone).
  • Experiments : Situation in which different types of variables are controlled and manipulated to establish cause-and-effect relationships.
  • Observations : Observing subjects in a natural environment where variables can’t be controlled.

Qualitative data collection methods

  • Interviews : Asking open-ended questions verbally to respondents.
  • Focus groups : Discussion among a group of people about a topic to gather opinions that can be used for further research.
  • Ethnography : Participating in a community or organization for an extended period of time to closely observe culture and behavior.
  • Literature review : Survey of published works by other authors.

A rule of thumb for deciding whether to use qualitative or quantitative data is:

  • Use quantitative research if you want to confirm or test something (a theory or hypothesis )
  • Use qualitative research if you want to understand something (concepts, thoughts, experiences)

For most research topics you can choose a qualitative, quantitative or mixed methods approach . Which type you choose depends on, among other things, whether you’re taking an inductive vs. deductive research approach ; your research question(s) ; whether you’re doing experimental , correlational , or descriptive research ; and practical considerations such as time, money, availability of data, and access to respondents.

Quantitative research approach

You survey 300 students at your university and ask them questions such as: “on a scale from 1-5, how satisfied are your with your professors?”

You can perform statistical analysis on the data and draw conclusions such as: “on average students rated their professors 4.4”.

Qualitative research approach

You conduct in-depth interviews with 15 students and ask them open-ended questions such as: “How satisfied are you with your studies?”, “What is the most positive aspect of your study program?” and “What can be done to improve the study program?”

Based on the answers you get you can ask follow-up questions to clarify things. You transcribe all interviews using transcription software and try to find commonalities and patterns.

Mixed methods approach

You conduct interviews to find out how satisfied students are with their studies. Through open-ended questions you learn things you never thought about before and gain new insights. Later, you use a survey to test these insights on a larger scale.

It’s also possible to start with a survey to find out the overall trends, followed by interviews to better understand the reasons behind the trends.

Qualitative or quantitative data by itself can’t prove or demonstrate anything, but has to be analyzed to show its meaning in relation to the research questions. The method of analysis differs for each type of data.

Analyzing quantitative data

Quantitative data is based on numbers. Simple math or more advanced statistical analysis is used to discover commonalities or patterns in the data. The results are often reported in graphs and tables.

Applications such as Excel, SPSS, or R can be used to calculate things like:

  • Average scores ( means )
  • The number of times a particular answer was given
  • The correlation or causation between two or more variables
  • The reliability and validity of the results

Analyzing qualitative data

Qualitative data is more difficult to analyze than quantitative data. It consists of text, images or videos instead of numbers.

Some common approaches to analyzing qualitative data include:

  • Qualitative content analysis : Tracking the occurrence, position and meaning of words or phrases
  • Thematic analysis : Closely examining the data to identify the main themes and patterns
  • Discourse analysis : Studying how communication works in social contexts

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Chi square goodness of fit test
  • Degrees of freedom
  • Null hypothesis
  • Discourse analysis
  • Control groups
  • Mixed methods research
  • Non-probability sampling
  • Quantitative research
  • Inclusion and exclusion criteria

Research bias

  • Rosenthal effect
  • Implicit bias
  • Cognitive bias
  • Selection bias
  • Negativity bias
  • Status quo bias

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to systematically measure variables and test hypotheses . Qualitative methods allow you to explore concepts and experiences in more detail.

In mixed methods research , you use both qualitative and quantitative data collection and analysis methods to answer your research question .

The research methods you use depend on the type of data you need to answer your research question .

  • If you want to measure something or test a hypothesis , use quantitative methods . If you want to explore ideas, thoughts and meanings, use qualitative methods .
  • If you want to analyze a large amount of readily-available data, use secondary data. If you want data specific to your purposes with control over how it is generated, collect primary data.
  • If you want to establish cause-and-effect relationships between variables , use experimental methods. If you want to understand the characteristics of a research subject, use descriptive methods.

Data collection is the systematic process by which observations or measurements are gathered in research. It is used in many different contexts by academics, governments, businesses, and other organizations.

There are various approaches to qualitative data analysis , but they all share five steps in common:

  • Prepare and organize your data.
  • Review and explore your data.
  • Develop a data coding system.
  • Assign codes to the data.
  • Identify recurring themes.

The specifics of each step depend on the focus of the analysis. Some common approaches include textual analysis , thematic analysis , and discourse analysis .

A research project is an academic, scientific, or professional undertaking to answer a research question . Research projects can take many forms, such as qualitative or quantitative , descriptive , longitudinal , experimental , or correlational . What kind of research approach you choose will depend on your topic.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Streefkerk, R. (2023, June 22). Qualitative vs. Quantitative Research | Differences, Examples & Methods. Scribbr. Retrieved April 13, 2024, from https://www.scribbr.com/methodology/qualitative-quantitative-research/

Is this article helpful?

Raimo Streefkerk

Raimo Streefkerk

Other students also liked, what is quantitative research | definition, uses & methods, what is qualitative research | methods & examples, mixed methods research | definition, guide & examples, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Comparative Analysis of Qualitative And Quantitative Research

Profile image of SUBHAJIT PANDA

2019, M.Lib.I.Sc. Project, Panjab University, under guidance of Dr. Shiv Kumar

There's no hard and fast rule for qualitative versus quantitative research, and it's often taken for granted. It is claimed here that the divide between qualitative and quantitative research is ambiguous, incoherent, and hence of little value, and that its widespread use could have negative implications. This conclusion is supported by a variety of arguments. Qualitative researchers, for example, have varying perspectives on fundamental problems (such as the use of quantification and causal analysis), which makes the difference as such shaky. In addition, many elements of qualitative and quantitative research overlap significantly, making it difficult to distinguish between the two. Practically in the case of field research, the Qualitative and quantitative approach can't be distinguished clearly as the study pointed. The distinction may limit innovation in the development of new research methodologies, as well as cause complication and wasteful activity. As a general rule, it may be desirable not to conceptualise research approaches at such abstract levels as are done in the context of qualitative or quantitative methodologies. Discussions of the benefits and drawbacks of various research methods, rather than general research questions, are recommended.

Related Papers

Forum Qualitative Sozialforschung Forum Qualitative Social Research

Margrit Schreier

comparative analysis in quantitative research

Fernando Almeida

Scientific research adopts qualitative and quantitative methodologies in the modeling and analysis of numerous phenomena. The qualitative methodology intends to understand a complex reality and the meaning of actions in a given context. On the other hand, the quantitative methodology seeks to obtain accurate and reliable measurements that allow a statistical analysis. Both methodologies offer a set of methods, potentialities and limitations that must be explored and known by researchers. This paper concisely maps a total of seven qualitative methods and five quantitative methods. A comparative analysis of the most relevant and adopted methods is done to understand the main strengths and limitations of them. Additionally, the work developed intends to be a fundamental reference for the accomplishment of a research study, in which the researcher intends to adopt a qualitative or quantitative methodology. Through the analysis of the advantages and disadvantages of each method, it becomes possible to formulate a more accurate, informed and complete choice.

IRJET Journal

Research design methods, such as qualitative, quantitative as well as mixed methods were introduced and subsequently each method was discussed in detail with the help of literature review as well as some personal and live examples to substantiate the findings of various literature. From various literature as well as from the own experiences, it is concluded that both qualitative research design method and quantitative research design method are equally important. It is not fair to criticize one method as the researcher is inclined towards the other method. It is practically evidenced that usage of both methods in the research, the researcher can substantiate the case better. However, duration part while using mixed methods to be kept in mind as it will take more time compared to the qualitative and quantitative methods. Hurrying and aborting in the middle due to time constraint ultimately result in poor research. It would be better if the world view towards these methods changes from criticizing mode to effective utilization mode, which will help research community in focusing and bring up better research outcomes rather than wasting time in arguing which method is scientifically acceptable and which method is biased. While I agree that the ontological, epistemological, axiological, and methodological assumptions for qualitative research method and quantitative research method, researchers should know fully about these methods and keep them as effective tools to utilize them in mixed mode, wherever it is appropriate and required to arrive at adequate research findings.

iosrjournals.org

UNICAF University - Zambia

Ivan Steenkamp

Gregoire Nleme

In this paper, as a practitioner, I describe quantitative and qualitative inquiries. I explain some of their differences and similarities. I emphasize most important differences that researchers should consider when selecting one of the methods or when running a mixed study. I list differences in worldviews, research design, research processes, reliability assurance, and validity assurance. I propose a process flowchart for each type of inquiry. The purpose of the essay is to give researchers and primarily doctorate students a review of differences and similarities between the two methods of inquiry.

Ziabur Rahman

Munyaradzi Moyo

Marta Costa

International Journal of Social Research Methodology

Julia Brannen

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

Business intelligence and business analytics in tourism: insights through Gioia methodology

  • Open access
  • Published: 12 April 2024

Cite this article

You have full access to this open access article

  • Montserrat Jiménez-Partearroyo   ORCID: orcid.org/0000-0003-2070-258X 1 ,
  • Ana Medina-López   ORCID: orcid.org/0000-0003-1956-1839 2 &
  • Sudhir Rana   ORCID: orcid.org/0000-0002-9910-1930 3  

35 Accesses

Explore all metrics

Although Business Intelligence (BI) and Business Analytics (BA) have been widely adopted in the tourism sector, comparative research using BI and BA remains scarce. To fill this gap in the literature, the present study explores how BI and BA contribute to strategic innovation, address operational challenges, and enhance customer engagement. To this end, using a dual-method approach that incorporates both quantitative and qualitative methodologies, we first conduct a bibliometric analysis using SciMAT. This sets the stage for the subsequent application of the Gioia methodology. Specifically, we perform an in-depth qualitative examination of a total of 12 scholarly articles on the tourism sector, evenly split between BI and BA. Upon synthesizing the findings on the roles of BI and BA, we outline distinct pathways through which they influence tourism sector management solutions. Based on the obtained evidence, we argue that, while BI focuses on technological advancement and operational integration, BA is more aligned with predictive analytics and data-driven customer engagement. These insights provide managers with a better understanding of the roles of BI and BA, serving as a guide for their strategic applications, from improving service quality to innovating in customer engagement. The novelty of this approach lies in its use of the Gioia methodology, in a comparative analysis to evaluate the separate yet complementarily roles of BI and BA, and in enhancing tourism industry practices.

Avoid common mistakes on your manuscript.

Introduction

Technological innovation improves efficiency, provides society with new and enhanced goods and services through economic reforms, and improves their living conditions (Wen & Okolo, 2023 ). In recent decades, the most extensively advancing technological branch is related to digital technologies.

As argued by Rawal et al. ( 2023 ), technological advances, in particular digitalization, improve efficiency and quality of service in various sectors. In this context, a particularly important role is played by data analysis and visualization. Rawal et al. ( 2023 ) noted the importance of proper data analysis and visualization to enhance customer experiences and decision-making processes, particularly in the service sector. By extrapolating from their conclusions, the transformative effect of Business Intelligence (BI) and Business Analytics (BA) in the tourism sector becomes apparent.

In the tourism sector, digital technology has given rise to the production of large amounts of data. When processed using advanced BA and BI tools, these data transform into key strategic assets for decision making in tourism as a highly dynamic sector. Although complex, this process facilitates the identification of new opportunities and generation of knowledge for tourism companies seeking superior performance and differentiation in the market. In this respect, Saura et al. ( 2023a ) argued that the decision of shifting to these data applications can bring the same results as open innovation models, which are essential for the development of successful innovation strategies.

It can reasonably be expected that BI and BA may become key channels for making this happen, leading the way in turning data into a valuable resource. Tourism industry moves towards a future that relies on data and BI and BA frequently come together in academic discussions. However, how they are used in tourism requires further investigation.

In this connection, available literature posits that, despite BI and BA being subjects examined in academic research, there remains certain ambiguity regarding their application in scholarly research. Indeed, these two terms are frequently used interchangeably, which overlooks the clear differences between BI’s traditional focus on descriptive analytics and BA’s emphasis on prescriptive and predictive analytics. This is obvious in articles that, despite their titles suggesting a focus on BA, predominantly focus on BI (or vice versa), or discuss both concepts without detailing their differentiations (Chen et al., 2023 ; Yiu et al., 2021 ). This impedes theoretical understanding and hinders an accurate use of these data-focused methods in the practical domain of tourism management.

Seeking to connect the theoretical and real-world gap, in this study, we initiate a process of clarification. To this end, we first conduct a deep bibliometric study, which is then followed by applying the Gioia qualitative methodology (Gioia et al., 2013 ) to distill the essence of BI and BA from secondary data comprising a handpicked selection of 12 research articles. Based on the findings, we identify the core themes and then define the conceptual distinctions between BI and BA within the tourism industry. This innovative approach, which uses real-life data, allows for a thorough synthesis that extends beyond the limited scope of single case studies, and thus provides a broad understanding of the conceptual domain.

Our principal aim in this study is to reveal patterns, follow thematic paths the conceptual mix-up, and obtain a detailed understanding of the roles BI and BA in creating a more knowledgeable, experience-led, and well-managed tourism sector. By addressing widespread conceptual misunderstandings, we aim to develop a structured framework for future research, strengthening the practical impact of BI and BA in building a robust and competitive tourism industry. Accordingly, our main research question can be formulated as follows: RQ1– In what ways do Business Intelligence and Business Analytics manifest as distinct yet interrelated elements in the tourism sector?

Our results bridge the gap in extant literature by illuminating BI’s role in technological and operational integration and BA’s focus on predictive analytics and the enhancement of customer interaction in the tourism sector. Our main findings associate BI’s tools with technological and operational integration and BA’s with predictive analytics and enhance customer interaction. Our comparative study provides fundamental differences in BI/BA approaches to strategic innovation and operational issues.

The remainder of this paper is organized as follows. In the next section, we thoroughly describe the core concepts of BI and BA, which is followed by sizing the database available for our study. This involves conducting a bibliometric analysis, using the SciMAT software. We then perform a purely qualitative analysis using the Gioia methodology. In our bibliometric research, we identify pertinent research clusters and, following previous studies, detect emerging trends within those clusters (Tanwar & Khindri, 2024 ). The strength of Gioia methodology lies in its successive phases of exploration that helps to conceptualize the central and intersecting themes emerging from the contextually rich and profound aspects of the phenomena under examination. This process enables us to properly conceptualize the fundamentals related to the individual and synergistic functions of BI and BA.

Next, upon presenting the theoretical framework that encompasses both quantitative and qualitative approaches, the empirical section focuses on describing the experiment conducted using real data and significant findings are outlined. The results are then presented, analyzed and discussed, followed by a review of the study’s limitations and an outline of future research directions.

Theoretical framework

In order to establish a robust context for understanding the impact of BI and BA within the tourism sector, which is particularly sensitive to information and data analysis dynamics, the initial step is to explore the convergence and divergence of these two concepts. BI and BA represent a paradigm shift towards data-driven decision-making that marks the evolution of the tourism industry. In this section, we review the theoretical framework that contains the definitions and applications of BI and BA and thus lays a solid foundation for understanding these two constructs in the context of tourism.

A review of the literature suggests that BI has diverse definitions, and a consensual definition has not been established yet (Chee et al., 2009 ). Yet, a common thread across these definitions is the perception of BI as a system that assists decision makers in making informed choices about the business’ direction. In addition, BI is viewed as a system encompassing technologies, tools, and software designed to gather data, automate processes, and generate information transformed into knowledge for making quality decisions (Nyanga et al., 2020 ).

According to the definition proposed by Shende and Panneerselvam ( 2018 ), BI refers to the application of technologies and practices for collecting, integrating, analyzing and presenting business information. Such transformation of data into knowledge within BI supports enhanced decision making. BI systems, or data-driven Decision Support Systems (DSS), favor gaining business advantages with robust BI tools over decisions based on mere intuition. What is essential here is establishing a data- or fact-based decision-making framework via a strong computer system that instills confidence in any decisions that is being made. According to Shende and Panneerselvam ( 2018 ), BI has six major components (see Table  1 ).

Furthermore, Shende and Panneerselvam ( 2018 ) define BA techniques as the exploration of historical data from numerous sources using statistical and quantitative analysis, data mining, predictive modelling, among other approaches, in order to identify trends and understand information that can drive business change and support sustainable practices. In this sense, BA involves the use of statistical tools and technologies for pattern identification, variability analysis, relationship identification, and future insights prediction. Similarly to BO, BA also has six major components that are listed in Table  2 .

Therefore, BI and BA are described as the tools supporting process optimization in contemporary organizations. Since both BI and BA enhance data-driven decision making and strategic planning, the two diverge in the sharing of data mining, a technique used to extract patterns and insights from data. In addition, both fields emphasize predictive analytics, which aims to forecast future trends based on historical data. Statistical analysis, which involves employing statistical methods to understand the data, is another common thread.

However, the two aspects where BI and BA diverge are their focus and application. The first divergence between BI and BA is observed when examining their specific components. On the one hand, BI primarily analyzes historical data to inform decision making and understand past performance (Chen et al., 2012 ; Vercellis, 2011 ); accordingly, BI uses OLAP to analyze data using advanced tools, exploring dimensions such as time or hierarchies. It also focuses on data mining and predictive analytics related to corporate performance, enables real-time distribution of metrics, relies on data warehouses for integration, and collects various types of data.

On the other hand, BA involves forward-looking use of data to forecast future trends and outcomes, thereby supporting strategic decisions (Shende & Panneerselvam, 2018 ). Furthermore, BA involves exploring historical data from various sources using statistical and quantitative analysis, data mining, and predictive modeling, with a particular emphasis on pattern identification, variability analysis, and a future-oriented approach to sustainable knowledge and practices. The shift from BI to BA is propelled by the growing volume and complexity of data and the need for real-time insights in an ever-evolving business landscape (Iovan & Ivanus, 2014 ).

The dual methodology approach

After establishing a clear distinction between the two key concepts under examination—those of BI and BA—in this section, we outline the methodology we use to achieve the goals set forth in the present study. As noted in the introduction, we use a dual methodological approach: first, we conduct a bibliometric study that allows for quantification/qualification of scientific publications related to our subject and, second, we perform purely qualitative analysis. Using such mixed methodological approach, as argued by Matenda and Sibanda ( 2023 ) in a study on BRICS economies, makes it possible to gain a more comprehensive understanding of the studied phenomena by combining the numerical depth of quantitative methods and the contextual richness of qualitative analysis, which jointly provide a more holistic understanding of the research topic. In this way, both approaches complement each other by providing a dual perspective, and their fundamentals are described below.

The quantitative-qualitative approach: a bibliometric assessment with SciMAT

In this review, we systematically examine the corpus of literature on BI and BA in tourism, integrating a bibliometric analysis using the SciMAT software to map the intellectual structure and thematic evolution of the field. Using statistical methods to evaluate publication patterns and trends in this field of study provides an understanding of the evolution and impact of research and topics over time. As argued by Ribeiro-Navarrete et al. ( 2023 ), a bibliometric study employs bibliographic analysis, co-citation, and co-occurrence of keywords. This approach enables identifying relevant authors, leading journals and key topics in the academic literature related to the phenomenon you want to study.

The SciMAT software tool, as described by Cobo et al. ( 2012 ), enables performing science mapping analysis within a longitudinal framework. This approach catalogues existing knowledge and it goes deeper by combining performance analysis with science mapping. This method is also instrumental in detecting and visualizing conceptual subdomains within the research field, be them specific themes or broader thematic areas. The resulting analysis hinges on co-word analysis within a longitudinal framework—a technique that uncovers various themes addressed by the field over a specified period. The richness of this analysis is augmented by incorporating performance measurement. These measures fall into the following two distinct categories: quantitative and qualitative. While the former focus on assessing the productivity of the identified themes and thematic areas, the latter aim to gauge the (perceived) quality based on the bibliometric impact of these themes and thematic areas, as expounded by Cobo et al., 2011 ).

The tourism industry has a complex structure that consists of the accommodation sector, attractions sector, transport sector, travel organizer’s sector. as well as destination organization sector (Middleton & Hawkins, 1998 ); along with the above, there are also restaurant services, food and beverage activities, and various auxiliary services involved. All these sectors and interconnected, supporting and complementing each other such that without others, the tourism industry will not be complete (Ramayah et al., 2011 ; Xu, 2010 ). Accordingly, the main features of BI and BA applied to tourism industry should include data analysis, reports, dashboards, data visualization, performance metrics, key performance indicators, predictive analytics, trend indicators, strategic planning tools, profitability analysis, benchmarking, budgeting, and forecasting (Ibrahim & Handayani, 2022 ). These tools are essential for decision making in the tourism sector, where the adoption of AI and Big Data analytics can provide a competitive advantage (Stroumpoulis et al., 2022 ). BI and BA also enable cross-process knowledge extraction and decision support in tourism destinations (Höpken et al., 2015 ) and data-driven management for new business in tourism (Ferreira & Pedrosa, 2022 ; Höpken et al., 2015 ).

Taking all this into account and for our specific research purposes, we selected the WoS, specifically the Web of Science Core Collection, which includes a collection of high-quality scientific journals within several large-scale databases. Another advantage of this database is that furnishes the essential data required for conducting analysis in SciMAT, including key words and abstract. As highlighted by Shu et al. ( 2020 ), the WoS is renowned as one of the most pivotal databases globally, with its significance defined by its inclusion of excellent research spanning together many countries. Similar arguments on using the WoS for its high performance and the breadth of results it yields across multiple disciplines were made byHarzing and Alakangas ( 2016 ) and Vera-Baceta et al. ( 2019 ).

Our search within WoS was guided by specific criteria, with the terms “tourism,” “Business Intelligence,” and “Business Analytics” forming the core of the search and selection process. This initial search yielded a total of 77 documents. However, following a meticulous refinement and filtering of the database, a total of 50 articles were retained as the primary subject of analysis. The included results came from Social Sciences Citation Index (SSCI), Conference Proceedings Citation Index – Social Science & Humanities (CPCI-SSH), Book Citation Index – Science (BKCI-S), Book Citation Index – Social Sciences & Humanities (BKCI-SSH), and Emerging Sources Citation Index (ESCI). The included articles spanned the years 2000 to 2023, thereby offering a comprehensive view of the evolution of the field. In order to conduct a more nuanced evolutionary analysis, we divided this timeline into two distinct periods: (1) pre-pandemic (2000–2020), with 38 papers found and (2) post-pandemic (2021–2023), with a total of 12 articles. This splitting of the data was essential, as the application of data-driven analysis in tourism underwent a significant transformation in response to the pandemic.

More specifically, Zamyatina ( 2023 ) provide critical insights into this transformative shift, highlighting a notable change in consumer preferences. The pandemic period, as Zamyatina ( 2023 ) observed, witnessed a dramatic alteration in consumer priorities, with a newfound emphasis on intangible luxury. This reflected a broader understanding where travel was increasingly considered an essential component of a healthy lifestyle. Chen ( 2021 ) corroborated this observation, noting that user-generated content during the pandemic predominantly focused on pandemic-related keywords. This shift embodies a profound change in public perceptions, notably marked by increased dissatisfaction among tourism service consumers due to quarantine restrictions and health test mandates.

In another relevant study, Ionescu et al. ( 2022 ) addressed the significant compression that the tourism sector experienced since the onset of the COVID-19 pandemic, which destabilized both tourist flows and the economic agents within the industry. As a solution to this situation, Ionescu et al. ( 2022 ) proposed a decision support model designed to aid in the recovery of tourism in Europe, with an emphasis on the growing need for data-driven tools within the industry. Rahim et al. ( 2021 ) discussed the profound impact of COVID-19 on tourism, noting that the pandemic adversely affected both demand and supply in various aspects. On exploring the increasing use of virtual tourism platforms, such as virtual reality (VR) and augmented reality (AR), as a response to the pandemic, Rahim et al. ( 2021 ) suggested that these technologies would come into wider use for data collection and analysis in the future, indicating a potential shift in how the tourism sector was likely to adapt to and overcome challenges.

The pure qualitative approach: insights from the Gioia methodology

Along with the bibliometric research, we complemented our methodology with a pure qualitative analysis of relevant publications. To this end, we used the Gioia methodology, a renowned qualitative approach in organizational research devised by Gioia et al. ( 2013 ). The core of this systematic qualitative approach is theory-building from empirical data. This method is particularly suited for exploring specific contexts, such as the intersection of BI and BA in tourism (Gioia et al., 2013 ). The Gioia methodology involves a dual-layered coding process, beginning with first-order coding to capture participants’ perspectives and progressing to second-order coding for broader, theoretical categorization (Gehman et al., 2018 ; Gioia et al., 2013 , 2022 ; Magnani & Gioia, 2023 ). To date, the Gioia methodology has been extensively used in the literature (Lacoste et al., 2022 ; Niittymies, 2020 ; Riviere et al., 2018 ; Visvizi et al., 2022 ), and many of these were reviewed by Magnani and Gioia ( 2023 ).

In the present study, we adapted the Gioia methodology for secondary data analysis, with a particular focus on previously collected and published scientific academic articles. This departure from conventional primary data collection methods, such as interviews or surveys, allowed us to include a wider range of studies for an extensive comparative analysis in the fields of BI and BA. This adaptation also enhanced robustness and validity of our theoretical framework and helped us to better align our results with the findings previous reported by Taquette and Minayo ( 2017 ) and Mwita ( 2022 ), who highlighted the scientific validity and flexibility of using qualitative approaches with secondary data.

Overall, the Gioia methodology enables transitioning from concrete studies to abstract theoretical insights, enriching qualitative research. Key stages of this methodology as used in the present study are outlined below.

Data Collection : A distinctive aspect of how we adapted the Gioia methodology was our analysis of secondary data, which involved an exhaustive review and synthesis of available literature. Instead of using conventional primary data collection methods, we thoroughly reviewed a wide range of published studies on the topic. This approach allowed for a broader exploration of information sources, aligning with the flexibility inherent in the Gioia methodology (Taquette & Minayo, 2017 ; Mwita, 2022 ),

Data Analysis : At this stage, which is the crux of the Gioia methodology, we systematically identified, coded, and analyzed qualitative data. Data analysis involved a detailed examination of texts to discover underlying patterns, themes, and dimensions (Gehman et al., 2018 ).

Conceptual Framework Development : Based on the insights derived from our data analysis, we created a detailed and data-driven conceptual framework. This framework was meant to aid in comprehending how BI and BA phenomena are constructed and perceived in the tourism sector (Mugizi, 2019 ).

Theory Generation : At this stage, our goal was to generate a rich, well-grounded theory. By interpreting the data within its conceptual framework, we aimed to produce a theory that would provide a deep understanding of the subject matter (Olbrich & Mueller, 2013 ; Tucker, 2016 ).

Our use of the Gioia methodology aligned with major principles of qualitative research including purposive sampling and autonomy in sample size determination (Subedi, 2021 ). Our approach was also consistent with Tipton et al.’s ( 2014 ) principles on sample selection, reinforcing the importance of thoughtful sample selection for representativeness.

Our approach facilitated an in-depth exploration of BI and BA in tourism, ensuring that the results are comprehensive, credible, and reflective of current academic research. Our approach also incorporated the insights from Dufour and Richard’s ( 2019 ) work that underlined the importance of selecting appropriate analytical approaches for specific research contexts. More specifically, in a comparative study that focused on the same dataset, Dufour and Richard ( 2019 ) used Grounded Theory Approach (GT) and General Inductive Approach (GIA) to highlight the strengths and limitations of each method and give the researcher the responsibility of selecting the most appropriate methodological framework. A review of the results of this comparison enhanced our methodology by providing alternative data analysis perspectives in tourism research.

Article selection and analysis process

The Gioia methodology integrates principles of sample selection in randomized experiments, emphasizing strategic and thoughtful sample selection in qualitative research. This approach guided our choice of 12 scholarly articles, providing an overview of BI and BA in the tourism industry and allowing for a detailed analysis of each article.

The selected articles were systematically identified from the WoS Core Collection database. Our decision to use the WoS was underpinned by the consideration that I business scholars prefer using the WoS and Scopus as their primary databases, while authors and academic institutions consider the journals, books, and conferences indexed of these two databases as “quality publications” (Rana et al., 2023 ).

Another important characteristic of the WoS is its multidisciplinary and international coverage (Vieira & Gomes, 2009 ). After a careful search of different keywords, a total of 25 papers were appraised and carefully read by the authors. We found misinterpretations in literature between Artificial Intelligence (AI) and BI. After several rounds of discussions and readings, a meticulous selection of 12 scientific articles on BI and BA in tourism was found the most relevant for further analysis. The articles were chosen based on their relevance, currency, and presence in high-impact journals, while ensuring a broad spectrum of perspectives.

Criteria for article inclusion

The inclusion criteria that we used in selecting the articles centered on whether the articles addressed six specific research questions; we also examined whether the paper provided definitions of BI/BA and discussed influences, advantages, challenges, contributions of BI/BA to tourism. The articles that did not meet these criteria were randomly replaced, thus ensuring methodological robustness and thematic consistency.

Article selection for this study aligned with the qualitative research criteria set by Laumann ( 2020 ) using keywords relevant to BI, BA, and tourism. Our purposive approach was aimed at finding the articles that made meaningful contributions to understanding the intersection of BI and BA in tourism, ensuring comprehensive topic coverage. The process was designed to minimize selection bias, providing a diverse temporal coverage and ensuring that each selected article substantially contributed to our understanding of BI and BA in the tourism sector. This approach underlined the integrity and validity of our research. The very process included identifying, categorizing, and relating themes to our study’s research questions, using a systematic approach to ensure comprehensive theme extraction. The findings from the selected articles were then synthesized to provide a holistic understanding of the research topic. This involved integrating insights from different studies, identifying commonalities and differences and linking them to the overarching research question.

Once six articles from each thematic area were randomly chosen, their content was examined to determine if they could address our main research question: In what ways do Business Intelligence and Business Analytics manifest as distinct yet interrelated elements in the tourism sector? To this end, the following questionnaire was designed, and each candidate publication was screened on how it addressed the questions. The questions included in the questionnaire are listed below.

How is Business Intelligence (BI/BA) defined and used in this specific study within tourism?

What types of tourism business decisions are influenced by Business Intelligence (BI/BA) in this case?

What are the identified advantages and challenges of using Business Intelligence (BI/BA) in this context?

How does this study contribute to the general understanding of Business Intelligence (BI/BA) in tourism?

What practical and theoretical implications can be drawn from this study for the implementation of Business Intelligence (BI/BA) in tourism?

What are the specific Business Intelligence (BI/BA) tools used in this study, and how do they contribute to the presented results and conclusions?

In instances where any of the selected articles did not address one or more of these questions, it was excluded from the dataset and was randomly replaced by another publication that was further screened as described above. The final sample comprised 12 articles spanning the period 2008–2023, sourced from different scientific journals. Table  3 and 4 lists the selected articles related to BA.

The systematic approach outlined in this section provides ensures empirical grounding and theoretical framework that aptly reflect the complexity of BI and BA in the tourism sector. Using this methodology, we conduct an in-depth exploration of BI and BA, thus laying the groundwork for a more comprehensive understanding of the strategic, operational, and customer-centric implications of these two constructs.

In this section, we first report the results of the bibliometric study, which is followed by the presentation of the findings obtained using and the Gioia methodology application. To enhance the data analysis and presentation of the results, an exhaustive examination of the findings derived from the Gioia methodology was conducted. Each identified theme was comprehensively detailed, using insights from the selected articles to highlight their significance in understanding the roles of BI and BA within the tourism industry.

The output included visual representations such as thematic maps and comparative tables for clarity. These visual aids were thoroughly interpreted and linked to our research objectives. A thematic map was created to demonstrate the interconnected aspects of BI and BA, while comparative tables were made to delineate their unique and shared applications in the tourism sector. This refined approach to data presentation aimed to elucidate the study’s primary findings aligning them with the research objectives, thus providing a holistic and accessible understanding of how BI and BA influence the tourism industry.

In-depth bibliometric analysis findings

Despite the relatively modest size of the analyzed sample, we identified a considerable array of themes and concepts, thereby facilitating a thorough analysis using the SciMAT tool. Table  5 and 6 presents the journals where two or more articles on the studied topic were identified.

The results of applying the SciMAT tool yielded a set of clusters that can be understood as conglomerates of different scientific aspects. In the case of co-word analysis, the clusters represented groups of textual information, or semantic or conceptual groups of different topics treated by the research field. Therefore, the detected clusters can be used to quantify the research field by means of a performance analysis. Co-word analysis, a powerful technique for discovering and describing the interactions between different fields in scientific research (Cobo et al., 2011 ), allowed us to discover the main concepts treated by the field.

Figure  1 shows the results of our keyword analysis. The circles represent each sub-period, with the number of keywords in that sub-period indicated inside the circle (37 in the first period and 23 in the second one). The arrow between consecutive sub-periods represents the number of keywords shared between them (20) and, in parentheses, we also report the Stability Index (overlap fraction, 0,87). The upper-incoming arrow represents the number of new keywords of the sub-period (2) and, finally, the upper-outcoming arrow represents the keywords that are not present (i.e., discontinued) in the next sub-period (17) (Cobo et al., 2011 ). The significant overlap between the two sub-periods indicates that most concepts remain relevant across both periods.

figure 1

Source Authors’ compilation based on Data from the WoS Core Collection

Overlapping map.

Figure  2 shows the results of co-word analysis where clusters of keywords (and their interconnections) were obtained. These clusters were considered as themes. Each research theme obtained in this process was characterized by two parameters: “density” (which can be understood as a measure of the theme’s development) and “centrality” (which can be understood as a measure of the importance of a theme in the development of the entire research field analyzed).

figure 2

Strategic Diagram period 1.

In Fig.  2 , the themes are placed in four groups according to the quadrant where they are located. Overall, a strategic diagram in a two-dimensional space can be built by plotting themes according to their centrality and density rank values. In the present study, strategic diagrams are shown in Figs.  2 and 3 for each period, respectively; the numbers within the circles correspond to the number of articles included in the analysis.

figure 3

Big Data–Networks graph (Period 1). Source Authors’ compilation based on Data from the WoS Core Collection

Within a theme, the keywords and their interconnections draw a network graph, called a thematic network. Each thematic network is labelled using the name of the most significant keyword in the associated theme (usually identified by the most central keyword of the theme). Figures  4 and 5 (Period 1) and 6 (Period 2) show the network graphs constructed with the central themes for each period.

figure 4

Technologies–Networks graph (Period 1). Source Authors’ compilation based on Data from the WoS Core Collection

figure 5

Strategic diagram (Period 2). Source Authors’ compilation based on Data from the WoS Core Collection

Coming back to Fig.  2 , following Cobo et al. ( 2011 ), the themes in the upper-right quadrant are both well developed and important for the structuring of a research field. These are generally known as the motor-themes of the specialty, as they present strong centrality and high density. The placement of themes in this quadrant implies that the corresponding themes are related externally to concepts applicable to other themes that are conceptually closely related. Themes in the upper-left quadrant have well developed internal ties, but unimportant external ties; therefore, they are of only marginal importance for the field. These themes are very specialized and peripheral in character. Themes in the lower-left quadrant are both weakly developed and marginal. The themes of this quadrant have low density and low centrality, mainly representing either emerging or disappearing themes. In the lower-right quadrant, we find themes that can be considered foundational and cross-cutting due to their low density and high centrality.

In Period 1 (2000–2020), Big Data emerged as a central, well-developed theme (Fig.  2 ), signifying its rise as a key instrument in decision making and business strategy within the tourism industry (Fig.  4 ). The graph depicts “Big Data” as intricately linked to both BI and BA, signifying its integral role in the development and application of these disciplines in tourism. The observed connections between “Big Data” and other thematic areas such as “Hospitality”, “Decision Making”, and “Smart Destination” imply that the analysis and use of extensive data sets are essential for informed decision-making processes and the enhancement of smart destinations in the hospitality industry. The use of technologies also appears to be a secondary central theme (see Fig.  1 ), increasingly standing out for its connections to innovation, use of information systems, assessment of consumer experiences and achievement of smart cities (see Fig.  4 ). Online Reviews also emerged as a relevant theme, which may point to the growing impact of user opinions. As a cross-cutting theme, the influence of culture in the tourism industry remains consistent.

In Period 2 (2021–2023), the results shown in the strategic diagram (Fig.  3 ) indicate that Decision Support Systems gained prominence, exhibiting an increase in centrality and suggesting a greater reliance on analytical tools for making strategic decisions. Furthermore, Business Intelligence maintained its position as an established theme, reflecting its ongoing impact on deriving actionable insights from large volumes of data. The interconnection between Decision-Support Systems (Fig.  6 ) and other concepts such as Hospitality, Framework, Experiences and Tourism Industry highlights the integration of advanced analytics across all facets of tourism.

figure 6

Decision-support-systems-Networks graph (Period 2). Source Authors’ compilation based on Data from the WoS Core Collection

We also constructed a longitudinal theme map by grouping the two periods (Fig.  7 ). Continuous lines in the map represent a conceptual link, while broken lines represent linked themes that share keywords that differ from their respective names. Finally, the diameter of the sphere corresponds to the number of documents retrieved in each theme.

figure 7

Longitudinal theme map. Source Authors’ compilation based on Data from the WoS Core Collection

Figure  7 suggests that, with the transition from Period 1 to Period 2, there occurred a shift from a focus on Big Data and technologies to a greater emphasis on the practical applications of these data through decision support systems and BI strategies. The increasing complexity of networks and the emergence of new nodes depict a tourism industry that is in constant adaptation to the innovations and demands of the more connected and data-driven era. Indeed, tourism gets increasingly informed by personalized experiences and optimized management, guided by the intelligence gleaned from the analysis of complex data and diverse sources of information, including online reviews and cultural factors.

The results of our bibliometric literature review revealed the complexity of BI and BA within the tourism industry, marking out their characteristics and areas where they converge and complement each other. Upon reviewing the current body of literature on BI and BA in tourism, we observed a shortage of research on the subtle roles of BI and BA in tourism. Seeking to bridge this gap in the literature, we thoroughly analyzed 12 key articles on BI and BA in tourism. The results of our Gioia analysis are discussed in the next section.

Analyzing BI in tourism with the Gioia methodology

Our qualitative analysis of the data using the Gioia methodology revealed a modulation in strategic decision-making processes within the tourism sector attributable to BI. The results indicated a profound impact of BI on strategic, operational, and competitive dimensions of tourism. These findings highlighted how BI facilitates a more dynamic approach to demand forecasting and decision making that accommodates the rapid changes in the tourism environment. Moreover, the results showed how BI promotes operational improvements, placing a premium on customer-centric strategies to enhance the tourism experience. The integrative function of BI emerged as a key driver for knowledge management, fostering competitive practices well aligned with global sustainability goals. The visual representation in Fig.  8 and accompanying analysis provide further detail on how BI acts as a linchpin to drive end-to-end growth, equipping the travel industry with the tools for strategic foresight, operational optimization, and a sustainable future.

figure 8

Data structure for BI obtained using the Gioia methodology (Gioia et al., 2013 ) Source Authors’ compilation from selected articles

Figure  8 shows the data structure as the result of the application of the Gioia methodology to BI in tourism and demonstrates BI’s substantial impact on strategic planning, operational enhancements, and sustainable competitive practices.

Based on these findings, it can be inferred that BI is instrumental in refining tourism demand forecasts and shaping adaptable decision-making processes. BI streamlines operations, placing customer preferences at the forefront, thus optimizing the overall tourist event experience. Furthermore, BI’s integrative capabilities contribute to knowledge management, aiding the tourism industry’s progression towards more responsible and sustainable practices. In summary, our qualitative results allow us to conclude that BI is a pivotal element in steering the tourism industry towards comprehensive growth, underlining its critical role in operational optimization, strategic foresight and sustainability.

Analyzing BA in tourism with the Gioia methodology

Our qualitative analysis on the influence of BA on decision-making in the tourism sector using the Gioia methodology was somewhat more extensive due to the increased complexity of the articles included in the sample. The results of this analysis unraveled the intricate web of applications and implications of BA in the tourism sector (see Fig.  9 ). The identified aggregate dimensions provided a multifaceted panorama of how BA reshapes tourism.

figure 9

Data structure for BA obtained using the Gioia methodology (Gioia et al., 2013 ). Source Authors’ compilation from selected articles

The strategic application of BA is the cornerstone supporting innovation and addressing challenges within the sector. Adapting BA strategically improves decision making and paves the way for significant advancements and effective management of challenges. This strategic foundation drives innovation in customer engagement, which emerges as a key outcome.

By leveraging data to enhance interactions and experiences, BA enables a more personalized and dynamic approach to customers. Furthermore, with regard to challenges and opportunities in data use, the path to innovation is marked by both. These critical elements directly influence how tourism organizations strategically apply BA and make progress in customer engagement. Paramount aspects centrally implied in the efficacy of BA are the successful management of data privacy and the development of specialized skills. The culmination of these dimensions is observed in the evolution and continuous improvement of tourism services. The successful integration of BA strategies, coupled with innovation in customer engagement and overcoming challenges, leads to a tangible transformation in the services offered, reflecting the vital role of BA in the tourism sector’s ongoing innovation. Together, these intertwined dimensions highlight a narrative of progress and adaptation in tourism, where BA acts as a catalyst for strategic change, customer-centered innovation, challenge overcoming, and service evolution. Taken together, the results of our qualitative analysis of BA using the Gioia methodology reflect the significant and multifaceted influence of BA in tourism, paving the way for greater personalization, enhanced engagement, and more informed decision making, all within the context of a constantly evolving and adapting tourism sector.

In recent years, surpassing their status as mere technological tools, BI and BA have emerged as strategic assets. In the present study, we conducted an in-depth exploration of the intersection of BI and BA within the tourism sector, and the results of our analysis contribute to the current understanding of distinct yet complementary roles of BI and BA in enhancing strategic orientation, customer engagement, operational efficiency and sustainability within the tourism sector.

Our results are largely consistent with the findings previously reported in the literature on the use of digital technologies for efficiency, innovation, and sustainability. For instance, similarly to our conclusions, Lyulyov et al.’s ( 2024 ) study on the integration of e-governance and e-business in sustainable development argued that enhancing efficiency and fostering innovation are key areas that need be pursued. Both e-governance and e-business leverage digital technologies to optimize processes, improve decision-making, and promote sustainable practices. This synergy is evident in their alignment with global sustainability goals and in contributing to a more informed, efficient, and customer-centric approach in their respective sectors.

Furthermore, in our analysis on BI and BA in tourism using the SciMAT tool, we found a significant change in publications before 2020 and from 2020 onwards, reflecting the tourim’s adaptation to digitalization. This result is broadly similar to Ribeiro-Navarrete et al.’s ( 2023 ) observations on the evolution in academic production, with a significant change happening in pre-pandemic and post-pandemic publications in the literature on digitalization of cooperatives.

Applying BI and BA in the tourism sector opens new avenues to enhance service quality and customer interaction. The proposed perspectives suggest a more tailored approach, where BI and BA are used not just as tools for data processing and analysis, but rather become integral to creating a dynamic, responsive, and customer-centric tourism experience. By integrating these technologies, tourism businesses can gain deeper insights into customer preferences and behaviors, allowing for a more personalized service delivery and a more engaging customer journey. Furthermore, the use of BI and BA can lead to the development of innovative strategies to attract and retain customers, offering a competitive edge in an increasingly digital marketplace. Therefore, the implementation of these technologies should be viewed not just as a technical upgrade, but as a strategic move towards a more informed, agile, and customer-oriented tourism industry.

Furthermore, if we compare the results of the present study with Saura et al. ( 2023b ) study on advances in digital technology, data-driven decision making, combination of methodologies, improvement of interaction and customer satisfaction, we find that, despite the fact that the central themes in both studies are the use of digital technologies and data analysis in the improvement of services, our article focuses specifically on the tourism sector, with a particular emphasis on the roles of BI and BA, while Saura et al. ( 2023b ) has a broader perspective on digital service models in various industries. In addition, the present paper on BI and BA focuses more on distinguishing the roles and impacts of BI and BA in tourism, while Saura et al. ( 2023b ) analyzes the broader application of digital technologies in improving the customer experience in services. Finally, the present study incorporates and details in a novel way the use of the Gioia methodology for the qualitative analysis applied to BI and BA in the tourism sector.

Our results reveal that BI is focused on technological integration and operational enhancement and thus is concerned with the systematic merging of various technological tools and platforms to streamline data collection, analysis, and reporting. This integration, as also demonstrated in several previous studies (e.g., Khan & Quadri, 2012 ; Viitanen & Pirttimaki, 2006 ), involves harmonizing disparate systems, including but not limited to customer relationship management (CRM) software, booking engines and social media analytics tools, in order to create a cohesive BI framework. In its turn, BA excels in predictive analysis and direct customer engagement through data, using the insights gained from BI to improve the efficiency and effectiveness of operational processes in tourism. As also indicated in previous research (DuttaRoy & DuttaRoy, 2016 ; Vahn, 2014 ), this could include optimizing resource allocation, enhancing customer service, tailoring marketing strategies based on customer data, and improving the overall business performance. Accordingly, our findings reinforce extant literature while providing new insights into the strategic application of BI and BA in tourism.

Specifically, we identified that strategic orientation in both fields emphasizes the importance of long-term development and market adaptation, with BA focusing more on strategic decision-making and innovation. This aligns with previous studies emphasizing the importance of strategic planning in the evolution of tourism (e.g., Vukotić & Vojnović, 2016 ). With regard to customer engagement, BA stands out for its ability to personalize customer experiences through detailed data analysis, whereas BI focuses on optimizing systems that influence customer service—an approach also observed in previous research (Shende & Panneerselvam, 2018 ).

Regarding operational and service enhancement, BI and BA share the common goal of improving service quality, albeit with a different focus: while BI focuses on operational processes, BA prioritizes data-driven service development. This reflects trends noted in previous studies on the importance of operational efficiency and service innovation in tourism (Hollebeek & Rather, 2019 ; Jiménez-Zarco et al., 2011 ).

Furthermore, sustainability emerges as a critical issue in the current industry, with BI facilitating sustainable practices through effective knowledge management. While none of the papers included in our dataset directly refers to sustainability in conjunction with BA, our results may suggest that BA advocates for more dynamic and innovative sustainable innovations. In this context, very few other publications are currently available. However, some articles would confirm these conclusions, as was the case of the few publications suggesting that the vast amount of data generated on social media by tourists regarding their travel experiences could serve as a valuable source of open innovation (Saura et al., 2023a ). For instance, the use of social Big Data might facilitate innovation processes supporting the development of sustainable tourism experiences in a destination (Del Vecchio et al., 2018b ).

Overall, an increasing number of publications point to the relevance of advanced data analysis technologies for understanding and responding to complex challenges, whether they study the economic impacts of extreme weather phenomena, as in Saura et al. ( 2023c ), or the evolution of practices in the tourism sector, as we did the present study. In both cases, the ability to process and analyze large volumes of data becomes an essential tool for innovation and strategic adaptation in an ever-changing world.

Conclusions

In the present study, we used a dual research methodology to explore how BI and BA serve as distinct yet interconnected facets in the tourism industry. The major research question addressed in this study was as follows: In what ways do Business Intelligence and Business Analytics manifest as distinct yet interrelated elements in the tourism sector? Through an in-depth analysis of 12 scientific articles extracted from the WoS Core Collection database, significant insights were garnered. The selected papers were subjected to a comprehensive bibliometric analysis that was initially conducted to determine the current landscape of the field. The findings presented here offer a deep understanding of the roles and interplay between BI and BA in tourism.

While sharing common goals, BI and BA differ in their approaches to those goals. While BI integrates and optimizes systems for improved efficiency and decision support, BA is dynamic and focuses more on leveraging data for direct customer engagement and innovative service transformation. These results highlight a complementary relationship, where BI provides a robust infrastructure and BA offers agility and responsiveness in the data-driven landscape of tourism. The selected methodological framework allowed us to draw upon a broad and diverse range of existing academic insights, enhancing the understanding of the current state of BI and BA in tourism.

The results of the present study advance the theoretical framework of tourism management by integrating and differentiating the roles of BI and BA. Our findings also add depth to existing management theories by showcasing how data-driven approaches can be applied to improve decision making, strategic planning and customer relations in tourism.

Furthermore, in the present study, we addressed the gap in the literature—namely, the scarcity of a comparative analysis of BI and BA within the tourism sector. On exploring the unique and combined effects of BI and BA, we highlighted areas previously unexamined in academic research. Accordingly, our findings provide a more holistic understanding of how data-driven strategies shape tourism industry practices. Viewed from this perspective, several key components regarding methodological aspects should be noted. Regarding data-driven inquiry, we used an inductive approach, which allowed themes and theories to naturally emerge from the data; with regard to narrative analysis, our data were organized into a coherent narrative, transitioning from raw data to more abstract themes and theoretical constructs; dealing with systematic thematic coding, several initial themes were directly identified from participant terms and then developed into broader themes; besides, methodology application supported detailed descriptions of phenomena, capturing the complexity of the qualitative data, In addition, our analysis was rooted in the perspectives of the original scientific studies. Accordingly, before moving to researcher-driven interpretations, we engaged in reflexivity, critically examining our role and potential biases in the data analysis process; visual models were extracted to illustrate the relationships between concepts and themes; and finally, this approach facilitated the development of new theoretical insights by organizing data into themes and dimensions.

Through a detailed exploration of the role of BI/BA in tourism decision-making using the Gioia methodology, we identified key themes that are presented in the Results section with visual aids such as thematic maps and comparative tables for clarity. These tools directly linked our findings to our research objectives, so that in our analysis of BI, we observed its notable influence on strategic and operational decision making in the tourism sector. According to our results, BI enhances demand forecasting and adaptability, especially in response to rapid changes in the tourism environment. The data also revealed the important role of BI in promoting operational improvements and customer-centric strategies, which is essential for enhancing tourism experience. Of note, BI’s integrative function emerged as a key factor in knowledge management, supporting sustainable and competitive practices within the industry.

Similarly, the results of our analysis of BA highlighted its strategic application in fostering innovation and effectively managing challenges in the tourism sector. Indeed, BA enables a more personalized approach to customer interaction, leveraging data to improve customer experiences. Identifying challenges and opportunities in data use proved critical in influencing BA’s strategic application in tourism organizations. Accordingly, the successful integration of BA strategies results in a continuous service improvement, thereby underscoring BA’s vital role in the ongoing innovation and transformation of tourism services.

Taken together, the results of the present study highlight the multifaceted and significant impact of BI and BA in shaping the strategic, operational, and customer-centric aspects of the tourism industry.

Practical implications

A major distinction between the use of BI and BA tools in the tourism sector concerns their different focus on both management and the organization’s strategy. To start with BI helps to understand past performance and manage the present one based on concrete facts; accordingly, using BI, one can better understand past trends and obtain indicators such as sales performance, customer behaviors, hotel occupancy, and forth. These indicators can meaningfully inform decision making to optimize hotel operations or improve marketing. As for the tools used within BI, these typically include structured data and software allowing one to create dashboards, reports, and data visualizations. In its turn, BA uses both structured and semi-structured data, such as customer reviews and social media comments, with more emphasis on complex analyses, be them predictive and prescriptive. BA is essential for anticipating changes in the market and strategical planning—for instance, it can predict the demands of certain tourist destinations, or market trends, thereby facilitating proactive and strategic decisions. BA also incorporates advanced tools such as statistical analysis, data mining, and machine learning.

Hence, for industry practitioners, an integration of BI and BA offers a comprehensive approach: BI for foundational data management and technological infrastructure, on the one hand, and BA for dynamic, data-driven decision-making, on the other hand. Tourism businesses can leverage BI for operational efficiencies and BA for customer-centric strategies and service innovation. Our findings also suggest, in order to fully leverage BI and BA for effective tourism management, there is a need for investment in technological infrastructure. In this respect, our findings offer a strategic roadmap for tourism professionals to effectively integrate BI and BA, offering a guide for implementing these technologies to maximize service quality and innovation in customer strategies.

Furthermore, Gioia’s methodological diagram for BI in tourism reveals a structured approach through which BI empowers the tourism sector. It highlights how strategic integration and forecasting, driven by BI, are critical to adapting to the dynamic nature of tourism, with the integration of data from various sources so as to improve demand analysis and agility in decision making. Technological advancements facilitated by BI are a breakthrough in data integration and a means to promote operational excellence through customer-centric strategies, allowing tourism companies to stay competitive. The operational efficiency improvements highlighted in the model reflect the role of BI in streamlining processes, optimizing resource allocation, and overall enrichment of visitor experiences. BI’s contribution to sustainability and knowledge management indicates its potential to foster sustainable tourism practices, improving the overall competitiveness of the sector. Said differently, BI acts as a conduit for innovation, strategic foresight, and sustainable development within the tourism industry, which makes its applications far-reaching, beyond mere data analysis, and capable of re-shaping the very structure of tourism sector management solutions.

Likewise, the concept diagram resulting from applying the Gioia methodology in the study of BA in the tourism sector clarifies the channels through which BA influences management solutions within it. Specifically, it shows how BA facilitates strategic decision-making, enabling predictive modelling and demand forecasting that support robust marketing strategies and service personalization. Data gleaned from online reviews and comments is key to refining customer satisfaction analysis, thus leading to improvements in service quality. In addition, the diagram indicates the importance of addressing data privacy and the need for expertise in handling it to effectively use and manage Big Data. Collectively, these pathways contribute to the evolution of tourism services through data analytics, which manifests itself in more personalized service offerings and recommendation systems. This progression shows BAs as analytical tools and as integral components in creating a more responsive and data-driven tourism industry that prioritizes customer engagement and operational excellence.

Theoretical implications

The present study contributes to the theoretical understanding of how BI and BA intersect and complement each other in the tourism sector, offering novel insights into their distinct yet synergistic roles. Our results broaden the scope of tourism management theories to include data-driven approaches for both operational and strategic planning. A new contribution to the theoretical understanding of BI and BA in the tourism sector is that, via clearly delineating the distinct roles and applications of BI and BA, the present study existing theoretical models of tourism management.

Yet another novel aspect of this study is the innovative use of secondary data from scientific articles in the Web of Science database, applied through the Gioia methodology. This approach extends the traditional use of this methodology by setting a precedent of using extensive academic databases to synthesize and analyze existing knowledge.

Collectively, our findings emphasize the importance of BI and BA in strategic orientation, customer engagement, and sustainability, thereby enriching the theoretical framework within which tourism management operates. Furthermore, our methodological innovation of data use adds a new dimension to academic research, offering a robust, comprehensive approach to understanding complex interdisciplinary phenomena in the tourism industry.

As argued by Saura et al. ( 2024 ), the adaptability and flexibility of small- and medium-sized enterprises, coupled with technology, enables enterprises to quickly respond to unforeseen challenges, emphasizing the importance of cybersecurity, government support, and digital tools to improve business processes. If we consider these conclusions in the context of the present study, we can reveal the following broader theoretical implication: in the modern business landscape, regardless of the specific industry, the effective use of technology and data analysis is an indispensable tool needed to manage complexity and implement innovation. This idea translates into the need for continuous adaptation and integration of digital tools into business strategies to improve efficiency, customer engagement, and overall resilience. These insights suggest a growing trend towards reliance on data-driven decision making and technological advancements in different sectors, reshaping theoretical frameworks and business practices alike.

Similar theoretical implications concerning the key role of advanced data analytics and technology in revolutionizing business strategies were made by Saura et al. ( 2024 ) in a study on data-mining analytics of Twitter-based user-generated content in operations management (OM), suggesting a transformative impact of digital innovation in this field. These conclusions resonate with the present study’s focus on the synergistic roles of BI and BA in improving tourism management through data-driven approaches. Both studies highlight how the new digital technologies can revolutionize operational and strategic planning, from automation and forecasting to customer engagement and sustainability in tourism. The use of various data sources, such as Twitter UGC and academic databases, also reflects a broader business trend: the imperative to use technology and data analytics to manage complexity and drive innovation across various industries. This convergence suggests a shift towards data-driven decision-making and technological advancements, reshaping theoretical frameworks and business practices.

Author information

Authors and affiliations.

Business Administration Department, Faculty of Economics and Business Sciences, Rey Juan Carlos University, Paseo de los Artilleros s/n. 28032, Vicálvaro, Madrid, Spain

Montserrat Jiménez-Partearroyo

Financial Economics and Accounting Department, Faculty of Economics and Business Sciences, Rey Juan Carlos University, Paseo de los Artilleros s/n. 28032, Vicálvaro, Madrid, Spain

Ana Medina-López

College of Healthcare Management and Economics, Gulf Medical University, Ajman, UAE

Sudhir Rana

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Ana Medina-López .

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Jiménez-Partearroyo, M., Medina-López, A. & Rana, S. Business intelligence and business analytics in tourism: insights through Gioia methodology. Int Entrep Manag J (2024). https://doi.org/10.1007/s11365-024-00973-7

Download citation

Accepted : 22 March 2024

Published : 12 April 2024

DOI : https://doi.org/10.1007/s11365-024-00973-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Business intelligence
  • Business analytics
  • Tourism industry
  • Gioia methodology
  • Comparative analysis
  • Strategic innovation
  • Operational challenges
  • Customer engagement
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. What is Comparative Research? Definition, Types, Uses

    comparative analysis in quantitative research

  2. How to Conduct Comparative Analysis? Guide with Examples

    comparative analysis in quantitative research

  3. Benefits of comparative research. Comparative Advantage. 2022-10-26

    comparative analysis in quantitative research

  4. 😍 Comparison between qualitative and quantitative research. Difference

    comparative analysis in quantitative research

  5. Qualitative vs. Quantitative Research: Definition and Types

    comparative analysis in quantitative research

  6. Comparison of quantitative and qualitative research approaches

    comparative analysis in quantitative research

VIDEO

  1. Reporting Descriptive Analysis

  2. Approaches to Content Analysis

  3. Predictive Content Analysis

  4. Unitizing in Content Analysis

  5. COMPARATIVE ANALYSIS OF ORGANIZATIONAL CULTURES

  6. Advantages & Disadvantages of Content Analysis

COMMENTS

  1. 15

    What makes a study comparative is not the particular techniques employed but the theoretical orientation and the sources of data. All the tools of the social scientist, including historical analysis, fieldwork, surveys, and aggregate data analysis, can be used to achieve the goals of comparative research. So, there is plenty of room for the ...

  2. (PDF) A Short Introduction to Comparative Research

    comparative historical analysis in history, and psychological analysis (Smelser, 1973). Comparative research or analysis is a broad term that includes both quantitative and qualitative comparison.

  3. quantitative approaches to comparative analyses: data properties and

    While there is an abundant use of macro data in the social sciences, little attention is given to the sources or the construction of these data. Owing to the restricted amount of indices or items, researchers most often apply the 'available data at hand'. Since the opportunities to analyse data are constantly increasing and the availability of macro indicators is improving as well, one may ...

  4. What is Comparative Analysis? Guide with Examples

    Comparative analysis is generally divided into three subtypes, using quantitative or qualitative data and then extending the findings to a larger group. These include. Pattern analysis—identifying patterns or recurrences of trends and behavior across large data sets.. Data filtering—analyzing large data sets to extract an underlying subset of information.

  5. Comparative Research Methods

    A recent synthesis by Esser and Hanitzsch ( 2012a) concluded that comparative communication research involves comparisons between a minimum of two macro-level cases (systems, cultures, markets, or their sub-elements) in which at least one object of investigation is relevant to the field of communication.

  6. Comparative Analysis

    As stated, comparative analysis may be used to determine and explain the association between variables. The method is primarily conducted to garner in-depth understandings of cause-and-effect relationships (Adiyia & Ashton, 2017).As a result, similar considerations of scholarly rigor and research ethics should be made when conducting comparative studies, as is the case with quantitative methods.

  7. Frontiers

    Comparative studies of cities throughout history are one of the greatest sources of insight into the nature of change in human societies. This paper discusses strategies to anchor these comparisons on well-defined, quantitative and empirical characteristics of cities, derived from theory and observable in the archeological and historical records. We show how quantitative comparisons based on a ...

  8. Comparative Analysis

    Definition. The goal of comparative analysis is to search for similarity and variance among units of analysis. Comparative research commonly involves the description and explanation of similarities and differences of conditions or outcomes among large-scale social units, usually regions, nations, societies, and cultures.

  9. Comparative Research Designs and Methods

    This module presents the macro-quantitative (statistical) methods by giving examples of recent research employing them. It analyzes the regression analysis and the various ways of analyzing data. Moreover, it concludes the course and opens to further perspectives on comparative research designs and methods.

  10. Statistical Methods for Comparative Studies

    Overview of the Book. The first five chapters discuss the main conceptual issues in the design and analysis of comparative studies. We carefully motivate the need for standards of comparison and show how biases can distort estimates of treatment effects. The relative advantages of randomized and nonrandomized studies are also presented.

  11. What Is Comparative Analysis and How to Conduct It?

    Quantitative comparative analysis is commonly applied in economics, social sciences, and market research to draw empirical conclusions from numerical data. Case Studies Case studies involve in-depth examinations of specific instances or cases to gain insights into real-world scenarios.

  12. How to Do Comparative Analysis in Research ( Examples )

    Comparative analysis is a method that is widely used in social science. It is a method of comparing two or more items with an idea of uncovering and discovering new ideas about them. It often compares and contrasts social structures and processes around the world to grasp general patterns. Comparative analysis tries to understand the study and ...

  13. Comparative research

    Quantitative analysis is much more frequently pursued than qualitative, and this is seen by the majority of comparative studies which use quantitative data. The general method of comparing things is the same for comparative research as it is in our everyday practice of comparison. Like cases are treated alike, and different cases are treated ...

  14. A Practical Guide to Writing Quantitative and Qualitative Research

    A research question is what a study aims to answer after data analysis and interpretation. The answer is written in length in the discussion section of the paper. ... Comparative research question (quantitative research) - Clarifies the difference among groups with an outcome variable (patients enrolled in COMPERA with moderate PH or severe PH ...

  15. Comparative Studies

    Comparative is a concept that derives from the verb "to compare" (the etymology is Latin comparare, derivation of par = equal, with prefix com-, it is a systematic comparison).Comparative studies are investigations to analyze and evaluate, with quantitative and qualitative methods, a phenomenon and/or facts among different areas, subjects, and/or objects to detect similarities and/or ...

  16. Comparative Analysis of Qualitative And Quantitative Research

    It is claimed here that the divide between qualitative and quantitative research is ambiguous, incoherent, and hence of little value, and that its widespread use could have negative implications. ... Panda, Subhajit, Comparative Analysis of Qualitative And Quantitative Research (February 15, 2019). M.Lib.I.Sc. Project (pp. 1-11), 2019, Panjab ...

  17. Comparative Analysis: What It Is & How to Conduct It

    Comparative analysis is a way to look at two or more similar things to see how they are different and what they have in common. It is used in many ways and fields to help people understand the similarities and differences between products better. It can help businesses make good decisions about key issues. One meaningful way it's used is when ...

  18. Using qualitative comparative analysis to understand and quantify

    Qualitative comparative analysis (QCA) is a method and analytical approach that can advance implementation science. ... Similar to qualitative and quantitative research approaches, analysis is iterative. Additionally, the researcher examines the truth table to assess whether all logically possible configurations have empiric cases. As described ...

  19. Research methods in business: Quantitative and qualitative comparative

    Continuing with this endeavor, this special issue of the Journal of Business Research presents articles that explore "Research Methods in Business: Quantitative and Qualitative Comparative Analysis.". The original papers were presented at the 2019 INEKA Conference held at University of Verona, Verona, Italy, from June 11 to 13, 2019.

  20. Qualitative vs. Quantitative Research

    When collecting and analyzing data, quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings. Both are important for gaining different kinds of knowledge. Quantitative research. Quantitative research is expressed in numbers and graphs. It is used to test or confirm theories and assumptions.

  21. Comparative Analysis of Qualitative And Quantitative Research

    Both methodologies offer a set of methods, potentialities and limitations that must be explored and known by researchers. This paper concisely maps a total of seven qualitative methods and five quantitative methods. A comparative analysis of the most relevant and adopted methods is done to understand the main strengths and limitations of them.

  22. Comparative Analysis

    Any comparative analysis is a research that must answer a question or questions asked. ... Scores are added to the comparative analysis to enable quantitative factors to derive conclusions. 4 Qualitative Comparative Analysis and Harvey Balls. Sometimes, it is hard to assign quantitative scores to different features, particularly, when there is ...

  23. Exploring the Differential Effects of Transcranial Direct Current

    Background. Transcranial Direct Current Stimulation (tDCS) is a non-invasive brain stimulation technique. Constant electric current is passed through the patient's scalp with the aim of modulating cortical excitability. Stroke is a cerebrovascular disease characterized by hemorrhage or cerebral ischemia. This systematic review and meta-analysis are aimed at comparing the efficacy of motor ...

  24. Business intelligence and business analytics in tourism ...

    Although Business Intelligence (BI) and Business Analytics (BA) have been widely adopted in the tourism sector, comparative research using BI and BA remains scarce. To fill this gap in the literature, the present study explores how BI and BA contribute to strategic innovation, address operational challenges, and enhance customer engagement. To this end, using a dual-method approach that ...