quantitative-vs-qualitative-research

Quantitative vs. Qualitative Research

Whereas quantitative research, in general, leverages statistics as the basis for making generalizations about an issue at hand. On the other hand, qualitative research performs qualitative inquiry comprising small data, context, and human judgment.

What’s quantitative data?

characteristics-of-quantitative-research-characteristics-of-quantitative-research
The characteristics of quantitative research contribute to methods that use statistics as the basis for making generalizations about something. These generalizations are constructed from data that is used to find patterns and averages and causal test relationships.

Quantitative data has become extremely important, especially for improving business processes.

When dealing with quantitative data, it’s critical to have a pipeline of selection of what data make sense, which can drive the business.

In other words, a lot of time will be spent building and curating the dataset, which will be used as the foundation to analyze the business.

Indeed, the risk is otherwise to rely on unreliable data, which only increases the noise for the business.

Many tech companies, like Google, Amazon, Netflix, and Microsoft, leverage data in their business processes.

Some examples of how quantitative data drives those processes to comprise:

  • Inventory management.
  • Orders’ fulfillment.
  • Product recommendation.
  • Indexing and ranking.
  • Spam detection.
  • A/B testing.
  • Content recommendation.

In other words, there are tons of practical use cases for which data can be used to improve business processes.

It’s also important to balance that with qualitative data.

What’s qualitative data?

characteristics-of-qualitative-research
Qualitative research is performed by businesses that acknowledge the human condition and want to learn more about it. Some of the key characteristics of qualitative research that enable practitioners to perform qualitative inquiry comprise small data, the absence of definitive truth, the importance of context, the researcher’s skills and are of interests.

Qualitative data is extremely important as it can change the nature of our quantitative understanding.

For instance, while tech companies leverage quantitative data to improve their processes, much of it is imbued by qualitative understanding to make that quantitative data much more valuable.

Indeed, the risk of quantitative data is too much generalization, ultimately leading to the creation of abstract scenarios that do not exist in the real world.

In addition, quantitative data is skewed toward things that can be measured, thus leading to attributing way more importance to those things that can be easily measured vs. those that can’t.

Take the case of digital marketing campaigns, where you can easily track clicks, thus attributing more importance to platforms like Google Ads, which are easily tracked.

Yet, you realize that people might be clicking on your ads campaigns thanks to your strong brand, which can’t be directly measured.

Thus, you find out that branding drives performance campaigns only by having a qualitative judgment of your business.

This is one of the many examples of how qualitative data can inform quantitative data.

Other examples comprise:

  • Data selection.
  • Data curation.
  • Data cleaning.
  • Validation workflows.
  • Understanding of changing contexts for which quantitative data don’t make sense anymore.

All of the above help make quantitative data much more valuable by removing a substantial amount of noise.

Quantitative vs. Qualitative Research

Dealing with data is extremely hard.

It’s one of the hardest things in business.

And as most businesses now have a lot of data available, it’s easy to fall into the trapping of misusing it.

For that, it’s critical to establish project business processes, whereas it gets clear to the internal team when to use quantitative vs. qualitative data or both.

Quantitative research, if used in the proper context, can be incredibly effective.

For instance, companies like Amazon have been using quantitative research to drastically improve – over time – their business processes, from inventory management to order fulfillment.

This is part of Jeff Bezos’ “Day One” Mindset.

jeff-bezos-day-1
In a letter to shareholders in 2016, Jeff Bezos addressed a topic he had been thinking about quite profoundly in the last decades as he led Amazon: Day 1. As Jeff Bezos put it, “Day 2 is stasis. Followed by irrelevance. Followed by excruciating, painful decline. Followed by death. And that is why it is always Day 1.”

This forced Amazon to understand how to leverage both quantitative data (for business processes) and qualitative data (for discovery).

In what Bezos labeled as customer obsession.

customer-obsession
In the Amazon Shareholders’ Letter for 2018, Jeff Bezos analyzed the Amazon business model, and it also focused on a few key lessons that Amazon as a company has learned over the years. These lessons are fundamental for any entrepreneur, of small or large organization to understand the pitfalls to avoid to run a successful company!

As Jeff Bezos recounted in 2006:

Many of the important decisions we make at Amazon.com can be made with data. There is a right answer or a wrong answer, a better answer or a worse answer, and math tells us which is which. These are our favorite kinds of decisions.”

As Jeff Bezos also highlighted at the time:

As our shareholders know, we have made a decision to continuously and significantly lower prices for customers year after year as our efficiency and scale make it possible.

Indeed, this was the core tenet of Amazon’s flywheel.

And Jeff Bezos also explained:

This is an example of a very important decision that cannot be made in a math-based way. In fact, when we lower prices, we go against the math that we can do, which always says that the smart move is to raise prices.

This is a critical point to understand, as Amazon has learned how to integrate quantitative and qualitative understanding within its business processes over the years.

Indeed, as Jeff Bezos further explained:

We have significant data related to price elasticity. With fair accuracy, we can predict that a price reduction of a certain percentage will result in an increase in units sold of a certain percentage. With rare exceptions, the volume increase in the short term is never enough to pay for the price decrease. 

In other words, by using statistical tools like price elasticity, you can have a short-term quantitative understanding.

But it tells you nothing about the potential long-term effects of it.

This is where you understand the limitations of statistical tools.

Jeff Bezos explained extremely well:

However, our quantitative understanding of elasticity is short-term. We can estimate what a price reduction will do this week and this quarter. But we cannot numerically estimate the effect that consistently lowering prices will have on our business over five years or ten years or more. 

By understanding the drawbacks and limitations of quantitative methods, you know when human judgment needs to kick in.

The Importance of Human Judgement

Jeff Bezos articulated it incredibly well when he said, back in 2006:

Our judgment is that relentlessly returning efficiency improvements and scale economies to customers in the form of lower prices creates a virtuous cycle that leads over the long term to a much larger dollar amount of free cash flow, and thereby to a much more valuable Amazon.com.

This is a great point to emphasize.

As most long-term decisions with second-order effects require a different thinking approach.

Indeed, most of Amazon’s successful long-term projects that really moved the needle were mostly the result of human judgment, as Jeff Bezos further articulated:

We’ve made similar judgments around Free Super Saver Shipping and Amazon Prime, both of which are expensive in the short term and—we believe—important and valuable in the long term.

Balancing Data with Human Intuition

That is why it’s critical to know when human judgment needs to kick in.

This usually happens when we need to balance short-term decisions with long-term ones.

While quantitative data is extremely useful for telling us the short-term consequences of a decision, it might not tell us anything about long-term ones.

This is true for both positive and negative cases.

Imagine the case, for instance, where quantitative data tells you that a decision is sound in the short term, yet it might carry a lot of hidden costs in the long run.

For instance, consider a company that invests all its marketing dollars in performance marketing campaigns without ever building a solid brand.

Quantitative judgment tells you that performance marketing campaigns work exceptionally well.

However, the thing is, unless you build a strong brand, you won’t survive in the long term.

Only a deep understanding of the business can help you deal with that.

And the opposite is true.

Imagine you would not spend resources building your brand, as you don’t see short-term results.

From an intuitive standpoint, you know that your competitive moat will depend on your ability to build a brand.

Yet if you were to follow the short-term understanding of your business through quantitative research alone, you would end up destroying it in the long run.

Second-Order Effects and System Thinking

Thus, to properly balance out quantitative and qualitative data.

Short and long-term thinking.

It would be best if you practiced second-order effect thinking.

second-order-thinking
Second-order thinking is a means of assessing the implications of our decisions by considering future consequences. Second-order thinking is a mental model that considers all future possibilities. It encourages individuals to think outside of the box so that they can prepare for every and any eventuality. It also discourages the tendency for individuals to default to the most obvious choice.

This implies asking about potential long-term scenarios of action.

Thus, rather than playing the short-term game, by asking what the short-term consequences of an action are, you can instead look into potential long-term implications.

For instance, in business, you might optimize for the bottom line in the short-term.

Yet, you might be giving away innovation, which in turn, might make your business lose competitiveness in the long-run.

So here, the core question is, am I timing for short-term profitability while giving up long-term competitiveness?

If that is the case, you want to keep still focusing on profitability, yet, you want to allocate part of the company’s resources to place innovation bets that can make your business as relevant as possible in the future.

Finding the balance between the short-term and the long-term is exceptionally challenging.

However, it is one of the most critical aspects of any successful business to stay relevant long-term!

Read Next: Qualitative Data, Quantitative Data.

Connected Analysis Frameworks

Failure Mode And Effects Analysis

failure-mode-and-effects-analysis
A failure mode and effects analysis (FMEA) is a structured approach to identifying design failures in a product or process. Developed in the 1950s, the failure mode and effects analysis is one the earliest methodologies of its kind. It enables organizations to anticipate a range of potential failures during the design stage.

Agile Business Analysis

agile-business-analysis
Agile Business Analysis (AgileBA) is certification in the form of guidance and training for business analysts seeking to work in agile environments. To support this shift, AgileBA also helps the business analyst relate Agile projects to a wider organizational mission or strategy. To ensure that analysts have the necessary skills and expertise, AgileBA certification was developed.

Business Valuation

valuation
Business valuations involve a formal analysis of the key operational aspects of a business. A business valuation is an analysis used to determine the economic value of a business or company unit. It’s important to note that valuations are one part science and one part art. Analysts use professional judgment to consider the financial performance of a business with respect to local, national, or global economic conditions. They will also consider the total value of assets and liabilities, in addition to patented or proprietary technology.

Paired Comparison Analysis

paired-comparison-analysis
A paired comparison analysis is used to rate or rank options where evaluation criteria are subjective by nature. The analysis is particularly useful when there is a lack of clear priorities or objective data to base decisions on. A paired comparison analysis evaluates a range of options by comparing them against each other.

Monte Carlo Analysis

monte-carlo-analysis
The Monte Carlo analysis is a quantitative risk management technique. The Monte Carlo analysis was developed by nuclear scientist Stanislaw Ulam in 1940 as work progressed on the atom bomb. The analysis first considers the impact of certain risks on project management such as time or budgetary constraints. Then, a computerized mathematical output gives businesses a range of possible outcomes and their probability of occurrence.

Cost-Benefit Analysis

cost-benefit-analysis
A cost-benefit analysis is a process a business can use to analyze decisions according to the costs associated with making that decision. For a cost analysis to be effective it’s important to articulate the project in the simplest terms possible, identify the costs, determine the benefits of project implementation, assess the alternatives.

CATWOE Analysis

catwoe-analysis
The CATWOE analysis is a problem-solving strategy that asks businesses to look at an issue from six different perspectives. The CATWOE analysis is an in-depth and holistic approach to problem-solving because it enables businesses to consider all perspectives. This often forces management out of habitual ways of thinking that would otherwise hinder growth and profitability. Most importantly, the CATWOE analysis allows businesses to combine multiple perspectives into a single, unifying solution.

VTDF Framework

competitor-analysis
It’s possible to identify the key players that overlap with a company’s business model with a competitor analysis. This overlapping can be analyzed in terms of key customers, technologies, distribution, and financial models. When all those elements are analyzed, it is possible to map all the facets of competition for a tech business model to understand better where a business stands in the marketplace and its possible future developments.

Pareto Analysis

pareto-principle-pareto-analysis
The Pareto Analysis is a statistical analysis used in business decision making that identifies a certain number of input factors that have the greatest impact on income. It is based on the similarly named Pareto Principle, which states that 80% of the effect of something can be attributed to just 20% of the drivers.

Comparable Analysis

comparable-company-analysis
A comparable company analysis is a process that enables the identification of similar organizations to be used as a comparison to understand the business and financial performance of the target company. To find comparables you can look at two key profiles: the business and financial profile. From the comparable company analysis it is possible to understand the competitive landscape of the target organization.

SWOT Analysis

swot-analysis
A SWOT Analysis is a framework used for evaluating the business’s Strengths, Weaknesses, Opportunities, and Threats. It can aid in identifying the problematic areas of your business so that you can maximize your opportunities. It will also alert you to the challenges your organization might face in the future.

PESTEL Analysis

pestel-analysis
The PESTEL analysis is a framework that can help marketers assess whether macro-economic factors are affecting an organization. This is a critical step that helps organizations identify potential threats and weaknesses that can be used in other frameworks such as SWOT or to gain a broader and better understanding of the overall marketing environment.

Business Analysis

business-analysis
Business analysis is a research discipline that helps driving change within an organization by identifying the key elements and processes that drive value. Business analysis can also be used in Identifying new business opportunities or how to take advantage of existing business opportunities to grow your business in the marketplace.

Financial Structure

financial-structure
In corporate finance, the financial structure is how corporations finance their assets (usually either through debt or equity). For the sake of reverse engineering businesses, we want to look at three critical elements to determine the model used to sustain its assets: cost structure, profitability, and cash flow generation.

Financial Modeling

financial-modeling
Financial modeling involves the analysis of accounting, finance, and business data to predict future financial performance. Financial modeling is often used in valuation, which consists of estimating the value in dollar terms of a company based on several parameters. Some of the most common financial models comprise discounted cash flows, the M&A model, and the CCA model.

Value Investing

value-investing
Value investing is an investment philosophy that looks at companies’ fundamentals, to discover those companies whose intrinsic value is higher than what the market is currently pricing, in short value investing tries to evaluate a business by starting by its fundamentals.

Buffet Indicator

buffet-indicator
The Buffet Indicator is a measure of the total value of all publicly-traded stocks in a country divided by that country’s GDP. It’s a measure and ratio to evaluate whether a market is undervalued or overvalued. It’s one of Warren Buffet’s favorite measures as a warning that financial markets might be overvalued and riskier.

Financial Analysis

financial-accounting
Financial accounting is a subdiscipline within accounting that helps organizations provide reporting related to three critical areas of a business: its assets and liabilities (balance sheet), its revenues and expenses (income statement), and its cash flows (cash flow statement). Together those areas can be used for internal and external purposes.

Post-Mortem Analysis

post-mortem-analysis
Post-mortem analyses review projects from start to finish to determine process improvements and ensure that inefficiencies are not repeated in the future. In the Project Management Book of Knowledge (PMBOK), this process is referred to as “lessons learned”.

Retrospective Analysis

retrospective-analysis
Retrospective analyses are held after a project to determine what worked well and what did not. They are also conducted at the end of an iteration in Agile project management. Agile practitioners call these meetings retrospectives or retros. They are an effective way to check the pulse of a project team, reflect on the work performed to date, and reach a consensus on how to tackle the next sprint cycle.

Root Cause Analysis

root-cause-analysis
In essence, a root cause analysis involves the identification of problem root causes to devise the most effective solutions. Note that the root cause is an underlying factor that sets the problem in motion or causes a particular situation such as non-conformance.

Blindspot Analysis

blindspot-analysis

Break-even Analysis

break-even-analysis
A break-even analysis is commonly used to determine the point at which a new product or service will become profitable. The analysis is a financial calculation that tells the business how many products it must sell to cover its production costs.  A break-even analysis is a small business accounting process that tells the business what it needs to do to break even or recoup its initial investment. 

Decision Analysis

decision-analysis
Stanford University Professor Ronald A. Howard first defined decision analysis as a profession in 1964. Over the ensuing decades, Howard has supervised many doctoral theses on the subject across topics including nuclear waste disposal, investment planning, hurricane seeding, and research strategy. Decision analysis (DA) is a systematic, visual, and quantitative decision-making approach where all aspects of a decision are evaluated before making an optimal choice.

DESTEP Analysis

destep-analysis
A DESTEP analysis is a framework used by businesses to understand their external environment and the issues which may impact them. The DESTEP analysis is an extension of the popular PEST analysis created by Harvard Business School professor Francis J. Aguilar. The DESTEP analysis groups external factors into six categories: demographic, economic, socio-cultural, technological, ecological, and political.

STEEP Analysis

steep-analysis
The STEEP analysis is a tool used to map the external factors that impact an organization. STEEP stands for the five key areas on which the analysis focuses: socio-cultural, technological, economic, environmental/ecological, and political. Usually, the STEEP analysis is complementary or alternative to other methods such as SWOT or PESTEL analyses.

STEEPLE Analysis

steeple-analysis
The STEEPLE analysis is a variation of the STEEP analysis. Where the step analysis comprises socio-cultural, technological, economic, environmental/ecological, and political factors as the base of the analysis. The STEEPLE analysis adds other two factors such as Legal and Ethical.

Related Strategy Concepts: Go-To-Market StrategyMarketing StrategyBusiness ModelsTech Business ModelsJobs-To-Be DoneDesign ThinkingLean Startup CanvasValue ChainValue Proposition CanvasBalanced ScorecardBusiness Model CanvasSWOT AnalysisGrowth HackingBundlingUnbundlingBootstrappingVenture CapitalPorter’s Five ForcesPorter’s Generic StrategiesPorter’s Five ForcesPESTEL AnalysisSWOTPorter’s Diamond ModelAnsoffTechnology Adoption CurveTOWSSOARBalanced ScorecardOKRAgile MethodologyValue PropositionVTDF FrameworkBCG MatrixGE McKinsey MatrixKotter’s 8-Step Change Model.

Other strategy frameworks:

Additional resources:

About The Author

Scroll to Top
FourWeekMBA