Beyond Web Analytics: 6 Crucial Qualitative Research Methods You Need in Your Toolbox

If you’re like most public sector web analysts, you probably agree that web analytics is a highly awesome tool in our toolbox…

But web analytics can’t tell us the whole story.

Because web analytics is great at telling us “what”, i.e. what happened on our website…

But it can’t tell us why – most importantly, why are people in IT so hard to work with some visitors able to get what they need from our site…while others struggle?

This is a crucial question, and one that can’t be answered with web analytics data alone.

That’s where qualitative research methods come in…

In this guide, I’m going to explain: 

  • What (online) qualitative research is
  • Why it’s so important
  • A 6-step online research framework you can use to continuously improve your content

What is Online Qualitative Research?

Let’s start with a simple definition of qualitative research first:

Qualitative Research

...is primarily exploratory research. It is used to gain an understanding of underlying reasons, opinions, and motivations …and involves collecting and analyzing non-numerical data.

And here’s a simple explanation of qualitative methods from Wikipedia:

Qualitative methods examine the why and how of decision making.

That’s a great distinction: when we do qualitative research for our web content, we want to find out why people made the decisions they did (why did they visit the website? Why did they follow the site path that they did?) 

So for our purposes, qualitative research tries to get at the reasons/opinions/motivations of our site visitors by collecting and analyzing non-numerical data (i.e. data that consists of words, not numbers).

Why is Online Qualitative Research So Important?

I gave part of this answer away in the intro…

Qualitative research is important because if we only used web analytics, we would only understand what is happening on our websites.

We wouldn’t be able to dig deeper and gain insight into why visitors are doing what they’re doing.

But there’s a related point that’s also hugely important: as the U.K.’s Government Digital Service (GDS) says:

government digital service logo


“Find what works, not what’s popular.”

As the GDS says, in the context of public sector services, something “works” only if “…people who need it can use it to get the right outcome for them.”

Think about that for a moment...

As web analysts, we’re often asked to produce reports and dashboards with “volumetrics”, i.e. metrics like “pageviews” and “visits” that typically have the biggest numbers.

But in digital performance measurement, size doesn’t matter…

What our metrics should be telling us is what works and (equally important) what doesn’t work.

Quantitative research can do that quite well. By setting up “goals” in Google Analytics (“success events” in Adobe Analytics) we can tell how many people achieved specific outcomes (e.g. downloaded a form; completed an online registration).

But we can’t tell why some people were able to achieve those outcomes, while others were not.

That’s where qualitative research comes in – it can tell us the “why”. And online qualitative research can tell us the “why” quickly, cheaply, and at scale.

Since you’re reading this article, I’ll assume you’re already a convert. So let’s jump right into a proven framework for doing killer online research.

6-Step Qualitative Research Framework 

The CO (conversion optimization) agency ConversionXL has developed an evidence-based methodology for generating the insights necessary to increase conversion rates on any website.

You can think of this methodology as a “toolkit” of research methods. Each of these methods gives you deeper insight into the needs and behaviours of your users.

In his article explaining the ResearchXL methodology, the founder of ConversionXL stresses that conversion optimization is not simply a bunch of tactics – it’s a process. As they say:

"...amateurs focus on tactics (make the button bigger, write a better headline, give out coupons etc.) while pros have a process they follow.

Peep Laja, ConversionXL

Peep Laja

ConversionXL

And Peep hits on a crucial point when he says the most important thing about conversion optimization is “the discovery of what matters…If you figure it out, you know WHAT to optimize, and WHERE.”

Combining this insight with that of the U.K.’s GDS, we can say that:

Qualitative research helps us to discover what matters and why it matters.

In the sections below I describe each of the methods in the ResearchXL methodology – what they are, what they’re good for, and how you can use them to improve your digital presence:

(Note that I’m focused in this article on qualitative methods, but the ResearchXL methodology also includes quantitative methods, such as web analytics analysis. I’ll briefly cover those methods, as well.)

Heuristic Analysis

Heuristic Analysis

“Heuristic analysis” (or “heuristic evaluation”) is a fancy term for the process of reviewing a website (or content grouping, app, etc.) and comparing it against accepted usability principles, to determine what improvements could be made.

To put that in more pedestrian terms: as a web analyst (or usability designer, etc.), you know a good website when you see one. Are the web pages you’re reviewing laid out well? Could a visitor easily accomplish their goal(s)?

The Interaction Design Foundation has a good list of steps for conducting heuristic evaluation:

  1. 1
    Establish an appropriate list of heuristics. Start with Jakob Nielsen’s 10 usability heuristics and Ben Shneiderman’s 'Eight Golden Rules of Interface Design', and add criteria that are relevant to whatever you’re evaluating. 

ConversionXL assesses each page of a website using these criteria:

  • Relevancy: does the page meet user expectations – both in terms of content and design? How can it match what they want even more?
  • Clarity: Is the content / offer on this page as clear as possible? How can we make it clearer, simpler?
  • Value: is it communicating value to the user? Can we do better? Can we increase user motivation?
  • play
    Friction: what on this page is causing doubts, hesitations and uncertainties? What makes the process difficult? How can we simplify? We can’t reduce friction entirely, we can only minimize it.
  • play
    Distraction: what’s on the page that is not helping the user take action? Is anything unnecessarily drawing attention? If it’s not motivation, it’s friction – and thus it might be a good idea to get rid of it.
  1. 2
    Select your evaluators. Evaluators should be usability experts (and not your end users).
  2. 3
    Brief your evaluators. Evaluators should be told exactly what they’ll need to do during the evaluation, and all evaluators should be briefed using a standardized process.
  3. 4
    First evaluation phase. This typically takes two hours, during which the evaluators can freely use the website/content/app/etc. and identify the specific elements they want to evaluate.
  4. 5
    First evaluation phase. This typically takes two hours, during which the evaluators can freely use the website/content/app/etc. and identify the specific elements they want to evaluate.
  5. 6
    Record problems. Evaluators should be as specific as possible in recording problems, and all evaluators should follow the same process. (Peep Laja of ConversionXL calls this stage identifying “areas of interest”, as opposed to “problems”.)
  6. 7
    Debriefing session. During this session the evaluators come together to collate their findings and compile a complete list of problems (as well as suggested solutions).

Technical Analysis

Technical analysis involves fixing the technical “bugs” that are stopping your visitors from getting the outcomes they want.

Put another way: is there anything wrong with the way your content renders in different browsers or on different devices, and is this preventing your visitors from getting what they want? And does your content load quickly?

Technical analysis consists of these three steps:

  1. 1
    Cross-browser testing involves confirming that your content works on every major browser
  2. 2
    Cross-device testing involves confirming that your content works on every major device
  3. 3
    Speed analysis is exactly what it sounds like: how fast are your web pages loading?

Page load speed might sound like a trivial thing, but studies have shown that if your pages take more than three seconds to load, you could be losing nearly half of your visitors. (Think about that!)

Web Analytics Analysis

Web analytics analysis is all about ensuring your web analytics is set up properly – especially set up to help you understand if your digital presence is contributing to your organization’s objectives.

(Note: if you haven’t set up a performance framework for your digital presence, and you haven’t established KPIs, now is the time to do it!)

Web analytics analysis involves three steps:

  1. 1
    Analytics “health check” is a series of analytics and instrumentation checks that answers the following questions:
  • check
    Am I collecting all of the data I need?
  • check
    Can I trust the data I’m collecting?
  • check
    Is anything broken or tracking/reporting incorrectly? Why?

For a detailed tutorial on this step, check out ConversionXL’s Google Analytics Health Check

(And check out the References & Further Reading section at the end of this article for tutorials on how to set up Google Analytics the right way.)

  1. 2
    Set up measurement of KPIs. Once you’ve got the basic configuration of your analytics software done, you need to ensure you’re tracking key performance indicators (KPIs). 

As I describe in my guide on creating a performance measurement framework for your content, your KPIs should flow from (a) the objectives of your digital presence and (b) your content goals.

  1. 3
    Identify “leaks”. These are “leaks” in your goal funnels. Allow me to explain with an example:
  • check
    Your content goals should include “effectiveness” goals. An example of an effectiveness goal is “Visitors are able to complete their task during their first session”.
  • check
    If that’s your effectiveness goal, your KPI will have to specify exactly which task you’ll be measuring.
  • check
    Completion of that specific task can be set up as a “Goal” in Google Analytics (or “Success Event” in Adobe Analytics).
  • check
    Once that “Goal” is set up, Google Analytics will allow you to see the “funnel” (aka path) that visitors took to complete the goal (Adobe Analytics treats this slightly differently – the software doesn’t include a goal funnel report, but there are many “pathing” reports that are comparable).
  • check
    So this last step in web analytics analysis is to identify “leaks” in goal funnel (or “paths” in Adobe Analytics)

In other words: where are people “dropping out” of the funnels/paths?

Once you’ve identified those drop-off points, you can use some of the qualitative techniques outlined in this article to find out why your visitors are dropping out.

Mouse Tracking Analysis

Mouse tracking allows us to see where a visitor clicked on, and scrolled down, a web page:

  1. 1
    Click maps show where visitors have clicked on a web page. Click maps are great for seeing patterns in how visitors “interact” with your pages, and can give you insight into what elements on a page get visitors’ attention.

As an example, the image of the SERP (search engine results page) above is a map showing where visitors clicked on the page. The red colour shows where people clicked the most, with fewer clicks as you move away from the red colour into orange, then yellow, then green.

  1. 2
    Scroll maps show you how far visitors scrolled down a page. In the image below, the shaded colours show how far down the page various percentages of visitors scrolled.
  1. 3
    User session video replays are exactly as they sound: videos of individual visits that you can watch, to see exactly what a visitor did on a page, how long it took them to do it, etc.

The GIF below shows a user session video:

Qualitative Surveys

Qualitative surveying is an exploratory research method in which open-ended questions are used to probe a topic (as opposed to quantitative surveys, which typically use close-ended questions, and are meant to provide statistically-significant results for a large population).

Qualitative surveys can provide you with valuable data on:

  • check
    Specific obstacles that are preventing your visitors from accomplishing their goals.
  • check
    Ideas for changes to make to your site/content to remove those obstacles.
  • check
    Exactly who your “customer” is (or, in the case of the public sector, who your primary target audience or client is).

The key point here is to understand the characteristics of your most successful visitor segments. In other words, what differentiates the people who are getting what they want from your web content, versus the people who are not?

You might think you know the characteristics of your visitors, but do you really know?

Audience Segmentation

This just might be the best slide on audience segmentation ever created. Source

 Here are the two major types of online qualitative surveys:

  • On-site surveys can be served to visitors as a “pop-up” that is triggered when the visitor has been on a page for a specific period of time, or is about to exit the page (or other criteria that you set).

The best uses of on-site surveys are to find out:

  • check
    Why your visitors are on your site (i.e. what they want to accomplish); and
  • check
    What problems they’re having accomplishing their goals on the site.

You can take that information and use it to remove the obstacles that visitors have told you are getting in their way.

  • Customer/client surveys can be used to understand the characteristics of clients who successfully use your site/content.

In both the private and public sector, we often have the contact details of clients (at least their email address). By contacting clients who achieved their goal(s) while using your site (e.g. applying for a program), you can learn how and why they were able to successfully use your site.

When formulating the questions to ask clients, remember that you want actionable data. So don’t ask “nice to know” questions – for every survey question you consider asking, put it through this filter:

“What will I do with the data from this question?” 

If you don’t have a good answer, eliminate that survey question.

Here are 4 types of questions to ask in a client survey:

  • check
    Who they are. Ask questions that encourage the client to “self-identify” (very useful for putting together personas)
  • check
    User intent. What specific problem were they trying to solve when they came to your site?
  • check
    Navigation process. Did they consult other sites before coming to yours? Was there anything else they did/steps they followed before coming to your site?
  • check
    Friction. Ask questions that get at “FUD” – any fears, uncertainties, and doubts they had before achieving their desired outcome on your site.

User/Usability Testing

User (aka usability) testing is about task completion. Can a site visitor achieve his or her specified goal efficiently, effectively, and with a high level of satisfaction?

Here’s another definition, by usability guru Steve Krug:

Steve Krug, Author of Don’t Make Me Think

Steve Krug

Usability really just means making sure that something works well: that a person of average (or even below average) ability and experience can use the thing – whether it’s a web site, remote control, or revolving door – for its intended purpose without getting hopelessly frustrated.

So usability is about a person being able to use something to achieve an outcome or a goal.

The usability group in the U.K.’s Government Digital Service (yes, them again) has a great line related to this:

government digital service logo


“Effectiveness for all users takes priority over efficiency or satisfaction for some users.

That’s an interesting position to take: that it’s more important for a website to help all visitors achieve their goal, than it is to provide an efficient or satisfactory experience for some visitors. 

In other words: are we helping every visitor get done what they want to do?

There are a ton of ways you can do user testing. Here’s a great list from Nielsen Norman Group:

And here's a quick description of the key methods:

  • Usability Lab Studies: participants are brought into a lab, one-on-one with a researcher, and given a set of scenarios that lead to completion of tasks and usage of a product or service
  • Ethnographic Field Studies: researchers meet with, and study, participants in their natural environment, where they would most likely encounter the product or service in question
  • Eyetracking: a device is configured to precisely measure where participants look as they perform tasks or interact naturally with websites, applications, etc.
  • Moderated Remote Usability Studies: usability studies conducted remotely with the use of tools such as screen-sharing software and remote control capabilities.
  • Unmoderated Remote Panel Studies: a panel of trained participants who have video recording and data collection software installed on their own personal devices uses a website or product while thinking aloud, having their experience recorded for immediate playback and analysis by the researcher or company.
  • Clickstream Analysis: analyzing the record of screens or pages that users click on and see, as they use a site or software product.
  • A/B Testing: a method of scientifically testing different designs on a site by randomly assigning groups of users to interact with each of the different designs, and measuring the effect of these assignments on user behavior.
  • Unmoderated UX Studies: an automated method that uses a specialized research tool to capture participant behaviors (through software installed on participant computers/browsers) and attitudes (through embedded survey questions), usually by giving participants goals or scenarios to accomplish with a site or prototype.
  • True-Intent Studies: a method that asks random site visitors what their goal or intention is upon entering the site, measures their subsequent behavior, and asks whether they were successful in achieving their goal upon exiting the site.

For an excellent (free!) online manual on how to do user research, check out the U.K. Government Digital Service’s user research manual.


Wrapping Up

Web analytics is an extremely important tool in any web analyst’s toolbox - but when we’re analysing web analytics data, we’re really only able to answer the ‘what’, i.e. ‘what did our visitors do on our site?’

To answer why visitors took the actions they did (or did not) on our site, we need to use qualitative research techniques. Qualitative research is used to gain an understanding of underlying reasons, opinions, and motivations, and involves collecting and analyzing non-numerical data. Online qualitative research has the benefit of being relatively inexpensive, quick, and scaleable.

There are dozens of online qualitative research techniques. In this article, I introduced an excellent framework created by the conversion optimization agency ConversionXL that groups the most important research methods. The framework consists of these categories and methods :

  • Heuristic analysis is the process of reviewing a website (or content grouping, app, etc.) and comparing it against accepted usability principles, to determine what improvements could be made
  • Technical analysis identifies the technical “bugs” that are stopping your visitors from getting the outcomes they want
  • Web analytics analysis ensures your web analytics is set up properly – especially set up to help you understand if your digital presence is contributing to your organization’s objectives.
  • Mouse tracking analysis allows us to see where a visitor clicked on, and scrolled down, a web page.
  • Qualitative surveying is an exploratory research method in which open-ended questions are used to probe a topic (as opposed to quantitative surveys, which typically use close-ended questions, and are meant to provide statistically-significant results for a large population).
  • User/usability testing is about task completion. It answers the question ‘Can a site visitor achieve his or her specified goal efficiently, effectively, and with a high level of satisfaction?’

About the Author Maurice

I've been working in digital marketing for 15 years, with a specialty in web analytics and everything performance measurement. I'm a researcher by avocation and love building frameworks (how nerdy is that!)

>