Out now: Value-Based Design, the definitive way to prove your design’s worth. Read it.

The Ultimate Guide to Heat & Scroll Maps

 

A heat map shows, in aggregate, how people interact with a page. They are extremely helpful for figuring out what to test, how a test affected customer behavior, and assessing how an interface is connecting your business’s goals to your customers’ goals.

Heat maps also usually come with scroll maps, which show what percentage of users scrolled to what portion of a page. Scroll maps are great for showing when people stopped reading something. Could there be something wrong with your pitch? Has the reader made a purchasing decision by this point? Scroll maps are great for answering these sorts of questions.

Heat Maps

First, you need to make a heat map. Here’s how you do that:

  1. Get a tool for creating heat maps. Crazy Egg and Hotjar are great candidates.
  2. The tool contains some JavaScript tracking code, so it can detect where people are clicking and scrolling. Install that.
  3. Go back to your tool. Enter the URL of the page you want to generate a heat map for.
  4. Wait a week or two.

You now have a heat map. It’s that easy. I didn’t omit any steps. It’s highly unlikely you’ll encounter bugs in the process.

Here’s what a heat map looks like:

Heat map 1

Here’s another:

Heat map 2

These are real-world heat maps from a Draft Revise client, KeySmart, who graciously offered them for us. Let’s dive in!

How to analyze these heat maps

There is bound to be noise like this in your heat map:

Heat map 3

Pay attention to what’s lighting up – as well as what’s going ignored. You’re looking for broad-scale trends that show a clear path forward.

Avoid reading into the tea leaves of random fields of clicks that don’t make any sense:

Heat map 4

Instead, focus on buttons that see zero activity, and try to figure out why:

Heat map 5

Let’s take the first heat map, which is of a product page for a KeySmart.

Heat map 1

Here are a few things to note:

  • Lots of people are playing with the add-on pull-down. That seems like an opportunity to make all three add-ons more immediately legible and understandable – or to move them later in the checkout process, upselling people after they’ve selected a model of KeySmart.
  • Not many people are customizing the color of their KeySmart. Either they don’t want to or they don’t know to. Future tests could rework the layout and behavior of the color selector.
  • There’s very little activity outside of these two areas – including on the navigation. Future tests could enclose this page (removing the header & footer navigation), or remove any elements below the checkout button (depending on whether people are scrolling down a lot to get a sense of the page’s credibility).
  • Towards the previous end, people are clicking very infrequently on the expanders for features, various FAQ items, etc. Either people don’t care (possible!), they don’t know to click on these, or they’ve already made a purchasing decision. Future surveying and testing would confirm their behavior in practice, allowing us to create a layout that better met their needs.

Now, let’s take the second heat map, which is of KeySmart’s home page. Here are my takeaways:

  • Nobody seems to care about their social media links. Move them to the footer; they distract from generating revenue.
  • The currency selector seems to be getting a lot of activity. Do what you can to geolocate the customer, so you don’t have to provide a pull-down for language and a currency selector. This should keep people more focused on the value that KeySmart is able to provide.
  • The old primary call to action, “Get a KeySmart Extended”, is seeing very little activity. Same with the second video:
    Heat map 6
    Future tests could automatically play the video, provide a different video, or even remove the primary CTA in favor of “I Need a KeySmart” below. (My take: adding “I Need a KeySmart” & “I Have a KeySmart” was the result of a successful test, which probably diverted attention from the rest of the masthead. Time to clean up the page.)
  • People are dropping off after they see the grid of products below the shipping callout. Nobody is clicking on the video or noticing the callout for “the compact solution to your bulky key ring.” A future test could even remove these products, and move the value- and benefits-focused text further up the page as a result.

How to compare control & variant on a heat map

If you’re running an experiment, try to get separate heat maps that are segmented to every single variation in a test. This allows you to compare your control & variant really elegantly.

Again, here you need to pay attention to big shifts. This was readily apparent in the test we ran that added two new CTA buttons below the masthead: people just zoomed in on those and proceeded to ignore the primary CTA. It generated more revenue for the business, which is great – so what now? Do we remove the old CTA? Do we pursue a bolder rework of the masthead?

The heat maps from this particular test showed the knock-on effects that a test can provide, especially when messing with such significant elements as the primary call to action. Always vet your control against your variations to see not only whether the test won, but how, and what ramifications it has on your customers’ real-world behavior.

Examples of design insights that inform future tests

Per usual, I’m not here to recommend any experiments that are guaranteed successes. Instead, what I want to provide is a playbook. What should you do when you notice something happening in your heat maps? Here are a few of the most common situations I see, some potential explanations for those, and a brief description of what to do next.

For SaaS

  • If 95% of your clicks are to log in: Get more qualified, wallet-out traffic to your site, so you can get enough statistical significance to run A/B tests.
  • If nobody is clicking your primary call to action: Well, what are they doing instead? If the bounce rate is too high, try testing a rework of your pitch – and maybe even your value proposition.
  • If everyone is beelining for your pricing page: That might be a good thing, if it moves people down your funnel. But you might also be missing an opportunity to give your pitch and actually sell the customer on what you offer. Usually in sales, your price comes last – so if they just click on the pricing link in your header, they’re either already sold on the notion (unlikely) or they want to know how much you cost before they invest themselves in reading the pitch (probable). Removing pricing entirely is a bad idea because it conveys you’re probably too expensive for them – bad for B2C especially. Burying pricing may be interesting for a test. Reworking the pricing page to continue the sales process is what I generally recommend, but it tends to have a lower overall success rate than we’d like. There are lots of options, and you need to tread carefully on them, because it depends heavily on who you’re selling to and how price-conscious they may be.
  • If people are beelining for the tiny free plan callout on your pricing page: Cut the free plan unless you have a concrete and proven strategy for laddering people out of the free plan. (For example, Slack has strict limits that move teams to paid plans frequently.) They’re a bear to support and likely a huge drain on the bottom line of your business.
  • If people are going nuts on your footer links, but ignoring your core value prop: One option is to rework the value prop. Another is to kill the footer, so you force them to read the value prop. A third is to rework the footer, so people only go to links like support, contact, or terms & conditions.

For ecommerce

  • If nobody is using your navigation: First, go to Baymard Institute’s site and read everything they have about ecommerce navigation. Then, explore why this might be happening. Are people buying straight from the home page? What traffic sources are people coming in from – are they searching for your products with Google? If so, is there something amiss with your search? Look more deeply into how people are browsing using Google Analytics – what paths are they taking, and does that fit your business’s goals? Only then can you figure out what to modify for a test.
  • If nobody is using your filters or categories: It could be that your filters or categories aren’t the right ones to fit actual customer use. I’d run a few usability tests that ask people to find and purchase the right thing, and see how they end up behaving.
  • If too many people are clicking around, but too few people are buying: Have you done a good job selling your product? If it’s particularly high-involvement – like, say, a fancy artwork on 1stdibs – then it may be that people are more liable to window-shop. In KeySmart’s case, they’re selling a $20 keychain. You have $20. Most of KeySmart’s customers have $20. If nobody ended up buying KeySmarts, then you absolutely need to do a better job of selling the thing.
  • If people are spending too many clicks on fiddly customization options: Simplify your customization – or run a test that defers it until later in the checkout. Do what you can to gradually engage the customer, so they don’t feel overwhelmed from the get-go.
  • If people are clicking on your navigation from your product page: Are they entering your product page through other traffic sources, like social media or Google? If so, run a test that explains the product – and assumes that the customer knows nothing about it. Are people entering from the home page, and bouncing around? Perhaps they’re considering all of their options, which means you need to do a better job of clarifying your product line earlier. In that case, I’d run a home page test that lays out all of the product offerings, and suggests specific use cases for each one.

Heat maps for smartphones & tablets

You need to run heat maps for all three platforms: smartphones, tablets, and desktop computers. Why?

  • Because the interaction model will be vastly different. If you have a different menu system, flow, and even checkout process, you’re going to want to see how use changes on a smartphone.
  • Because the heat map may yield noise otherwise. People could be tapping on entirely different elements than on your desktop layout, which could show as noise on the heat map itself.
  • Because the whole game changes on smartphones. If you run a B2B SaaS or ecommerce site, you know that engagement, transactions, and revenue all crater on smartphones. The game gets harder, and the rules change. You always need to play on your customers’ turf if you want to win.

So, run three sets of heat and scroll maps, on desktop, smartphone, and tablet – in order to determine how behavior changes over time.

Scroll Maps

Here’s an example of a scroll map, from former Draft Revise client KeySmart:

Scroll map

Red areas correspond to 100% of customers viewing those elements; then it fades to yellow (for 75%), green (50%), blue (25%), and black (0%).

Let’s talk about when scroll maps are useful, and how to meaningfully act on them.

When are scroll maps useful?

Scroll maps matter in two main ways:

  • When correlating engagement in corresponding heat maps. Usually, scroll maps are run at the same time as heat maps, acting on the same data. That allows you to understand whether there’s a high percentage of people engaging with elements that look dimmer on the heat map than they should.
  • On long-form pages. Are you running a long-form that you want customers to read, not just skim? Scroll maps tell you when customers get bored and leave – which lets you isolate any elements that need to be reworked to reduce bounce rates.

Now, let’s talk about how to create and analyze scroll maps.

How to create scroll maps

Scroll maps are usually created by default in your heat mapping tool, but some do not create scroll maps at all. Here’s a run-down of the current state of the art:

  • Hotjar: Creates scroll maps alongside heat maps.
  • Crazy Egg: Creates scroll & confetti maps alongside heat maps.
  • Mouseflow: Creates scroll maps alongside heat maps.
  • Lucky Orange: Creates scroll & confetti maps alongside heat maps.

If you’re looking for a tool to get started, I typically use and recommend Hotjar to newcomers.

Once you’ve signed up, your heat map tool comes with a snippet of JavaScript to install on your store. Install that, enter the URL of the page you want to generate heat & scroll maps for, and wait a little bit. It’s just as hard to create scroll maps as it is to create heat maps.

How to analyze scroll maps

So, you have a scroll map. What do you look at?

First, take a look at the most conversion-focused elements on the page:

  • Price
  • Calls to action
  • Risk-reducing text & objection busters

Audit each of these, isolate where they are on each platform, and figure out what percentage of your customers are scrolling to each element. Scroll mapping tools often provide the ability to hover the map to see exactly how many customers are scrolling to each element.

If your most important elements are hidden below the fold, 1) they probably shouldn’t be, and 2) it’s likely that your scroll maps will uncover some significant issues with the page’s ability to move customers to the next step.

Then, take a look at any significant drop-off points where over 10% of your customers are exiting the page in the duration of one element. If most of your customers are dropping off at a given point, that’s probably a sign that you need to rework the page in order to keep their attention – if that’s a clear goal of the page, of course.

Time-scale analysis of scroll maps

You should run heat & scroll maps for all of your funnel’s key pages every 2 months at the least.

Collect your pages’ scroll maps over time, to determine whether more people are scrolling to key elements of the page – and whether the percentages for each conversion-focused element are going up over time.

Confetti Maps

Unlike heat maps, confetti maps show specific clicks on a page. Also called click maps, they’re higher resolution than heat maps, and they require deeper, more quantitative-driven analysis.

Confetti maps are good for:

  • Analyzing specifically how many people click in a region versus on a specific place – which is good for understanding bugs in your funnel around click regions & tap targets.
  • Understanding percentages of hit goals without needing to resort to quantitative analysis in GA.
  • Breaking down clicks by specific referrers.
  • Sharing with other team members who might be more data-driven.

Hotjar lets you hover specific elements in the DOM to assess who’s clicking where – but it measures click events per DOM element, meaning you don’t have a clear sense of where people are tapping, especially on large elements.

Optimizely doesn’t have confetti maps as a feature at all. Neither does Hotjar, for some reason. So you’ll need to go to Crazy Egg to do it.

Crazy Egg works the same way as Hotjar: sign up, install the tracking snippet, get a heat map running, and wait a day or two for your findings. You get a confetti map along with the heat map. This costs you all of $29.

What to Do First: Customers not hitting the target

One of the biggest issues in conversion-driven web development is figuring out the proper size of your tap targets. For example, let’s say you have a 3-column plan grid that contains CTA buttons at the bottom of each column. Only the buttons are clickable.

It is very common to run a confetti map on this page and realize, to your infinite horror, that people are clicking all over the page to no effect. The answer, of course, is to make each column clickable to sign up for each respective plan.

You’re looking at what the customer is doing and updating your page’s behavior to match their expectations. In doing so, you’re making it easier for the customer to move down-funnel and give you money.

Here are some other common things to pay attention to when analyzing a confetti map:

  • People clicking modules on your home page, but not exactly on the CTA button.
  • People trying to swipe left or right on images in an unswipeable gallery.
  • People tapping outside too-small elements on a page, especially around product customization & checkout.
  • Header navigation misfires on zero-padding links, especially on desktop.

The best thing about confetti maps: you can count specific clicks on a map and determine precisely how many people are likely to benefit from the change. Yes, this means you’re counting a lot of small dots. It’s good intern work. But it’s also terrific for making a specific case for a usability improvement.

The Level-Up Move: Break down confetti by referrer

Next, you’ll want to see if behavior changes based on where customers come from – especially on pages that are receiving outsize traffic for whatever reason (you’re sending it there via ad campaigns, it’s your home page, etc).

Different colored confetti shows by referrer. This is great for understanding how various ad campaigns may be affecting customer behavior. If you run a site that bases lots of its traffic off of affiliate campaigns, it’s useful for seeing whether customers change in behavior when they come from a specific site – and whether that behavior maps naturally to conversion. The same with guest blog posts, podcasts, etc. All traffic is not made equally.

For example, if you find out that specific traffic sources are more likely to beeline for pricing and they’re unlikely to buy, you may want to try a few things:

  • Personalize the site for them such that either pricing is hidden (forcing them to contact you).
  • Put pricing upfront and encourage impulse purchases (this depends heavily on your business’s AOV & LTV, of course).
  • Or place your marketing efforts in traffic sources that are more performant to the business.

The overarching process is as follows:

  1. Match traffic sources to their conversion metrics on GA.
  2. Break out your confetti maps by the same traffic sources.
  3. Determine if there’s a difference in behavior between traffic sources.
  4. Determine if there’s a correlation between the likelihood of conversion and the difference in behavior.
  5. If there is a correlation, make a tactical recommendation to increase the conversion rate, either by personalizing the traffic source or by reducing the amount of low-quality traffic.

Other kinds of confetti maps

Why just break out confetti colors by referrers? Other breakdowns are similarly valuable:

  • Mobile vs. desktop vs. tablet. (You might want 3 different page layouts to reflect the differences!)
  • Countries & continents, especially if you have language & currency selectors on your store.
  • Specific ad campaigns.
  • Social network refers, especially if you have strong Facebook or Instagram acquisition strategies.
  • Page load times. Blue confetti should be if the DOM loads quickly, and then it could fade to red as it gets progressively longer to load the page.

Clicktale allows you to segment in these ways, but it’s also a more enterprisey, “call us” type service. There will be premeetings, and meetings, before you can kick its tires. My deepest apologies for this.

GA lets you segment like this, and it offers click maps, but only on elements that are already clickable. This sucks for many reasons, and primary among them is that you can’t address major usability flaws with unpliant elements. As a result, I strongly recommend seeking third-party tools for both confetti & heat maps.

Troubleshooting

Let’s go over some of the most common issues when running heat maps, and the things you can do to address each one of them.

When the screenshot doesn’t underlay

Heat maps are generally overlaid on top of a static screenshot of the page. Most heat mapping tools take this screenshot using their own scripts; some use a third-party API like BrowserStack to do it.

Sometimes, the screenshot doesn’t underlay at all.

For Hotjar, this is best solved by pinging their support and having them do it themselves. (As of press time, Hotjar doesn’t allow you to underlay your own screenshot.)

For Crazy Egg, you can retake the page, which usually solves the problem.

Excessive noise

Sometimes, heat maps yield excessive noise: either a bunch of taps uniformly across the heat map, or pockets of taps in places that make no sense.

Noise uniformly across the heat map

This is likely the result of aggressive sideloading of elements changing the DOM tree in the background of the customer’s session, such that their taps keep registering in unexpected places as they continue browsing and loading new elements.

This is especially unfortunate, since any background noise is likely supposed to be rendered as taps in places you would expect on the page. This reduces the quality of signal you’d get in the heat map, and it lengthens the time it takes to get any meaningful, actionable results.

In order to address this, you’ll either need to change the way that you side-load assets (which is non-negligibly difficult, and could severely harm the page’s load & render times), or you’ll want to cross-check your results with a new heat map provider.

Pockets of noise in unexpected places

This is usually the result of people tapping on an element that clearly exists on the page, but taps are registered by the heat map as existing elsewhere.

You might be able to look at the heat map – especially hovering the heat map within your heat mapping tool – and determine where the customer truly intended to tap. But even then, that doesn’t make the heat map very shareable, and it remains a little confusing for you.

The #1 issue that causes this is absolutely positioned elements. If you have any major elements that are absolutely positioned on the page, taps could register in unexpected places, depending on the customer’s viewport width.

This also happens quite often with sticky headers & footers. Taps could be registered in a column down the page, in line with the middle of the element you’re tracking.

In either situation, you’ll either need to figure out how to change the positioning of the element (some absolute positioning can be solved with margins or relative positioning, and absolute positioning should only be used when necessary in general), or you’ll need to find a heat mapping tool that can follow sticky elements more gracefully.

Hotjar does this well enough; in my experience, Crazy Egg does not.

Elements are nudged in one direction

Sometimes, taps appear uniformly shifted a few pixels to the sides of their corresponding elements.

If shifted to the side, this is usually the result of a breakpoint issue; the heat map is rendering at a specific width, while most customers are accessing the page at a different width.

If shifted vertically, this could be the result of an injected header or footer banner, or a sticky header or footer, pushing the content in the wrong direction.

In either situation, you may need to ask the heat mapping provider to retake the underlying screenshot. If the issue is with injected elements affecting the page render, then you should probably change the timing of when those elements are injected, pushing them before the full page renders and document.ready fires to the browser.

Mobile heat maps don’t fire

If your desktop heat maps are showing new taps correctly, but your tablet & smartphone heat maps don’t show anything, then you could have some structural issues with your theme. Perhaps it’s rendering the whole experience as an AJAX-powered single-page app, or perhaps an overlay is preventing taps from registering.

In either case, this usually indicates a significant issue with the DOM that might require a re-theme. Sometimes, though, you can get off easy and discover that the heat mapping tool’s snippet simply doesn’t render on mobile. Try there first, and then look into the way that the page is being rendered for customers.

Remember that every addition of complexity in your theme is a conscious choice. The more complicated you’ve made things, the harder it will be to track customers’ behavior and make meaningful changes.

Time-scale analysis

You should run heat & scroll maps of every key page in your funnel at least once every month. That includes:

  • Home page
  • Category page
  • 2-3 product detail pages
  • Cart
  • Checkout
  • Thank you/order confirmation
  • Any landing pages that you’re directing traffic to

You can get a lot of insights looking at just one heat map, of course. But heat maps gain a whole new layer of value when you are able to track how they change over time. Let’s talk about how you can learn from many heat maps in aggregate, as you continue to make revenue-generating design decisions that change both your store and your customers’ behavior.

How to analyze time-scale changes

There are two main ways you can assess time-scale changes in heat maps: either through image diffs, or through overlaying.

In programming parlance, a diff is how you compare two blocks of text to determine what’s been added, removed, and changed. Diffs are commonly tracked per line, but you can always get more granular to assess changes between specific blocks of text.

And you can run diffs for images, too! There are a handful of great tools for the purpose, where you can look at two images at the same time and make comparisons. We use Kaleidoscope here at Draft. Obviously, you’ll want to compare the same kind of map (heat, scroll, confetti, etc), page, and platform (smartphone, desktop, tablet).

Assessment & synthesis

On that example, you can see differing interactions on the image carousel, configurator, fit & care tabs, and specific color ways. You can then vet different pairs of images across a longer period of time to determine whether there’s a broader arc in customer behavior.

You should also check specific heat maps against your annotations in Google Analytics to determine what changes you made and when, so you can compare any heat map changes against the way the store has evolved over time.

Once you’ve determined how customer behavior has changed, it’s time to synthesize your findings into new design decisions – and continue attempting to match both long-term changes in customer interaction, and short-term adaptations to your store’s usability.

Wrapping Up

With this guide, you should get a sense of why heat maps are important, how to run & read one, and the different kinds of design insights that you can garner.

← Back to the Blog Check Out Draft’s Store →