Do Users Understand Mobile Menu Icons?

Call it burger, hamburger, navicon or whatever – it’s that icon made up of 3 bars that indicates a menu.

Indicates to whom exactly? Like virtually every other element I add to a website, I just assume that users will know.

This interesting Twitter conversation made me think otherwise.

 

So I ran a small A/B test using Optimizely*.  The results were a little underwhelming but interesting.

Original Layout

menu1

Icon only, no text or borders.

Variation 1

menu2

Same as control, but with MENU written below.

Variation 2

menu3

Same as control, but with a rounded border.

Results

mobilemenuresults

The differences are negligible, slightly more users clicked the Icon + Border, than just the icon alone.

I suspect this is because the border draws the eye a bit more.

The experiment was targeted at all mobile browsers, and ran across all pages on the site.

What’s surprising is the low number of clicks – only 2% of users actually open the menu. This is indicative of the kind of site (blog-type content), as opposed to a more highly engaged service-type site (e.g. Facebook).

So What?

In my opinion discerning the overall intent of the user is far more important (and challenging), than the exact specifications of the menu icon.

If the user has a reason to go looking for something then they are more determined to spend the extra millisecond looking for something that looks like a menu.

 Desktop Behavior is Different

Last year I did a similar test on the desktop layout of the site.

I tested 4 variations of the content inside a blue button:

  1. (Baseline) The word “MENU”
  2. The word “MORE”
  3. Hamburger icon + “MENU”
  4. Hamburger icon + “MORE”

The results were far more significant.

Tests 2, 3, and 4 performed far worse (18%, 31%, and 43% worse respectively).

Adding the bars icon appeared to confuse users and over-complicate a simple button.

An Expensive Test

* Optimizely billing and pricing is painful. They have monthly pricing based on a fixed number of visits to your A/B test page(s). It’s not cheap.

They charge you when you go over the monthly allocation, but don’t rollover any unused visits. It’s a lot like government tax departments that charge you interest for late payments, but don’t appear to do the same thing when they owe you a refund.

With Optimizely I have a 2,000 visit per month plan.

My experiment quickly hit 19,684 visits before I was able to pause it. Bill? An extra $117 for those extra 17,684 visits – ouch!

Even 20,000 visits to a test does not really give you a great deal of statistical confidence.

UPDATE: See another larger A/B test on the hamburger icon here.

James Foster
James makes websites, you can follow him on @exisweb.
Filed under AB Tests Responsive Design
Updated: February 18, 2014

40 Comments

  1. Curious why 20,000 visits doesn’t give you confidence. Is it the low number of interactions with the burger? That should be plenty for relevant data. 1000 samples should give you like 95% confidence iirc.

    • I guess it’s because of the low number of interactions… and maybe I’m just suspicious by nature.

  2. Interesting! Esp the pricing :)
    On a serious note, I once run a qual test on 20+ people using a complex desktop application. Main finding: As long as there was a single button, it did not matter at all what the icon was, people found it and clicked it. When there was more than 1 button, some people inevitably got confused with some icons, and did better with text+icon or text alone. So overall my study supports your findings, but things get orders of magnitude more complicated with more icons.

    • Very interesting. One thing I’ve found with A/B testing is: assume nothing. Sometimes results defy explanation.

  3. I wonder how the Icon + Border + MENU would perform in this test. My gut says this would be the best configuration. Thanks for running the test, good stuff!

    • Yeah I’m for that. I’ll try it on my next test, and this time I’ll gather some more analytics data, and hopefully demographic data as well.

  4. I highly recommend that you try some manually coded A/B testing and use Google Analytics to capture and assess the data. Free is always better. I cringed when you said how much you had to pay!

    • You cringed, I nearly fell off my chair. Why you can’t set a visitor cap on optimizely is beyond me – surely it would seem a simple feature to add.

      • Maybe it’s too cynical, but I’m guessing because it brings them more revenue to not allow a cap?

        • I suspect that’s the case. It’s not very nice to log in and discover your experiment has gone way over on the visitor numbers.

          I don’t use optimizely anymore for that reason. It’s great tool to setup AB tests quickly. Probably why I haven’t run too many more tests as they take so long to setup.

  5. Very interesting. Thanks for sharing.

    I’m wondering why some versions performed better then the others:
    - Lack of affordance in baseline version and version 1
    - Users don’t recognize the Hamburger icon as the menu
    - The border in version 2 improved the visibility of the icon
    …?

    • IMO the border improves visibility and makes it look like a button.

  6. If you have to write a description of what an icon does below it, the icon is so bad it shouldn’t be used! Simple.

    • Agreed – and it just adds more noise.

  7. I was following this conversation on Twitter (I saw it on Brad Frosts timeline) and have found that in a few cases we have had to implement additional text next to the icon due to the sites user base not being entirely savy with the icon. Fortunately it didn’t take much away from the design though.

    Great to see some data from your test though.

    • Thanks I’m going to do some more tests soon.

  8. It seems as though perhaps the flat lack of clickable affordance is more detrimental than the lack of semiotic meaning. (laymen’s terms: the border made the icon look more clickable, and labeling it “menu” gave the icon more meaning, but the clickability led to more clicks.)

    The dangers of flat UI design are real!

    • I think the biggest confusion of flat design (I’m looking at you ios7) is where you have a word that is clickable. I could test the word “menu” all alone on my next test. Would be interesting.

  9. I just ran a lab-based test a couple of days ago on a similar design (flat burger icon, no border, no label) and I was surprised observing that 12 out of 14 users struggled to find a way to navigate and needed multiple attempts before finding the menu.
    Interesting, the search icon (also flat, no border, no label), was found effortlessly.
    Each design is a world apart and you can’t draw general conclusions so easily, but I do believe that that was a clear case of users that did not recognize the icon (and they were quite advanced users BTW).

  10. We have tried with on screen tool tip for the 1st time visitor for the menu icon. That works.

    • By tooltip do you mean some kind of overlay? Obviously on a touch device there is no hover / tooltip.

  11. Nice article. I’m actually surprised nobody has done this yet. We’ve had requests from clients to use the ‘hamburger’ menu because “everyone knows what that means” but I’ve never been able to tell them, actually no, you might be wrong there because…

    I’m particularly curious as to whether comprehension of it is linked to visitors age. My hypothesis (not really based on any solid reasoning) is that younger users will have more success with the hamburger whereas older users will not recognize it as often so may struggle with it. But as I say, that is just my assumption.

    • I’m sure there have been plenty of tests. I’ve done a lot of tests over the last few years, but never shared or blogged about them.

      I’m just realizing how useful this information is to the wider community, so from here on in, I’m going to write up all the A/B tests I do.

      As for the age thing, I’ve seen that from lab-type usability tests on a small sample.

  12. Found your article via webdesigntuts+. Great job! Very interesting and a good hint for practical work.

    • Glad it was useful.

  13. This is interesting. Thanks for sharing. For your next test, how about reversing out the menu so that the burger layers are light shapes surrounded by a dark rectangle? This will give the menu button more visual mass.

    Also, for another version, how about adding dots to the left side of the menu to create something that looks more like a bulleted list?

    These will probably make nothing more than marginal differences, but who knows, eh?

    While it’s a little old, I have successfully conducted A/B testing on WordPress sites using the following plugin:

    http://wordpress.org/plugins/maxab/.

    Nope. I’m not the author! It should provide similar data to that offered by Optimizely.

    • “adding dots to the left side of the menu” – do you mean on the hamburger icon itself (for example like the notifications icon on Mac OS mavericks?

      The inverse hamburger is an interesting one. I might look at testing it further on down the track.

      Thanks for the heads up on the AB testing. There’s a few options that I will explore. Those kind of plugins are great for testing alternates of an individual page (typically a signup page) but not so good for small UI tweaks across a whole site.

  14. Thanks for this blog. We implemented a responsive site last year. At phone breakpoints we used the hamburger. It never tested well. Since I wasn’t directly on the team my advice to replace it was ignored. This may help justify fixing the feature.

    • You’re welcome. I’m in the middle of another bigger test, and it’s not looking good for the hamburger… results soon.

  15. Have you realized that the confidence intervals in the table do overlap, hence you can not decide, statistically speaking, that A is better than B?

    • Which is why I’ve gone onto do a second, and now a third larger test.

Add a Comment

Your email address will not be published. Required fields are marked *