In case you missed this post about an AB test on mobile menu icons, make sure you check out the comments. There are some very interesting insights about A/B testing and its shortcomings.
The post went a tiny bit viral, and suddenly it wasn’t just my mother reading this blog.
Three things I learned:
1. This icon has lots of names: hamburger, sandwich, and even hotdog ?! What it actually is, is a list icon. We’ve just co-opted it to mean a navigation menu.
2. When something gets noticed, some people get a little mean (source)
3. One commenter said I was the Dunning–Kruger effect in action. This phenomenon is when you try to sound clever but are actually a dumbass.
Thanks for the vote of no-confidence.
In this hyper-connected world full of rockstar developers and super-smart designers, I’m humbled on a minute-by-minute basis. I might need to start attaching positive affirmation stickers on my laptop.
The Final Hamburger A/B Test
I do enjoy A/B testing, but conclude what you want. I’m not an expert, nor am I advising anything, but sharing what happened on a single website.
Using a commercial A/B testing service can get very expensive very quickly, and well beyond the budgets of small-time web designers and developers. So, hopefully, these posts are helpful for some of you.
Bordered list icon (hamburger).
Bordered word menu.
240,000 unique mobile visitors were served the A/B test.
|Variation||Unique Visitors||Unique Clicks|
The test was large enough to achieve statistical significance.
Where things get interesting is when we break down the data a little:
|Unique Visitors||Hamburger Clicks||%||Menu Clicks||%|
There is very little difference in the Android user preference, but their lack of engagement is disturbing.
Hamburger icons may appear to be ubiquitous, but they are not the only option.
There is an issue that is much more important:
- These are the results from one website (see more about demographics here).
- The test was done using some in-house code, so I cannot guarantee the perfect execution of code across all devices. I do not have time or capacity to rigorously test code like the big commercial AB testing services like Optimizely. Bear in mind that to run this test with Optimizely would have cost $859 (I kid you not).
- I can’t measure intent with this test. I’m measuring clicks on a webpage. Maybe the user thought menu as a list of food to order. Maybe they wondered what the hamburger icon was and tapped it. Who knows. AB testing cannot tell you this.