Thursday 28 March 2013

Are homepage carousels effective? (AKA the Brad Frost Carousel Challenge)

After reading Brad Frost's blog post about the effectiveness of carousels a couple of months ago, I decided to take up his challenge and see just how well the carousel on the University homepage was performing.

We put a lot of effort into producing a steady steam of features to go into the carousel, but does anyone actually read them?

The University of York homepage, complete with carousel in the top right

Our carousel is of the auto-advancing variety, with controls for pagination and pause / resume provided below. We didn't previously set a limit on the number of items that could be displayed in the carousel, but there were often eight or nine items to cycle through at any one time.


The results are in


We started tracking clicks on the carousel slides towards the end of February, along with clicks on the pagination and play/pause controls below. Technical details of the tracking setup can be found towards the end of this post.

For the first two weeks of tracking (20 February to 5 March), the distribution of clicks on the carousel slides looked like this:

Click distribution on carousel slides, 20 Feb to 5 March

It didn't come as too much of a surprise that the slide in first position got the most attention, but I wasn't expecting to see such a large skew. Around half of the clicks were on the first slide, with numbers dropping off rapidly after that. The poor slides in position eight were barely ever seeing the light of day.

The pause / resume button is rarely used, with less than 300 events registered (257 pauses, 36 resumes) since we've been tracking it. This suggests that the button is probably not noticeable enough.


Fewer carousel slots = more clicks?


Given that the items in positions outside the first few slots weren't getting much attention, we decided to reduce the number of items we have in the carousel to a maximum of five.


Total numbers of carousel clicks, 20 Feb - 27 March

We reduced the number of slots to 5 on around 15 March. Since then we've seen consistently more clicks on the carousel than we did when there were more features to choose from.

A possible explanation is that some of stories that we've had in the carousel over the last couple of weeks have been a bit more high-profile than usual (the announcement of our new Vice-Chancellor, BBC Question Time being broadcast on campus and our China graduation ceremony), which may have generated more interest anyway.

A more interesting theory is that we're seeing the paradox of choice in action - that giving people more choices creates anxiety and results in them not choosing anything. By reducing the number of items available to choose from, it's easier to make a choice and click on something.

We'll come back to this in a few weeks and see if we're still generating more clicks with fewer features.


The technical bit


If you're interested in how we're tracking this, read on.

Clicks on each of the carousel slides are tracked as events in Google Analytics. Analytics requires three values when tracking an event (category, action and label), which we're tracking as follows:

  • Event Category: Carousel
  • Event Action: slide-N clicked 
  • Event Label: Destination URL

So an example event might look like:
'Carousel', 'slide-1-clicked', 'http://www.york.ac.uk/50/events/china/'

The call for each event is automatically assembled with jQuery, so there's no manual tagging required each time a new item is added to the carousel.

Categorising events in this way seems to work well as it allows us to view the data in quite a lot of different ways within Analytics.

Look out for


Something that tripped me up at first was that not all of the clicks on the carousel were registering with Analytics, and were appearing as 'cancelled' requests within Chrome's inspector panel.

Look out for cancelled requests (shown in red) in Chrome's inspector panel
The simple solution to this is to delay the request by a fraction of a second so that the event has time to register before the destination page starts to load. See Google's documentation on tracking outbound links for how to do this.

3 comments:

  1. Great post Paul, I've been interested in this topic myself. You might want to try disabling the auto-forward completely too and see the results.

    Have a read of this:
    http://www.nngroup.com/articles/auto-forwarding/

    ReplyDelete
  2. Thanks Paul, really interesting! What a great little bit of research. Very interested in the paradox of choice thing - will be good to see if this continues.

    ReplyDelete
  3. Do you have data on what percentage of site visitors clicked the slideshow at all?

    ReplyDelete