Today, I want to spend a little time on what for some might be “the basics”, but is an often overlooked and always important part of analytics. Generally when people think of analytics, they think of two things: reports or insights. I’d like to place a third item on that list today: optimization. While technically this is just another way to get insights, it is different enough that I think it deserves it’s own separate attention (and this blog post!).
Website Optimization is the practice of letting the site visitors choose which version of a page or page element is best. This isn’t by voting, but my their unbiased actions. Website optimization comes in two basic forms: A/B testing and Multivariate testing. In A/B testing you generally create two entire pages and randomly show one or the other to each visitor as they come to the site. One design may have a big product photo and the other may have a chart showing how the product delivers value. In A/B testing, the winner is determined by seeing which of these two designs produces more sales, leads or whatever other conversion you might have.
The second major optimization testing practice is multivariate testing. In this scenario, individual elements within a single page are changed randomly. Generally this is two or more elements, such as a headline, body text, photo/image, call-to-action or really anything else. The chart on the right shows two elements (star and circle) being varied based on color, position and inclusion. These test are more complicated to setup and also require a great deal more traffic to determine a winner. Fortunately tools like Google Website Optimizer or Ominture’s Test & Target take a lot of the leg-work out of configuring these tests.
Here are a few examples of where DigiKnow has used this recently, and the results we achieved.
A manufacturer had been using their website to generate leads for their direct-sales product. Over some time, they had continued to see their conversion rates from AdWords and other sources heading in the wrong direction. The DigiKnow analytics team was brought in to review their process and help identify areas that could be improved. After reviewing their Google Analytics and overall configuration, we recommended using Google Website Optimizer targeted at the paid search traffic. Although we considered many optimization elements on the site, because of the desired turn around for answers and the volume of traffic running through the test, we limited the number of variables in each experiment. In some cases we optimized headlines, in others image variations and on still others call-to-action/response mechanisms.
The result: over the course of 6 months, their paid search conversion rate was up 60% over the low and in certain cases, was even much higher. The best example within their test set was in testing a customer testimonial against a product chart image, the winning configuration returned nearly twice the site average. This is just one example of the quick wins that were derived from this program.
My second case is a little different from the first one in that we didn’t use an off-the-shelf optimizer tool. Our client in this case had a small product information (not sales) website and wanted some insights into which products were the most popular. When we reviewed their analytics, we weren’t too surprised when the popularity of the products matched the order of the products within the navigation. The most popular item was first on the list, second most popular was second, etc. We recommended that we randomize the product order (within two product categories) to determine the true popularity. Because the navigation appeared in three separate places on the site, we ended up implementing custom programming to handle the randomization, then used Google Analytics to assess the data.
The results: As expected, we found that in the first product group, the popularity of the products was directly dependant on their order in the list. The first product got the most traffic, the second next, and the third the least. What was more interesting was that in the second group (always listed after the first) this was not the case. After people looked past the first group, they were just as likely to click the fourth link as the seventh (4 products in group #2). We also looked at which individual products performed above or below the mean for their position and were able to determine which were truly most popular. One interesting insight from this was a product in Group #2 that was below average in most positions, was actually the most popular in the last position. Somehow the product image must have drawn the eye most when it was at the end of the list.
To further this experiment, we are updating the product images of the underperforming products to see if we can increase their appeal relative to the other products or if they are ultimately just less interesting overall.
These are just two examples of how optimization can quickly answer some of the assumptions that companies (and agencies) make when we are building websites.
Optimization lets you know what works, rather than just guessing.