Commentary

Case Study: Testing On The Fly

Since "Testing" was a significant subject at last week's Conference by Mediapost Communications, this timely report from Marketing Sherpa about Case Study #CS31905: Landing Page Testing and Optimization, the Intuit Quick Books' success, in a recent "battlefield testing" atmosphere, may be a useful add-on for the readers of the Research Brief.

Reported by David Kirkpatrick, he says at the outset, "... find out how one B2B marketer went through a cycle of four tests on its main landing page to optimize revenue-per-visitor, found a winner, and continued to test to refine those results. This advanced look at optimization includes in-depth results on four tested landing pages... "

Marketing testing, says the report, is much more effective when it is part of a complete testing-and-optimization cycle. Learning what works should lead to new ideas to test against the previous winners. Continuing this cycle leads to fully optimizing the marketing effort. Intuit is one such company that seeks to improve its marketing through use of this virtuous circle.

Sunil Kaki, Senior Marketing Manager for Quickbooks, explaining why Intuit's testing is ongoing, says "... even [tested] winners have a lifecycle of being impactful. I think they are good for maybe a quarter, but not more than that because the winning results decay over time. We want to have a fresh winner almost every three months... "

Ideally, testing happens in a controlled environment where one variable can be changed while the rest remain static, though this testing cycle was happening in real-time on a key Intuit landing page. In real-world testing, on active landing pages, many variables change. During these tests, discounts were applied at the corporate level. Since there was limited time, social element micro tests were added to the test plan for learning within the overall test. Adding to this battlefield atmosphere was the fact that the QuickBooks product changed from a 2010 version to a 2011 version in the middle of testing.

The complete cycle of four tests looked at various aspects of the landing page, including ideas around social media, and comparing how the desktop version of the software coexisted with the software-as-a-service version on the same page.Kaki pointed out that traffic to the tested landing page was affected by Intuit promotions, outside events happening at the industry level, and retail partners running promotions that drove website traffic. And, even though Intuit began seeing results from the first test, the control was kept consistent throughout the entire cycle.

The overall objective for the entire testing and optimization cycle was to determine the ideal landing page to increase revenue per visitor through increased conversion rate and/or increased revenue per order

For Test #1, Kaki stated the main goal of the first test was to find out if there were "broad strokes" that could be taken to improve the page, and uncover layout concepts that worked. The focus was on product and information presentation to help determine the ideal layout, so three different layout treatments, along with the control, were used to determine how much information was needed for a user to make a decision to purchase. Winning pages were determined by revenue per visitor.
Results for Test #1 For this testing and optimization cycle the control remained the same landing page. All results are vs. control:

  • Treatment 1: +7.24%
  • Treatment 2; -5.26%
  • Treatment 3: +5.26%

Although the short form Treatment 1 "won" the first test, the goal for Test #2 was to build on the conversion performance of the FAQs treatment by increasing overall revenue through a more visible option to purchase the higher-priced "Premier" product, with radio buttons, tabbed layout and a Facebook "Like" button.

The second test found the "big win" for the total testing and optimization cycle. Kaki said adding the social element was very interesting because making social media part of the marketing effort was new for Intuit. Test 2 treatments involved simplifying page objectives and providing the visitor with enough information to make a purchase decision.
Results for Test #2 (results versus control):

  • Treatment 1: +22.52%
  •  Treatment 2: +13.73
  • Treatment 3: -6.05%

Test #3 only included two treatments, and refined the two high-performing pages from Test #2. This test also featured the worst "winning" results of any in the entire cycle.

Results for Test #3 (difference versus control)

  • Treatment 1: +2.02%
  • Treatment 2: -7.80%

Test #4 sought to target the user experience on the landing page by taking a deeper look at keyword targeting and operating system (OS) detection. In addition, a treatment similar to the original "Choose Your Version" layout was tested with different Quickbooks products included.

The final round of testing and optimization and included the control and five treatments, with the fifth treatment added halfway through the test, bringing back the original first treatment from the first three tests by removing two elements added to that page for the final test. Similar to Test #3, this test refined earlier results by looking for incremental increases in revenue-per-visitor.

Results for Test #4: (difference versus control)

  • Treatment 1: -5.05%
  • Treatment 2: -10.44%
  • Treatment 3: -8.75%
  • Treatment 4: -7.41%
  • Treatment 5: +3.37%

In concluding, Kaki said one major result Inuit found in the testing and optimization cycle was, "how big of an influence a landing page could have for our product mix... critical, because the products are priced differently. If we are able to influence a higher-priced product... and take away units from lower (priced) products... " it develops a very high level of revenue-per-visitor.

Please visit Marketing Sherpa here for additional details about the study

Next story loading loading..