You create a great email program, you optimize creative, timing and targeting; the recipient finds it interesting and clicks through to the Web site. Touchdown! Or is it? Just how well is the user
paid off for his effort once he reaches the Web site? Landing pages and direct response Web experiences can make or break the email marketer. Most companies develop site experiences based on marketing
intuition instead of empiric data regarding the way a page, a group of pages or tasks are performed on a site. While there are many issues involved in tracking discrete experiences back to the
individual user and then providing path and page analysis, there are several approaches to optimizing your pages. The choices depend somewhat on your appetite for budget and detail.
I
interviewed Michael Zell, the director of advanced optimization at Avenue A / Razorfish's Web Optimization group, to provide an overview of what this involves, and to give you a brief "window" into
his world.
What is advanced optimization and how does it fit into an interactive agency?
Michael Zell: We created a group within the agency focused upon maximizing the
performance of our client's Web sites. My team works across the agency offices to deploy this capability, which is part granular data analysis and part usability and design expertise. We primarily
utilize Atlas' site optimization tools as our data source, but will also look at data from other Web analytics packages when necessary (for example, to understand where the Web analytics packages may
not be providing enough information to derive actionable findings/recommendations). We also deploy site side surveys to understand user attitudes along with the behavioral analysis. We integrate this
behavioral and attitudinal data to identify friction points and barriers to conversion and provide evidence-based site recommendations to optimize performance.
How are people optimizing
sites today?
Optimization generally falls within two categories: quantitative and qualitative. Quantitative optimization typically involves the analysis of Web analytics data and also
testing (either A/B or multivariate). Multivariate testing is beginning to take off as tools get better, but few have the resources to manage this effectively and in a timely manner. Qualitative
optimization is done primarily through focus groups, usability studies, and expert heuristic review.
What are the greatest flaws in how people are optimizing Web experiences today?
First, I think many people are trying to test themselves into optimal designs or user experiences. Testing is almost always a good approach to finding what works and what doesn't, but sometimes
optimal performance can only be improved through changes in information architecture or a more fundamental redesign of a site or process.
Secondly, while behavioral analysis and Web
analytics are good tools for identifying key drop-off pages within Web sites, they cannot, by themselves, identify specific problem areas within pages. For example, friction in key user paths does not
fully address what might be missing in a user experience (qualitative tools such as focus groups or site-side surveys are very effective means and should also be considered). For example, we worked
with a financial services company that had a good application process--but many users wanted a better card comparison tool to help in the decision-making process. The online survey uncovered this as
an important unmet user need. Coupled with the behavioral data analysis we performed, we were able to provide a holistic view on the site's users' needs.
Thirdly, many companies don't
understand their performance relative to their industry and other industries. How difficult is it to do this type of analysis? While the approach is straightforward, getting the right data and
analysis can be challenging. Another big challenge is finding/training people that have a combination of data analysis and design skills. Lastly, many companies have the tools to collect data, but
don't have the staff to analyze and interpret it.
What type of return or learnings can you expect?
We are able to provide validation about customers or site experiences
that companies may have debated but could not measure or understand. We also use this data to help clients and our client service teams prioritize design changes. Also, many companies are paralyzed by
too much data or internal debate about what should be changed and how. Using real user data to generate insights and recommendations moves decisions forward, especially in large organizations.
****
One final thought: Without having a good grasp of what justifies "optimization," the "how-tos" will never meet your expectations.