How Much Do You Trust Your Tools?

My last couple of articles were about balancing technology and experience.  My theme was that technology is black and white and comes with inherent limitations and strengths.  As a result, there will always be a need for strong best practices and experienced search specialists to get more blood out of the proverbial [search] stone.  This is not all that different from another example in recent history that many thought technology would replace -- the paper industry.  You can definitely draw correlations to the fact that at one time many felt that computers and the Internet would replace the paper industry, but in reality we need and use more paper than ever. 

To better illustrate my thinking, let's go through a couple examples of how experience needs to augment technology.

One and the most obvious to me is how your budget gets spent.  Just setting budget caps and letting tools pace your money is not sufficient enough.  Your budget will not be spent wisely.  The engines just try to pace out your budget evenly over a given period of time.  This has negative implications for share of voice and maximizing volume fluctuations.  Simply asking the engine reps for guidance here is usually of no help either as you get back responses like "lift your budget caps."  While this can be effective, I don't think I need to belabor the point on why one needs to tread carefully going down that path.



To do this right, an experienced search marketer will look at industry growth patterns, categorical trends, changes in consumer search behavior, and advertising specific historical data to come up with a guide as to how much budget is required and at what pace it should be spent.  Once you have an idea on what you think should happen, you can begin to optimize the pacing yourself versus relying on engine or 3rd party technologies to do this for you. 

Per my last article on best practices, I want to hammer home again the importance of structure.  If you have your campaign structured right it can be very easy for you to pull out a single keyword or a handful of keywords to test an uncapped budget scenario while minimizing the risk of overspending.  After short test periods keywords can be regrouped and budget caps revised to take advantage of efficiency gains, seasonality, and/or volume spikes as they arise.

A second strong example revolves around your effective CPC.  Increasingly we find ourselves distrusting that certain bid tools [I won't call them out here, but it is one of the larger ones that was purchased by an even larger company].  The consensus is that they are not spending our budget as efficiently as they could because the tool is often over bidding.  You have the three CPC data points (Max, Min, Engine) that have to be entered but what we question is the ratio of the Max to the actually Engine CPC.  We have played around with a variety of bid rules to manage this and after much testing it seems one can almost always lower bids without taking a dip in position.  Try this for yourselves and just lower your CPCs by $0.05 across the board and see what happens.  If any readers are brave enough to try it, I'd love to see results posted below...

As a result of this scenario, we have been testing out different formulas to guide our bid rules for lowering CPC bids without negatively impacting the volume and performance of the campaign.

What the team here did was create two formulas - one is for Google/AOL and the other is for Yahoo/Bing. The main difference between them is that for Google/AOL the formula takes Quality Index into consideration while for Yahoo/Bing the formula uses position.  Each formula takes the weighted difference of Max Bid and Average CPC in combination of QI or Position (i.e. higher QI, lower the bid can go) and then measures the difference between max bid and actual CPC.

What we have been able to do for clients is create efficiencies because we leveraged the technology to help us test and execute but we didn't rely on the technology to spend and bid for us.  For one client recently we saw that by applying our custom bid formulas, testing appropriate budget allocations, and strategically structuring the campaign overall clicks went up 30+%, the CPC went down 20+%, and the CTR also went up 10+% with a relatively flat spend. 

Now ask yourself, how much you trust your tools to manage your campaigns for you? 

2 comments about "How Much Do You Trust Your Tools?".
Check to receive email when comments are posted.
  1. Scott Brinker from ion interactive, inc., April 5, 2010 at 2:12 p.m.

    Excellent post. Love the focus on harnessing the intersection of human intelligence and computer software. It's not an either-or proposition, but the synergies between the two -- especially in marketing -- have only started to be explored.

  2. Kevin Pike from Kevin Pike, April 5, 2010 at 6:20 p.m.

    For me, with any online tool it's about "levels" of trust. At what level can you trust the tool to provide cost benefit savings vs. the cost of human analysis.

    I have worked with several PPC bid management tools in the past trying to lower my time expense, but it's bit me ass more than once.

    Basically anytime you try to assign one formula or set of rules to an entire account or campaign you already failed. You will always have to have a human involved to keep the BS meter turned down to a minimum.

Next story loading loading..