Last week I was fortunate to speak at SMX East in New York, and I attended the majority of sessions in the Technical SEO track as well. Honestly, it was one of the best SEO tracks I've ever attended at any conference. The caliber of technical issues addressed was topnotch.
Normally, I live-blog these sessions, but they were so packed with good, detailed information, I had to hold off and cover the sessions when I had more time to digest all of the data! So here are my key takeaways from those sessions. And if you're like me, you'll eat this stuff up!
I spoke during this session with Topher Kohan of CNN. Topher's a great resource to follow on semantic Web markup, because he's been working with microdata, microformats and RDFa for several years. It's also helpful to have the perspective of a large scale site to understand the best ways to implement markup across your website smoothly and efficiently.
The two big questions about schema.org seem to be:
1) How soon do I need to move to this format?
2) When will new rich snippets begin appearing from schema.org markup?
Topher emphasized that while the engines are encouraging use of schema.org, he wasn't planning on necessarily making it his top priority yet, since there currently isn't any proof that schema.org will improve rankings or provide additional rich snippets. But he did encourage everyone to get schema on the radar and perhaps implement it during the next redesign. For a large organization like CNN, if the ROI isn't immediately apparent, it's tough to justify immediate implementation. Jack Mendel from Google was also on the panel and couldn't commit to a timeframe for additional rich snippet implementation from Google either. So I agree with Toper; if you have a redesign forthcoming, go ahead and implement schema.org where you can. Otherwise, get it on your development schedule as time allows.
Real Answers for Technical SEO Problems
Todd Nemet of Nine by Blue led this session, and he was FANTASTIC. He presented quite a few actual examples of code and how he used back-end tech work to decipher and solve SEO problems. Simply fascinating.
One of my favorite points Todd made was about how the engine bots are crawling your load balanced servers and how to detect which server are getting indexed most often. He recommended using the round robin-approach with load balanced servers. He also demonstrated how to check server latency in a load balancing setup and how many data packets are lost. In both cases, involve your network engineer to repair network problems to ensure better indexing and load times.
Another fantastic suggestion about how to do SEO work on the back end was to check server logs to see if your site is being scraped, who is crawling your site and how often. If you look at your log files for bot information, such as "Googlebot" then you'll see each entry where the bot crawled the site, which page, and the timestamps of when the page was crawled. Log files are a great way, too, to spot redirect errors, like 200s or 302s, so check for those in the log files so that you can correct them with 301s instead.
Making Data From Google Webmaster Central & Bing Webmaster Tools Actionable
The best speaker by far on this panel was Myron Rosmarin of Rosmarin Search Marketing, although I have to give kudos to Duane Forrester of Bing for being a really entertaining presenter as well. Myron's presentation was entitled "Stop Running Reports. Start Providing Analysis," and that title alone hooked me! So many SEOs focus on reports, but are do they really provide actionable insights?
First Myron identified the three major areas that Google Webmaster Tools and Bing Webmaster Tools focus on: diagnostics, performance and control panel. While performance data answers the question "How are we doing,?" diagnostic reports answer "Is there a problem?" Similarly, reporting pulls data together, but analysis makes sense of the data.
Myron then gave six guiding principles to convert diagnostic data (available in Google and Bing Webmaster Tools) into action items:
1) Always provide a baseline. Numbers don't mean much without a point of comparison. .
2) Analyze at an appropriate level of granularity.
3) Pull data at appropriate time intervals. There are several ways to define an appropriate time interval, but if you're using either engine's version of Webmaster Tools, remember that the data is not stored there infinitely.
4) This leads us to the fourth principle: Download everything! The data in these tools has a limited shelf life -- don't lose it.
5) Not everything is a problem. Some problems are short-lived. Be sure you address that in your analysis.
6) Finally, remember it's mind over matter. Myron reminded us all the Serenity Prayer: there are some things you can change and some you can't. Have the wisdom to know the difference. Great words for any SEO to live by!