In this post on the basics of search, Matt Cutts tells us that Google's algorithms decides what to index by asking more than 200 questions, such as how many times a page contains your keywords, if the words appear in the title and the URL, and if the page include synonyms for those words. Knowing how search works can help SEO and paid search professionals build better campaigns.
Nichola Stott explains ways to increase click-throughs in natural search results. For example, to optimize the meta-description, short and snappy sometimes works better than the full 152 characters. There's also no need to front-load keywords in your meta-description, though it is a good idea to ensure your lead keyword for the page is included. She explains that the objective is to make the description readable and impactful.
Get brand face time with consumers, determine the audience, and learn your competitors' strategies. Knowing these PPC points can help make writing ad text for the Google Content Network a little easier. And while the network can deliver many impressions, it can make it challenging to get a click. So Erin suggests putting your brand name in the headline or ad itself to capitalize on all of the impressions.
While Mark Jackson's reflections on an SEO conference provide a good foundation for this primer of SEO basics, scroll halfway down to get to the meat. That's where he discusses competitive analysis and knowing the correct title tags to use. Still, Jackson's advice won't work if your content lacks focus, so understand and identify your audience first.
Mike Tekula provides an interesting view on hacking Google Analytics to suit your needs. No platform provides everything, but sometimes you need a few more features than what the original tool offers. Tekula tells us about a few clever SEOs who take on complicated and difficult projects. They walk through how to customize filters to rank and track, enhance interaction and provide more data through Greasemonkey userscripts, and more. And while Tekula makes several outstanding points, he probably missed my article that explains how Google Analytics now tracks more than the last referring source.
Cybercriminals rely on blackhat SEO tactics to "poison" and manipulate search engine results to make links appear higher than legitimate ones, according to Patrik Runald. Links infected with malware and Trojans can appear near the top of the search results, generating a greater number of clicks to malicious Web sites. This trend is becoming more common. The average number of malicious sites in Google searches using Google trending topics in 2009 rose to 13.7% for the top 100 results, Runald writes. This means for every 100 results, around 14 of the suggested links may link to a malicious site ...
Advertisers have lots to say to potential customers, but Jeremy Hull suggests not cramming everything into one paid search ad. Hull provides advice on writing paid search ads, from keeping it simple to knowing the audience and speaking their language. Hull reminds us that Ernest Hemingway, author and journalist, once wrote a story using just six words, and is said to have called it his best work. The story: "For sale: baby shoes, never worn."
Just because Google's bots crawl a Web page, it doesn't mean that page got indexed. So Tom Critchlow provides a nontechnical step-by-step guide to help you understand what pages Google crawls on your site and how to compare that to the pages getting traffic. He notes that site reviews are a good way to spot search pages being crawled, and demonstrates how to back up the claim with the data to show the client.
Bill Slawski points to an older patent from Google that explores ways to index the "deep" Web when explaining a project by the Regents of the University of California, which represents a group of universities across the state. Supported by government funding, people at the universities have begun work to improve the types of pages search engines find, crawl, index and serve up to people searching for information on the Web. Slawski tells us why indexing deep pages is important, and points to a simplified whitepaper for those who want to take a deeper dive.
Will Alexa remain relevant in 2010, as search engines continue to tweak algorithms? That's a question Aaron Wall decided to research, though he claims to have little faith in the ranking tool. Those who install the Alexa toolbar can see more query data that list some opportunities for their site in paid search, he explains. Alexa's features provide downstream and upstream traffic sources, percentage of search traffic, sub-domains, top search queries, demographics, and more.