Inviting The Crawlers To Dig Deeper On Your Site

If you're struggling to get the deeper content on your site indexed, Ann Smarty has some tips for enticing the engines' crawlers to dig in and spend more time. First, make sure you have an accurate count of the number of indexed pages. Go beyond the standard "site" parameter and check for the number of pages indexed in each directory and sub-directory. Also, check to see what the most recently indexed pages are.

Armed with this info, you can try to identify any patterns in the pages that weren't indexed. For example, are there any internal pages that have inadequate link juice? Member profile pages that don't really need to be indexed? Perhaps there are duplicate content issues between the product description pages and the shopping cart. If it's a lack of link juice, Smarty suggests finding quality deep-linking directories to bulk up the deep links to your subdirectory pages.

Lastly, you want to make sure that you're not forcing the crawlers to spend too much time on bulky content. "Googlebot works on a budget," Smarty says. "If you keep it busy crawling huge files, or waiting for your page to load, or following duplicate content URLs, you might be missing the chance to show it your other pages."

Read the whole story at Search Engine Journal »
Tags: search
Recommend (1) Print RSS