Buggy, bloated or otherwise compromised source code can actually be the source of a site's drop in, or failure to rank on the search engines. So Stoney deGeyter offers a rundown on the most common
code problems and solutions, starting with errors that keep the engines from crawling your page as a whole.
"There are numerous coding errors that can essentially stop the spiders
from grabbing, indexing and evaluating your copy properly," deGeyter says. "For example, if you forget to close your
tag the search engines
may not take your body copy into
consideration, not knowing that it's actual content. By using validated code you are 100% certain to eliminate these kinds of potential problems."
Read the whole story at Search Engine Guide »