Around the Net

SEO Guidelines For Robots.txt

Adam Audette presents another view on using robots.txt file applications. He lays out the two typical opposing schools of thought before offering up insight and five best practices into his own. The advice ranges from never using the robots.txt file to handle duplicate content to never using robots.txt to omit URLs from being displayed in Google's search engine.

Read the whole story at Rimm-Kaufman Group »

Next story loading loading..