Around the Net

Options For Tracking Referrer In URLs

Nathan Buggia provides a few possible ways to track the source of clicks or requests. One is to track parameters in URLs, but that sometimes results in "significant issues with search engines," which he explains in the post.

For instance, he believes it can cause duplicate content issues when a search engine bot finds multiple valid URLs that point to the same page. It can cause ranking issues if all page links aren't for the same URL. Robots.txt is one relatively simple solution to ensure search engines are not indexing URLs that contain tracking parameters, he adds.

Read the whole story at jane and robot »

Next story loading loading..