SEO Basics - Making Sure Your Site Is Ready For Year-End
Hope everyone had a great Turkey Day! Now that you've all recovered from your tryptophan turkey comas, I hope you're all geared up and ready for the giving season. (Just starting out? Check out last month's post for year-end best practices.) But before we head into the biggest time of year for nonprofits, I thought it might be a good idea to make sure everyone has their bases covered as far as search engine optimization (SEO) goes.
Organic traffic is likely one of the largest sources of visitors to your site, so we want to avoid any potential issues that could easily be fixed. (Not sure about the difference between organic and paid traffic? Paid traffic is visitors who have come to your site by clicking on the ads along the top, sides, and sometimes bottom of the search engine results page. Organic traffic is visitors who have come to your site by clicking a link from the normal, non-paid search results that Google populates.) Let's make sure that the search engines can find all of the pages on your site easily and that there aren't any health issues with your site (like broken links).
A robots.txt file basically tells search engines like Google what parts of your site their bots can and cannot crawl. For example, if you have a limited-access portion of your site that requires users to log in to see, you may not want those pages showing up in organic search results. Or you may have a subdirectory that allows you to preview how a certain ad or picture would display on your site that you also wouldn't want indexed.
Creating a robots.txt file isn't hard. Start out by gathering a list of any subdirectories that you don't want search bots to be able to crawl (if there aren't any, that's fine as well). Then specify the bots you wish to address. Most common is a *, which addresses all bots that could potentially crawl your site. It is possible to address specific bots separately, in case you want Google to find certain things and Bing to find others, but that's a little more in-depth.
Next, add any subdirectories you don't want bots crawling to the Disallow: list as you see in the example below:
Any subdirectory that you don't include in the disallow list will be assumed as allowable. If you don't have anywhere on your site that you wouldn't want bots crawling, the basic robots.txt to allow all pages looks like this:
Once you have your list, upload it to your site's root directory at www.examplenonprofit.org/robots.txt and you should be good to go!
An XML sitemap is like a table of contents for search engines. It enables them to more easily find and crawl all of the pages on your site. Why is this important, you may ask? Because whenever a Google bot crawls a page of your site, for example, it adds that page to its index and enables that page to show up in organic search results. Google will crawl your site even without an XML Sitemap, but if you create one and tell Google where it is, Google will very likely crawl your site faster. The internet is a large place (a huge series of tubes, I'm told), so helping Google find all of your site's pages will only help you.
XML Sitemaps are usually created by a tool that crawls your site, such as GSite Crawler. This particular free tool will crawl your site, and then export an XML file with a list of all the URLs on your entire site (you can also tell it to read your robots.txt, so it won't index anything you want hidden). Then you simply upload that file to your root directory at www.examplenonprofit.org/sitemap.xml, and show Google and Bing where it's located (via Webmaster Tools, but more on that in a minute). Many site crawler tools will also give you a list of broken links on your site, which is incredibly helpful, as you never want a user going to a 404 error page. Here's an example of how an uploaded sitemap made by GSite Crawler looks.
Video sitemaps are pretty much the same as XML Sitemaps, but they instead index all of the videos on your site. Since search engines have been placing more and more emphasis on videos and ranking them well in organic search results, it's important to make sure they know where all of your videos are.
Unfortunately, I haven't been able to find a good video sitemap creation tool (if you know of one, please leave it in the comments!), which means that in order to create a video sitemap you'll have to do it manually. If you want to learn how or call your favorite coder for help, here's Google's blog post explaining it. Even though creating them can be a pain, it's definitely worth it if you have a fair amount of videos on your site. I've seen the organic rankings for videos rise dramatically after implementing a video sitemap.
Webmaster Tools Accounts (Google & Bing)
A webmaster tools account is a wonderful thing. Both Google and Bing have webmaster tool sites, so you'll need to create an account with both to cover all your bases. (And since Bing owns Yahoo now, a Bing webmaster tools account will cover both Bing and Yahoo). Webmaster Tools accounts can be very helpful in a lot of ways. Here are just a few examples of things you can do in Webmaster Tools:
- Show Google or Bing where your XML Sitemap or Video Sitemap is, and ask them to crawl or re-crawl it (short of calling Google or Bing, webmaster tools is the best way to communicate with search engines about indexing your site).
- Show you potential health issues with your site, like site errors (think of it as a doctor's check-up for your website)
- Demote organic sitelinks (Organic sitelinks are additional links that occasionally show up below your organic search listing. Although you can't choose organic sitelinks, you can demote certain links that you'd prefer not be shown)
- Top sites that link to your site, and your most linked content
If you don't have a webmaster tools account already, it's not hard to create one. Both Google and Bing accounts will require that you verify that you own the site, usually by placing a tag on your homepage. Then you're off and running!
Have questions? Leave a comment below.