If Google cannot accurately spider your website, you have no chance of ranking well within Google’s search results. Between widgets, flash, and javascript, it can be difficult for a designer to really appreciate how Google sees a site. Here, we describe two common techniques for viewing your website as the googlebot.
Recently, I realized that my disqus comment plugin spam control was not really controlling the spam that the google bot would see. Disqus would remove the spam that javascript-enabled humans would; however, javascript-disabled bots would continue to see the spam-filled comments. As a javascript-enabled human myself, if I had not used these techniques, I would have never known the spam problem existed.
Use Google’s Cache
Search for your site and click the cached link that sits beside the searches. This will demonstrate how the googlebot is seeing your site. Here’s an example…
Use Google Webmaster Tools
An even more accurate way is to ask Google to spider it and show you the direct output using an option in Google’s Webmaster suite.
1. Log into Webmaster Tools
2. Select a site
3. Expand the Diagnostics link
4. Click the Fetch as Googlebot link
5. Input the link and wait until completion
Using these techniques you can see exactly how google views your webpage. This is essential in helping google correctly spider and index your sites.