Luckily, professional SEOs no longer only look at the source code of a website. SEO tools have come a long way, and it is easier than ever to view a website the same way a search engine does. Below are five tools that make this task quick and effective.
Google Cached Version
SEOs have long used the cached version of their website to glean how Google is viewing it. To view the cached version of any page simply type “cache:www.yourdomain.com/” into the Google search box. Once the cached version of your webpage loads, click on “Text-only version”. This will enable you to quickly identify if any of your text or links are not being crawled in addition to any hidden text.
SEO-Browser.com
While checking Google’s text-only cached version of a page can be very helpful, it is difficult to do this for an entire website. This is because clicking on a link to another page in the cached version will bring you to the live version, thus making it difficult to quickly examine the entire site. For this reason I often go to seo-browser.com. It is essentially the same as viewing the text cached version, but it allows you to obtain a better understanding of how the pages interact with each other. Simply type your URL into the search box on the homepage and click “Simple”.
Google Webmaster Tools
Google Webmaster Tools has two great features that help you see how the Googlebot views your website. For a quick reference you can use the Site Preview Tool. This will show you the same thing as clicking on the magnifying glass next to your website in the SERPS. However, it is easier to use since it compares your actual page and previewed page side by side and doesn’t highlight snippets of text from your site.
For a more in depth report perform a “Fetch as a Googlebot” within Webmaster Tools. This is similar to viewing your source code, but it is going one step further by showing you how Googleviews your source code.
Web Developer Toolbar
The Web Developer Toolbar is a well known extension for Firefox and contains three essential features for enabling you to see a website the way search engines do. These features include:
- Turning off cookies (search engine crawlers don’t carry them)
- Turning off JavaScript (search engines struggle to read it)
- Turning off Meta redirects (search engines don’t blindly follow them)
By simply disabling cookies and Javascript while browsing a website you will gain a better understanding of what the search engines are seeing while still maintaining the overall look and feel of the site as you browse.
Likewise, it is also helpful to turn off Meta redirects in your browser, since it is likely that the search engines will crawl a page with a Meta redirect before moving on, whereas some Meta redirects will send a visitor away so quickly they might not even realize an additional page exists.
User Agent Switcher
A user agent switcher or spoofer allows you to change the user agent string, which identifies what browser you are using to the website you visit. In other words, even if you are browsing the web in the latest version of Firefox, you can change the user agent to appear as if you are using IE6. My favorite is a Firefox extension from Chris Pederick, who created the Web Developer Toolbar mentioned above.
The best part about this extension is the ability to set the user agent string to any one of the major search engine bots. This will allow you to quickly determine if a website is cloaking by presenting the search engines with different content than users.
Conclusion
These tools and methods have their strengths and their weaknesses, so it is important not to rely on a single tool while reviewing a website. By using all or a combination of the above tools, you will be able to see and spot problems in a website that simple browsing can’t accomplish.
Via: SEO Blog