Search engine optimization tips

First recommendation is to use readable and meaningful parameter names and values. It is a common situation when dynamic URL looks like http://www.mysite.com/somepage.asp?ID=4532&CATID=743994 Since this URL can be optimized by LinkFreeze for search engine indexing it is not a good idea to use these IDs. Search engines respect keywords they found in file path and names. For example if your site provides some geographical information URL could look like http://mysite.com/info.asp?country=UK&city=London After optimizing with LinkFreeze this URL will become http://www.mysite.com/info~country~UK~city~London.asp and search engines will pickup 5 keywords from this URL. Don’t try to spam search engine with unrelated keywords in URL, most of search engines has spam protection.


Return session-less content. This can be a problem for online forums and other application that require a logged on user to work. If your sessions work using cookies search engines will ignore it. But worst situation is when your application store session identification in URL. Search engines may index these URL with session identifier. This may cause a lot of garbage in backward links to your site and also may confuse your web application engine. It is recommended to intercept search engine crawlers using User-Agent information and remove all session-dependent content from the pages.


Don’t use JavaScript, Flash or Java applets for site navigation. LinkFreeze, just like search engines, can’t extract links from JavaScript, Flash or Java applets, ActiveX controls, etc. If you have such navigation on your site, you will need to provide site map page with a normal HTML links for search engine indexing.


Ensure you have only one way to retrieve same information from the site. For example you may have one URL to display article, another to return printer-friendly version, and more URLs to rate article. This may spam search engine with same content but from different URLs. Sometimes search crawler may even fall into an infinite loop by indexing same information again and again while URL of the information is changing. In this case you will need to tell search engine what it should not index using robots.txt file or META tag directive. Here is an example of META tag usage: <META NAME=”Robots” CONTENT=”noindex”>.