www in urls

The WWW :

First, www.example.com is an actual sub domain of example.com, even if it is most of the time defined by default exactly as the main domain.

The problem is, the domain will as well always work, so that one can always load example.com and www.example.com if nothing is done.

Here come the Search Engine Optimization matter, those two are duplicates, and worst, by default, every URL of a site will have a duplicate, because you can always keep or get rid of the www.
The www sub domain is linked to the same hosting as the domain.

Some bots do try to get rid of the www to load pages, even if they did follow a link with the www. And people can post such links too, so that in the end, if you do nothing, you are very likely to find duplicate indexing in search engines results and to end up with smaller Page Ranking.

The solution is to redirect with a 301 http header to only use one of the two.

What to choose ?

Internet is all about standard, Search Engines Bots do follow standards, or at least are built according to them. So the obvious choice here, is to always keep the www.

And it’s not only about Bots understanding this is a main domain and not a sub (thus more important ?).
Did I say standards ? www.example.com vs example.com . Which one will have more weight do you think ?

Almost every form able to auto create links will do it using a RegEx based on the www, no www, no auto active link.
One would have to post http://example.com , which you will admit is less probable.

Some sites even add the www in their sub domains, but it can become a bit long in the end, which is something to take care of as well in URLs.

Posted in |

Term targeting tool in SEO

With the Term Targeting tool of SEOmoz, one can determine how well a particular keyword is targeted by a page. This software can assign a letter and percentage grade, which can reflect the keyword strength usage. This can be done after the analysis throughout the page, the frequency and location of the keyword. This tool or software uses in-house created formula at SEOmoz. Many clients have been successful using this methodology, engine optimization search practice and the usage of keyword is different and the results can vary. For an estimation of inclusion of keyword is the best use of this tool and not for a strict rules set for how other website or webpage should be using keywords. Examining a web page and also determining which keyword appears prominently in different applied weights in the HTML is the purpose of this tool.

An h1 tag will be given more weight in keyword than a keyword in the main body copy. SEOmoz created this tool for their premium members and it may be later opened to all users. Analyzing the content of any given page and also extracting the phrases and terms, which appear in the search engine in target. This tool has a good premise and analysis can be done all the time. This tool has benefited most of the companies in their projects with different clients. Use this software for knowing the strength of the keyword usage. However, the weightings that this tool use are that which Rand Fishkin and his crew came up with reports making valuable particularly to the beginners of SEOs.

Spider Simulator

Today, a lot of content that has been displayed on web pages might not actually be available or visible to search engines. This can be flash based content or content that is generated through JavaScript and even content displayed as images. This means that anything that is rendered on the side of the client may not be visible to search engines.

Spider Simulator is a tool that simulates a search engine by displaying the contents of a web page the exact way a search engine will see it. This software can also help in displaying hyperlinks, which at times can be very important. The search engine spider simulator provides you with a report on aspects of the page that are used to determine how highly in a search engine the page is ranked, and also gives some general usability feedback.

One can now download a page through this tool, where you will be given an idea of what the search engines see when they spider the site you are using. This software is very useful for seeing what search engine will be first encountering on the page, however the output may look a bit ugly because the web pages will look like without the markup of any HTML. In fact, this can act as a good indicator of how fairly the design of the page is done provided it will be readable and can be used in this mode. This tool can ultimately help in your SEO and help you set your website such that you can drive traffic to it.


Hello and welcome to our website. The links to the right hand side are the basic high level categories of information present on this website. If you would like any additional information, do contact us.

Thanks – Cybersoft Team

Posted in |