We often receive many questions about dynamic websites. Some site owners want to know how to prepare their dynamic website for search engines. In this article we answer this question.
Because, lets face it, sometimes the technology behind websites can be daunting. It can be difficult at best to understand how your site works. That's why we wrote this article – to help you understand at least one small part of your site, and how it can (and does) impact your site's overall search engine marketing performance.
Just what is a dynamic website?
First, let us take a look at what we are referring to when talking about a dynamic website. When we are talking of dynamic sites here we are talking about those that generate a non-static URL when it comes to building and displaying pages.
For example: A typical URL of a dynamic site would look something like this:
Some dynamic sites have more variables, some less, but essentially a “true” dynamic site must use variables as shown above to render the pages.
Why is this a problem?
The problem with these types of URLs is that a search engine crawler has problems indexing pages with extended URL variables.
In other words, a crawler will not be able to index the above URL example. While it is true that today's more sophisticated crawlers can index some dynamic pages, in general they can not get past more than 1 or 2 variables in the URL. That means, in the above example, the crawler may only be able to index pages utilizing “variable1” and “variable2”. Beyond that the crawler likely would not be able to index the page. That means those other pages will not show up in search engine results
As you can probably already guess this is a problem. That is because in many cases (such as e-commerce sites) the “meat” of the site – the product pages – are those pages found after those first 2 variables. In other words, most of the site will not be able to get indexed (or ranked) because of those long, dynamically generated URLs.
Some CMS vendors have attempted to address the issue by implementing coding changes to help make the URLs more search engine friendly, but in many cases the result is not ideal.
That is because they take a URL like this:
and turn it into:
by replacing all the “?” “=” and “&” with “/”
While this does make the URL more search engine friendly because it helps to create a static URL, it does pose another problem to search engine rankings: That is, the page now becomes too deep in the folder hierarchy to be worth anything to the search engines.
The logic goes back to the “old days” of the web, where websites were hand coded and presented as static HTML or HTM pages. Back then, webmasters would still use a folder structure to organize their sites, but they placed their most important content in the shallower folders, and less important content in deeper folders.
When we say “shallower” and “deeper” we are referring to folder depths. A folder is the text between the slashes in the URL. That means that in the above example, index.php is the first subfolder, and the most shallow, while ID3 is the 7th, and deepest, subfolder.
So as you can see in the above example, product pages found in “ID3” likely will not get indexed by search engine crawlers because of their folder depth. In fact, most content found below “ID1” (the third folder depth) will not be indexed.
This is because the search engine crawlers expect the important content to be found within the first two or three subfolders.
Why is this important to SEO?
This is important because as a search engine optimizer we need as much content to be found and indexed as possible. In general, larger sites outperform smaller ones. Therefore we want all that content to be found and indexed as it all has an impact on how the site will perform on the search engines.
For one, every single page indexed increases the chance of a visitor coming the site. That is because every single page can potentially be ranked for any combination of keywords found on that page. If you have a 400 word page, how many keyword combinations do you think that works out to? Now multiply that by 1,000 or 10,000 or even 100,000 pages. Now how many potential keyword combinations do you think there are?
The second most important reason for having all those pages indexed is because they can influence your link popularity. The search engines today are driven by link popularity. At it's most basic form link popularity equates to the number of links a site has. The site with the most links can rank higher than the site with fewer links.
That means the more pages you have indexed (all presumably with links elsewhere on the site including the home page) the more internal link popularity that site has. Not only that, but more pages also increases the chances of other sites linking to your site via those pages.
For example, let's say you are the only site on the web selling blue widgets. By offering a good static URL to the search engines, an opportunity is created whereby other sites that support blue widgets would link to your blue widgets pages. This then increases the site's overall link popularity.
My site has dynamic URLs. How do I fix it?
Well there could be many ways. The simplest is to check with your web development team or CMS system provider to see if this is part of the system. Many commercial and open source CMS systems have options or modules already installed, or easily installable, which allow you to change dynamic URLs on the fly to static appearing ones.
Keep in mind what we mentioned earlier, however. Simply changing a long dynamic URL into a static URL which places your content more than 3 folders deep is not an effective choice. While you would open up more of your content to crawlers, it would not receive as much of a boost in search engines as it could because of that folder depth. In other words, folder depth is key when it comes to changing from dynamic to static URLs.
If such a module is not available for your chosen content management system there could still be options.
For example there are many different free and paid software packages available which allows one to rewrite URLs “on the fly.” These include mod_rewrite for sites hosted on Apache, or ISAPI Rewrite for Windows based hosting. But this would obviously depend on your hosting.
Our goal here was not to scare you. It was to bring to light one of the most misunderstood barriers to search engine indexing and ranking. Sometimes such a fix is as easy as turning on a bit of code, while other times it could be a little more complex, such as perhaps rewriting or adding code.
But in the end, the best way to get a dynamic site fully indexed is to first fix the URLs by having them rewritten into static appearing URLs.
Search Engine Marketing and Brand Development Strategist
Get In Position