How Smart Web Designers Screw Up Your SEO
Talented web designers do a great job creating great web sites but all too often in the process unknowingly wreck the likelihood of high rankings in search engines. Here’s how and what to do about it.
Many sites on the web are amazing – a real tribute to their designers. Many of these are attractive, functional and compelling for visitors. But look a little deeper and we see a consistent problem with search engine ranking possibilities across many sites. The snazzy site’s creators are good at their job. Their job is site creation. They also generally think they understand site prioritisation but screw up their clients SEO such that the search engine optimisation effort is multiplied through re-work and necessary architectural changes. The main issues are URL manipulation, duplicate content and a serious downside of popular shopping cart software products. Related issues are potentially endless, particularly with future site changes/overhauls and their abandoned URL’s that have desirable search engine clout.
The Cause Leading to the Effect. Since people in business generally have a skill base that doesn’t include web site design, they dip into the sizable pool of inexpensive web creator talent around. They’ve heard of SEO, but their chosen web design company who produce dazzling samples of work along with shopping carts say they will create the site in line with SEO principles. Great! Once producing a great looking site that works superbly, works with the shopping cart, demonstrably has customers going through the shopping cart and parting with their funds, has products easy to add and subtract through an external interface with the database – the customer is pleased and pays the bill after agreeing the ongoing fee structure with amendments and changes. And start a PPC campaign. And realise that the cost of the PPC campaign is about the same as their premises rental at their high street but with a huge cost increase at Christmas time. And realise that now they have two landlords – their High St premises owner and Google (and/or Yahoo, MSN, etc…). Or, they realise that whilst they thought that with their new online company the web would be free, they, like their real estate counterparts, have an expensive landlord of the search masters, led by the ‘benevolent’ Google. But no matter – just have to wait a while until the organic results show their site highly through the efforts of those clever people that created this great site – just wait a few weeks… months… years. Here’s why it’s going to be years… decades... never. And here’s how to make it, realistically, a few months.
Unfriendly URL. The URL problem is not limited to the use of shopping cart software like OS Commerce and others that make use of session ID’s, although they are default offenders. Some web design companies compound the problem with the use of session ID’s apart from their shopping cart software, or use ‘cart created’ session ID’s throughout their design. Session ID’s are a handy means of keeping state and identity across several pages for a particular user’s sequence of pages within the domain per session. The main fully featured shopping cart – OS Commerce - which is free and hence attractive – appends a session ID to every page. The ID is unique to every user session (so if the user closes the browser and re-starts a session on a site the ID will be different). See an example of this with naturalfigures(dot)co(dot)uk. Go to any category and see the session ID appended. Now close the browser and open the same URL again – note that the session ID has changed for the same pages selected. What’s the problem with this? When the Google bot or any other SE’s bot comes along to examine the page – it sees the page with appended session ID and indexes the page. Then the next time it visits the page it lands on the same page and sees the same content, but this time for a different apparent URL – which is the same URL with a different session ID appended. What’s this? Duplicate content! Most web designers have little understanding of why this would ever be a problem.
A similar issue of duplicate content exists with the way that most web designers have internal links to some start file like index.htm. Back to the home page? Go to thedomain/index.htm. But this is the same content as thedomain.com. But there‘s more. Not only are these pages the same, but also http://thedomain and http://www.thedomain are also the same content. To demonstrate the SE’s viewing this as different, try it with xe(dot)com and note the different PageRank scores. It’s easy to fix these problems, it’s just that web designers are generally oblivious to the problem.
Site Redesigns, Wasted Pages. Occasionally, like your living room, the site needs an overhaul. Or it could be that some web designer believes that the way to higher ranking for their client is to redesign the site because they’ve heard that page names should have hyphens, not underscores, or that page names shouldn’t have hyphens but should have underscores (it doesn’t matter a hoot). In the redesign – many web designers destroy any search engine clout currently enjoyed and end up with a negative affect for the site. Oh well. At least it looks much nicer after the redesign.
What are web designers missing? As SE’s traverse a page they analyse it and index it assuming it doesn’t offend them in some way (cloaking, dup content, redirects defined in the wrong way, etc.). It’s indexed. Got that? Indexed. That is, the page – referred to by its URL – now exists in some database patrolled by Google’s armed guards. When web designers change a site design and invent new page names without properly redirecting from the old page, Google see another shiny new page – note that it has exactly the same content as another on the same domain they already have indexed – and index the new page too. Only now the site is devalued in the eyes of the search engine because it clearly duplicates content. This is not anywhere near as serious as duplicate content across distinct domains, but is still a red flag when seen within a domain. But wait – it’s not duplicate content! The old page has been changed – sure – it still exists because there may be external links to it – but there are no internal links to the page – it’s been replaced by the new page. But did anyone tell Google about that? What?! How do you tell Google about anything? By a properly defined 301 redirect in the htaccess file. Hmmm. Try that on your web designer – if there’s the slightest questioning lift of eyebrows – run. But the problem doesn’t end there; since this is now a new page, it doesn’t have the establishment of the old page. The SE doesn’t know it’s a replacement, it just thinks it’s a new page, something that has to earn it’s place through time and new internal and external links. The htaccess 301 redirect resolves all this.
The Solution. A popular web presence is no longer the breeze it used to be. Everyone flocks to the web – but how do the SE’s sort out the wheat from the chaff? The solution to this and much more is the design of pages from the start with SEO principles in mind. But this has become a buzz-phrase. The web designers need to understand how search engines see pages as well as how humans see pages. Let’s face it – if SEO’s designed all the web sites it wouldn’t be pretty. Both skills are needed. For proof of this see the site cited in the bio box for this article – as site which to prove an SEO point is distinctively un-pretty. But the SEO’s have the upper hand. They know they aren’t designers and they know they need clever artistic designers to build something that is not just functional but also attractive. The converse is not generally true. Web designers in general don’t really understand search engine optimisation – despite their sales people’s sale oriented claims. They think they know the SEO science.We’ve yet to find a web designer who does.
Source: Free Articles from ArticlesFactory.com
ABOUT THE AUTHOR
By Baron Turner of TurnerDow Search Engine Optimisation. There’s Only One High St Now! Be on it with TurnerDow SEO