Is SEO Sleazy and Unethical?
From a usability perspective, designing a website to be read by a search engine will inevitably lead to poor readability and usability. SEO seems like a way of hacking away at the structure of your website and implementing keyword schemes and accepting reciprocal links from companies and individuals that you wouldn’t touch with a ten foot barge pole in real life. Sounds kind of sleazy to me.
The seo headache
I’m suffering a haemorrhage trying to decide whether SEO is sleazy, I have recently read a lot of articles and listened to a lot of podcasts on SEO and I have come up with my own conclusion that there’s a correct balance standing on a very fine line stuck in a real grey area:
A lot of web design purists, including Dave Shea, have commented on SEO as being sleazy and that hacking away to improve your website indexing should be avoided. The usual slant is that if you have good content that is updated often your pages will rank fine. It’s the old “content is king” attitude and I agree with them.
A lot of web designers use keyword heavy pages and spend a lot of time hacking away at their pages using SEO techniques to improve their rankings, the usual squawk from this lot is that high placement equals higher custom, I agree with them.
You see my problem?
I agree that good, clean, well written content is best from a readability point of view, but if nobody’s reading it because the engines aren’t indexing it well or finding and connecting any keywords to your posts then why bother writing content that nobody’s going to see!
But I cannot agree with letting the back streets of Amsterdam link to my website just to improve my ranking in a googly search engine. Seems kind of crazy; after all I don’t want people finding my website after typing in SEX, BLOW, DESIGN. I think those users will also be greatly disappointed when they arrive at DesignBit and find the post about creating a semantic web page and not the behind closed doors activities they desired.
Seo sleaze conclusion
The fine line, I believe, is when SEO starts to interfere with usability. If navigation starts to contain too many keywords the user will probably get lost and if the content has keywords like Mr T’s Jewellery collection then the page will be unreadable so you end up with a document that you can’t read, but that doesn’t matter because you couldn’t navigate to it anyway.
For a large community website, natural SEO using natural and readable website theme keywords in the hundreds of posts will of course have great results but that’s no good to the Average Joe small to medium enterprise that only has a few pages describing services and prices. So I do not see a problem with Joe optimising his content so googly can see his website a bit better. But if Joe decides to try and trick Googly with backdoor techniques then the company could be tarnished with a ban from the registrar.
I think SEO is still in its infancy and that it will remain there, googly algorithms will forever be changing and in turn so will the techniques of the SEO crazed designers. The Googly algorithms will change probably meaning the SEO zombies will have to change the way their site works to concur. The “content is king” web designer will not have to change their sites code every time googly throws a tantrum and for that very, very lazy reason I think I’ll fly with the Natural SEO bunch. Cheers Fellows!