One of the least understood and utilized areas of SEO is probably page structure. Webmasters will use an editor such as Frontpage, Dreamweaver, etc. to create there pages making sure to get the right keywords in the title tags, spread through the pages, in links, alt tags, etc. all the while forgetting one important fact, what it looks like to the search engines.
Unfortunately, a fairly small amount of webmasters actually know how to code in HTML. They use a WYSIWYG editor (What You See Is What You Get) such as one of those mentioned above. There is nothing wrong with that, in fact, it is how I originally started. The problem is that most of these types of editors produce horrible and I mean HORRIBLE code.
This is often referred to as "code bloat" and while it is not NECESSARILY bad, it is not good either. First, this "code bloat" increased the overall size of each page. For some pages, this is not a big deal, but for pages that are already fairly large in file size, this can make a big difference.
I once went through a page that was roughly 50kb in size and removed all of the offending code bloat caused by a WYSIWYG editor. Once finished, the size of the page was reduced to roughly 30kb. That means just under 50% of the pages size was extraneous code! Now, you may be asking yourself … "Self … that's a lot, but what's the big deal". Well, the issue is that at 50k, a visitor on dial-up will have to wait an average of 10-12 seconds for the page to load, while at 30k a dial-up using visitor will only have to wait 7-8 seconds for the page to fully load.
That 3-5 seconds may not seem like a lot, but it could mean the difference between the visitor staying on your site or leaving to visit a competitor.
Now, there are definitely other reasons to avoid "code bloat". That brings us to SEO. Remember, while your pages make look pretty when viewed in a browser, that is NOT how a search engine sees them. They see only HTML. Have you ever pilled up a page to look at the HTML of the page only to go "Whoa … I can not make heads or tales of this!". Well, put yourself in the place of a search engine spider.
In order to know what your site is about and to properly index and rank your site and all it's pages, it needs to be able to properly read your pages. This means the less code the spider has to sift through to find your content, the less chances there are that something will cause them to get "tripped up".
Over the past 9 years, I have seen the search engines get MUCH better at spidering sites, but if you think it is impossible for an error on a page to cause a spider to not be …