Top 10 Tips For Better SEO Website Design

Top 10 Tips For Better SEO Website Design

[ad_1]

1. Write Good Page Titles and Headings Make sure that you use keywords that people are searching for in your page’s header tags (i.e. h#: where # is a number). Also, be sure the H1 tag on each page accurately describes the page’s content. Good headlines are good for your users, and good for the search engines.

2. Utilize Title Tags to Their Utmost You want your title tag to contain keywords that are important to your SEO efforts. Make sure those keywords show up near the front of your title as Google only picks up the first sixty to seventy characters. Don’t just list a bunch of keywords; take a targeted approach that makes sense to your user. Include a clear call to action!

3. Use Natural Language in Your Content Don’t try too hard to stuff keywords in your page just to make it more “keyword dense.” The search engines are getting better and better at identifying natural language. They can spot your keyword fluffed content much easier than they could in the past. What I mean to say is, don’t use the same keyword phrase over and over and over and over and over and over and over and–you get the idea. Try including semantically related words in your writing.

4. Create a Smart Internal Linking Structure Use keywords and user friendly descriptions in links to help your users (and the spiders) navigate from page to page on your site. Properly position navigation to make it simple for users to know where they are and where they can go.

5. Include a Sitemap File for The Search Engines The best location for your sitemap.xml is in the root of your site. This is where the spiders will typically look. This is the standard way to do it, and we’ve been pretty happy doing this. You may choose something else, but keep it simple. Also consider creating an HTML version of your sitemap for your users. You might put it at yourdomain.ext/site_map/ and include the same content you have in the XML version. Now go submit your sitemap to Google using the webmaster tools.

6. Protect Your Site From 404 Errors! You can use your new Google Webmaster Tools account to track the 404 errors identified by Googlebot when spidering your site. It’s essential that all links coming into your site, and your site’s own internal linking, do not produce 404 errors for your visitors. By creating a redirect strategy to handle redirection of old pages, you can avoid this problem and make happier visitors. We’ll post a full article on a great way to handle this with PHP.

7. Use Statically Typed (Pretty) URLs This is a great usability feature for your site. A “pretty” URL looks like this: http://en.wikipedia.org/wiki/Search_engine_optimization. You can see in the URL what the page (HTML resource) is all about. You may even have some keywords in the URL. It is the alternative to a dynamically typed URL that might look like this: http://www.google.com/search?aq=f&sourceid=chrome&ie=UTF-8&q=dynamic+url. Most people like the first type of URL–the pretty kind–because they are easy to read. We have a framework that helps us build sites with pretty URLs. We’ll post a full article on this at some point too so you can see how we do it. Until then, roScripts has an article on how to do it with Apache and PHP. Or, check out SitePoint’s article on how to do it with simple HTML.

8. Design For Accessibility With the Web 2.0 craze came a huge push by web developers to use XHTML and CSS for page markup and to follow strict compliance guidelines from the W3C. Thank goodness! Now if we could just get Microsoft to jump on board we could quit worrying about creating websites to be “cross browser/cross platform” and just focus on content. Pages built to standard are often much lighter (in data size) and therefore load more quickly. Google gives you a bit of preference for a quickly loading site, plus you’ve only got your user’s attention for so long. Make it accessible and your users will see what you expect them to, and the spiders will more simply index your pages.

Don’t forget cell phones and mobile devices. iPhones dominate web traffic for mobile devices. You might consider user agent detection to provide content formatted for individual devices. One of my favorite tech resource sites, quirksmode, has a good article on Javascript browser detection. You can also detect user agent info before outputting headers to your visitor’s browser (so you can immediately deliver properly formatted content).

9. Use “Spider Visible” Content Consider using text instead of graphics for your navigation, page titles and other page elements. Google cannot process the content of some rich media files or dynamic pages. Some search engines still have a hard time with Flash as well. Some flash elements on your site can come in very handy, however Flash-only sites are often difficult to maintain and hard for some search engines to index.

Speaking of dynamic content, don’t create your site with blocks of text produced by javascript. Make it all visible by default, hide it using javascript when the page loads, and then show it with Javascript on some event (e.g. onClick, onMouseOver). Because most spiders won’t bother parsing your javascript, any text that is produced dynamically via javascript will likely be skipped.

10. Don’t Try to Trick The Spiders We figured we would include at least one don’t in our list of dos. Google warns against shadow domains, doorway pages, spyware and scumware. Don’t use them! They won’t do anything for the long term success of your business. They don’t work and they will get you banned from the search engines (at least from the ones that matter).

[ad_2]

Share this post