Website Architecture for Search-Engines

0

Joe Spencer on technical search-engine optimization of websites, speaking at the SEO-Conference of Reto Hartinger’s Internet Briefing.
Twitter: @joespencer

Joe Spencer speaking on technical search engine optimization at SEO conference of the Internet Briefing

HTML Code Requirements

– The content area should be positioned high in the HTML code
W3C Validation validator.w3.org
– All HTML should be lower case
– Remove comments
– Avoid frames and iFrames
– Use external CSS and JavaScript files
– Uncompressed files should be less than 25kB
– Use Gzip for compression and quicker delivery / download times

JavaScript Code Requirements
– Use of inline JavaScript

Max Number of HTTP Requests
– When a page loads, HTTP requests are sent to the server. Use less than 20 requests at the time

URL Structure
– Avoid dynamic URLs
– Use lower-case characters
– Use a dash (-) rather than underscores (_)
– Directories should contain index.html oder default.html file for the default page. Avoid using intro.html or other generic names
– Use URL rewrites for creating search engine friendly URLs

Flat URL Structure
– Don’t use more than 2-3 levels

URL Rewrites
– Allows the placement of targeted keywords in the URLs
– Insure that all pages load from a single URL otherwise this will create URL canonicalization issues

Languages can be handled in many ways:
– Default language http://www.mydomain.com
English http:// /en

URL Canonicalization
– Often domains for web pages can be indexed with different URLs
www.domain.com
domain.ch
www.domain.com/index.html
domain.com/index.html

These domains cause multiple content issues und dillute Page Rank to multiple sites.

Common dynamic URL example
http://www.domain.com/index.php?&page=1
http://www.domain.com/index.php?page=1&parameter=123

URL canonicalization tag sets the preferred domain.

Controlling Robots
– Robots.txt file allows to disallow spiders from indexing parts of a website, e.g. archives that may cause duplicate content issues

Robots meta tags
nindex,nofollow
noindex
nofollow

HTML sitemap: noIndex/follow
About us: NoIndex/NoFollow
Privacy policy: NoIndex/Follow

NoFollow can also be applied to links to external websites rel=“nofollow“ or to content that is not too important.

The navigational footer links on Google sites show different samples of nofollow tags.
NoFollow is recommended for navigational links, cross domain links , advertisements, external links

HTTP Headers for SEO
– HTTP 301: Moved permanently, passes links to new site; avoid multiple redirects
– HTTP 302: Temporarily moved, it will return, links will not adapt
– HTTP 404: Page not found, include a search feature and links to main content
– HTTP 503: Service not available, use during maintenance or releases with timeframe

HTTP 302 can be used to promote short URLs such as www.internet-briefing.ch/seoconf without having that actual file structure and without moving this link permanently to the actual page.

HTTP checking tools return the HTTP header responses if you want to verify your current code.

Joe Spencer supports the SEO-Network.
The SEO-Conference slides are available here.


 

Share.

About Author

Walter Schärer bloggt über neuste Internet-Trends im Online Marketing, Social Media, Blogs, Web Analytics, SEO, Mobile und so.

Leave A Reply