4.6.1 SEO and Usability Considerations
As a whole technology should only act as an enabler. It should never be a site’s
main focus. Here are some technical considerations vital to a good website:
URL rewriting: it is vital that important URLs in your site are indexable by
the search engines. Ensure that URL rewriting is enabled according to the
guidelines in this chapter. URL rewriting should be able to handle extra dynamic
parameters that might be added by search engines for tracking purposes.
GZIP compression: this helps to speed up the download times of a web page,
improving user experience.
Server-side form validation: form validation is the process whereby the data
entered into a form is verified in order to meet certain preset conditions (e.g.
ensuring that the name and email address fields are filled in).
Client-side validation relies on JavaScript, which is not necessarily available
to all visitors. Client-side validation can alert a visitor to an incorrectly filled
in form most quickly, but server-side validation is the most accurate. It is also
important to have a tool to collect all of the failed tests and present appropriate
error messages neatly above the form the user is trying to complete. This will
ensure that all correctly entered data is not lost, but repopulated in the form
to save time and reduce frustration.
International character support: the Internet has afforded the opportunity to
conduct business globally, but this means that websites need to make provision
for non-English visitors. It is advisable to support international characters via
UTF-8 encoding; both on the website itself and in the form data submitted to it.
Search-friendly sessions: sessions can be used to recognise individual visitors
on a website, useful for click-path analysis. Cookies can be used to maintain
sessions, but URL rewriting can be used to compensate for users who do not
have cookies activated. This means that as visitors move through a website,
their session information is stored in a dynamically generated web address.
Search engine spiders do not support cookies, so many websites will attempt
URL rewriting to maintain the session as the spider crawls the website.
However, these URLs are not liked by search engine spiders (as they appear
to create a moving target for the robot) and can hinder crawling and indexing.
The work-around: use technology to detect if a visitor to the site is a person or
a robot, and do not rewrite URLs for the search engine robots.
Web Development and Design › developing – technology gives it life › SEO and Usability Considerations
Discussion
Why does URL rewriting
create a moving target for
a search engine spider?