113. Web Development and Design › developing – technology gives it life › SEO and Usability Considerations4.6.1 SEO and Usability ConsiderationsAs a whole technology should only act as an enabler. It should never be a site’smain focus. Here are some technical considerations vital to a good website:URL rewriting: it is vital that important URLs in your site are indexable bythe search engines. Ensure that URL rewriting is enabled according to theguidelines in this chapter. URL rewriting should be able to handle extra dynamicparameters that might be added by search engines for tracking purposes.GZIP compression: this helps to speed up the download times of a web page,improving user experience.Server-side form validation: form validation is the process whereby the dataentered into a form is verified in order to meet certain preset conditions (e.g.ensuring that the name and email address fields are filled in).Client-side validation relies on JavaScript, which is not necessarily availableto all visitors. Client-side validation can alert a visitor to an incorrectly filledin form most quickly, but server-side validation is the most accurate. It is alsoimportant to have a tool to collect all of the failed tests and present appropriateerror messages neatly above the form the user is trying to complete. This willensure that all correctly entered data is not lost, but repopulated in the formto save time and reduce frustration.International character support: the Internet has afforded the opportunity toconduct business globally, but this means that websites need to make provisionfor non-English visitors. It is advisable to support international characters viaUTF-8 encoding; both on the website itself and in the form data submitted to it. discussionSearch-friendly sessions: sessions can be used to recognise individual visitors Why does URL rewritingon a website, useful for click-path analysis. Cookies can be used to maintain create a moving target forsessions, but URL rewriting can be used to compensate for users who do not a search engine spider?have cookies activated. This means that as visitors move through a website,their session information is stored in a dynamically generated web address.Search engine spiders do not support cookies, so many websites will attemptURL rewriting to maintain the session as the spider crawls the website.However, these URLs are not liked by search engine spiders (as they appearto create a moving target for the robot) and can hinder crawling and indexing.The work-around: use technology to detect if a visitor to the site is a person ora robot, and do not rewrite URLs for the search engine robots. 105