Website Redesign and SEO

Websites redesign is usually done when a website needs a refresh on its design to meet the current demand and market. Redesigns can make an old site look descent, but they can also make a high traffic site invisible.

Website redesign should always have search engine in mind. Since a website redesign can change the URL of its web pages, this can affect on how search engines see the new structure the website is offering.

More than often, website redesign can give a disaster to its search engine presence and SEO (Search Engine Optimization). This can come from many reasons from coding errors, SEO unfriendly design practices, to even more disastrous practices (e.g., content duplication, URL rewriting without redirection, information architecture changes away from search engine friendly techniques).

The redesign process should be started with a collaborative call between the web developer and designer, the SEO team, and the owner(s) of the website as decision maker(s).

Web Design

Architecture

A site redesign gives the opportunity to improve the site with upgrades to achieve wanted results with new improving features, better user experience with a more secure environment, faster load times, etc..

When a website needs to be redesigned, its structure should not be very much altered from its original. For search engines to easy recognize the available inbound links and URLS, most of the website's structure should remain the same. If major changes are necessary, re-codes should be made.

Content Duplication

Search engines visit websites to index them, see what changes, re-index when necessary and rank them. When a website is on its development environment (beta), the site should be excluded from search engines' view to prevent them to index the site at its weak state which can severely affect SEO. Relaunching the site should only be done when the new elements of design are ready for search engines and the public.

Duplicate content is when a site have similar content in two or more of its web pages. Another form of content duplication is the creation of new URLs without properly redirecting old URLs via a 301 permanent redirect. This will leave search engines wondering which page should be ranked and may give it a penalty.

Beside redirecting old URLs, a website should also remove any live copies on its server(s) that have visibility with the search engines.

It is common in the redesign process that no one used the word delete in reference to site content and a permanent redirect is a must.

Content Restrictions

When a website is available online, search engine's spiders crawl the site and index its pages. It is crucial for a website to identify which of its pages should be crawled and which pages should not.

One way to tell search engines which files and folders on a website index and which to avoid is with the use of the Robots metatag. But since not all search engines read metatags, the Robots matatag can simply go unnoticed, making it not an efficient way to tell the web crawlers/spiders what to do. A better way to inform search engines about your will is to use a Robots file (robots.txt).

W3C And Code Validation

Take advantage of a website redesign to address code issues and how the site adheres to W3C recommendations and Section 508 compliance factors. Since search engines highly appreciate W3C recommendations, compiling with it can make their attempt in indexing the pages a better success.

Tracking

Websites need tracking code to report their condition on the World Wide Web. The tracking code records the activity of its users and reports it to a web analytic tool. After a website redesign, any analytical tracking codes that were used should placed back in the page source before the site goes live. Additionally, any conversion pages should have the appropriate conversion tracking code appended.

To truly assess the success of the redesign from an SEO and sales standpoint, ensure that site statistics and focused monitoring in post-launch are recorded. The data can used to find any encountered problems once the site is relaunched.

The necessary data includes:

  • Pages indexed by search engines.
  • Ranking report.
  • Analytics data such as bounce rate, time on site, pages per visits, goal completions, etc..
  • Page load time test.
  • Search engine's spider crawl state.
  • A copy of what Google used to see (fetch as Googlebot) and the average time Google need to download and crawl in the Webmaster tool.
  • W3C code validation report.

Conclusion

A website redesign can make an old site look fresh or a high traffic site invisible to search engines. Careful steps should be taken to avoid necessary decrease in page rank and visibility.

The recorded data can give important informations about how good or bad a website redesign in the eyes of the search engines. Minding all the opportunities that a redesign presents from an SEO and usability standpoint can lead to a successful launch and a fruitful post-launch environment.