How to maintain traffic during a re-design

Worried about traffic loss after a re-design? No? [slap] Well you should be. Whenever a website goes through a re-design or a site migration for any particular reason, there is always a risk of losing traffic.

If you push that shiny new website live without taking the necessary care and attention, those precious keyword ranking positions could suddenly disappear. What is the point of having a lovely new website if no-one can see it!?

 traffic during a redesign

To hold on to those keyword positions and to minimise this risk of traffic loss and potential revenue loss, here are some best practices to follow that have helped me during a number of re-designs.

Plan the Process

To ensure that no elements of the migration process get overlooked it is best practice to devise a thorough migration plan. This will help us to estimate the time, effort and resources needed to complete the process. Establishing objectives will allow us measure the progress and success of the migration. 

Objectives

  • To ensure minimum traffic loss
  • To maintain key rankings
  • Maintain Domain Authority and Page Authority (Link Juice)

Time and effort

  • To assess how much time and effort is needed (recommend a budget)
  • Communicate the challenges, risks and effort to the client.
  • Share the migration plan with the client and team so each party is aware of their responsibilities and when the deadlines are.
     

Clean up your old Site

The majority of our web design projects here at BLISS involve a major re-design and changes to the URL, site architecture and content. This is a perfect opportunity to clean up the legacy site and ensure that any current errors such as 404’s and redirect chains are not replicated on the new site.

To do’s on the legacy site

  • Crawl the legacy site  - I use the crawler application Screaming Frog.
  • Identify and record 301’s - important as this will reduce redirect chains later on.
    Redirect chains can slow pagespeed, increase link equity loss, bad UX and reduce conversion rates.
  • Export pages - export all URLs that have received inbound links (control the link juice). OSE and Majestic SEO do this nicely, WMT is free but provides much less data
    This exercise may also highlight some juicy 404 pages that are receiving inbound links, these pages should be redirected (301) to the relevant page on the new site.

Tips to help you maintain traffic

  • Clean up the site, export 404’s and fix before site migration. Again, I use MOZ to identify these.
  • Take a snapshot of the current performance, page load times and rankings. This will help to measure how successful the migration process was and no doubt your client/CEO will probably ask for this information.

 
Map Legacy URLs to the new site

During this stage all the URLs on the legacy site must be redirected to relevant pages on the new site (providing the URL is different). This process is usually done manually, however, for large sites such as e-commerce sites automation may help speed things up. Care should be taken here to look after those high ranking pages and redirecting only a portion of the legacy URLs will effect the sites overall domain authority.


URL Mapping Process (Checklist)

  • Put all the legacy URLs (identified earlier) into a spreadsheet.
  • Populate the associated page titles and meta descriptions - Screaming Frog automatically does this.
  • Include 404’s identified earlier that need to be redirected.
  • Identify relevant URLs on the new site and populate a 2nd column next to the legacy URLs call destination. For large sites this section may require automation.
  • Any matched URLs can be removed from the spreadsheet as they already exist on the new site.
  • All remaining URLs on the spreadsheet need to be 301’d to the destination URLs.
     

Test & Monitor

  • Testing to ensure everything has gone according to plan is a critical part of the process. It is also important that the test environment is not accessible to any search engine. (password protect the environment, allowing access to certain IP addresses)
  • Generate a Robots.txt file. Allow access to crawlers and point them towards the XML sitemap. This is also a great opportunity to make the crawling of the website more efficient by excluding parts of the site which have duplicate content or multiple urls with similar content such as internal search and pagination pages.
  • Generate XML and HTML sitemaps
  • Test 301’s 
  • Monitor the site and robots.txt file
  • Monitor indexation in WMT - When the site is live
  • Final crawl and monitor crawl errors
  • Update most valuable inbound links at the source (not 301), contact webmasters and change the link at source
     

Success or fail?

Reindexing and crawling a new site can take time so it is important that you wait a number of weeks before you report on the success of the migration. Rankings may fluctuate even more so than usual for the first 3 weeks and may even drop initially. 

Quick final checks

  • Webmaster tools -Number of URLs indexed, and impressions 

    Down period during a redesign
  • Check domain authority, page authority (may take over 1 month to update) 
  • Check site speed and how it compares to the legacy site. (see Conor’s article about how to speed up your page load times)

To help you along the way Sam has put together a handy little checklist for you to download (JPG [0.5mb]|PDF [0.7mb]) and print...

Website Migration Checklist