The Ultimate SEO Checklist for Website Redesigns

Resdesign your site and maintain your Google rankings!

How to retain search engine rankings when redesigning a website

A website redesign/migration may be just what a site needs to get that traffic converting, but how do you maintain traffic after a website has been rebuilt? This SEO checklist will take you through all the necessary steps and SEO considerations needed for a smooth transition to a new business website.

Use the links below to easily navigate this article:

A few redesign SEO considerations before we start
  1. How might website redesigns negatively affect SEO?
  2. Why does it matter if you switch domains during a redesign/migration?
  3. If you choose to change the domain for a redesign
  4. Pages also have a reputation (and you could lose it)
Redesign SEO: Let’s get our hands dirty!
    1. It’s time for an audit of the current website
    2. Look for your top performing content
    3. Discover your backlinks and page authority
    4. It’s time for a backlink audit
    5. Consolidate your data
    6. Make a back-up of the current website
    7. Build a virtual URL structure
    8. Start building the new website
    9. Content, titles and meta descriptions
    10. Preparing your 301 redirects
    11. Adding 301 redirects to the .htaccess file
    12. Check the new website for errors
    13. Create an XML sitemap
    14. Make sure the new website’s content only has one version
    15. You can squeeze a bit more “juice” from old backlinks
    16. …and relax!

A few redesign SEO considerations before we start

1. How might website redesigns negatively affect SEO?

How a website redesign effects SEO, depends on how it is carried out. Some questions that need to be considered before redesigning/re-building a website are:

2. Why does it matter if you switch domains during a redesign?

If the site has been around for some time, the domain will have built a reputation. Depending on how the website has been managed, this could be positive or negative.

Generally the age of a domain will work in your favour. Search engines don’t easily trust new domains, as trust has to be earned over time. Therefore, sharing them with their users can be risky.

Old domains however, especially those that have been well maintained, are trustworthy and easy to rank. For this reason, consider staying with the current domain if you can.

3. If you choose to change the domain for a redesign

If you do choose to change the domain, I recommend seeking professional advice before you buy. Some available domains may not be as new as you think and could have a poor reputation.

If your chosen domain has been used for malicious or “spammy” purposes, it may be blacklisted or still have associated penalties. For this reason, a background check should always be carried out before committing to a new domain.

Even if the new domain is legit, there may be an initial drop in traffic if you switch domains.

Want help choosing a domain?

It's important your domain doesn't have a history of spam or malware.
If you are unsure, get in touch and I will check it for you for free.Interested? Get in touch for a free chat.

(There's no obligation and we both might learn something!)

4. Pages also have a reputation (and you could lose it when you migrate)

It’s not just domains that build authority over time, so do the pages. There are many factors that affect page authority, such as:

Redesign SEO: Let’s get our hands dirty!

1. It’s time for an SEO audit of the current website

Before proceeding with a website redesign, it's important to plan ahead. Let’s start by crawling the current site. For this, we will use the “Screaming Frog SEO Spider”. You can use the software for free, but many of it's features are restricted. Therefore, I recommend paying for a licence — read ahead before you buy — we are not affiliated with Screaming Frog.

Once you have the full version of Screaming Frog SEO Spider, set the mode to “Spider”, type the current domain in the box and click start (you may need to exclude URL query strings and ignore "robots.txt" in the settings). Make sure you are in the “Internal” tab and select “HTML”. Export your scrape and name it “old-site-keep”. Open it and save a copy as an Excel workbook or equivalent. Rename this file: “old-site-edit” — we will use this later.

To exclude URL query strings in Screaming Frog SEO Spider, head to: Configuration > Spider

this image demonstrates how to limit the query strings in Screaming Frog SEO Spider

To ignore robots.txt, got to: Configuration > robots.txt > Settings

This image demonstrates how to ignore robots.txt inside Screaming Frog SEO Spider

Then, run a scan of the website.

Check the following:

Also this data will help you to recognise the current site’s strengths and weaknesses:

Confused over H1s and robots?

Smart Sprout will lend a hand. Smart Sprout offers business mentoring sessions, white-label SEO services and much more...
Interested? Get in touch for a free chat.

(Don't worry. This link won't take you off the page)

Does the website have duplicate pages?

Many websites automatically serve content multiple times and Google sees this as a duplicate. Ideally, all versions will [301] redirect to a single URL structure — if the website has SSL installed, all non-secure URLs should be redirected to “https://” or this could lead to insecure content warnings when users visit the website.

If you leave multiple versions of the same content, search engines may not understand which structure to index, and other sites could be linking to the wrong version. Although, this won’t likely lead to penalties, it does mean backlink authority may be spread over multiple pages, for no reason.

Here are a few examples of common URL structure variations:

We recommend choosing one of the bold URLs as the primary structure.

2. Look for your top performing content

If analytics software is installed (such as Google Analytics), take a look at the pages report (or equivalent). Export the data to a spreadsheet and look for the following data:

Colour code the rows so you can easily identify the best and worst performing pages.

Important! Look at the purpose of the page before you write it off. “Download” and “contact” pages often have relatively high bounce rates and low time-on-page. However, this does not necessarily indicate poor performance.

3. Discover your backlinks and page authority

As you may already know, backlinks pass on authority from external websites and you can retain most of it (roughly 85%) during a redesign by redirecting pages that have external inbound links.

  1. Head over to moz.com and sign up for a free account
  2. Go to MOZ’s Open Site Explorer and scan the website
  3. Select “Top Pages” and request a CSV — we’ll use this later
  4. Click on “Inbound Links” in the left-hand column — we want a list of all the external links pointing to the website
  5. Under “Target”, select “this root domain” — again select “request CSV” and click the link to download your exports
this image demontrates how to download a CSV from MOZ after locating inbound links

If [during the audit] you found multiple versions of your domain [that are not redirecting], check these, as well, in Open Site Explorer. You may find backlinks you didn’t know you had.

If you are using the free version of Screaming Frog SEO Spider and you are unable to access all of your pages or backlinks due to limitations, contact me and I can do this for you.

4. It’s time for a backlink audit

Follow these steps to prepare your files:

  1. Rename your inlinks CSV to “inlinks-keep.csv”
  2. Open “inlinks-keep.csv” and click file > save as
  3. Call this one “inlinks-edit” and save as an Excel workbook (or equivalent)

5. Consolidate your data

  1. Open “old-site-edit.xlsx” (Screaming Frog export). For now, hide all columns except “Address”, “Status Code” and “Title 1”.
  2. Referencing your pages export from Google Analytics (or equivalent), highlight your top performing pages on the “old-site-edit” spreadsheet
  3. Open the “Top Pages” export (OSE) and continue highlighting the strongest pages on “old-site-edit”
  4. Now, open the “Inlinks” export (OSE) and in “old-site-edit” highlight any remaining pages that have legitimate backlinks
  5. Finally, highlight any action pages such as your contact and download pages
  6. Insert an empty column before “Address” and label it “301”
  7. Next to each highlighted row type “Yes”
  8. Save your document

Need some help?

Smart Sprout will lend a hand. Smart Sprout offers business mentoring sessions, white-label SEO services and much more...
Interested? Get in touch for a free chat.

(Don't worry. This link won't take you off the page)

6. Make a back-up of the current website

When the new website is built, you may want to look back at the old site to compare. Using Httrack you can make a duplicate of the website and host it on a local drive for future reference. This help you to view the pages, content and URL structure of the old website, whenever needed.

  1. Download HTTrack
  2. Install HTTrack on your computer
  3. Run HTTrack
  4. Click “Next”. Then, name your project
  5. Click “Next” again and enter your domain name
  6. Click “Next” and “Finish to continue with the default settings
  7. When the website has been downloaded, click “Browse Mirrored Website” to open the local copy of the site

7. Build a virtual URL/website structure

Before you begin the website redesign, it is a good idea to map out the site structure. Though it is good to have this in a spreadsheet, you may also want to visualise it on a sheet of paper. A spider diagram or flowchart may work well.

Create a new spreadsheet and call it “new-site-edit” — work with the developer to create a logical layout which prioritises the top pages.

8. Start building the new website

If you haven’t done this already, discuss your plans with the web developer and make sure you are on the same page. The web developer may be used to doing things differently, but it is important that no shortcuts are taken.

Build the new website on a test server. You don’t want the current site to experience much downtime as it may be de-indexed by Google and other search engines.

Before you start adding content to the website, block all content in "robots.txt" — that way you won’t have any duplicate content issues when the new site goes live.

With the developer, discuss the structure of the site. It is important that, not only, is your top content matched, but that these pages aren’t too many clicks away from the home page. They should also have a logical structure.

If the current homepage is directly linking to your top pages, I recommend you do the same thing.

This image details the text used to block all content in robots.txt

9. Content, titles and meta descriptions

Look at the old content and decide what you are going to copy across and what you will improve. This step is vital to maintaining your search engine rankings. Any content you change must cover each topic in-depth. To be on the safe side, any new content should be over 500 words. It should be easy to read and well structured. Google measures the bounce rate and time on page. So, writing relevant content alone isn’t enough, it has to be interesting and it has to invite clicks/actions.

Titles and meta descriptions are to be carefully thought out. It is a good idea to put the main keyword at the start of the title and near the start of the meta description. Make descriptions enticing and relevant to users. If you don’t get the clicks, the website will likely lose its position.

If at all in doubt about content relevancy or titles and meta descriptions, copy them from the old site — you can always experiment later.

10. Preparing your 301 redirects

When the website is built and ready to launch, scan the site on the test server using Screaming Frog SEO Spider. On the internal tab, select html, then export the data to CSV.

Open the CSV file and the “old-site-edit” spreadsheet. Create a column next to the address column in “old-site-edit” and paste in the new, replacement, URLs next to the old top content — be sure that the URLs are like for like.

You now have all the data you need to create your redirects in the “.htaccess” file.

Open a txt document using notepad (or equivalent) and begin writing out your redirects, ready to be placed in the “.htaccess” file.

They should look something like this (spaces are important):

Place each redirect on a new line.

FTP and .htaccess files got you worried?

Smart Sprout will lend a hand. Smart Sprout offers business mentoring sessions, white-label SEO services and much more...
Interested? Get in touch for a free chat.

(Don't worry. This link won't take you off the page)

11. Adding 301 redirects to the .htaccess file

When your new website is ready to launch, you can add your list of redirects to the htaccess file.

  1. Log in to the website server via FTP
  2. Download a copy of the “.htaccess” file
  3. Make another copy and keep it safe
  4. Rename your downloaded “.htaccess file” to “htaccess.txt”
  5. Open “htaccess.txt” it in notepad++ (or an equivalent text editor)
  6. Take your list of formatted redirects and paste them into “htaccess.txt”
  7. Upload the “htaccess.txt” file via FTP
  8. Rename the current “.htaccess” file to “.htaccess-old”
  9. Rename “htaccess.txt” to “.htaccess”

When you launch the website, check that each of these redirects are functioning. Take the original list and go to “list” mode in Screaming Frog SEO Spider. Then, add your original list of top URLs. Run the crawl and check that each of them has the status code: “301”.

12. Check the new website for errors

  1. Change the mode back to “spider”
  2. Crawl the whole site — look to see if there are any errors such as missing/duplicate titles, H1s, or meta descriptions.
  3. Go to “Reports” and export: “Redirect chains”, “Canonical errors”, and “Insecure content” reports
  4. Click the response codes. Select “404” and export this, too
  5. Look over the exported reports for issues — fix anything that is broken
  6. Check the “robots.txt” is no longer blocking search engines from crawling the website
This image demontrates how to eport reports from Screaming Frog SEO Spider

13. Create an XML sitemap

Google and other search engines generally do a good job of finding content, but they recommend using an XML sitemap to help them navigate more easily.

You can create a sitemap using the Screaming Frog SEO Spider:

  1. Scan the website
  2. Select sitemap > Create XML sitemap
  3. Go through the steps and export a sitemap
  4. Upload the XML sitemap to the root folder of your website
  5. Head over to Google Search Console and submit your sitemap
This image demontrates how to create an xml sitemap

It is also a good plan to have an HTML sitemap to help users (and search engines) navigate the website.

14. Make sure the new website’s content only has one version

As mentioned earlier, in the old site audit, a lot of websites naturally generate duplicates of their content.

The following variations are common:

If these URL structures all lead to the same content without redirecting to the canonical (main/chosen URL), Google recommends creating a 301 redirect (site-wide) to prevent duplicate content issues and indexation of the wrong URLs. They also recommend consistency in your internal linking — if the main home page is “https://yoursite.com”, link to “https://yoursite.com/contact” rather than “https://www.yoursite.com/contact”.

Also, if SSL is installed, site-wide (recommended), make sure that any non-secure links are redirected to “https://” or users may suffer insecure content warnings and could be restricted from accessing content on the website.

Ideally, choose one of the bold URL structures above and redirecting rest (site-wide).

15. You can squeeze a bit more “juice” from old backlinks

The website site is built, and you have done almost everything you can to pass the authority of the old site on to the new one. However, the backlinks only pass roughly 85% of the page authority through a 301 redirect. With that in mind, it is a good idea to look over these backlinks and highlight the ones you think offer the most authority. Contact the site admins and let them know that you have a new website and ask if they would mind directly linking to the new URL. That way, you get 100% of the “link juice”.

16. …and relax!

Great! You have completed the necessary steps to keep your ranking. Now it’s up to the search engines to index your new content. There are no guarantees that Google will like all of the changes to your site. So, track your rankings thoroughly (I recommend SERP Lab), keep an eye on your analytics data and be ready to adapt.

To infinity and beyond!

Do you want to take this website to the next level. Smart Sprout offers business mentoring sessions, white-label SEO services and more...
Interested? Get in touch for a free chat.

(There's no obligation and we both might learn something!)