Preserving SEO Quality During A Site Migration or Site Move

Moving a site, whether it’s to a new code base or even an entirely different domain comes with a large set of challenges. As an SEO one of my biggest concerns is ensuring that everything with SEO value is maintained during the move.

Bridge Preserving SEO Quality During A Site Migration or Site Move

This includes the usual on page elements:

<url>

<title>

<h1>

<meta description>

<no index, no follow>

<canonical>

<breadcrumbs>

There is a lot of steps involved for a move like this especially considering all of the various back end changes. For this example I am going to use transferring the website to a new code base.

At the heart of any major migration like this are URLs, Since things like pagerank, links,  cached information is  handled at the url level making sure that your top urls are accounted for is a must. For this first part of the tutorial I am going to focus on compiling a spreadsheet of your top urls.

In order to preserve SEO value you need to compile a list of your top performing urls. There are a few ways to do this. Here are the tools you will be using:

- Google Analytics

- Google Webmaster Tools

- SEOMoz (if you  have an account)

Let’s start with analytics.

 

1. In Google analytics you can get a csv of your top performing urls based on site traffic. This is useful because this shows you the most visited pages on your site.   Once you have this information you can use it for benchmarking, reporting, or various other SEO tasks.

To start go to

Content >>> Overview >>> Site Content >>> All Pages

For numbers sake go ahead and get a years worth of data, this will help give you a more accurate picture.

Once your under the content tab go ahead and set your date range back to about a year. This is helpful because it gives you more data to work with. In the bottom of the page extend the results shown to 5000, depending on the size of your website you may not need all of these results.

Once you have all the results go ahead and export all of the urls as a csv. Now you have a list of your top 5000 urls based on traffic.

What I normally do with this list is  I immediately crawl it within a crawler such as screaming frog or something similar. This way I can get an idea of the server responses that my top pages are serving.

In the next post I will show you what to do with this list during the migration.

share save 171 16 Preserving SEO Quality During A Site Migration or Site Move

3 Alternative Uses For Screaming Frog

1. Validate pages that don’t have Ad tags

When you’re working at a really large tech company that serves multiple ad tags from a large array of ad networks it can be quite cumbersome to deal with ad serving errors, or even just misformatted ads. Screaming Frog makes it easy to validate on-page ad tags to help alleviate some of these issues. To do it, simply place the ad ID tag under..

Configuration >>>> Custom >>>>

From there you should be given a screen like this…

Screaming Frog 3 Alternative Uses For Screaming Frog

Paste your ad tag code in the “Filter 1” field.

On the Right hand field select “Does Not Contain”

Begin your crawl. When the crawl is finished click on the “custom” tab.

From there you should see all of the pages that DO NOT have the ad tracking code. This is handy if you’re targeting specific pages on your website.

2. Validate Google Analytics Tag

In the same way that you can use Screaming Frog to validate Ad Tags you can do the same for Google Analytics. Just simply put the analytics ID  tag under the “Filter 1” field. This ID is usually an 8 digit number that you can find from your Google Analytics account. To find it go to your website’s analytics page and navigate to:

Admin >>> Tracking Info

From there copy and paste the ID into the screaming frog field.

This is particularly handy if you don’t you use a conventional CMS like wordpress.

3. Recheck Google Webmaster Tool Errors

Google Webmaster tools provide a broad sweep of crawl errors which can be exported into a csv file. From there you can recrawl that CSV file and validate those errors.

From my experience many of the errors that Google hits are false positives so it’s always useful to double check these issues before devoting dev time. Besides just validating server responses this is also a great way to trouble shoot other on-page elements. To do this utilize Screaming Frog’s “List” feature and your good to go. Once the urls are crawled you can further analyze them or export the data into another csv file for further analysis.

Hope this helps

Carl

share save 171 16 3 Alternative Uses For Screaming Frog

Review – The Little Book Of WordPress SEO By Kalin Nacheff

41sk4btlM5L. SS500  Review   The Little Book Of Wordpress SEO By Kalin Nacheff

 

One of my former colleagues Kalin Nacheff just finished writing his comprehensive guide, WordPress SEO. I highly recommend it for anyone who is using the WordPress Platform to manage their website. This book will help anyone who wants to improve their Search Engine Optimization. One of the biggest benefits of Kalin’s guide is that it is tailored specifically to WordPress and he goes line by line on the different setting changes, plugin downloads, server configurations etc that you need to make in order to improve your rankings.

Having worked closely with Kalin for a while now this guide truly represents the distillation of his SEO knowledge, his guide is a clear no nonsense walk through for anyone who just wants to cut the fluff and get straight to the point from SEO.

In particular, I believe this book will be most helpful for anyone who is starting out with a brand new website. WordPress SEO has a breadth of information on how to properly set up an SEO optimized site from the beginning. Kalin’s book covers everything you need to do to initial set up for your own blog, and what’s most beneficial is that it’s laid out in a way that ANY person can follow , you don’t need to be some SEO guru to get the most out it.

One final note about his book is that Kalin doesn’t get caught up in the theory or the obscure aspects of SEO that a lot of agencies or marketers do, the real benefit (I believe) to his book is the fact that he has done all the research and sifting through multiple plugins, site configurations etc and more, and he just lists them out, in order. Easy to understand.

Truly a worth while read.

Kalin’s Personal Website

Buy WordPress SEO Book

-Carl

share save 171 16 Review   The Little Book Of Wordpress SEO By Kalin Nacheff

Pubcon 2012 – Google Algorithm Updates

SEO Google Updates
These are changes announced to the algorithm
 Social is the next big signal that will help determine rankings. In a case study done by SearchMetrics they found out that Google + “plus one has 50 times more importance then a Facebook “share” , Pinterest and Twitter showed almost no improvement in rankings. I tested out the Google claim and I verified it, it appears to be true.

Google +
Facebook – 50 Likes
Pinterest – No Index
Twitter – No Index


Quality of backlinks and keyword in UGC content now more important then ever.

Facebook will launch a search engine in early 2013. If they do they could capture 22% of the search market and be the second largest search engine.

Social signals reveal high velocity content, reveals what’s trending in searcg

 Google time stamps every +1 and follows the clicks ( QDF) pages that have a lot of +1′s in the same hour tend to rank higher

It takes only 1 Google+ “Share” to get a page ranked and indexed

 It takes 25-50 Facebook Shares to get a similar page ranked and indexed
All pages that have a +1 had stable rankings in the long term
There is now a new GoogleBot for crawling, Mobile Phones, Tablets etc. More data to analyze

 Google will now message you if manual action was taken on your website to reduce rankings
 It was revealed that Duplicate Content doesn’t effect rankings but does effect the crawling and indexing.
share save 171 16 Pubcon 2012    Google Algorithm Updates

International SEO Take Aways – PubCon 2012

Just attended PubCon here are some notes that I took on International SEO. They are a little scatter brained at the moment, but I plan to clean them up in the coming weeks.

International SEO Notes PubCon 2012 

  • Yandex is now the largest search engine in Europe
  •  Baidu’s search engine pushes almost all of the organic results below the fold. Paid ads get top spot!! This doesn’t apply to long tail search terms however
  • China has the lowest connection speeds in the world
  •  One of the best ways to rank in China is to integrate
  •  For the Korean search engine Naver, they look at how long a user is on site and also how you rank amongst your competitors to determine your final rankings
  • Yandex updates their index every 2 weeks your results can fluctuate during that time
  •  Press Releases work really well in Russia, numerous case studies done showing improved rankings.
share save 171 16 International SEO Take Aways   PubCon 2012

The 3 Fundamentals Of SEO

 

Shadow Image 300x211 The 3 Fundamentals Of SEO

There are three components of SEO that you should be aware of.

Crawling
Indexing
Ranking

Crawling

Crawling is the act of having spiders (often times called crawler-bots) scan various parts of your website. It is important to design websites that are as crawler friendly as possible. The bots are always crawling the net for new web pages and content. A bot has various ways to crawl your website which will be discussed in more detail within this guide.

Indexing

Once your website is crawled, search engines have to store the pages in their database. Most search engines have two databases a main and a supplemental database. Depending on which database your web pages are stored in, this can affect your rankings.

Rankings

Often times confused as the main component of SEO, rankings are where your web pages show up in search results pages. There are many different factors that affect your rankings which will be discussed in more detail Other external factors besides what you can edit will also affect your rankings, including a user’s location, cookies, your login status, Google +, session IDs, etc. In this guide we focus on the factors you can directly control.

share save 171 16 The 3 Fundamentals Of SEO

Taxonomy For SEO Purposes – Content Sites

Library 300x190 Taxonomy For SEO Purposes   Content Sites

Taxonomy is the process of creating different category structures for your web pages. Although more  for users than search engines, proper taxonomy can help spiders understand your site hierarchy and  it is also a great way to build page rank among relevant pages through proper internal linking and relevant content.

 

Setting Up Taxonomy 


1. Do keyword research to find the most logical category names. These names should also make sense for the users.
2. When creating categories and sub categories make sure you have enough pages to populate those cats. If you don’t, consider moving those items to other places.
3. Ideally you want to try and limit content cross over. An article should appear only under one category. Spiders might register this as duplicate content.
4. On individual article pages include the name of the category keyword. For instance if you have a tennis shoe article page be sure and include the word “shoe” in the name.
5. Go through all your articles and add category keywords where possible.

 

share save 171 16 Taxonomy For SEO Purposes   Content Sites

Filtering Google Search Results For More Focused Results

Capture Of Image 300x218 Filtering Google Search Results For More Focused ResultsCurrently I have been working a lot in Adobe Flasg Builder and developing apps via actionscript. One of my projects requires me to research other Flash websites. Luckily Google is here to help.

For anyone who needs to do research Google itself is pretty darn good, but for more specific results, Google has a slew of advanced searches you can do.

1. Go to Advanced Search

This can all be accessed under the “advanced search” tab by clicking on the gear icon. From this tab you can filter the results by a slew of different criteria, for my purpose I was interested in looking for .swf files so I sorted by “File Type”.

2. Filter Search Results

The power of Advanced Search is to filter by parameters such as file type, this allows you to get more relevant and specific results. In my example I searched by .swf, but you can also search by date updated etc.

3. No luck? Try again!

The beauty of this tool is if you don’t immediatlly find what your looking for you can retry with different filters. For instance, if you were trying to find the owners manual for a Subaru car and you couldn’t find it as a .pdf, there is always a possibility that someone embeded it as a .swf or even an excel file, just keep trying.

- Carl

share save 171 16 Filtering Google Search Results For More Focused Results

What Bing’s Disavow Feature Implies for Bing Search Results

On June 27th, Bing announced a new “disavow” feature for backlinks. If you already have a Bing Webmaster account you can access the feature through your “Configure My Site” option. There are two things that I believe this implies for Bings results.

1. They’re Being More Aggressive Towards Spammy and Bad Neighborhoods

Allowing users to disavow certain links is similar to what social media sites and blogs have been doing with their comments. I assume that just like how comments can be removed after it receives enough “spam” hits, Bing is looking to single out domains that have recieved a large amount of “disavows”

2. Almost Everyone Will Need to Start Using It

Save for a few high authorith sites. I predict that this new feature will now create upward pressure for all Bing Webmasters to start disavowing links. Once a domain receives enough “disavows” were Bing identifies it as a “spammy” site it’s easy to assume the Bing spiders will crawl all of it’s outbound links to see where there pointing to. How much this would negatively affect sit’s that have those backlinks? I am not sure, but I predict that since Bing is giving you the option of disavowing links this will force webmasters to seriously to audit their backlinks.

 

Bing Spider What Bing’s Disavow Feature Implies for Bing Search Results

share save 171 16 What Bing’s Disavow Feature Implies for Bing Search Results

2 Quick Mobile SEO Tips

1. Don’t Block The Desktop Googlebot In The Robots File

This is a common mistake many mobile site developers make. Google has officially said that it’s not necessary for proper indexing and ranking. The crawlers are smart enough to realize the difference, plus you might miss out on indexing opportunities if you were to do so.

2. Create and submit a mobile specific site map to Google Webmasters that lists the crawlable webpages (not javascript)

Just like it’s important to maintain and submit a regular sitemap for your website, and XML sitemap is another great way to give spiders direction on which pages are mobile versus desktop.

Here is an example of one:
http://www.google.com/mobilesitemap.xml.

More information from Google here:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=34648

- Carl

share save 171 16 2 Quick Mobile SEO Tips