#does this mean that matt and gary will also migrate
Explore tagged Tumblr posts
Text
EVERYONE AT TOMBLR DO YOU THINK THE GREAT TWITTER TO TUMBLR MIGRATION WILL INVOLVE TOM
DO YOU THINK WE WILL BE FOUND OUT FOR OUR CRIMES
#(tom voice) i am at tumblr and am very concerned#does this mean that matt and gary will also migrate#DIDNT MATT HAVE A TUMBLR ONCE#we will never be safe#tom scott#toot!!#raggedy
28 notes
·
View notes
Text
Nine link building myths you should ignore in 2019 Search Engine Watch
Almost anyone running a B2B or B2C business knows that Google and other search engines like quality links, and could consider them as one of the top ranking factors.
So, if you want your website to rank higher than your competition on search engines, a proper link building strategy is not debatable.
However, if you’re going to implement link building in your 2019 digital marketing strategy, you have to do it the right way.
Search engines shroud their algorithms in secrecy, so the SEO and link building industry is flooded with many myths that will never get you results but can get you into a lot of trouble.
To avoid investing resources into wasted link building efforts, pay attention to these nine link building myths that won’t get you anywhere in 2019.
1. Guest posting is dead
This myth started to get really popular in 2014 when Google’s Matt Cutt said,
“Okay, I’m calling it: if you’re using guest blogging as a way to gain links in 2014, you should probably stop. Why? Because over time it’s become a more and more spammy practice, and if you’re doing a lot of guest blogging then you’re hanging out with really bad company. So stick a fork in it: guest blogging is done; it’s just gotten too spammy.”
Because of how direct and stern this warning by Cutt was, it’s understandable that many people believe that guest blogging is genuinely dead.
However, Cutt later clarified this statement by saying that what he meant was spammy blog posts for the sake of SEO purposes was dead.
This means that publishing relevant and resourceful blog posts on authoritative sites for building links, exposure, branding, increased reach, and building a community is still very relevant in 2019.
2. Links not relevant to your niche are low-quality links
This is a prevalent myth that contradicts the fundamental idea of link building in 2019. To rank high, you need to get top authority sites to link back to your site. To get these sites to feature your link, you need to provide relevant content for them. Moreover, whether or not that content is related to your niche or not, it still improves your ranking.
So, when your site receives a non-relevant backlink from a non-relevant niche, Google will not frown upon these links.
3. Building tons of links to a single piece of content is spammy
Many people still think that building tons of links to a single piece of content could negatively impact their keyword rank. Again, this link building myth contradicts itself because it goes against the idea of organic link building.
If search engines do not penalize highly original and valuable webpage that other websites link to because of how helpful and informative their content is, why would they consider a piece of content with tonnes of backlinks spammy?
However, if your links are low quality (from spammy content networks and directories), you could be slapped with a manual penalty or significant link profile devaluation.
4. Link building is irrelevant if you already rank high in search queries
It’s sad, but many marketers still believe this. Link building, like other digital marketing strategies for social media marketing, blogging, and others should be consistent. Not only because it helps you maintain your position above your competition in search queries, but also because it helps you with the following:
Increase your brand’s visibility across the web
Increase traffic to your domain
Showcase your brand’s authority and value
Link building is not just about increasing the volume of links to your site; it also exposes your business to new customers.
5. Google will always prioritize sites with higher backlinks over others in search queries
The truth is there isn’t a “one size fits all” for search engine ranking. There are about 200 ranking factors related to UX, mobile usability, technical performance, query intent, and many more.
Google’s ranking factors are very dynamic. According to Google Webmaster John Mueller, the search engine focuses on a particular query intent to select its ranking factors.
youtube
So, while link building is a valuable ranking factor, Google algorithms find a balance between its 200 ranking factors before displaying results to a search query.
6. All pages/posts/links on your site have an equal ranking value
When people talk about this myth, they usually mean either of these two things:
Every post on your site has the same authority or
All links on a page are of equal ranking value
Both statements are wrong. In the first instance, a post that has been linked back to by high authority sites will rank higher than others which have not. There are tools like website auditor which can be used to check the individual ranking value of your site’s posts.
As for the second statement, Google’s John Muller confirmed that their search algorithms take into account the position of a link on a webpage it appears.
So take advantage of link positioning. SEO experts like Bill Slawski and Rand Fishkin recommend positioning your links higher on the page because the higher a link is placed on the page, the more it weighs, and the more value it passes to the pages it links to.
7. Internal links don’t help you rank higher
While high-quality external links are one of the most important ranking factors, internal links also play a huge roll in helping you rank higher. This is because linking from higher to lower ranking pages can give a massive boost to weak pages. Interlinking related content on your website also creates what search engine experts call a “topic cluster”.
In 2019, topic clusters are significant because when a search query is made for a particular topic and search engines find relevant topic clusters on your website, your site will be considered an authority in this field and will automatically rank higher than other sites with relevant single pages.
8. Stuffing your image alt texts with relevant keywords helps you rank higher
Image links are not bad for SEO. However, too much of anything is never a good idea. And this applies to image link building. While there are no penalties for using image links, stuffing your image alt tags with keywords to manipulate rankings is against Google’s guidelines.
Before Google started using AI and machine learning to understand images, people had to stuff their alt tags with text to ensure the pictures appeared in relevant search queries. However, in 2019, both text and image are translated into the same language in coding.
9. Wikipedia and Wiki-like pages are the Gods of domain authority building
Many people are convinced that getting a link back from pages like Wikipedia will automatically give them a higher ranking authority because of the exceptionally high domain authority Wikipedia has. But sadly, digital marketing has as many facts as it does fallacies.
Information directly from Google’s Garry Illyes tells us that Google ranks Wikipedia just like any other website.
Wikipedia is ranked just like any other website.
— Gary “鯨理” Illyes (@methode) December 16, 2016
In conclusion
Don’t allow the fear of spamming keep you from harnessing the many fantastic benefits of implementing a link building strategy.
Also, although Google’s dynamic algorithms are usually hidden, SEO and link building agencies like seopow study them every day to let you know what’s a fact and what’s a fable.
Segun Onibalusi is the Founder and CEO at SEO POW, an organic link building agency. He can be found on Twitter .
Want to stay on top of the latest search trends?
Get top insights and news from our search experts.
Related reading
The most important steps to focus on for your customers when trying to improve your UX for SEO. Search query, design, navigation, snippets, content, etc.
Fuzzy Lookup is an Excel add-in that can be a clever solution for automating SEO tasks. Here’s how you can use it for 404 and site migration redirects.
What exactly agencies need when it comes to website audits and what to look for in choosing a tool. Five specific recommendations, screenshots, examples.
All you need to know about using APIs for better, more productive SEO tasks — content curation, keyword analysis, link monitoring, site performance, etc.
Want to stay on top of the latest search trends?
Get top insights and news from our search experts.
Source link
0 notes
Text
Why Redesigns Sabotage Your SEO Rankings (And How to Avoid It)
It seems like most companies redesign their website every year or so.
New trends gain steam, so they want to be more ‘contemporary�� or ‘flat.’
Or new color schemes are en vogue. So every site you visit looks Asana-bright.
Everyone now wants to update their site on the same regular basis.
I love experimenting with new color schemes and trends, too.
Except for one tiny thing.
I hate redesigns.
Design updates are good. They allow you to incrementally make improvements to make sure your site is up-to-date.
But full-scale redesigns?
Where you completely overhaul the site architecture and page content?
You should avoid those like the plague.
I know that sounds surprising. But I’m going to share a few examples of how and where website redesigns go bad.
Especially when it comes to destroying all of the hard-earned SEO rankings you’ve built up over time.
Here’s how to avoid sabotaging your own SEO rankings with your redesign.
Site architecture changes cause you to lose links
There are hundreds of rankings factors for SEO.
But backlinks still reign supreme.
External links have been considered ‘votes’ since the beginning of (internet) time. Their quantity, diversity, and authority pass the most influence to raise your position in the SERPs.
Internal links don’t count for as much value. However, they do have a direct influence over someone’s website experience.
I’ll explain.
In 2011, Google Panda was released. It was one of the first reported cases where Google confirmed the use of qualitative factors.
They used a survey with questions like:
Would you trust information from this website?
Is this website written by experts?
Would you give this site your credit card details?
Do the pages on this site have obvious errors?
Does the website provide original content or info?
Would you recognize this site as an authority?
Does this website contain insightful analysis?
Would you consider bookmarking pages on this site?
Are there excessive adverts on this website?
Could pages from this site appear in print?
And they had people individually rate different websites.
Fast forward a few years, and Google also started taking user behavior into account.
They don’t just want to rank websites based on links or content length. They also want to look at the overall experience of that website.
They want to make sure that people find what they’re looking for.
So the better experience visitors have, the more credit the site will get.
What’s one way to ruin an otherwise nice experience?
Broken links that derail someone’s path through your site.
When most companies redesign websites, they start messing with the site architecture.
They create new pages and ditch old ones. Or they take content from one page and add it to another.
Then, they switch up their menus and navigation schemes.
It seems harmless on the surface. The new experience might even be superior to the old one.
But what they don’t realize is that they’re often creating a TON of problems for SEO.
For starters, site architecture changes can ruin hub pages you’ve worked hard to build.
These are like clusters of related pages on your site. And they can help increase your perceived authority on those topics.
Page-level changes also create broken internal links throughout the site.
You know the drill. You try to click on a new page to find related information, only to be met by a 404 error.
One or two isn’t a big issue. Redesigns, however, often create a ton of them all at one time.
For example, let’s say you’re redesigning a hotel or ecommerce website.
Chances are, you’re using a detailed parent-child structure to organize pages.
That means you might have “Rooms” at the top, followed by the individual types of rooms underneath.
The problem is that these structures often change over time.
Maybe you come out with new products or services. Maybe you migrate old rooms into new ones.
One seemingly small change can often create a ripple effect throughout your site.
It might make perfect sense to move your featured rooms up a level or two.
However, any changes to your URL structures don’’t create one or two broken links.
It can literally create hundreds to thousands.
Take blogs for example.
Let’s say you’ve worked hard over the years to create hundreds or thousands of blog posts.
But when it comes time to move over to a new CMS during a site redesign, someone wants to remove the date strings from the URLs.
Heck, all it takes is literally a single click inside WordPress to update Permalink Settings.
So yes, it seems harmless.
I’ve actually seen this mistake time and time again.
Poor, unsuspecting business owners who have their entire websites practically break.
Tens of thousands of page URLs break overnight.
And you know what happens to their rankings?
They drop like a rock.
Fortunately, Google Search Console can help you spot broken links under the Crawl Report.
My favorite tool for technical SEO audits is Screaming Frog.
It will crawl every page on your site, uncovering tons of on-site SEO issues.
For example, you can start by looking for the “Client Error (4XX)” report under Response Codes.
Most of these will be 404 errors, when the status is reported as “Not Found.”
So far, we’ve been focusing almost exclusively on broken internal links.
But that’s not the only way redesign changes can affect your site links.
Think about it this way.
Older, high-authority pages or posts tend to acquire the most backlinks.
The highest value links are also the hardest to get. These include editorial links, for example, that come from journalists or other influencers.
That also means you can’t control them.
So when your page or post URL changes, you will lose all of those external links, too.
This, again, happens all the time.
Permalink updates, moving the blog from a subfolder to subdomain, or even just new product pages replacing old ones can force you to lose all those backlinks.
The best solution? Don’t change old page URLs!
At least, not if you can help it.
Otherwise, another way to side-step this problem is through setting up 301 redirects.
These are ‘permanent’ redirects, telling search engines that the new page has now replaced the old one.
The Quick Page/Post Redirect Plugin for WordPress is one of the most popular options.
It’s also incredibly easy to use. All you have to do is drop in the old “Request” URL and then direct it to the new “Destination” one. The only caveat is that redirects like these should be used sparingly.
What you don’t want to see, is something like this:
Loading up on too many 301 redirects can cause other unintended consequences. And they’re usually a sign that there’s a bigger, underlying issue at play.
It means the site architecture has changed dramatically.
Here’s why too many redirects can also affect your SEO rankings.
Too many 301 redirects can cause slow page speeds
“301” redirects have long been considered the best for SEO.
They indicate a ‘permanent’ change, as opposed to a ‘temporary’ one like a 302 gives off.
Either way, SEOs still feared that redirects would somehow limit the amount of PageRank that flowed through to the site.
Even Google’s own Matt Cutts once indicated some loss.
But in 2016, Google webmaster analyst, Gary Illyes, confirmed that all 3XX links pass full value:
Another Googler, John Mueller, confirmed the same findings.
Why does this all matter?
Because redirects are often now used to update websites to HTTPS. So some SEOs think this is Google’s way to help make sure people adopt it.
Last year, Google Chrome users started seeing new security warnings.
Previously, up to 70% of users would ignore website security warnings. So Google rolled out new ‘Not Secure’ messages for sites that don’t set-up SSL certificates.
Moving from HTTP to HTTPS isn’t as simple as you might think, though.
For example, you can just flip a switch inside Google Search Console to pick the ‘preferred domain’ of your site.
That way, you avoid potential canonicalization issues of your site recognized as two: a “www” and “non-www” option.
As discussed, any URL changes can cause you to lose links.
Architecture changes can break internal links. But you can also lose out on ‘link equity’ if sites link to HTTP and not the new HTTPS-version of your site.
Again, why are we harping on redirects?
Because too many can slow down your site’s performance.
And page speed has been officially confirmed as a ranking factor.
Kinsta ran a test on WordPress to see how redirects affect page speed.
First, they used Pingdom to run a page speed report with no redirect.
The page loaded in around 1.06 seconds. That’s a good score!
Next, they ran the test again. But this time, through a redirected URL.
And check out how it affected page load time:
Crazy, right?!
A redirect increased page load time by 58%.
That’s just a single page redirect, too.
Multiply this across dozens of redirects and you can see the problem.
Even worse, is when multiple redirects occur right after another.
This often happens if you’ve updated a page more than once. As in, multiple redesigns over the years.
One URL redirects to another, which redirects to another. And page speed slows to a crawl.
My favorite tool for diagnosing redirects is the redirect mapper tool from Patrick Sexton.
All you have to do is drop in your URL:
Hit “Go,” and you’ll instantly get feedback on different 301 redirects set up over the years:
Again, fewer is better. Google, themselves, literally says to eliminate as many as possible.
Which could be a problem if you’ve updated content during redesigns.
Here’s why.
Updated content messes with keyword targeting and page optimization
Why would you ever setup two redirects for a single page?
That doesn’t make sense, right?
Of course not. At least, not intentionally.
Yet, it still happens all the time.
Here’s why.
Five years ago, you sold one product or service. Three years ago, it changed. And this year, it’s changing again.
In other words, the purpose behind the page evolves over time. So all of the content on the page changes, too.
It even happens with Skyscraper content. You take a lot of old posts that are underperforming, and redirect them to a new one.
Instead of relying too much on redirects, they should simply ‘refresh’ those old posts. Adding new content and images can boost SEO traffic by 111%.
Multiple redirects in a row cause performance issues.
However, continually changing page content also messes with your keyword targeting and on-site optimization.
Here’s how.
Let’s go back to a hotel example.
Initially, maybe they only have one two room types. But after a renovation, those are expanded.
The original website architecture might just list those first few rooms on the same page. But now, there’s too many.
So you change the “Rooms” page to a category page, which lists out ones underneath it.
The problem is that now your “Rooms” page also has zero content. It just serves as a drop-down now:
If that “Rooms” page was ranking previously, it isn’t anymore.
Now, you have thin content issues, for starters. This is when there’s less than ~300 words on individual pages of your site:
Page length matters because Backlinko’s analysis showed that “the average first-page result on Google contains 1,890 words.”
50% of search queries also contain four words. That means someone is typing in a long-tail keyword to find something specific on that page.
It’s hard to give people the information they’re craving if you’ve literally removed all (or most) of the content.
Content changes during site redesigns also wreck havoc on page metadata.
One of three things usually happens in this case:
The page content has changed, so the old metadata is no longer relevant
New metadata is copied and pasted from other sites
Or the designers and developers completely neglected to add any metadata to updated pages
Once again, Screaming Frog can help you diagnose these issues.
Drop in your URL and search for the meta description option. I like to start here, because it usually indicates a bigger problem at play.
For example, check out the following example. I’ve blurred the site’s name to protect the innocent.
Two problems are happening here. First, the same keyword is being repeated on multiple pages. This could lead to duplicate content issues and reduce their ability to get one main page to rank for that term.
Second, there’s a ton of pages missing a meta description entirely.
Meta descriptions technically don’t help you rank. They do, however, help you increase your SERP click-through rates (CTR). And new data suggests that CTR can often affect rankings directly.
If a page doesn’t have a meta description, search engines will often pull content directly from the page.
But in most cases, it’s random text that gets truncated because it exceeds length requirements.
So it’s not ideal. And people won’t click.
Here’s another common problem to look for:
We’re looking at different restaurant pages on one website. However, they all share the same exact meta description.
Once again, this is a red flag.
The duplicate metadata cannibalizes the chances of your primary page ranking well for this term.
And these inconsistencies typically indicate a larger problem at play.
Most firms that specialize in design will not touch the page’s metadata.
SEO isn’t a high priority for them. They might not have the specialists on staff.
So this is what happens. You get websites shipped that look fantastic, but don’t perform.
Pages have the same copied metadata. Or worse, title tags and descriptions are missing completely.
And at the end of the day, the only thing that matters is how your website performs — not how it looks.
Redesigns screw up ‘user flow’s that are already working
My biggest problem with website redesigns is that they often screw up what’s already working.
If your business is up-and-running, chances are you already have purchases rolling in each day.
Redesigns that change site architecture or page content often screw this up.
You’re completely jeopardizing revenue.
And ultimately, your website’s ability to generate revenue is its most important aspect.
Changing all of that, without knowing if the new design is going to convert as well as the old design, is a huge gamble.
Results might increase. But you don’t know for sure. That’s exactly the problem.
Think about it this way.
A website’s macro-conversion, like a purchase, is made up of micro-conversions.
To get a sale, you first have to get people to your site. Then get them to visit certain pages. Then possibly opt-into something before they had over payment.
These ‘user flows’ are already playing out across your website.
Changing the sequence of these steps can have massive ramifications on the end goal.
This is the point behind conversion optimization that most people miss.
They think ‘conversion optimization’ means to change a button color or headline.
But in reality, changing how people flow through your site can often have a bigger impact on purchases.
One study, for example, found that optimizing an ecommerce checkout flow could result in an additional $87,175/month. That ~3% conversion increase could add another 23.94% to their top line.
Micro-conversions also extend to the internal links on your pages. These are like the beginning to a new path through your site.
Changing these links doesn’t seem like that big of a deal. However, now you can see, that it could have a big impact on how people purchase your products or services.
How do you avoid this?
Again, updating your site design is a good thing. But do it incrementally so you can test the effects on each page.
For example, here’s how my Quick Sprout blog looked a few days ago:
Now, compare that to how it looked a few years ago. You can do this yourself using the Wayback Machine.
Pretty similar, right?
Sure, it looks more ‘clean’ and ‘polished’ now. The design is still relevant for today.
However, I did not want to change what was already working. That’s how I like to run website redesigns. I’ll tweak element-by-element or page-by-page.
Then, I’ll use something like Crazy Egg to run tests after each change.
If numbers go down, I’ll revert back to the old design. Even if it seems a little ‘outdated.’
But if numbers go up, I’ll start carrying those new design updates over to my other pages.
That way, you should never, ever lose SEO rankings as a result of a website redesign.
Or, more importantly, you won’t lose revenue, either.
Conclusion
Website design updates should happen regularly.
Design trends change pretty frequently. And you want to make sure your website properly reflects your brand.
What you don’t want to do, however, is sabotaging everything else that’s working.
Large-scale site redesigns can often create tons of problems.
Site architecture changes can lead to performance issues. Content changes ruin your keyword targeting. And changing micro-conversions can drag down your macro-conversions.
The way your website looks is important. But only to a certain point.
The more important issue at play is whether revenue is increasing or decreasing.
Website redesigns can easily screw up your SEO. That causes rankings to fluctuate and traffic to decrease.
Declining traffic, then, brings down revenue with it.
Avoid this trickle-down issue by not changing what’s already working. If you’re going to update something, do it on small elements, first.
That way, you can test the impact in isolation. You can see if it’s going to increase or decrease results on a small scale.
Then, you can pull back to the old design if it’s not working without losing too much traffic or revenue.
And if it is working, you can start applying those proven updates to the rest of your site.
Websites aren’t just fashion statements. More often than not, functionality and performance should outweigh the appearance.
Have you ever experienced traffic drops right after a new website redesign?
About the Author: Neil Patel is the cofounder of Neil Patel Digital.
http://ift.tt/2E6ZKrO from MarketingRSS http://ift.tt/2FRLX8Z via Youtube
0 notes
Text
Why Redesigns Sabotage Your SEO Rankings (And How to Avoid It)
It seems like most companies redesign their website every year or so.
New trends gain steam, so they want to be more ‘contemporary’ or ‘flat.’
Or new color schemes are en vogue. So every site you visit looks Asana-bright.
Everyone now wants to update their site on the same regular basis.
I love experimenting with new color schemes and trends, too.
Except for one tiny thing.
I hate redesigns.
Design updates are good. They allow you to incrementally make improvements to make sure your site is up-to-date.
But full-scale redesigns?
Where you completely overhaul the site architecture and page content?
You should avoid those like the plague.
I know that sounds surprising. But I’m going to share a few examples of how and where website redesigns go bad.
Especially when it comes to destroying all of the hard-earned SEO rankings you’ve built up over time.
Here’s how to avoid sabotaging your own SEO rankings with your redesign.
Site architecture changes cause you to lose links
There are hundreds of rankings factors for SEO.
But backlinks still reign supreme.
External links have been considered ‘votes’ since the beginning of (internet) time. Their quantity, diversity, and authority pass the most influence to raise your position in the SERPs.
Internal links don’t count for as much value. However, they do have a direct influence over someone’s website experience.
I’ll explain.
In 2011, Google Panda was released. It was one of the first reported cases where Google confirmed the use of qualitative factors.
They used a survey with questions like:
Would you trust information from this website?
Is this website written by experts?
Would you give this site your credit card details?
Do the pages on this site have obvious errors?
Does the website provide original content or info?
Would you recognize this site as an authority?
Does this website contain insightful analysis?
Would you consider bookmarking pages on this site?
Are there excessive adverts on this website?
Could pages from this site appear in print?
And they had people individually rate different websites.
Fast forward a few years, and Google also started taking user behavior into account.
They don’t just want to rank websites based on links or content length. They also want to look at the overall experience of that website.
They want to make sure that people find what they’re looking for.
So the better experience visitors have, the more credit the site will get.
What’s one way to ruin an otherwise nice experience?
Broken links that derail someone’s path through your site.
When most companies redesign websites, they start messing with the site architecture.
They create new pages and ditch old ones. Or they take content from one page and add it to another.
Then, they switch up their menus and navigation schemes.
It seems harmless on the surface. The new experience might even be superior to the old one.
But what they don’t realize is that they’re often creating a TON of problems for SEO.
For starters, site architecture changes can ruin hub pages you’ve worked hard to build.
These are like clusters of related pages on your site. And they can help increase your perceived authority on those topics.
Page-level changes also create broken internal links throughout the site.
You know the drill. You try to click on a new page to find related information, only to be met by a 404 error.
One or two isn’t a big issue. Redesigns, however, often create a ton of them all at one time.
For example, let’s say you’re redesigning a hotel or ecommerce website.
Chances are, you’re using a detailed parent-child structure to organize pages.
That means you might have “Rooms” at the top, followed by the individual types of rooms underneath.
The problem is that these structures often change over time.
Maybe you come out with new products or services. Maybe you migrate old rooms into new ones.
One seemingly small change can often create a ripple effect throughout your site.
It might make perfect sense to move your featured rooms up a level or two.
However, any changes to your URL structures don’’t create one or two broken links.
It can literally create hundreds to thousands.
Take blogs for example.
Let’s say you’ve worked hard over the years to create hundreds or thousands of blog posts.
But when it comes time to move over to a new CMS during a site redesign, someone wants to remove the date strings from the URLs.
Heck, all it takes is literally a single click inside WordPress to update Permalink Settings.
So yes, it seems harmless.
I’ve actually seen this mistake time and time again.
Poor, unsuspecting business owners who have their entire websites practically break.
Tens of thousands of page URLs break overnight.
And you know what happens to their rankings?
They drop like a rock.
Fortunately, Google Search Console can help you spot broken links under the Crawl Report.
My favorite tool for technical SEO audits is Screaming Frog.
It will crawl every page on your site, uncovering tons of on-site SEO issues.
For example, you can start by looking for the “Client Error (4XX)” report under Response Codes.
Most of these will be 404 errors, when the status is reported as “Not Found.”
So far, we’ve been focusing almost exclusively on broken internal links.
But that’s not the only way redesign changes can affect your site links.
Think about it this way.
Older, high-authority pages or posts tend to acquire the most backlinks.
The highest value links are also the hardest to get. These include editorial links, for example, that come from journalists or other influencers.
That also means you can’t control them.
So when your page or post URL changes, you will lose all of those external links, too.
This, again, happens all the time.
Permalink updates, moving the blog from a subfolder to subdomain, or even just new product pages replacing old ones can force you to lose all those backlinks.
The best solution? Don’t change old page URLs!
At least, not if you can help it.
Otherwise, another way to side-step this problem is through setting up 301 redirects.
These are ‘permanent’ redirects, telling search engines that the new page has now replaced the old one.
The Quick Page/Post Redirect Plugin for WordPress is one of the most popular options.
It’s also incredibly easy to use. All you have to do is drop in the old “Request” URL and then direct it to the new “Destination” one. The only caveat is that redirects like these should be used sparingly.
What you don’t want to see, is something like this:
Loading up on too many 301 redirects can cause other unintended consequences. And they’re usually a sign that there’s a bigger, underlying issue at play.
It means the site architecture has changed dramatically.
Here’s why too many redirects can also affect your SEO rankings.
Too many 301 redirects can cause slow page speeds
“301” redirects have long been considered the best for SEO.
They indicate a ‘permanent’ change, as opposed to a ‘temporary’ one like a 302 gives off.
Either way, SEOs still feared that redirects would somehow limit the amount of PageRank that flowed through to the site.
Even Google’s own Matt Cutts once indicated some loss.
But in 2016, Google webmaster analyst, Gary Illyes, confirmed that all 3XX links pass full value:
Another Googler, John Mueller, confirmed the same findings.
Why does this all matter?
Because redirects are often now used to update websites to HTTPS. So some SEOs think this is Google’s way to help make sure people adopt it.
Last year, Google Chrome users started seeing new security warnings.
Previously, up to 70% of users would ignore website security warnings. So Google rolled out new ‘Not Secure’ messages for sites that don’t set-up SSL certificates.
Moving from HTTP to HTTPS isn’t as simple as you might think, though.
For example, you can just flip a switch inside Google Search Console to pick the ‘preferred domain’ of your site.
That way, you avoid potential canonicalization issues of your site recognized as two: a “www” and “non-www” option.
As discussed, any URL changes can cause you to lose links.
Architecture changes can break internal links. But you can also lose out on ‘link equity’ if sites link to HTTP and not the new HTTPS-version of your site.
Again, why are we harping on redirects?
Because too many can slow down your site’s performance.
And page speed has been officially confirmed as a ranking factor.
Kinsta ran a test on WordPress to see how redirects affect page speed.
First, they used Pingdom to run a page speed report with no redirect.
The page loaded in around 1.06 seconds. That’s a good score!
Next, they ran the test again. But this time, through a redirected URL.
And check out how it affected page load time:
Crazy, right?!
A redirect increased page load time by 58%.
That’s just a single page redirect, too.
Multiply this across dozens of redirects and you can see the problem.
Even worse, is when multiple redirects occur right after another.
This often happens if you’ve updated a page more than once. As in, multiple redesigns over the years.
One URL redirects to another, which redirects to another. And page speed slows to a crawl.
My favorite tool for diagnosing redirects is the redirect mapper tool from Patrick Sexton.
All you have to do is drop in your URL:
Hit “Go,” and you’ll instantly get feedback on different 301 redirects set up over the years:
Again, fewer is better. Google, themselves, literally says to eliminate as many as possible.
Which could be a problem if you’ve updated content during redesigns.
Here’s why.
Updated content messes with keyword targeting and page optimization
Why would you ever setup two redirects for a single page?
That doesn’t make sense, right?
Of course not. At least, not intentionally.
Yet, it still happens all the time.
Here’s why.
Five years ago, you sold one product or service. Three years ago, it changed. And this year, it’s changing again.
In other words, the purpose behind the page evolves over time. So all of the content on the page changes, too.
It even happens with Skyscraper content. You take a lot of old posts that are underperforming, and redirect them to a new one.
Instead of relying too much on redirects, they should simply ‘refresh’ those old posts. Adding new content and images can boost SEO traffic by 111%.
Multiple redirects in a row cause performance issues.
However, continually changing page content also messes with your keyword targeting and on-site optimization.
Here’s how.
Let’s go back to a hotel example.
Initially, maybe they only have one two room types. But after a renovation, those are expanded.
The original website architecture might just list those first few rooms on the same page. But now, there’s too many.
So you change the “Rooms” page to a category page, which lists out ones underneath it.
The problem is that now your “Rooms” page also has zero content. It just serves as a drop-down now:
If that “Rooms” page was ranking previously, it isn’t anymore.
Now, you have thin content issues, for starters. This is when there’s less than ~300 words on individual pages of your site:
Page length matters because Backlinko’s analysis showed that “the average first-page result on Google contains 1,890 words.”
50% of search queries also contain four words. That means someone is typing in a long-tail keyword to find something specific on that page.
It’s hard to give people the information they’re craving if you’ve literally removed all (or most) of the content.
Content changes during site redesigns also wreck havoc on page metadata.
One of three things usually happens in this case:
The page content has changed, so the old metadata is no longer relevant
New metadata is copied and pasted from other sites
Or the designers and developers completely neglected to add any metadata to updated pages
Once again, Screaming Frog can help you diagnose these issues.
Drop in your URL and search for the meta description option. I like to start here, because it usually indicates a bigger problem at play.
For example, check out the following example. I’ve blurred the site’s name to protect the innocent.
Two problems are happening here. First, the same keyword is being repeated on multiple pages. This could lead to duplicate content issues and reduce their ability to get one main page to rank for that term.
Second, there’s a ton of pages missing a meta description entirely.
Meta descriptions technically don’t help you rank. They do, however, help you increase your SERP click-through rates (CTR). And new data suggests that CTR can often affect rankings directly.
If a page doesn’t have a meta description, search engines will often pull content directly from the page.
But in most cases, it’s random text that gets truncated because it exceeds length requirements.
So it’s not ideal. And people won’t click.
Here’s another common problem to look for:
We’re looking at different restaurant pages on one website. However, they all share the same exact meta description.
Once again, this is a red flag.
The duplicate metadata cannibalizes the chances of your primary page ranking well for this term.
And these inconsistencies typically indicate a larger problem at play.
Most firms that specialize in design will not touch the page’s metadata.
SEO isn’t a high priority for them. They might not have the specialists on staff.
So this is what happens. You get websites shipped that look fantastic, but don’t perform.
Pages have the same copied metadata. Or worse, title tags and descriptions are missing completely.
And at the end of the day, the only thing that matters is how your website performs — not how it looks.
Redesigns screw up ‘user flow’s that are already working
My biggest problem with website redesigns is that they often screw up what’s already working.
If your business is up-and-running, chances are you already have purchases rolling in each day.
Redesigns that change site architecture or page content often screw this up.
You’re completely jeopardizing revenue.
And ultimately, your website’s ability to generate revenue is its most important aspect.
Changing all of that, without knowing if the new design is going to convert as well as the old design, is a huge gamble.
Results might increase. But you don’t know for sure. That’s exactly the problem.
Think about it this way.
A website’s macro-conversion, like a purchase, is made up of micro-conversions.
To get a sale, you first have to get people to your site. Then get them to visit certain pages. Then possibly opt-into something before they had over payment.
These ‘user flows’ are already playing out across your website.
Changing the sequence of these steps can have massive ramifications on the end goal.
This is the point behind conversion optimization that most people miss.
They think ‘conversion optimization’ means to change a button color or headline.
But in reality, changing how people flow through your site can often have a bigger impact on purchases.
One study, for example, found that optimizing an ecommerce checkout flow could result in an additional $87,175/month. That ~3% conversion increase could add another 23.94% to their top line.
Micro-conversions also extend to the internal links on your pages. These are like the beginning to a new path through your site.
Changing these links doesn’t seem like that big of a deal. However, now you can see, that it could have a big impact on how people purchase your products or services.
How do you avoid this?
Again, updating your site design is a good thing. But do it incrementally so you can test the effects on each page.
For example, here’s how my Quick Sprout blog looked a few days ago:
Now, compare that to how it looked a few years ago. You can do this yourself using the Wayback Machine.
Pretty similar, right?
Sure, it looks more ‘clean’ and ‘polished’ now. The design is still relevant for today.
However, I did not want to change what was already working. That’s how I like to run website redesigns. I’ll tweak element-by-element or page-by-page.
Then, I’ll use something like Crazy Egg to run tests after each change.
If numbers go down, I’ll revert back to the old design. Even if it seems a little ‘outdated.’
But if numbers go up, I’ll start carrying those new design updates over to my other pages.
That way, you should never, ever lose SEO rankings as a result of a website redesign.
Or, more importantly, you won’t lose revenue, either.
Conclusion
Website design updates should happen regularly.
Design trends change pretty frequently. And you want to make sure your website properly reflects your brand.
What you don’t want to do, however, is sabotaging everything else that’s working.
Large-scale site redesigns can often create tons of problems.
Site architecture changes can lead to performance issues. Content changes ruin your keyword targeting. And changing micro-conversions can drag down your macro-conversions.
The way your website looks is important. But only to a certain point.
The more important issue at play is whether revenue is increasing or decreasing.
Website redesigns can easily screw up your SEO. That causes rankings to fluctuate and traffic to decrease.
Declining traffic, then, brings down revenue with it.
Avoid this trickle-down issue by not changing what’s already working. If you’re going to update something, do it on small elements, first.
That way, you can test the impact in isolation. You can see if it’s going to increase or decrease results on a small scale.
Then, you can pull back to the old design if it’s not working without losing too much traffic or revenue.
And if it is working, you can start applying those proven updates to the rest of your site.
Websites aren’t just fashion statements. More often than not, functionality and performance should outweigh the appearance.
Have you ever experienced traffic drops right after a new website redesign?
About the Author: Neil Patel is the cofounder of Neil Patel Digital.
0 notes
Text
Why Redesigns Sabotage Your SEO Rankings (And How to Avoid It)
It seems like most companies redesign their website every year or so.
New trends gain steam, so they want to be more ‘contemporary’ or ‘flat.’
Or new color schemes are en vogue. So every site you visit looks Asana-bright.
Everyone now wants to update their site on the same regular basis.
I love experimenting with new color schemes and trends, too.
Except for one tiny thing.
I hate redesigns.
Design updates are good. They allow you to incrementally make improvements to make sure your site is up-to-date.
But full-scale redesigns?
Where you completely overhaul the site architecture and page content?
You should avoid those like the plague.
I know that sounds surprising. But I’m going to share a few examples of how and where website redesigns go bad.
Especially when it comes to destroying all of the hard-earned SEO rankings you’ve built up over time.
Site architecture changes cause you to lose links.
There are hundreds of rankings factors for SEO.
But backlinks still reign supreme.
External links have been considered ‘votes’ since the beginning of (internet) time. Their quantity, diversity, and authority pass the most influence to raise your position in the SERPs.
Internal links don’t count for as much value. However, they do have a direct influence over someone’s website experience.
I’ll explain.
In 2011, Google Panda was released. It was one of the first reported cases where Google confirmed the use of qualitative factors.
They used a survey with questions like:
Would you trust information from this website?
Is this website written by experts?
Would you give this site your credit card details?
Do the pages on this site have obvious errors?
Does the website provide original content or info?
Would you recognize this site as an authority?
Does this website contain insightful analysis?
Would you consider bookmarking pages on this site?
Are there excessive adverts on this website?
Could pages from this site appear in print?
And they had people individually rate different websites.
Fast forward a few years, and Google also started taking user behavior into account.
They don’t just want to rank websites based on links or content length. They also want to look at the overall experience of that website.
They want to make sure that people find what they’re looking for.
So the better experience visitors have, the more credit the site will get.
What’s one way to ruin an otherwise nice experience?
Broken links that derail someone’s path through your site.
When most companies redesign websites, they start messing with the site architecture.
They create new pages and ditch old ones. Or they take content from one page and add it to another.
Then, they switch up their menus and navigation schemes.
It seems harmless on the surface. The new experience might even be superior to the old one.
But what they don’t realize is that they’re often creating a TON of problems for SEO.
For starters, site architecture changes can ruin hub pages you’ve worked hard to build.
These are like clusters of related pages on your site. And they can help increase your perceived authority on those topics.
Page-level changes also create broken internal links throughout the site.
You know the drill. You try to click on a new page to find related information, only to be met by a 404 error.
One or two isn’t a big issue. Redesigns, however, often create a ton of them all at one time.
For example, let’s say you’re redesigning a hotel or ecommerce website.
Chances are, you’re using a detailed parent-child structure to organize pages.
That means you might have “Rooms” at the top, followed by the individual types of rooms underneath.
The problem is that these structures often changes over time.
Maybe you come out with new products or services. Maybe you migrate old rooms into new ones.
One seemingly small change can often create a ripple effect throughout your site.
It might make perfect sense to move your featured rooms up a level or two.
However, any changes to your URL structures doesn’t create one or two broken links.
It can literally create hundreds to thousands.
Take blogs for example.
Let’s say you’ve worked hard over the years to create hundreds or thousands of blog posts.
But when it comes time to move over to a new CMS during a site redesign, someone wants to remove the date strings from the URLs.
Heck, all it takes is literally a single click inside WordPress to update Permalink Settings.
So yes, it seems harmless.
I’ve actually seen this mistake time and time again.
Poor, unsuspecting business owners who have their entire websites practically break.
Tens of thousands of page URLs break overnight.
And you know what happens to their rankings?
They drop like a rock.
Fortunately, Google Search Console can help you spot broken links under the Crawl Report.
My favorite tool for technical SEO audits is Screaming Frog.
It will crawl every page on your site, uncovering tons of on-site SEO issues.
For example, you can start by looking for the “Client Error (4XX)” report under Response Codes.
Most of these will be 404 errors, when the status is reported as “Not Found.”
So far, we’ve been focusing almost exclusively on broken internal links.
But that’s not the only way redesign changes can affect your site links.
Think about it this way.
Older, high-authority pages or posts tend to acquire the most backlinks.
The highest value links are also the hardest to get. These include editorial links, for example, that come from journalists or other influencers.
That also means you can’t control them.
So when your page or post URL changes, you will lose all of those external links, too.
This, again, happens all the time.
Permalink updates, moving the blog from a subfolder to subdomain, or even just new product pages replacing old ones can force you to lose all those backlinks.
The best solution? Don’t change old page URLs!
At least, not if you can help it.
Otherwise, another way to side-step this problem is through setting up 301 redirects.
These are ‘permanent’ redirects, telling search engines that the new page has now replaced the old one.
The Quick Page/Post Redirect Plugin for WordPress is one of the most popular options.
It’s also incredibly easy to use. All you have to do is drop in the old “Request” URL and then direct it to the new “Destination” one. The only caveat is that redirects like these should be used sparingly.
What you don’t want to see, is something like this:
Loading up on too many 301 redirects can cause other unintended consequences. And they’re usually a sign that there’s a bigger, underlying issue at play.
It means the site architecture has changed dramatically.
Here’s why too many redirects can also affect your SEO rankings.
Too many 301 redirects can cause slow page speeds.
“301” redirects have long been considered the best for SEO.
They indicate a ‘permanent’ change, as opposed to a ‘temporary’ one like a 302 gives off.
Either way, SEOs still feared that redirects would somehow limit the amount of PageRank that flowed through to the site.
Even Google’s own Matt Cutts once indicated some loss.
But in 2016, Google webmaster analyst, Gary Illyes, confirmed that all 3XX links pass full value:
30x redirects don't lose PageRank anymore.
— Gary "鯨理" Illyes (@methode) July 26, 2016
Another Googler, John Mueller, confirmed the same findings.
Why does this all matter?
Because redirects are often now used to update websites to HTTPS. So some SEOs think this is Google’s way to help make sure people adopt it.
Last year, Google Chrome users started seeing new security warnings.
Previously, up to 70% of users would ignore website security warnings. So Google rolled out new ‘Not Secure’ messages for sites that don’t set-up SSL certificates.
Moving from HTTP to HTTPS isn’t as simple as you might think, though.
For example, you can just flip a switch inside Google Search Console to pick the ‘preferred domain’ of your site.
That way, you avoid potential canonicalization issues of your site recognized as two: a “www” and “non-www” option.
As discussed, any URL changes can cause you to lose links.
Architecture changes can break internal links. But you can also lose out on ‘link equity’ if sites link to HTTP and not the new HTTPS-version of your site.
Again, why are we harping on redirects?
Because too many can slow down your site’s performance.
And page speed has been officially confirmed as a ranking factor.
Kinsta ran a test on WordPress to see how redirects affect page speed.
First, they used Pingdom to run a page speed report with no redirect.
The page loaded in around 1.06 seconds. That’s a good score!
Next, they ran the test again. But this time, through a redirected URL.
And check out how it affected page load time:
Crazy, right?!
A redirect increased page load time by 58%.
That’s just a single page redirect, too.
Multiply this across dozens of redirects and you can see the problem.
Even worse, are when multiple redirects occur right after another.
This often happens if you’ve updated a page more than once. As in, multiple redesigns over the years.
One URL redirects to another, which redirects to another. And page speed slows to a crawl.
My favorite tool for diagnosing redirects is the redirect mapper tool from Patrick Sexton.
All you have to do is drop in your URL:
Hit “Go,” and you’ll instantly get feedback on different 301 redirects set up over the years:
Again, fewer is better. Google, themselves, literally says to eliminate as many as possible.
Which could be a problem if you’ve updated content during redesigns.
Here’s why.
Updated content messes with keyword targeting and page optimization.
Why would you ever setup two redirects for a single page?
That doesn’t make sense, right?
Of course not. At least, not intentionally.
Yet, it still happens all the time.
Here’s why.
Five years ago, you sold one product or service. Three years ago, it changed. And this year, it’s changing again.
In other words, the purpose behind the page evolves over time. So all of the content on the page changes, too.
It even happens with Skyscraper content. You take a lot of old posts that are underperforming, and redirect them to a new one.
Instead of relying too much on redirects, they should simply ‘refresh’ those old posts. Adding new content and images can boost SEO traffic by 111%.
Multiple redirects in a row cause performance issues.
However, continually changing page content also messes with your keyword targeting and on-site optimization.
Here’s how.
Let’s go back to a hotel example.
Initially, maybe they only have one two room types. But after a renovation, those are expanded.
The original website architecture might just list those first few rooms on the same page. But now, there’s too many.
So you change the “Rooms” page to a category page, that lists out ones underneath it.
The problem is that now your “Rooms” page also has zero content. It just serves as a drop-down now:
If that “Rooms” page was ranking previously, it isn’t anymore.
Now, you have thin content issues, for starters. This is when there’s less than ~300 words on individual pages of your site:
Page length matters because Backlinko’s analysis showed that “the average first-page result on Google contains 1,890 words.”
50% of search queries also contain four words. That means someone is typing in a long-tail keyword to find something specific on that page.
It’s hard to give people the information they’re craving if you’ve literally removed all (or most) of the content.
Content changes during site redesigns also wreck havoc on page metadata.
One of three things usually happens in this case:
The page content has changed, so the old metadata is no longer relevant
New metadata is copied and pasted from other sites
Or the designers and developers completely neglected to add any metadata to updated pages
Once again, Screaming Frog can help you diagnose these issues.
Drop in your URL and search for the meta description option. I like to start here, because it usually indicates a bigger problem at play.
For example, check out the following example. I’ve blurred the site’s name to protect the innocent.
Two problems are happening here. First, the same keyword is being repeated on multiple pages. This could lead to duplicate content issues and reduce their ability to get one main page to rank for that term.
Second, there’s a ton of pages missing a meta description entirely.
Meta descriptions technically don’t help you rank. They do, however, help you increase your SERP click-through rates (CTR). And new data suggests that CTR can often affect rankings directly.
If a page doesn’t have a meta description, search engines will often pull content directly from the page.
But in most cases, it’s random text that gets truncated because it exceeds length requirements.
So it’s not ideal. And people won’t click.
Here’s another common problem to look for:
We’re looking at different restaurante pages on one website. However, they all share the same exact meta description.
Once again, this is a red flag.
The duplicate metadata cannibalizes the chances of your primary page ranking well for this term.
And these inconsistencies typically indicate a larger problem at play.
Most firms that specialize in design will not touch the page’s metadata.
SEO isn’t a high priority for them. They might not have the specialists on staff.
So this is what happens. You get websites shipped that look fantastic, but don’t perform.
Pages have the same copied metadata. Or worse, title tags and descriptions are missing completely.
And at the end of the day, the only thing that matters is how your website performs — not how it looks.
Redesigns screw up ‘user flow’s that are already working.
My biggest problem with website redesigns is that they often screw up what’s already working.
If your business is up-and-running, chances are you already have purchases rolling in each day.
Redesigns that change site architecture or page content often screw this up.
You’re completely jeopardizing revenue.
And ultimately, your website’s ability to generate revenue is its most important aspect.
Changing all of that, without knowing if the new design is going to convert as well as the old design, is a huge gamble.
Results might increase. But you don’t know for sure. That’s exactly the problem.
Think about it this way.
A website’s macro-conversion, like a purchase, is made up of micro-conversions.
To get a sale, you first have to get people to your site. Then get them to visit certain pages. Then possibly opt-into something before they had over payment.
These ‘user flows’ are already playing out across your website.
Changing the sequence of these steps can have massive ramifications on the end goal.
This is the point behind conversion optimization that most people miss.
They think ‘conversion optimization’ means to change a button color or headline.
But in reality, changing how people flow through your site can often have a bigger impact on purchases.
One study, for example, found that optimizing an ecommerce checkout flow could result in an additional $87,175/month. That ~3% conversion increase could add another 23.94% to their top line.
Micro-conversions also extend to the internal links on your pages. These are like the beginning to a new path through your site.
Changing these links doesn’t seem like that big of a deal. However, now you can see, that it could have a big impact on how people purchase your products or services.
How do you avoid this?
Again, updating your site design is a good thing. But do it incrementally so you can test the effects on each page.
For example, here’s how my Quick Sprout blog looked a few days ago:
Now, compare that to how it looked a few years ago. You can do this yourself using the Wayback Machine.
Pretty similar, right?
Sure, it looks more ‘clean’ and ‘polished’ now. The design is still relevant for today.
However, I did not want to change what was already working. That’s how I like to run website redesigns. I’ll tweak element by element or page by page.
Then, I’ll use something like Crazy Egg to run tests after each change.
If numbers go down, I’ll revert back to the old design. Even if it seems a little ‘outdated.’
But if numbers go up, I’ll start carrying those new design updates over to my other pages.
That way, you should never, ever lose SEO rankings as a result of a website redesign.
Or, more importantly, you won’t lose revenue, either.
Conclusion
Website design updates should happen regularly.
Design trends change pretty frequently. And you want to make sure your website properly reflects your brand.
What you don’t want to do, however, is sabotaging everything else that’s working.
Large-scale site redesigns can often create tons of problems.
Site architecture changes can lead to performance issues. Content changes ruin your keyword targeting. And changing micro-conversions can drag down your macro-conversions.
The way your website looks is important. But only to a certain point.
The more important issue at play is whether revenue is increasing or decreasing.
Website redesigns can easily screw up your SEO. That causes rankings to fluctuate and traffic to decrease.
Declining traffic, then, brings down revenue with it.
Avoid this trickle-down issue by not changing what’s already working. If you’re going to update something, do it on small elements, first.
That way, you can test the impact in isolation. You can see if it’s going to increase or decrease results on a small scale.
Then, you can pull back to the old design if it’s not working without losing too much traffic or revenue.
And if it is working, you can start applying those proven updates to the rest of your site.
Websites aren’t just fashion statements. More often than not, functionality and performance should outweigh the appearance.
Have you ever experienced traffic drops right after a new website redesign?
About the Author: Neil Patel is the cofounder of Neil Patel Digital.
Read more here - http://review-and-bonuss.blogspot.com/2018/02/why-redesigns-sabotage-your-seo.html
0 notes
Text
Why Redesigns Sabotage Your SEO Rankings (And How to Avoid It)
It seems like most companies redesign their website every year or so.
New trends gain steam, so they want to be more ‘contemporary’ or ‘flat.’
Or new color schemes are en vogue. So every site you visit looks Asana-bright.
Everyone now wants to update their site on the same regular basis.
I love experimenting with new color schemes and trends, too.
Except for one tiny thing.
I hate redesigns.
Design updates are good. They allow you to incrementally make improvements to make sure your site is up-to-date.
But full-scale redesigns?
Where you completely overhaul the site architecture and page content?
You should avoid those like the plague.
I know that sounds surprising. But I’m going to share a few examples of how and where website redesigns go bad.
Especially when it comes to destroying all of the hard-earned SEO rankings you’ve built up over time.
Site architecture changes cause you to lose links.
There are hundreds of rankings factors for SEO.
But backlinks still reign supreme.
External links have been considered ‘votes’ since the beginning of (internet) time. Their quantity, diversity, and authority pass the most influence to raise your position in the SERPs.
Internal links don’t count for as much value. However, they do have a direct influence over someone’s website experience.
I’ll explain.
In 2011, Google Panda was released. It was one of the first reported cases where Google confirmed the use of qualitative factors.
They used a survey with questions like:
Would you trust information from this website?
Is this website written by experts?
Would you give this site your credit card details?
Do the pages on this site have obvious errors?
Does the website provide original content or info?
Would you recognize this site as an authority?
Does this website contain insightful analysis?
Would you consider bookmarking pages on this site?
Are there excessive adverts on this website?
Could pages from this site appear in print?
And they had people individually rate different websites.
Fast forward a few years, and Google also started taking user behavior into account.
They don’t just want to rank websites based on links or content length. They also want to look at the overall experience of that website.
They want to make sure that people find what they’re looking for.
So the better experience visitors have, the more credit the site will get.
What’s one way to ruin an otherwise nice experience?
Broken links that derail someone’s path through your site.
When most companies redesign websites, they start messing with the site architecture.
They create new pages and ditch old ones. Or they take content from one page and add it to another.
Then, they switch up their menus and navigation schemes.
It seems harmless on the surface. The new experience might even be superior to the old one.
But what they don’t realize is that they’re often creating a TON of problems for SEO.
For starters, site architecture changes can ruin hub pages you’ve worked hard to build.
These are like clusters of related pages on your site. And they can help increase your perceived authority on those topics.
Page-level changes also create broken internal links throughout the site.
You know the drill. You try to click on a new page to find related information, only to be met by a 404 error.
One or two isn’t a big issue. Redesigns, however, often create a ton of them all at one time.
For example, let’s say you’re redesigning a hotel or ecommerce website.
Chances are, you’re using a detailed parent-child structure to organize pages.
That means you might have “Rooms” at the top, followed by the individual types of rooms underneath.
The problem is that these structures often changes over time.
Maybe you come out with new products or services. Maybe you migrate old rooms into new ones.
One seemingly small change can often create a ripple effect throughout your site.
It might make perfect sense to move your featured rooms up a level or two.
However, any changes to your URL structures doesn’t create one or two broken links.
It can literally create hundreds to thousands.
Take blogs for example.
Let’s say you’ve worked hard over the years to create hundreds or thousands of blog posts.
But when it comes time to move over to a new CMS during a site redesign, someone wants to remove the date strings from the URLs.
Heck, all it takes is literally a single click inside WordPress to update Permalink Settings.
So yes, it seems harmless.
I’ve actually seen this mistake time and time again.
Poor, unsuspecting business owners who have their entire websites practically break.
Tens of thousands of page URLs break overnight.
And you know what happens to their rankings?
They drop like a rock.
Fortunately, Google Search Console can help you spot broken links under the Crawl Report.
My favorite tool for technical SEO audits is Screaming Frog.
It will crawl every page on your site, uncovering tons of on-site SEO issues.
For example, you can start by looking for the “Client Error (4XX)” report under Response Codes.
Most of these will be 404 errors, when the status is reported as “Not Found.”
So far, we’ve been focusing almost exclusively on broken internal links.
But that’s not the only way redesign changes can affect your site links.
Think about it this way.
Older, high-authority pages or posts tend to acquire the most backlinks.
The highest value links are also the hardest to get. These include editorial links, for example, that come from journalists or other influencers.
That also means you can’t control them.
So when your page or post URL changes, you will lose all of those external links, too.
This, again, happens all the time.
Permalink updates, moving the blog from a subfolder to subdomain, or even just new product pages replacing old ones can force you to lose all those backlinks.
The best solution? Don’t change old page URLs!
At least, not if you can help it.
Otherwise, another way to side-step this problem is through setting up 301 redirects.
These are ‘permanent’ redirects, telling search engines that the new page has now replaced the old one.
The Quick Page/Post Redirect Plugin for WordPress is one of the most popular options.
It’s also incredibly easy to use. All you have to do is drop in the old “Request” URL and then direct it to the new “Destination” one. The only caveat is that redirects like these should be used sparingly.
What you don’t want to see, is something like this:
Loading up on too many 301 redirects can cause other unintended consequences. And they’re usually a sign that there’s a bigger, underlying issue at play.
It means the site architecture has changed dramatically.
Here’s why too many redirects can also affect your SEO rankings.
Too many 301 redirects can cause slow page speeds.
“301” redirects have long been considered the best for SEO.
They indicate a ‘permanent’ change, as opposed to a ‘temporary’ one like a 302 gives off.
Either way, SEOs still feared that redirects would somehow limit the amount of PageRank that flowed through to the site.
Even Google’s own Matt Cutts once indicated some loss.
But in 2016, Google webmaster analyst, Gary Illyes, confirmed that all 3XX links pass full value:
30x redirects don't lose PageRank anymore.
— Gary "鯨理" Illyes (@methode) July 26, 2016
Another Googler, John Mueller, confirmed the same findings.
Why does this all matter?
Because redirects are often now used to update websites to HTTPS. So some SEOs think this is Google’s way to help make sure people adopt it.
Last year, Google Chrome users started seeing new security warnings.
Previously, up to 70% of users would ignore website security warnings. So Google rolled out new ‘Not Secure’ messages for sites that don’t set-up SSL certificates.
Moving from HTTP to HTTPS isn’t as simple as you might think, though.
For example, you can just flip a switch inside Google Search Console to pick the ‘preferred domain’ of your site.
That way, you avoid potential canonicalization issues of your site recognized as two: a “www” and “non-www” option.
As discussed, any URL changes can cause you to lose links.
Architecture changes can break internal links. But you can also lose out on ‘link equity’ if sites link to HTTP and not the new HTTPS-version of your site.
Again, why are we harping on redirects?
Because too many can slow down your site’s performance.
And page speed has been officially confirmed as a ranking factor.
Kinsta ran a test on WordPress to see how redirects affect page speed.
First, they used Pingdom to run a page speed report with no redirect.
The page loaded in around 1.06 seconds. That’s a good score!
Next, they ran the test again. But this time, through a redirected URL.
And check out how it affected page load time:
Crazy, right?!
A redirect increased page load time by 58%.
That’s just a single page redirect, too.
Multiply this across dozens of redirects and you can see the problem.
Even worse, are when multiple redirects occur right after another.
This often happens if you’ve updated a page more than once. As in, multiple redesigns over the years.
One URL redirects to another, which redirects to another. And page speed slows to a crawl.
My favorite tool for diagnosing redirects is the redirect mapper tool from Patrick Sexton.
All you have to do is drop in your URL:
Hit “Go,” and you’ll instantly get feedback on different 301 redirects set up over the years:
Again, fewer is better. Google, themselves, literally says to eliminate as many as possible.
Which could be a problem if you’ve updated content during redesigns.
Here’s why.
Updated content messes with keyword targeting and page optimization.
Why would you ever setup two redirects for a single page?
That doesn’t make sense, right?
Of course not. At least, not intentionally.
Yet, it still happens all the time.
Here’s why.
Five years ago, you sold one product or service. Three years ago, it changed. And this year, it’s changing again.
In other words, the purpose behind the page evolves over time. So all of the content on the page changes, too.
It even happens with Skyscraper content. You take a lot of old posts that are underperforming, and redirect them to a new one.
Instead of relying too much on redirects, they should simply ‘refresh’ those old posts. Adding new content and images can boost SEO traffic by 111%.
Multiple redirects in a row cause performance issues.
However, continually changing page content also messes with your keyword targeting and on-site optimization.
Here’s how.
Let’s go back to a hotel example.
Initially, maybe they only have one two room types. But after a renovation, those are expanded.
The original website architecture might just list those first few rooms on the same page. But now, there’s too many.
So you change the “Rooms” page to a category page, that lists out ones underneath it.
The problem is that now your “Rooms” page also has zero content. It just serves as a drop-down now:
If that “Rooms” page was ranking previously, it isn’t anymore.
Now, you have thin content issues, for starters. This is when there’s less than ~300 words on individual pages of your site:
Page length matters because Backlinko’s analysis showed that “the average first-page result on Google contains 1,890 words.”
50% of search queries also contain four words. That means someone is typing in a long-tail keyword to find something specific on that page.
It’s hard to give people the information they’re craving if you’ve literally removed all (or most) of the content.
Content changes during site redesigns also wreck havoc on page metadata.
One of three things usually happens in this case:
The page content has changed, so the old metadata is no longer relevant
New metadata is copied and pasted from other sites
Or the designers and developers completely neglected to add any metadata to updated pages
Once again, Screaming Frog can help you diagnose these issues.
Drop in your URL and search for the meta description option. I like to start here, because it usually indicates a bigger problem at play.
For example, check out the following example. I’ve blurred the site’s name to protect the innocent.
Two problems are happening here. First, the same keyword is being repeated on multiple pages. This could lead to duplicate content issues and reduce their ability to get one main page to rank for that term.
Second, there’s a ton of pages missing a meta description entirely.
Meta descriptions technically don’t help you rank. They do, however, help you increase your SERP click-through rates (CTR). And new data suggests that CTR can often affect rankings directly.
If a page doesn’t have a meta description, search engines will often pull content directly from the page.
But in most cases, it’s random text that gets truncated because it exceeds length requirements.
So it’s not ideal. And people won’t click.
Here’s another common problem to look for:
We’re looking at different restaurante pages on one website. However, they all share the same exact meta description.
Once again, this is a red flag.
The duplicate metadata cannibalizes the chances of your primary page ranking well for this term.
And these inconsistencies typically indicate a larger problem at play.
Most firms that specialize in design will not touch the page’s metadata.
SEO isn’t a high priority for them. They might not have the specialists on staff.
So this is what happens. You get websites shipped that look fantastic, but don’t perform.
Pages have the same copied metadata. Or worse, title tags and descriptions are missing completely.
And at the end of the day, the only thing that matters is how your website performs — not how it looks.
Redesigns screw up ‘user flow’s that are already working.
My biggest problem with website redesigns is that they often screw up what’s already working.
If your business is up-and-running, chances are you already have purchases rolling in each day.
Redesigns that change site architecture or page content often screw this up.
You’re completely jeopardizing revenue.
And ultimately, your website’s ability to generate revenue is its most important aspect.
Changing all of that, without knowing if the new design is going to convert as well as the old design, is a huge gamble.
Results might increase. But you don’t know for sure. That’s exactly the problem.
Think about it this way.
A website’s macro-conversion, like a purchase, is made up of micro-conversions.
To get a sale, you first have to get people to your site. Then get them to visit certain pages. Then possibly opt-into something before they had over payment.
These ‘user flows’ are already playing out across your website.
Changing the sequence of these steps can have massive ramifications on the end goal.
This is the point behind conversion optimization that most people miss.
They think ‘conversion optimization’ means to change a button color or headline.
But in reality, changing how people flow through your site can often have a bigger impact on purchases.
One study, for example, found that optimizing an ecommerce checkout flow could result in an additional $87,175/month. That ~3% conversion increase could add another 23.94% to their top line.
Micro-conversions also extend to the internal links on your pages. These are like the beginning to a new path through your site.
Changing these links doesn’t seem like that big of a deal. However, now you can see, that it could have a big impact on how people purchase your products or services.
How do you avoid this?
Again, updating your site design is a good thing. But do it incrementally so you can test the effects on each page.
For example, here’s how my Quick Sprout blog looked a few days ago:
Now, compare that to how it looked a few years ago. You can do this yourself using the Wayback Machine.
Pretty similar, right?
Sure, it looks more ‘clean’ and ‘polished’ now. The design is still relevant for today.
However, I did not want to change what was already working. That’s how I like to run website redesigns. I’ll tweak element by element or page by page.
Then, I’ll use something like Crazy Egg to run tests after each change.
If numbers go down, I’ll revert back to the old design. Even if it seems a little ‘outdated.’
But if numbers go up, I’ll start carrying those new design updates over to my other pages.
That way, you should never, ever lose SEO rankings as a result of a website redesign.
Or, more importantly, you won’t lose revenue, either.
Conclusion
Website design updates should happen regularly.
Design trends change pretty frequently. And you want to make sure your website properly reflects your brand.
What you don’t want to do, however, is sabotaging everything else that’s working.
Large-scale site redesigns can often create tons of problems.
Site architecture changes can lead to performance issues. Content changes ruin your keyword targeting. And changing micro-conversions can drag down your macro-conversions.
The way your website looks is important. But only to a certain point.
The more important issue at play is whether revenue is increasing or decreasing.
Website redesigns can easily screw up your SEO. That causes rankings to fluctuate and traffic to decrease.
Declining traffic, then, brings down revenue with it.
Avoid this trickle-down issue by not changing what’s already working. If you’re going to update something, do it on small elements, first.
That way, you can test the impact in isolation. You can see if it’s going to increase or decrease results on a small scale.
Then, you can pull back to the old design if it’s not working without losing too much traffic or revenue.
And if it is working, you can start applying those proven updates to the rest of your site.
Websites aren’t just fashion statements. More often than not, functionality and performance should outweigh the appearance.
Have you ever experienced traffic drops right after a new website redesign?
About the Author: Neil Patel is the cofounder of Neil Patel Digital.
Why Redesigns Sabotage Your SEO Rankings (And How to Avoid It)
0 notes
Text
Why Redesigns Sabotage Your SEO Rankings (And How to Avoid It)
It seems like most companies redesign their website every year or so.
New trends gain steam, so they want to be more ‘contemporary’ or ‘flat.’
Or new color schemes are en vogue. So every site you visit looks Asana-bright.
Everyone now wants to update their site on the same regular basis.
I love experimenting with new color schemes and trends, too.
Except for one tiny thing.
I hate redesigns.
Design updates are good. They allow you to incrementally make improvements to make sure your site is up-to-date.
But full-scale redesigns?
Where you completely overhaul the site architecture and page content?
You should avoid those like the plague.
I know that sounds surprising. But I’m going to share a few examples of how and where website redesigns go bad.
Especially when it comes to destroying all of the hard-earned SEO rankings you’ve built up over time.
Site architecture changes cause you to lose links.
There are hundreds of rankings factors for SEO.
But backlinks still reign supreme.
External links have been considered ‘votes’ since the beginning of (internet) time. Their quantity, diversity, and authority pass the most influence to raise your position in the SERPs.
Internal links don’t count for as much value. However, they do have a direct influence over someone’s website experience.
I’ll explain.
In 2011, Google Panda was released. It was one of the first reported cases where Google confirmed the use of qualitative factors.
They used a survey with questions like:
Would you trust information from this website?
Is this website written by experts?
Would you give this site your credit card details?
Do the pages on this site have obvious errors?
Does the website provide original content or info?
Would you recognize this site as an authority?
Does this website contain insightful analysis?
Would you consider bookmarking pages on this site?
Are there excessive adverts on this website?
Could pages from this site appear in print?
And they had people individually rate different websites.
Fast forward a few years, and Google also started taking user behavior into account.
They don’t just want to rank websites based on links or content length. They also want to look at the overall experience of that website.
They want to make sure that people find what they’re looking for.
So the better experience visitors have, the more credit the site will get.
What’s one way to ruin an otherwise nice experience?
Broken links that derail someone’s path through your site.
When most companies redesign websites, they start messing with the site architecture.
They create new pages and ditch old ones. Or they take content from one page and add it to another.
Then, they switch up their menus and navigation schemes.
It seems harmless on the surface. The new experience might even be superior to the old one.
But what they don’t realize is that they’re often creating a TON of problems for SEO.
For starters, site architecture changes can ruin hub pages you’ve worked hard to build.
These are like clusters of related pages on your site. And they can help increase your perceived authority on those topics.
Page-level changes also create broken internal links throughout the site.
You know the drill. You try to click on a new page to find related information, only to be met by a 404 error.
One or two isn’t a big issue. Redesigns, however, often create a ton of them all at one time.
For example, let’s say you’re redesigning a hotel or ecommerce website.
Chances are, you’re using a detailed parent-child structure to organize pages.
That means you might have “Rooms” at the top, followed by the individual types of rooms underneath.
The problem is that these structures often changes over time.
Maybe you come out with new products or services. Maybe you migrate old rooms into new ones.
One seemingly small change can often create a ripple effect throughout your site.
It might make perfect sense to move your featured rooms up a level or two.
However, any changes to your URL structures doesn’t create one or two broken links.
It can literally create hundreds to thousands.
Take blogs for example.
Let’s say you’ve worked hard over the years to create hundreds or thousands of blog posts.
But when it comes time to move over to a new CMS during a site redesign, someone wants to remove the date strings from the URLs.
Heck, all it takes is literally a single click inside WordPress to update Permalink Settings.
So yes, it seems harmless.
I’ve actually seen this mistake time and time again.
Poor, unsuspecting business owners who have their entire websites practically break.
Tens of thousands of page URLs break overnight.
And you know what happens to their rankings?
They drop like a rock.
Fortunately, Google Search Console can help you spot broken links under the Crawl Report.
My favorite tool for technical SEO audits is Screaming Frog.
It will crawl every page on your site, uncovering tons of on-site SEO issues.
For example, you can start by looking for the “Client Error (4XX)” report under Response Codes.
Most of these will be 404 errors, when the status is reported as “Not Found.”
So far, we’ve been focusing almost exclusively on broken internal links.
But that’s not the only way redesign changes can affect your site links.
Think about it this way.
Older, high-authority pages or posts tend to acquire the most backlinks.
The highest value links are also the hardest to get. These include editorial links, for example, that come from journalists or other influencers.
That also means you can’t control them.
So when your page or post URL changes, you will lose all of those external links, too.
This, again, happens all the time.
Permalink updates, moving the blog from a subfolder to subdomain, or even just new product pages replacing old ones can force you to lose all those backlinks.
The best solution? Don’t change old page URLs!
At least, not if you can help it.
Otherwise, another way to side-step this problem is through setting up 301 redirects.
These are ‘permanent’ redirects, telling search engines that the new page has now replaced the old one.
The Quick Page/Post Redirect Plugin for WordPress is one of the most popular options.
It’s also incredibly easy to use. All you have to do is drop in the old “Request” URL and then direct it to the new “Destination” one. The only caveat is that redirects like these should be used sparingly.
What you don’t want to see, is something like this:
Loading up on too many 301 redirects can cause other unintended consequences. And they’re usually a sign that there’s a bigger, underlying issue at play.
It means the site architecture has changed dramatically.
Here’s why too many redirects can also affect your SEO rankings.
Too many 301 redirects can cause slow page speeds.
“301” redirects have long been considered the best for SEO.
They indicate a ‘permanent’ change, as opposed to a ‘temporary’ one like a 302 gives off.
Either way, SEOs still feared that redirects would somehow limit the amount of PageRank that flowed through to the site.
Even Google’s own Matt Cutts once indicated some loss.
But in 2016, Google webmaster analyst, Gary Illyes, confirmed that all 3XX links pass full value:
30x redirects don't lose PageRank anymore.
— Gary "鯨理" Illyes (@methode) July 26, 2016
Another Googler, John Mueller, confirmed the same findings.
Why does this all matter?
Because redirects are often now used to update websites to HTTPS. So some SEOs think this is Google’s way to help make sure people adopt it.
Last year, Google Chrome users started seeing new security warnings.
Previously, up to 70% of users would ignore website security warnings. So Google rolled out new ‘Not Secure’ messages for sites that don’t set-up SSL certificates.
Moving from HTTP to HTTPS isn’t as simple as you might think, though.
For example, you can just flip a switch inside Google Search Console to pick the ‘preferred domain’ of your site.
That way, you avoid potential canonicalization issues of your site recognized as two: a “www” and “non-www” option.
As discussed, any URL changes can cause you to lose links.
Architecture changes can break internal links. But you can also lose out on ‘link equity’ if sites link to HTTP and not the new HTTPS-version of your site.
Again, why are we harping on redirects?
Because too many can slow down your site’s performance.
And page speed has been officially confirmed as a ranking factor.
Kinsta ran a test on WordPress to see how redirects affect page speed.
First, they used Pingdom to run a page speed report with no redirect.
The page loaded in around 1.06 seconds. That’s a good score!
Next, they ran the test again. But this time, through a redirected URL.
And check out how it affected page load time:
Crazy, right?!
A redirect increased page load time by 58%.
That’s just a single page redirect, too.
Multiply this across dozens of redirects and you can see the problem.
Even worse, are when multiple redirects occur right after another.
This often happens if you’ve updated a page more than once. As in, multiple redesigns over the years.
One URL redirects to another, which redirects to another. And page speed slows to a crawl.
My favorite tool for diagnosing redirects is the redirect mapper tool from Patrick Sexton.
All you have to do is drop in your URL:
Hit “Go,” and you’ll instantly get feedback on different 301 redirects set up over the years:
Again, fewer is better. Google, themselves, literally says to eliminate as many as possible.
Which could be a problem if you’ve updated content during redesigns.
Here’s why.
Updated content messes with keyword targeting and page optimization.
Why would you ever setup two redirects for a single page?
That doesn’t make sense, right?
Of course not. At least, not intentionally.
Yet, it still happens all the time.
Here’s why.
Five years ago, you sold one product or service. Three years ago, it changed. And this year, it’s changing again.
In other words, the purpose behind the page evolves over time. So all of the content on the page changes, too.
It even happens with Skyscraper content. You take a lot of old posts that are underperforming, and redirect them to a new one.
Instead of relying too much on redirects, they should simply ‘refresh’ those old posts. Adding new content and images can boost SEO traffic by 111%.
Multiple redirects in a row cause performance issues.
However, continually changing page content also messes with your keyword targeting and on-site optimization.
Here’s how.
Let’s go back to a hotel example.
Initially, maybe they only have one two room types. But after a renovation, those are expanded.
The original website architecture might just list those first few rooms on the same page. But now, there’s too many.
So you change the “Rooms” page to a category page, that lists out ones underneath it.
The problem is that now your “Rooms” page also has zero content. It just serves as a drop-down now:
If that “Rooms” page was ranking previously, it isn’t anymore.
Now, you have thin content issues, for starters. This is when there’s less than ~300 words on individual pages of your site:
Page length matters because Backlinko’s analysis showed that “the average first-page result on Google contains 1,890 words.”
50% of search queries also contain four words. That means someone is typing in a long-tail keyword to find something specific on that page.
It’s hard to give people the information they’re craving if you’ve literally removed all (or most) of the content.
Content changes during site redesigns also wreck havoc on page metadata.
One of three things usually happens in this case:
The page content has changed, so the old metadata is no longer relevant
New metadata is copied and pasted from other sites
Or the designers and developers completely neglected to add any metadata to updated pages
Once again, Screaming Frog can help you diagnose these issues.
Drop in your URL and search for the meta description option. I like to start here, because it usually indicates a bigger problem at play.
For example, check out the following example. I’ve blurred the site’s name to protect the innocent.
Two problems are happening here. First, the same keyword is being repeated on multiple pages. This could lead to duplicate content issues and reduce their ability to get one main page to rank for that term.
Second, there’s a ton of pages missing a meta description entirely.
Meta descriptions technically don’t help you rank. They do, however, help you increase your SERP click-through rates (CTR). And new data suggests that CTR can often affect rankings directly.
If a page doesn’t have a meta description, search engines will often pull content directly from the page.
But in most cases, it’s random text that gets truncated because it exceeds length requirements.
So it’s not ideal. And people won’t click.
Here’s another common problem to look for:
We’re looking at different restaurante pages on one website. However, they all share the same exact meta description.
Once again, this is a red flag.
The duplicate metadata cannibalizes the chances of your primary page ranking well for this term.
And these inconsistencies typically indicate a larger problem at play.
Most firms that specialize in design will not touch the page’s metadata.
SEO isn’t a high priority for them. They might not have the specialists on staff.
So this is what happens. You get websites shipped that look fantastic, but don’t perform.
Pages have the same copied metadata. Or worse, title tags and descriptions are missing completely.
And at the end of the day, the only thing that matters is how your website performs — not how it looks.
Redesigns screw up ‘user flow’s that are already working.
My biggest problem with website redesigns is that they often screw up what’s already working.
If your business is up-and-running, chances are you already have purchases rolling in each day.
Redesigns that change site architecture or page content often screw this up.
You’re completely jeopardizing revenue.
And ultimately, your website’s ability to generate revenue is its most important aspect.
Changing all of that, without knowing if the new design is going to convert as well as the old design, is a huge gamble.
Results might increase. But you don’t know for sure. That’s exactly the problem.
Think about it this way.
A website’s macro-conversion, like a purchase, is made up of micro-conversions.
To get a sale, you first have to get people to your site. Then get them to visit certain pages. Then possibly opt-into something before they had over payment.
These ‘user flows’ are already playing out across your website.
Changing the sequence of these steps can have massive ramifications on the end goal.
This is the point behind conversion optimization that most people miss.
They think ‘conversion optimization’ means to change a button color or headline.
But in reality, changing how people flow through your site can often have a bigger impact on purchases.
One study, for example, found that optimizing an ecommerce checkout flow could result in an additional $87,175/month. That ~3% conversion increase could add another 23.94% to their top line.
Micro-conversions also extend to the internal links on your pages. These are like the beginning to a new path through your site.
Changing these links doesn’t seem like that big of a deal. However, now you can see, that it could have a big impact on how people purchase your products or services.
How do you avoid this?
Again, updating your site design is a good thing. But do it incrementally so you can test the effects on each page.
For example, here’s how my Quick Sprout blog looked a few days ago:
Now, compare that to how it looked a few years ago. You can do this yourself using the Wayback Machine.
Pretty similar, right?
Sure, it looks more ‘clean’ and ‘polished’ now. The design is still relevant for today.
However, I did not want to change what was already working. That’s how I like to run website redesigns. I’ll tweak element by element or page by page.
Then, I’ll use something like Crazy Egg to run tests after each change.
If numbers go down, I’ll revert back to the old design. Even if it seems a little ‘outdated.’
But if numbers go up, I’ll start carrying those new design updates over to my other pages.
That way, you should never, ever lose SEO rankings as a result of a website redesign.
Or, more importantly, you won’t lose revenue, either.
Conclusion
Website design updates should happen regularly.
Design trends change pretty frequently. And you want to make sure your website properly reflects your brand.
What you don’t want to do, however, is sabotaging everything else that’s working.
Large-scale site redesigns can often create tons of problems.
Site architecture changes can lead to performance issues. Content changes ruin your keyword targeting. And changing micro-conversions can drag down your macro-conversions.
The way your website looks is important. But only to a certain point.
The more important issue at play is whether revenue is increasing or decreasing.
Website redesigns can easily screw up your SEO. That causes rankings to fluctuate and traffic to decrease.
Declining traffic, then, brings down revenue with it.
Avoid this trickle-down issue by not changing what’s already working. If you’re going to update something, do it on small elements, first.
That way, you can test the impact in isolation. You can see if it’s going to increase or decrease results on a small scale.
Then, you can pull back to the old design if it’s not working without losing too much traffic or revenue.
And if it is working, you can start applying those proven updates to the rest of your site.
Websites aren’t just fashion statements. More often than not, functionality and performance should outweigh the appearance.
Have you ever experienced traffic drops right after a new website redesign?
About the Author: Neil Patel is the cofounder of Neil Patel Digital.
0 notes