#SEO Crawler – Digital Marketing Agency HTML Template
Explore tagged Tumblr posts
Text
SEO Crawler - Digital Marketing Agency HTML Template
SEO Crawler – Digital Marketing Agency HTML Template
[ad_1]
![Tumblr media](https://64.media.tumblr.com/c613b0d8f610b9d5bdf77f6de41402ab/c9cfd121e728a4f2-b3/s540x810/e0957a3cbaf6a114a44a87ee058aba99b50c4b23.jpg)
NOTE: Please READ BEFORE BUYING: This is not a WORDPRESS Theme. The item works as intended, so it works only as HTML template. No refunds will be given for mistake purchase, i didn’t test it on DW or Vuejs, or React, the item is converted from WP version, so it has some of the wp classes and inline styles which we couldn’t remove.
Seocrawler is a beautiful Seo, Digital Agency Template…
View On WordPress
#digital marketing#marketing#marketing wordpress theme#online marketing#page rank#search engine optimization#seo#seo agency#seo business#seo company#seo html#seo services#seo wp#social media#vuejs
1 note
·
View note
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
#túi_giấy_epacking_việt_nam #túi_giấy_epacking #in_túi_giấy_giá_rẻ #in_túi_giấy #epackingvietnam #tuigiayepacking
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
https://ift.tt/39YwJky
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Video
youtube
7 skills for web developers
Web development is one of the fastest growing industries. In fact, it's predicted to grow 13% until 2026.
The figure also shows that there's (and will) be a lot of work to follow. But do you have the skills to stand out in the competition and get the job of your dreams?
There are many skills you will need to develop and create successful websites. Here are the 7 most important web development skills you need!
1.HTML / CSS As a web developer, you will need to understand the basics of coding and markup languages.
https://en.wikipedia.org/wiki/Web_developer
Of all markup languages, Hypertext Markup Language (HTML) is the standard.
The actual HTML forms every web page on the Internet as we know it. How a website functions depends on how the developer writes the HTML.
But for your site to actually render as a web page, you'll rely on CSS.
You cannot write HTML without CSS
Cascading Style Sheets (CSS) interpret documents written in markup language. They are a more stylized representation of the HTML language.
https://www.w3schools.com/html
CSS also describes how an HTML document will look visually like a web page. It sets the bricks for the font, color, and overall layout of the website.
Think of it this way: HTML builds the skeleton of a web page. CSS gives the website its style and look.
Most basic web development skills require mastery of HTML and CSS. Don't neglect their importance!
2.JavaScript Once you master HTML and CSS, you'll eventually want to learn JavaScript.
JavaScript is a higher level programming language. It makes the website more interactive and functional. Create a website for the future
The web development industry is taking off. Standards are growing higher and more rigid. And with that, there will be higher expectations for the websites you create and the customers you work for.
JavaScript will allow you to create a better experience for web users. With JavaScript, you can write special features directly to your web pages. These include (but are not limited to) a search bar, social media and video share buttons.
JavaScript complements HTML. Although HTML forms a basic web page, JavaScript gives it more life and functionality.
3.Photoshop As a web developer, you'll want to know your way around Photoshop. It will not only make your life easier, but also help you perform better and faster.
You'll have a lot of fun to edit, design and stylize your website with Photoshop. You can even design some banners and logos for clients throughout your career.
But your Photoshop skills will extend far beyond appearance.
Once you master Photoshop, you won't just learn how to translate and design code. You will also create multiple website mockup.
So in other words, you'll be mainly using Photoshop for website planning.
4.WordPress Nearly 75 million websites operate on WordPress alone. That's over 25% of the internet altogether.
WordPress is a free content management system. It's also great for both beginners and for established web developers too.
It is relatively easy to use because you can edit and modify web pages, add plugins, and run error testing. There is also a Yoast feature that will help you with SEO.
You will want to develop your website building skills using other platforms. But WordPress is not just a standard but also a linchpin in the web development world.
5. Analytical skills If your web developer skills are strong, you will create successful websites. But there is a marketing aspect for jobs that few people really understand.
Of course, the most successful websites are the most functional.
But consumer behavior is always changing. So your design, coding and development skills will always evolve to delight the ever-changing consumer.
Hence, web developers need a strong understanding of the consumer. Especially web consumers.
You will meet a wide range of audiences, niche markets and customers throughout your career. If you can understand your consumers as a whole, it will only help you create sales websites. Know your audience
There are several ways to understand web consumers. But the most concrete way to understand them is to hone their online behavior.
And that's where web analytics tools come in.
Fortunately, there are many tools available in the market to help you collect web stats. For example, there are Google Analytics, MOZ Keyword Explorer, and SEMRush.
With statistics on the web, you will better understand your specific target audience. Web statistics will tell you which keywords users search for and how long they stay on your site.
It's access to the mind and interests of your target audience. And with all this knowledge, you can create more engaging websites.
6.SEO Search engine optimization (SEO) is the driving force of modern marketing.
https://en.wikipedia.org/wiki/Search_engine_optimization
Nowadays, websites need SEO to attract traffic and secure leads. Most modern consumers find products and services through online searches. Sites that do not implement SEO will not render high enough on search engine results pages.
Page upload speed, domain name reliability and keyword content are just some of the SEO skills web developers can (and should) learn. Increase traffic to every website you create
Web developers can apply SEO to help their website rank better and attract more traffic.
find us: Search engine optimization, Web search engine, Nofollow, Web crawler, Google Search, Meta element, PageRank, Digital marketing, Web software, Digital media, Human–computer interaction, Online services, Information science, Computing, Technology, Information retrieval, Communication, Web crawlers, Advertising agencies, Web development, Information technology, E-commerce, Online databases, Web 2.0, Indexes, Aggregation websites, Search engine software, Cyberspace, Websites, Internet search, Web scraping, Internet, Internet search engines, Hypertext, World Wide Web, Spamdexing, Online advertising, Search engine marketing, Web services, Web technology, Marketing, Marketing software, Software, Search engine indexing, Business, Mass media, Human activities, Multimedia, Media manipulation, Cultural globalization, Digital technology, Search engine results page, Directories, Backlink, Reference works, Web applications, Pay-per-click, Bing (search engine), Electronic publishing, Centralized computing, Computer networks, Media technology, Information economy, Website, Computer archives, Google Penguin, Promotion and marketing communications, Service industries, Robots exclusion standard, Human communication, Information technology management, Advertising, Google Hummingbird, Web directory, Communication design, Data management, Googlebot, Spamming, Internet ages, Sitemaps, Intertextuality, Artificial intelligence, Index (publishing), Tag (metadata), Human–machine interaction, Web analytics, Wikipedia, Affiliate marketing, Site map, Canonical link element, Wikidata, History of Google, Web traffic, Public sphere, HTML element, Link farm, Metadata, Internet ethics, Computer data, Noindex, Chromium (web browser), Targeted advertising, Cloaking, Business economics, Local search engine optimisation, User interfaces, Written communication, Cybercrime, Lawsuit, Complaint, Web search query, Software development, Yahoo!, Web portals, Google, Digital marketing companies of the United States, Larry Page, Internet Protocol based network software, Library science, Information retrieval organizations, Alternate reality, Mobile search, Local search (Internet), Sergey Brin, Matt Cutts, Information management, Danny Sullivan (technologist), Software engineering, Databases, Alphabet Inc., Web content, Computer networking, Hypertext Transfer Protocol clients, Applications of cryptography, Wikimedia Commons, Telecommunications, Internet fraud, Social media marketing, Publishing, User agent, Domain name, Marketing strategy, Video search engine, Computer file, Design, Seznam.cz, Blog, Wikimedia Foundation, Network service, DMOZ, Digital display advertising, Social media, Mathematical optimization, Wikiversity, Internet technology companies of the United States, Google services, Grey hat, Google Panda, Keyword density, Ad blocking, Internet forum, Web developer, World Wide Web, Internet, Information technology management, Information economy, Digital media, Human–computer interaction, Information science, Digital technology, Information technology, Cyberspace, Technology, Web development, Software development, Hypertext, Software engineering, Software, Computing, Human activities, Product development, Computer programming, Technology development, Systems engineering, Computer engineering, Web application, Multimedia, Web software, Software project management, Programmer, Cultural globalization, Media technology, Computers, Electronic publishing, Web design, Intellectual works, User interfaces, Communication, JavaScript, Computer science, Information Age, Web technology, Mass media, Electronics industry, PHP, Websites, Web template system, Web 2.0, Information management, Wikipedia, Scripting language, Human communication, Server-side scripting, Application software, HTML, Active Server Pages, Adobe ColdFusion, Single-page application, Software architecture, Web server, Wikidata, React (web framework), JQuery, Internet ages, Ember.js, AngularJS, ColdFusion Markup Language, Computer data, System software, Web applications, Cascading Style Sheets, Intertextuality, Management, Centralized computing, Human–machine interaction, Wikimedia Commons, Front-end web development, Website, Business, Wikimedia Foundation, Design, Database, Online services, Java (programming language), Open-source movement, Web service, Python (programming language), Free content, Computer-related introductions, Server-side, Wikiversity, World Wide Web Consortium standards, Server (computing), Software framework, Front and back ends, Ruby (programming language), Information and communications technology, Artificial intelligence, Web standards, Software design, Free software, Perl, Computer networking, Systems science, Hypertext Transfer Protocol, Employment, IT infrastructure, World Wide Web Consortium, Knowledge representation, Systems architecture, Public sphere, Information retrieval, Computer-mediated communication, Distributed computing architecture, Wide area networks, PDF, Programming paradigms
0 notes
Text
Top 7 Trends In SEO 2019 To Watch
Businesses need to use 2018 in order in order to the bad habit of looking at aspects of an SEO marketing and advertising strategy being a one-time exercise. All of us desired to look at a massive group that included businesses of most sizes and in all sectors so we could really discover how SEO traffic increased whenever the only common denominator was including reviews to their site. In reality, this is portion of the SEO technique we have used to continuously grow our organic traffic throughout the last 12 months right here at SnapApp: #3. Mobile will certainly account for 72% of ALL OF US digital ad spend by 2019. This workshop is usually designed to help business proprietors implement Digital Marketing techniques with regard to their business including social mass media, organic SEO as well since paid channels. The Beginner's Guide in order to SEO continues to be study over 3 million times plus offers comprehensive information you require to get on the road to professional quality Search Motor Optimization, or SEO. Whether it's using web analytics for carrying out research or even conversion tools that monitor plus report the usage of essential keywords, SEO agencies know exactly how and where you can obtain the information they need in order to save time furthermore lessen your in-house price. Right now there is probably no more fundamental strategy for SEO than the particular integration of internal links straight into your site - it is usually an easy way to increase traffic to individual pages, SEARCH ENGINE OPTIMIZATION Consult says. SEO SERVICES Presently there are a number of SEARCH ENGINE OPTIMIZATION services which can help enhance the organic search engine ranks of a website. Since it turns out, there's more in order to on-page SEO than optimizing with regard to keywords. Search engines motor optimization (organic SEO) describes the particular methods used to obtain the high placement (or ranking) upon a search engine results web page in unpaid, algorithm-driven results upon the given search engine. Also, videos possess a lot of untapped possible - great for SEO and even make for good user diamond. The quite best SEO expert 2019 will certainly tell you for High-End mobile devices, we're seeing more format changes to focus, provide the better experience, search results. Social SEO is especially helpful for online reputation administration. It isn't just the means that Google ranks optimized content material, but the way that they will rank poorly constructed or taboo content that will push your own ranking to where it really should to be in 2019. Ray Cheselka, SEO & Ppc Manager at SEO and style agency, webFEAT Complete, predicts that will sites with over a 2 second load time will end up being penalized, and search intent is usually going to carry on and grow within importance. SEO consists of ordering the site's architecture and hyperlinks to make pages inside the particular website easier to find plus navigate. In contrast, content material that no one is humming about have minimal effect upon social SEO. They are usually generally knowledgeable within the are usually of both SEO and content material marketing. When it arrives to reviews, customers work as a good army of link builders plus keyword writers so your SEARCH ENGINE OPTIMIZATION structure is shaped without a person having to lift a hand. A reputable SEARCH ENGINE OPTIMIZATION company carries on along with your Company Profile and then do individuals profile building and then Regional Business Listing Optimization. Our first five steps had been dedicated to on-page SEO strategies Link-building is the primary goal of off-page SEO, and is usually also an enormous factor within how search engines rank your own web pages. So far because I know, this only functions for HTML or CSS webpages - I don't go very much for Flash websites, and We am unsure how that cookware out regarding search engines plus SEO. In fact, a huge part of SEO in 2018 is writing for humans PLUS search engines (also known since: SEO copywriting”). SEO is the significant part of any web marketing strategy. Canonical: This connect to handles content syndication, which basically allows other blogs to post your projects (similar to franchising) with out hurting your website's SEO ranking—simply by having a rel=canonical can obtain your brand and content away on the web in several outlets, ensuring a greater get to and bigger audience without harming your own search results. We are usually offering affordable search engine optimisation SEO services to clients throughout the globe. If you want the strong social media strategy, a person simply can't ignore SEO. Another SEO-related plugin, W3 Total Cache is used to boost the performance of your Wp blog or a website simply by integrating features such as content material delivery networks to be capable to reduce the loading periods for your pages. Search Motor Optimization (SEO) is the technique to campaign your products on the internet towards the right clientele with regard to increasing ROI. There are many tools upon the web to assist along with basic keyword research (including the particular Google Keyword Planner tool and there are even more useful third party SEO tools to help you perform this). The biggest change that will we'll see in 2019 (and that's already happening) is the particular fact that keywords are getting less important. Once you've discovered your keywords, use another SEMrush tool, the SEO Content Design template, which is part of their particular Content Marketing Toolkit, to function out the best way in order to optimize your content. If a company is providing you all these services below one roof, climbing up the particular SEO ranking will not become an uphill task for your own business any more. Local SEO companies allows a person to position your business upon search engines like google plus other digital marketing platforms therefore you're seen by potential clients — on their terms. YouTube adds the particular nofollow” tag to their hyperlinks, so they don't really assist with SEO. SEARCH ENGINE OPTIMIZATION isn't just about building research engine-friendly websites. The sooner you understand the reason why Google is sending you much less traffic than it did this past year, the sooner you can clear it up and focus upon proactive SEO that starts to effect your rankings in a optimistic way. The trending SEARCH ENGINE OPTIMIZATION technique or strategy now plus in 2019 will be Lookup Enterprise Optimization It has already been coined by myself since 2015. The sensible strategy for SEO would certainly still appear to be in order to reduce Googlebot crawl expectations plus consolidate ranking equity & possible in high-quality canonical pages plus you do that by reducing duplicate or near-duplicate content. This can take a LONG period for a site to recuperate from using black hat SEARCH ENGINE OPTIMIZATION tactics and fixing the troubles will never necessarily bring organic visitors back as it was prior to a penalty. The particular best SEO Guide is right here to dispel those myths, plus give you all you require to know about SEO in order to show up online and some other search engines, and ultimately make use of SEO to grow your company. > > Upon Page Optimization: On-page SEO will be the act of optimizing novel pages with a specific finish goal to rank higher plus acquire more important movement within web crawlers. There are numerous 5 Great Lessons You Can Learn From SEO 2019 websites providing pertinent information regarding SEO and online marketing, and you may learn from them. But it's perplexing why some businesses don't consider harder with analysis, revisions, plus new content with their SEARCH ENGINE OPTIMIZATION online marketing strategy. An effective SEO strategy will certainly be made up of a variety of elements that ensure your internet site is trusted by both customers as well as the lookup engines. By taking their particular marketing needs online and employing confer with an experienced SEARCH ENGINE OPTIMIZATION agency, a business has the capacity to achieve thousands, or even millions associated with people that they would possess not been able to in any other case. An SEO on the internet marketing strategy is a extensive plan to get more individuals to your website through research engines. Several search optimizers try to cheat Google by using aggressive strategies that go beyond the simple SEO techniques. Subscribe to the particular Single Grain blog now intended for the latest content on SEARCH ENGINE OPTIMIZATION, PPC, paid social, and the particular future of internet marketing. SEO can furthermore stand for search engine optimizer. Like the rest of the particular digital landscape, SEO marketing will be continuously evolving. Search Engine Book — Read information right after Moz's guide to solidify knowing regarding it of the basic elements of SEO. If a person do not have the period or have insufficient training upon web design or SEO, Appear for web design experts plus hire a professional SEO services agency to keep your web site and your good reputation normally you business may depend upon it. I ended with the website number 1, 228, 570, 060. This particular generates SEO anchor text, which usually helps you in improving your own search engine rankings. SEARCH ENGINE OPTIMIZATION marketing is focused on the keyword choice that will attract a excellent deal of unique visitors in order to your website. Maybe you have ponder what will be the brand new changes and updates that we all can experience in SEO back links sphere in 2019? A Cisco research found that by 2019, eighty percent of all consumer Web traffic will be from Web video traffic. If you might have spent time online recently, might probably see the term "SEO, inch or "Search Engine Optimization. If you understand you might have VERY lower-quality doorway pages on the site, a person should remove them or re-think your SEO strategy if a person want to rank high within Google for the long expression. One of the issues engines like google and Bing have often attempted to overcome is knowing which external links exist exclusively for SEO purposes and which usually links represent a true sign that the source content will be of value to the visitors. 2018 has currently seen some particularly significant SEO paradigm shifts from Mobile First” in order to the ever-advancing Rankbrain machine-learning criteria. Content marketing is usually a bigger approach which along with SEO forms a part associated with your digital marketing strategy. What You Should Know: This future of search engine marketing is Semantic SEO. Links plus technical SEO are the biggest pieces of the pie, yet multimedia efforts such as video clip, photos, and podcasts will become the game changer and differentiator in many competitive markets. Occasional, and I actually do more occasional and not really frequent, usage of keywords plus keyword phrases in these hyperlinks may also help very somewhat in your SEO processes. Excelling at SEARCH ENGINE OPTIMIZATION means serving your visitors—not simply search engines. Here arrives the idea of SEO or even search engine optimization. The particular fact remains that SEO providers assure clients that even in case the site will not position among the top search motors like google, the money can not be a waste owing to the refund. Upon the subject of speed, with the beginning of 2017 right now there was still much resistance in order to AMP in the SEO neighborhood overall, but as we mind toward 2018 that feels in order to be dissipating now somewhat along with a reluctant acceptance that AMPLIFIER looks as though it's not really going away sooner. The biggest way that individuals misuse SEO is assuming that will it's a game or that will it's about outsmarting or deceiving the search engines. Both are crucial to the particular success of an SEO strategy, but they're on completely different edges of the fence when this comes to improving your search motor rankings. Greater than 50% of mobile phone customers started using voice search correct from 2015, and so we all can expect that in 2019 and after that not much less than 50% of searches will certainly be in the form associated with voice search. Within the past, getting a great SEO was only about making use of keywords. Just remember Blog9T that , SEO will be about targeting real people, not really only search engines. When you do these on-page plus off-page elements of SEO with least along with your rivals, you can achieve higher research engine ranking positions in the particular organic section of search motor results pages and have the quality website capable of sustaining your revenue goals. In my opinion that 2018 is going to be the particular year where voice search changes how users search and SEOs need to optimize. SEO or Search engine marketing is a term coined jointly to describe the techniques that will the website should use in order to boost its rankings on the search engine. Investing in high quality tactics which do take more time but endure and are usually the ones which generate important traffic is the best method to spend your time upon SEO. Some SEARCH ENGINE OPTIMIZATION specialists and bloggers say that will short URLs ranks better within Google. This technique will be sometimes known as SEO content” or SEO copywriting”. But you need to not ignore the most substantial part of the website developing process, i. e. SEO (Search Engine Optimisation). Whilst businesses can choose to perform their own SEO, hiring the SEO agency that has encountered search engine optimizers will simply no doubt, help businesses reap RETURN ON INVESTMENT in the long run. Within fact if you look carefully you will find that content material marketing and SEO go hands in hand. On the some other hand, Black Hat SEO contains efforts like redirecting internet lookup engine "spiders" to different web pages than human visitors see, mass-posting "spam" comments (on blogs, community forums, articles), or putting lists associated with keywords in late each web page in very small fonts. Since we've seen, one of the particular major advantages of SEO will get more traffic because of much better search engine ranking. If you are severe about improving search traffic plus are unfamiliar with SEO, we all recommend reading this guide front-to-back. A few of these SEO strategies attempt to deceive users into going to sites about subjects they no longer have any fascination with, which places them at odds with the particular purpose of search engines. Also companies considering about getting Search engine optimisation services should go through these types of magazines to familiarize themselves making use of the latest trends within the particular SEO and web-based marketing sector to allow them to evaluate the assistance offered for all of them from the selected SEO companies. Mobile SEARCH ENGINE OPTIMIZATION in 2018 will likely be all regarding Progressive Web Apps (PWAs). The term SEARCH ENGINE OPTIMIZATION also describes the making internet pages easier for internet research engine indexing software, known because "crawlers, " to find, check, and index your web site. I feel that technical SEARCH ENGINE OPTIMIZATION mistakes that affect crawl spending budget - and also pollute Search engines with non-SEO-friendly content such because social landing pages, WordPress mass media archives, offer pages and cloned e-commerce product pages - will certainly have an even more detrimental effect upon sites moving forward. Effective SEO lets you enhance your website to demonstrate up within search engines. All of us will get into how to be able to pick the best keywords intended for your business later in this kind of SEO guide, but it perfect you to know how in order to use them, as they will be referenced throughout this section. Despite the fact that meta descriptions are not the ranking factor for search machines, they do hold value with regard to your website and are component of your SEO presence. Let us speak a bit as to exactly what SEO is before we enter into the SEO article writing recommendations for people who may end up being new or do not very understand it. SEO stands intended for Seo. The much better you get at SEO, the particular more traffic - and even more leads - you're likely in order to attract over time. To find away more read our 2019 developments in SEO marketing report. Are voice searches plus it's expected that by 2019, 67 million voice-assisted devices is definitely going to be in make use of in the U. S. These types of kinds of changes and provide on your website shouldn't end up being a problem logistically — the particular overall practice among the greatest web companies nowadays is making sure that websites are flexible sufficient, especially for SEO purposes. Whether you are usually a marketer, webmaster or organization owner, it is very essential invest in voice SEO marketing to reap benefits in 2019. We said earlier that sociable media isn't a direct SEARCH ENGINE OPTIMIZATION ranking factor, so you're most likely wondering why we're even bringing up it. The particular effects of Black hat SEARCH ENGINE OPTIMIZATION are temporary, keep in mind that take the particular search engine long before this spots these illegal strategies plus then penalizes you; the lookup engine may spam your hyperlinks and if you continue making use of these malpractices the search motor will altogether block your web site and links. Algorithmic chasers, technical SEOs, plus Google Doodle followers should develop their technical skills to concentrate on emerging voice search systems and AI applications. Single Grain is a electronic marketing agency in order in order to companies like Uber, Amazon plus Salesforce grow their revenues on-line using SEO and paid marketing. This free marketing device is really a long-term technique and can be time rigorous but it will worth this. A good user experience will be the key to satisfaction plus a powerful SEO. 1 SEO might target different kinds of research, including image search, video research, academic search, 2 news research, and industry-specific vertical search motors. All of us are in the fourth 1 / 4 of 2018, it is the particular right time start thinking concerning the year ahead and brand-new changes SEO realm might count on. We've long known that client opinions, input, and sentiment regarding a brand deeply impact SEARCH ENGINE OPTIMIZATION rankings, but we wanted the particular information to prove it. Off-page SEO means getting action to build trust, expert, social signals and inbound hyperlinks. How video affects SEO actually depends on your goals whenever using video in your advertising campaign and which video system you use. LSI is just not what most SEO specialists claim it to be. This is certainly not a idea that can be used simply by the average web designer or even webmaster to improve their lookup engine listings, and it is definitely not what many people, which includes myself, has written it in order to be. Nevertheless, first some history. This particular video walks you through the particular specific steps you have to get increased rankings in 2018, including launching speed, technical SEO, content, hyperlinks, and more. Due to the fact it is, all across this year agencies specializing in SEO were active recovering and finding new procedures to optimize the search motor even better than what their particular competitors were doing. With internet customers who use their mobiles in order to search on the increase, because an SEO consultant it can make sense to get a look at the particular effects SEO marketing is putting on search engine optimization. The sole purpose of SEARCH ENGINE OPTIMIZATION Services is to improve your own search engine ranking. Guarantee redirected domains redirect through the canonical redirect and this too provides any chains minimised, although Help to make sure to audit the backlink user profile for any redirects you stage at a page just such as reward comes punishment if these backlinks are toxic (another kind of Google opening up the particular war which is technical seo on a front it's not really, and in fact is speak, to building backlinks to your own site). In order to smoothen out the software plan interface problem, the web developing team as well as the particular SEO specialist work together in order to build the major search motors friendly programs and code that could be easily integrated into the user's website. They will have got to find SEO expert internet sites, who will help the company owner's site have many clients in internet marketing. This is since they are not SEO pleasant and can affect your positioning significantly. These SEO crawler programs are similar to Google's own crawlers and will provide you an overview showing exactly how your page will perform within SEO rankings. Google is making certain it takes longer to observe results from black and white hat SEARCH ENGINE OPTIMIZATION, and intent on ensuring the flux in its SERPs structured largely on where the searcher is in the world during the particular time of the search, plus where the business is situated near to that searcher. In 2018, SEO is content and articles is SEO, content is electronic and digital is content. Before starting a good SEO project, site owners ought to carefully read through the site owner guidelines that every search motors provides and follow recommended greatest practices. This is clear that when website owners hire a search engine optimisation SEO expert, they stand the better chance of maximizing their own SEO services. Search motors cannot understand this type associated with content, so it's vital that you design and style video pages in an SEARCH ENGINE OPTIMIZATION oriented manner. Subscribe to our weekly SEARCH ENGINE OPTIMIZATION and daily SearchCap newsletters for a summarize of all the latest SEARCH ENGINE OPTIMIZATION related news, tips and strategies from Internet search engine Property and other sources all more than the Web. When you aren't logged in, go to SEARCH ENGINE OPTIMIZATION Toolkit » Keyword Research » Keyword Overview. For almost any SEO technique to operate successfully you require content and that will can come in any type like keywords, articles or sites. From 2019, AI can be utilized by the businesses to acquire higher rankings upon search engines. For that reason, it can be a great idea to incorporate expenses intended for professional SEO content in your own quarterly marketing budget. Google will see right via sneaky, black hat SEO techniques like creating duplicate pages, producing pages with thin content simply for the sake of obtaining more pages and buying inbound links. One thing is definitely crystal clear: if you would like people to discover your function, you need search engine marketing (SEO). This almost all means when you're thinking regarding your SEO strategy, you require to think about how your own social networking strategy fits into the particular puzzle, too. As your visitors slowly increases, ensure that you include additional SEO strategies (like adding exterior and internal links, guest publishing, etc) to engage more customers and keep your metrics higher. Google has still left a very narrow band associated with opportunity when it comes in order to SEO - and punishments are made to take you out associated with the game for some period while you cleanup the infractions. Head of Marketing, Shiny Edstrom, at BioClarity, a San Diego based health-science company, thinks that SEO is going in order to see a decreased importance within 2019, and SEOs should begin ensuring they're competent in SEARCH ENGINE MARKETING. The best rank that you could see the profile with contact information and maps on right hands side of search result web page is achieved after long treatment for using SEO tactics. Numerous search engine optimisation. Search Engine Optimization (SEO) is the technique of customizing a website for search motor discovery and indexing. SEO is actually a technique which search motors require that sites must improve properly, so that they show up high in search results. The European Search Honours is an international competition that will shines the spotlight for the best SEO and Content Marketing organizations in Europe.
0 notes