#SearchAtlas
Explore tagged Tumblr posts
seowidgets · 2 days ago
Link
Curious about SearchAtlas? Discover the essential facts in our straightforward guide to understanding who they are and what makes them unique.
0 notes
thelmasirby32 · 4 years ago
Text
What the commoditization of search engine technology with GPT-3 means for Google and SEO
30-second summary:
Google’s technological edge has always come from its computational power.
This edge is no longer special. AWS, Microsoft, and other cloud services now give us access to essentially unlimited computing power on demand.
Generative Pre-trained Transformer 3 (GPT-3) technology is the largest most advanced text predictor ever. It will eventually be available as a commercial product.
We may see big players like Apple enter the search engine market.
Founder and CTO of LinkGraph gives you foresight into the sea of opportunities ahead.
The tech-world recently geeked out after its first glimpse into OpenAI’s GPT-3 technology. Despite some kinks, the text predictor is already really good. From generating code to writing GoogleAds copy, to UI/UX design, the applications of GPT-3 have sparked the imaginations of web developers everywhere. 
This changes everything.
With GPT-3, I built a Figma plugin to design for you.
I call it "Designer" pic.twitter.com/OzW1sKNLEC
— Jordan Singer (@jsngr) July 18, 2020
But they should also be sparking the imaginations of search engine optimizers. In its debut week, we already witnessed a developer build a search engine on top of GPT-3. It can hardly rival Google’s product, but the potential is clearly there. OpenAI plans to turn GPT-3 into a commercial product next year, meaning any brand could use the technology to create their own search platform. 
The implications would be significant for the SEO landscape. More brands innovating their own search engines would create new opportunities for digital marketers and the brands we help build. For Google, though, the potential is far more nerve-wracking. GPT-3 has shown that Google’s core technological advantages – natural language processing (NLP) and massive computing – are no longer unique and are essentially being commoditized.
GPT-3 challenges Google’s technological edge from all directions
Few have monetized NLP and machine learning as well as Google, but their technological edge has always been computational power. Google’s ability to crawl billions of pages a day, its massive data centers, and its extensive computing across that data, have cemented their status as the dominant search engine and digital advertising market leader.
But AWS, Microsoft Azure, and other cloud services now give us access to essentially unlimited computing power on demand. A decade of Moore’s Law has also reduced the cost of this computing power by one-to-three orders of magnitude. 
Additionally, open-source software and advances in research have made it easier for developers to access the latest breakthroughs in NLP and machine learning technology. Python, Natural Language Toolkit (NLTK), Pytorch, and Tensorflow are just a few that have granted developers access to their programming and software innovations. 
Yes, Google still has BERT, which shares similar architecture with GPT-3. But GPT-3 is slightly larger (by about 175-billion parameters). GPT-3 also doesn’t need nearly as large of a training data set as Google’s BERT.
Not only will GPT-3 add value to those businesses and applications that are already using AI and machine learning with a newer, larger, and significantly improved NLP model, it would also equip Google’s biggest Cloud competitors with the ability to pair that technology with their own computing power. 
Other players will soon be able to build massive search engines like Google did
In order to build a search engine, you need to be able to retrieve many different types of information: Web results for one, but also maps, data, images, and videos. Google’s indexing power is what catapulted them to be the primary retriever of all web knowledge. 
In addition to building that massive information retrieval system, Google monetized on those economic searches through advertising. But the majority of searches Google doesn’t actually earn any money on. 
OpenAI built its own information retrieval system GPT-3 so that it could create superintelligence. If OpenAI wanted to, they could build a competitor to Google. But the hardest part would be bringing this to a massive audience. Bing’s market share is only 6.5% while Yahoo’s is 3.5%. 
It’s been a long time since the search engine market was a realistic place to compete. But what if a GPT-3 commercial product equipped a new competitor with an equally-matched technological edge, market share, cloud service, and devoted customer-base to enter the search market? 
Let’s say a competitor like Apple. They launched a newly redesigned Apple Maps earlier this year. They already announced they are indexing the web through Applebot. When it comes to launching the next best search engine, Apple is well-positioned.
How could Apple change the SEO landscape with its own search engine?
Most likely, an Apple search engine would use ranking factors similar to Google. The app store ecosystem would equip Apple with greater use of in-app engagement data. We could also see a greater reliance on social signals from Facebook and Twitter.
All android devices currently ship with Chrome + Google Search as the default search OS. Apple’s devices ship with Safari and you can select your own preferred search OS. It could easily do what Google has done and default to its own search engine. With just one iPhone model launch, Apple could transition its massive customer-base away from Google through its technological edge and dominance with devices. 
But what would be most troublesome for Google is how Apple could disrupt Google’s massive ads ecosystem. 71% of Google’s revenue comes from advertising. With Google Ads now being the most expensive (and competitive) place to advertise on the internet, many advertisers would welcome a disruption. It’s possible we could see billions of dollars of advertising revenue shift to Apple.
For SEOs and digital marketers, it’s fun to imagine. We could see entirely new markets for search, creating more need for our expertise and additional platforms our customers can use to grow. We’re not quite there yet, but SEOs and digital marketers should be prepared for what advancements like GPT-3 could potentially mean for our industry.
Manick Bhan is the founder and CTO of LinkGraph, an award-winning digital marketing and SEO agency that provides SEO, paid media, and content marketing services. He is also the founder and CEO of SearchAtlas, a software suite of free SEO tools. He is the former CEO of the ticket reselling app Rukkus.
The post What the commoditization of search engine technology with GPT-3 means for Google and SEO appeared first on Search Engine Watch.
from Digital Marketing News https://www.searchenginewatch.com/2020/08/21/what-the-commoditization-of-search-engine-technology-with-gpt-3-means-for-google-and-seo/
0 notes
thelmasirby32 · 4 years ago
Text
1000 Ranking factors: How Google finds signals through the noise
30-second summary:
Google has the ability to measure content quality signals like never before, and with each new Core Update, they understand content quality more as humans do.
Google uses different search models depending on the search query, meaning ranking signals can vary depending on search intent.
In a recent correlation analysis we did in the sports ticketing industry, the top factors that correlated with good rankings were long-form landing page content and high domain authority.
Given Google’s guidance about their forthcoming 2021 web vitals update, page experience and performance are likely going to become more essential for ranking on page one.
LinkGraph’s CTO explains how machine learning is helping us better understand how Google finds and evaluates content quality signals when ranking web pages.
What makes a web page rank on the first page of Google? Historically, the best correlation studies regressed thousands of search factors against page one rankings in an attempt to understand the primary drivers of SERP performance. 
But these ranking factors are weighted differently depending on the type of search. Google uses varying search models depending on the search intent behind the question. Local searches with the map pack, higher economic value searches with high CPC, informational queries with high search volume, searches where the most relevant results may be rich media like videos/images, and even search within highly regulated industries like health and money, can all weight ranking factors differently
To make correlation studies even more challenging, Google updates its core algorithm several times a year, meaning those signals continue to evolve. As Google continues to refine its ability to measure and analyze those signals, their bots are getting better at understanding website quality the way that humans do.
Opening Google’s black box and the North star
In the last decade, the academic fields of machine learning and natural language processing have made great strides. Starting around 2012, Google’s search algorithms evolved beyond regression-based models towards deep learning. Google’s ranking algorithm has now become a black box, and even their engineers find it challenging at times to understand why their models produce the results they do. 
Ultimately, we know their goal has always been to bring the highest quality search results to users, and they’ve always done the best they could with the technologies and datasets available to them at the time.
Every core update is consistent with what they have been telling us for years: they reward high-quality pages and penalize low-quality sites and spam. Google’s updates have always been marching in the same direction, and it’s well understood among SEOs that there is a North Star towards:
Higher-quality content
Fast, snappy user experiences
Increasing site authority and reputation
What can be frustrating for webmasters is Google’s lack of specific guidance about which specific ranking factors matter the most in their specific industries? When studying how certain properties like page speed, text content, and backlinks relate to rankings, advanced SEOs are turning to correlation analysis and machine learning tools to better understand how Google finds these signals through the noise.
What we learned about the primary drivers of SERP rankings
We look into an industry-specific correlation analysis, and what we learned about the primary drivers of SERP rankings.
To better understand these signals, we studied the correlation of 18 important ranking factors across 200 searches in the sports ticketing vertical. Each keyword had a CPC of over $8. 
What we discovered was that the most important factors that were correlated with top rankings were high Domain Authority and long-form landing page content. Domain organic traffic value, URL organic traffic, and page load speed were among the weaker correlations. 
Unfiltered backlink count and Referring Domain counts themselves were not as strongly correlated as metrics like Moz’s Domain Authority, which differentiates between low-quality links and high-value links. Additionally, Majestic’s Trust Flow and Citation Flow metrics proved less correlated to ranking on the first page then Moz’s Domain Authority.
For the startup client, we were working with, this analysis taught us that maximizing the amount of page rank on their most competitive landing pages was critical in order to contend with the heavyweights in their space, more than anything else they could do for their SEO.
From a content perspective, we found the web pages that rank on the first SERP align with what Google claims to reward: Long-form, topically-rich pages with interactive elements that provide a great page experience, like jump links, expandable content modules, and interactive javascript or videos. In our analysis, we were surprised to see that page load speeds weren’t as strong of a factor as we might have expected. Nevertheless, given Google’s guidance about their forthcoming 2021 web vitals update, performance signals are likely to become more important. Even though these signals may not be as important as site authority in the algorithm today, I would strongly advise brands to begin preparing for the update and begin making on-page content improvements and page speed improvements right now and understand how your site stacks up in the ‘Chrome User Experience’ report.
What this means for ranking on page one in your industry
If you’re an upstart and looking to break into page one of Google, most of these Google algorithm updates should come as very good news because they create a more democratic SEO landscape. In every correlation analysis, we’ve performed, site authority has always been the most influential factor. It’s also the hardest ranking factor to improve because link building takes such a concerted amount of effort and can take years in big industries. 
Seeing other ranking factors demonstrating strong influence over rankings – such as content length and quality, page speed, and web vitals, and high-quality UX. It means that new entrants gain more opportunities to succeed in SEO on the merits of their web pages, not just because they are incumbents with massive backlink profiles. Even if you don’t have the tools to do your own comprehensive correlation analysis, you can manually reverse engineer the SERPs in your industry. Studying how your competitors to benchmark on these important search factors can reveal valuable tactical insights that help you direct your team’s SEO efforts towards the most impactful work possible and deliver faster-ranking improvements. 
Manick Bhan is the founder and CTO of LinkGraph, an award-winning digital marketing and SEO agency that provides SEO, paid media, and content marketing services. He is also the founder and CEO of SearchAtlas, a software suite of free SEO tools. He is the former CEO of the ticket reselling app Rukkus.
The post 1000 Ranking factors: How Google finds signals through the noise appeared first on Search Engine Watch.
from Digital Marketing News https://www.searchenginewatch.com/2020/07/22/1000-ranking-factors-how-google-finds-signals/
0 notes