#they just have an engine as a generator vs connecting to an external power source
Explore tagged Tumblr posts
Text
research tips for electric train whump:
-INDUSTRIAL MAINTENANCE SOURCES ARE YOUR FRIEND! There’s not much stuff specifically about trains but basic concepts (maintenance and repair of AC or DC motors, transformers, rectifiers, other heavy electrical equipment) largely carry over from sources about elevators or substations. There’s a lot of good videos on youtube about these topics, look into stuff aimed at apprentice electricians or industrial maintenance, it will generally be more visual/metaphor and less math-heavy if you struggle with that.
-many “breakdowns” are actually due to power infrastructure issues- third rail and catenary wires have different problems and these further vary based on how old they are. Fixed-tension catenaries on the former Pennsylvania Railroad are a notorious issue. The more modern the line, the less weird stuff you’ll have, but much of the US, parts of the UK, and a number of spots in mainland Europe have more unusual and antiquated electrification systems. And this is also an issue with model trains between brands and eras!
-On a related note, pantograph designs have varied by time and place and have different advantages/disadvantages/issues. You even have times when they have a uniquely bad time with old catenaries, like old Comet EMUs specifically getting snarled in those constant-tension PRR catenaries
-Never underestimate how “dumb” and un-computery a lot of electric trains actually are, especially circa the 80s. “Toaster” isn’t even an inaccurate insult for older DC-motor trains, they literally brake using huge resistors that put out a ton of heat… like a giant toaster. And the really old, simple stuff tended to last a stupidly long time in service so you can 1000% wave off a 20s-era engine or EMU in the 80s.
-If you want some really easy ones that have actually happened before: connecting to too strong of a power source and getting FRIED, and having wires ripped out willy-nilly by techs not familiar with electric trains that don’t know what they’re doing. Weaponized incompetence with anything electrical checks out with how even a lot of train people know very little about it and don’t care.
-model trains also have a lot of beginner-friendly electrical info you can work from
#stex#starlight express#a lot of this stuff also applies to diesel-electric engines (which is almost all diesels)#they just have an engine as a generator vs connecting to an external power source#older electric locomotives will have a lot more in common with appliances or factory machinery than computers#rail is generally technologically conservative since it’s VERY heavy duty and high reliability and likes to reuse vs replace#it’s actually really handy to know the basic electrical stuff that goes into them because it’s applicable to tons of everyday stuff
13 notes
·
View notes
Text
Understanding ChatGPT and Its Differences From Google Bard!
Understanding ChatGPT and Its Differences From Google Bard!
AI intelligence is taking the world to new heights each day. From automating simple everyday processing to replacing heavy machinery in industries, AI, coupled with various other technologies, has changed the technology landscape. ChatGPT is one of the prime examples of AI revolutionizing the world.
If you are from the IT sector, there is no chance that you haven’t heard of ChatGPT. Since its release in November 2022, it has taken the world by storm. And soon after that, Google took out one of the best pieces of its arsenal, Bard.
So, whether you are aware of these two AI rivals or not, the coming sections will surely be fruitful for you!
What is ChatGPT?
If you don’t know, let’s put some light on ChatGPT.
ChatGPT is an AI-backed chatbot that provides human-like responses against the text inputs provided by the users. Now, some of you may argue how it is different from Jasper, Quillbot, and other similar tools. Well, there are many differences, but one key difference is that it is not connected to the internet. Moreover, it is not linked with any type of external information.
ChatGPT generates responses that are conversational and are generated from the data it has been provided. Hence, making it a language-processing system and not a search engine.
How does ChatGPT generate responses?
To elaborate further, ChatGPT generates human-like text and does not access the internet to do it. How? The process is called pre-training, in which a system is provided with heaps of data, and the system is tuned to establish relationships between words and concepts. This can be further used for translation and summarization.
One doubt some of you might have in your mind, what is the source of information for ChatGPT? The data provided to train ChatGPT includes sources like books, websites, and a number of articles.
With the help of these resources, ChatGPT is capable of generating a plethora of content types, such as
Program and software codes
Social media posts
Cooking recipes
Blogs and articles
Emails drafts
Summaries
Jokes
Law briefs
And much more!
GPT-3 — The Model Behind ChatGPT!
We have been talking about ChatGPT for some time now. Let’s get a bit technical!
GPT in ChatGPT stands for Generative Pre-trained Transformer. ChatGPT is an implementation of GPT-3, which is an exceptional neural network machine learning model. So far, GPT-3 is the most powerful model ever created. Why?
Unlike Microsoft’s Turing Natural Language Generation model, which featured 10 billion parameters, GPT-3 is created with 175 billion parameters that provide it lightning-fast speed to process billions of words per second. Further, ChatGPT excels at understanding the context of the conversation with the help of self-attention mechanisms.
Bard – The Rival to ChatGPT Initiated By Google!
In the last three and half months, ChatGPT has acquired millions of daily active users. If we go by the numbers, by the end of December 2022, it garnered over 57 million users. The number further climbed to 100 million by the end of January 2023.
As most of the online community started calling it the replacement of Google, Google stepped in to defend itself and launched Bard.
Google Bard is an AI chat service that is backed by the revolutionary Google’s LaMDA. Expanded as Language Model for Dialogue Applications, Google LaMDA was unveiled by the tech giant two years ago. Though Google was ahead in the game with its Transformer technology invention, it is not a frontrunner in the AI revolution.
How is Bard Different From ChatGPT?
We know that ChatGPT has set the bets high, and it will be difficult for any tech giant to face it. Google Brad has the capability to match ChatGPT. However, it will take time as Google has just launched Bard with limited capabilities as of now.
Bard vs. ChatGPT — Key Differences
Other than being developed by different tech giants, these AI chatbots can be differentiated based on some points!
Source of information
One of the chief differences between Google Bard and ChatGPT is their source of information. As elaborated in the previous sections, ChatGPT is not connected to the internet but is fed with heaps of data from various sources. It uses AI to fine-tune the data and provide relevant human-like responses to the users.
On the other hand, Google Bard leverages the power of the web to provide all the information.
Quality of information
As the source of information for Google Bard is the web, the quality of information will be good. It will generate responses that will have the latest information, and that information will be fine-tuned and detailed as compared to the standard Google search results.
In contrast, ChatGPT is trained on a limited set of data which means the responses to user queries will only be limited to those data sources. The user may not get the latest information on some queries.
Technology
The third distinguishing feature between the two is the technology used. Google Bard uses LaMDA, which uses the open-source network to comprehend natural language. Moreover, LaMDA is trained to look for patterns between different words and sentences to generate an output.
On the other hand, ChatGPT uses GPT-3, which is a powerful Generative Pre-trained Transformer. It generates human-like responses by deeply analyzing the importance of words and phrases in the input queries. ChatGPT does not need a grammatically correct sentence to provide a response; it can generate an output based on a few words too.
The Future of Chatbots!
As of now, Google is considered late as ChatGPT has covered most of the market. Moreover, Microsoft plans to combine ChatGPT with the Bing search engine, which will be another impediment for Google to overcome. ChatGPT has also launched its premium service at an affordable price. However, Google has announced that AI features will be seen in the Google search engine in the coming time. So the users might get to see refined results.
Conclusion
ChatGPT is one of the best inventions of the 20th century. As people have started calling ChatGPT an alternative to Google search, Google will not stay silent on this. As of now, it has rolled out the most basic, low-power consuming version of Google Bard. However, it is not certain or in any way speculated what the search giant is up to.
If you want to understand these chatbots or want to build one of your own, you can get a quote from Reinforce Global.
#AI#Artificial Intelligence#Artificial Intelligence Technology#Bard#Google Bard#ChatGPT#Chatbot#Website Development Company#Reinforce Global
1 note
·
View note
Text
Prof Tuners Sound Cards & Media Devices Driver Download For Windows 10
Jan 01, 2021 A sound card is a valuable upgrade for your PC if you are a dedicated gamer, audiophile, or a creative who produce audio on your computer. Here are the best sound cards with advice on which one.
Prof Tuners Sound Cards & Media Devices Driver Download For Windows 10 64-bit
Prof Tuners Sound Cards Amazon
The sound card was working properly. But I still can't hear any sound when I try to watch TV on my computer. The cable to share my speaker to the TV tuner card is plugged on the right places, which are the line-in and audio-out. I can only hear sound from my TV tuner card if I directly plug the speaker cable to the audio-out area of my TV tuner. Professional Sound Card of an Unusual Design, with Professional and Consumer Jacks ESI presented its new sound card called Juli@ at the world-famous exhibition Musikmesse (March 31 – April 3, 2004) in Frankfurt, Germany. The card is shipped in a nice box made as a book with a transparent inner side. The box for ESI Juli@ is nice and functional.
Professional Sound Card of an Unusual Design, with Professional and Consumer Jacks
ESI presented its new sound card called Juli@ at the world-famous exhibition Musikmesse (March 31 – April 3, 2004) in Frankfurt, Germany.
The card is shipped in a nice box made as a book with a transparent inner side.
Bundle
The bundle of our sample card contained the following items:
PCI-card ESI Juli@
breakout cable for the S/PDIF and MIDI interfaces
really useful user's manual in English and in German printed in A5 format
CD with drivers
CD with special version of Ableton Live (list of limitations)
50% discount coupon for a full version of Ableton Live
Colorful advertisement of all ESI products
Card-transformer
The unique feature of this card is the opportunity to transform it to user's needs: you can choose which jacks to use – professional 1/4' balanced jacks (1/4' TRS) or consumer RCA jacks.
ESI Juli@ sound card: Professional 1/4' balanced TRS jacks are active Disassembled ESI Juli@ sound card: The upper part turns 180 degrees ESI Juli@ sound card: consumer RCA jacks are active
Innards
The main DSP chip labeled with the ESI logo is VIA Envy24HT-S (24-bit, 192 kHz; interfaces: three output I2S/AC-links, two input I2S/AC-links). Digital transceiver – AKM AK4114 (8 inputs, 2 outputs).
The main DSP chip: 24-bit, 192 kHz VIA Envy24HT-S
The card uses an 8-channel multibit 24-bit 192 kHz DAC AKM AK4358 with reduced sensitivity to jitter, positioned by AKM for professional equipment as well as for consumer DVD-Audio and SACD (by the way, I wrote about the release of this DAC in our news a year ago). Dynamic range: 112 dB. THD+noise: -94 dB. 8 times 24-bit digital filter with an slow roll-off option (in this case this option is disabled to provide ideally even frequency response). DAC is not a top class, but quite good enough for a sound card in this price range.
You can see on the ESI Juli@ block diagram that the 8-channel AKM 4358 DAC (marked as 4õ DAC) is used in this sound card on purpose — it monitors digital I/O with an absolute zero delay.
ADC – dual bit delta-sigma stereo 24-bit 192 kHz AKM AK5385A has a little better characteristics; it is positioned by AKM for professional equipment to record audio in high definition formats, including DVD-Audio. Dynamic range: 114 dB. Signal/(noise+distortions): 103 dB. High-quality digital antialiasing filter with a linear phase, passband (Fs=48 kHz): 0~21.768 kHz, ripple: 0.005 dB, stopband: 100 dB.
ESI Juli@ converters: AKM 4358 DAC and AKM 5385A ADC
Prof Tuners Sound Cards & Media Devices Driver Download For Windows 10 64-bit
Some comments. Though DAC and ADC are of the same class in this sound card, ADCs are still a tad better than DACs. Perhaps, as it was with Waveterminal 192X, the engineers considered the digitization quality (ADC) to be more important in home/project studio than the slightly reduced quality of signal monitoring, other things being equal. But hi-end audio PC users (RCA jacks are a curtsy to them), who use their computers solely for high quality playback and who do not care about ADC, have a reason to pause – is the DAC comparable with the level of the rest of the sound playback section? If the rest of the sound section is much more expensive than the card (approximately over $600), so that the bottleneck is in the DAC quality as well, then they should pay attention to the mastering level sound cards.
In general, it should be noted that in the price range up to $200 Juli@ demonstrates better DAC/ADC quality in comparison with similar old cards based on ENVY24 – various M-Audio Audiophile 2496, Echo MIA, etc. According to our measurements (using the same reference sound card) Juli@ is even better than the more expensive ESI Waveterminal 192Õ, which has better (according to the specification) ADCs. So in our opinion only EMU 1212M (with converters of a higher level) can be a serious competitor to the undoubtedly successful in price/quality Juli@.
Drivers sahara. Despite this fact, many users of middle end sound sections will quite possibly prefer Juli@, because this ESI card is easier to control, it does not have problems with the MME/WDM interface support in high formats, it has a smoother frequency response of line outs, and it offers several useful proprietary features, which will be described later. Besides, EMU is very difficult to find on our shelves and this situation has not changed for half a year already.
E-WDM technology
E-WDM technology (Enhanced Audio MIDI Driver) is a proprietary project of ESI, which enhances the original architectural concept of Microsoft WDM-drivers.
Aside from standard features, E-WDM offers functions required for professional sound processing:
Aside from Win XP/2000, it also supports Win Me/98 SE Based on the WDM architecture, E-WDM drivers work fine under all WDM compatible Windows operating systems, while WDM drivers of other sound cards often work only in W2K and XP, offering VxD drivers for Win98SE and ME.
GIGAStudio 2000 support From the very beginning E-WDM drivers were developed for professional applications. The popular software sampler GIGAStudio from Tascam is fully supported with the extremely low latency of 1.5 ms.
ASIO 2.0 support ASIO 2.0 is the de facto standard, and E-WDM supports it with just a 3.0 ms latency.
Independent support for MME applications Several MME applications can be used simultaneously, and the drivers will not complain that the device is busy. This is also useful for old applications, which are not supported by WDM drivers.
Multi-client support An unlimited number of audio applications can be accessed simultaneously with the multi-client support. You can use SONAR and WinAmp via a single sound device.
Multi-channel support It supports 5.1 output format for software DVD-Video players, such as WinDVD and Power DVD. This function is also useful for WDM applications, such as SONAR, especially in terms of the input/output channel synchronization.
DirectMUSIC MIDI ports WDM-based multi-channel MIDI-driver from ESI offers a more stable timing than the built-in MIDI-timing in NT4 or Windows 9X.
DirectSound support E-WDM offers support for several DirectSound channels for multi-channel output in such applications as PCDJ (DJ software).
No signal attenuation at -6 dB Unlike the situation with several other drivers, E-WDM guarantees the signal level exactly according to the existing standard.
Ultra-low latency, less than 1.5 ms E-WDM drivers provide comfortable work due to the minimal latency. Buffer size can be set in drivers.
DirectWIRE: audio stream visual routing technology
DirectWIRE is a technology for visual software routing of signals, which is currently available in most ESI products. Signals are routed on the level of drivers, bit-to-bit, without quality loss.
DirectWIRE 1.0 technology appeared in ESI Waveterminal 2496/192 when the Gigastudio sampler was still widely popular and initially it served to convert MIDI-tracks to WAV from Gigastudio (GSIF interface) into Cubase (ASIO) or into Cakewalk/SONAR1 (MME). Drivers resmed port devices replacement.
DirectWIRE 1.0 panel in ESI Waveterminal 192X
DirectWIRE 2.0 supports 32 channels, it has a new control panel, and an option to mute monitoring of selected output channels. You can switch inputs and outputs of the MME, WDM, ASIO, GSI/F program interfaces, even if they work simultaneously. You can connect inputs and outputs of different applications with virtual cables and record signals without quality loss. Using DirectWIRE you can also record multi-channel audio from DVD or another source (even protected from copying) in formats up to 24-bit 192 kHz in real time in completely digital form.
DirectWIRE 3.0 was enhanced with hardware inputs. In Juli@ Virtual Inputs 1 and 2 are the left and the right hardware analog inputs, 3 and 4 – the left and right hardware digital S/PDIF input.
Interface names mean the following applications:
MME: WinAmp, CoolEdit, Cakewalk, Vegas, etc.
WDM: SONAR (WDM/KS), PowerDVD, WinDVD, etc.
ASIO: Cubase, Nuendo, Logic, Reason, SONAR (ASIO), etc.
GSIF: GigaStudio 2.42 or higher.
You can read this Audiotrak tutorial to learn how to use DirectWIRE 3.0 with detailed examples.
Digital and MIDI Interfaces
Juli@ stands out against other ESI products with its MIDI I/O, digital RCA I/O, and a digital optical output. You had to buy a separate MI/ODI/O module to get these features in Audiotrak MAYA44MKII and ESI Waveterminal 192X/L. Though there are also modifications with MI/ODI/O module in a box now, it occupies an additional bracket among PCI slots.
Sound Quality
https://hunterlist346.tumblr.com/post/655921485763035136/nsc-sicherheitstechnik-usb-devices-driver-download-for-w. Juli@ vs. Audigy2 ZS Platinum Pro
These two cards can be compared due to similar prices ($185 for the ESI product and $240 for the Creative product) and professional attributes (external block with a full set of connectors) in Audigy, which is essentially a gaming card. Functionality of Juli@ and Platinum Pro in terms of a microphone preamplifier and phone-in can be leveled up by adding an inexpensive external analog mixer (about $50) to Juli@, but it's quite possible that an external mixer is already available in home/project studio.
Before the comparative audition I had no subjective preferences – the cards have converters of the same class, and thus I didn't expect considerable differences in sound quality. Nevertheless, the difference between the cards can be clearly heard even using the active speakers JetBalance JB-381 (by the way, very good speakers at a moderate price at last, less than $200).
In the 16 bit 44 kHz mode A2 is traditionally weak at playing via the MME interface, and it's much better at playing via the professional packaged 'SB Audigy2 ZS ASIO' driver with enabled SSRC high quality resampling to 48 kHz. But Juli@ produces crisper and more detailed sounds, and thus it's more suitable for a professional. Though this can be heard only when you instantly switch between the cards playing the same music fragment. If you increase the time between the auditions to a couple of minutes, you won't be able to distinguish the cards even with the same music fragment. One way or another, I clearly hear the difference, for example at Alex Reece, 1996 album 'So Far' (hits: Feel The Sunshine, Pulp Friction) and some other test compositions with clear compressed mastering and timbre-catchy high frequency range.
Juli@ vs. EMU 1820
Despite the high price ($400) and the increased number of channels in EMU1820, Juli@ can compete with it head-to-head, because the Creative/EMU product has converters of the same level (CS4392).
Prof Tuners Sound Cards Amazon
Both cards, Juli@ and 1280, were tested in balanced mode, high signal level (+4 dBu) to increase the dynamic range and to demonstrate sound cards' features at maximum. The auditions were also carried out on EVENT 20/20 bas monitors, allowing the balanced connection.
Thorough audition in the same conditions, using the same cable and instantly switching between the cards did not reveal any difference in their sounding. However, considering the high price of EMU1820, it cannot be called a successful sound card – it's an economic modification of the elder 1820M. 1820 is of lower quality than even the younger 1212M model. Forestalling your questions: we don't have an opportunity to compare it with EMU 1212M or 1820M, because these cards cannot be found on sale in Russia. US EMU office promised to send us a press-sample, but we lost contact with it. If we still have an opportunity, we'll update this article.
Juli@ vs. LynxTwo
Comparison with LynxTwo (price >$1000, stereo-modification is Lynx L22, >$800) in our tests is traditional, it serves to reveal the degree and character of sound differences between a sound card under review and a reference sound source.
We hasten to say, LynxTwo sounds better. The difference is not so large but it still makes itself felt. Nevertheless, Juli@ demonstrates professional sound quality in comparison with the reference. Multimedia cards (for example, Audigy2/ZS) are usually infamous for timbre distortions, various tones and harmonics at high frequencies. So despite their rather high quality converters, such cards cannot be used for professional work. Juli@ is free from these drawbacks, it sounds true and clear. The only difference from Lynx is in a tad lesser detail, which will hardly influence its professional capability, except for the mastering.
Headphones
Despite the fact that this card does not have a separate output to headphones, they can be connected using the 2RCA-minijack adapter. Using Sennheiser HD600 (400 ohm impedance) via the adapter caused no problems with sound. Besides, we carried out the following interesting experiment: we connected a variable resistor to the output, and applied a maximum amplitude sinusoid to the output of the card. On the digital oscillograph EZ Digital OS-310M we monitored how the amplitude changed and at what resistance there would be a limitation. In comparison with the pure linear output in MAYA44MKII with the limitation at ~100 ohm and lower, Juli@ was OK. Operational amplifiers JRC 4580 hold well the low impedance load, though the Juli@ specification requires the load impedance of not less than 100 ohm. By the way, MAYA44MKII has an output for headphones, which is also OK.
Working in Professional Applications
No surprises here. As any sound card based on the ENVY24 series chip, Juli@ demonstrated excellent results in ASIO-applications CubaseSX 2.0.1 and WaveLab 5.0a.
Besides, InterVideo WinDVD 6.0 works correctly with the card as well. When playing DVD-Audio discs, they are downsampled to 16/48.
RMAA 5.4 Tests
Many users ask us to test not only the quality of linear output, but that of the linear input as well. Silicon image hard disk controller driver download for windows 10. It is especially important for professional equipment, so we tested the card in three modes: when the output and the input of the card are connected, input and output are connected separately using LynxTwo – a reference card of a higher quality.
16 bit 44 kHz, +4dBu, balanced 1m cablesESI Juli@ loopbackESI Juli@ -> LynxTwo Frequency response (from 40 Hz to 15 kHz), dB:+0.05, -0.03+0.02, -0.16+0.04, -0.03Noise level, dB (A):-96.1-96.1-96.4Dynamic range, dB (A):95.795.995.8THD, %:0.00060.00230.0006IMD, %:0.00460.0150.0045Stereo crosstalk, dB:-96.7-97.9-98.1
When the linear input is tested in the balanced mode using LynxTwo, the frequency response gets worse in this test as well as in all the other ones. In the loopback mode (when the output and the input of the card are connected) there is no such drawback, thus we have some disalignment in the operation of balance interfaces in this specific device combo. This is another reason not to use balance connection needlessly.
24 bit 44 kHz, +4dBu, balanced 1m cablesESI Juli@ loopbackESI Juli@ -> LynxTwoFrequency response (from 40 Hz to 15 kHz), dB:+0.05, -0.03+0.02, -0.16+0.04, -0.03Noise level, dB (A):-103.6-109.3-108.0Dynamic range, dB (A):103.4109.2107.9THD, %:0.00050.00230.0005IMD, %:0.00200.0140.0019Stereo crosstalk, dB:-103.9-109.3-107.2
Note the ideally steady frequency response graph and the ideally clear noise spectrum in the 24-bit mode. The noise level measured is close to the claimed figures in the converter specifications. Engineers made the most of the existing converters.
24 bit 96 kHz, +4dBu, balanced 1m cablesESI Juli@ loopbackESI Juli@ -> LynxTwoFrequency response (from 40 Hz to 15 kHz), dB:+0.03, -0.02+0.01, -0.17+0.02, -0.04Noise level, dB (A):-103.9-109.8-107.9Dynamic range, dB (A):103.9109.4107.7THD, %:0.00050.00230.0005IMD, %:0.00190.0140.0019Stereo crosstalk, dB:-101.6-106.8-106.7
In the 96 kHz mode the spectrum is reproduced completely, as it should be. I'll remind you that EMU cards have problems with frequency response in DAC, which is -0.7 dB for 20 kHz.
24 bit 44 kHz, S/PDIF all tests comparison Test file 24/44LynxTwo S/PDIF loopbackESI Juli@ -> LynxTwoFrequency response (from 40 Hz to 15 kHz), dB:+0.00, -0.00+0.00, -0.00+0.00, -0.00+0.00, -0.00Noise level, dB (A):-147.7-144.4-144.5-144.5Dynamic range, dB (A):133.4133.2133.2133.2THD, %:0.00000.00000.00000.0000IMD, %:0.00020.00020.00020.0002Stereo crosstalk, dB:-149.7-146.3-145.1-145.0
In digital output tests Juli@ is equal to LynxTwo.
24 bit 96 kHz, S/PDIF all tests comparison
Test file 24/96LynxTwo S/PDIF loopbackESI Juli@ S/PDIF -> LynxTwoESI Juli@ S/PDIF loopbackFrequency response (from 40 Hz to 15 kHz), dB:+0.00, -0.00+0.00, -0.00+0.00, -0.00+0.00, -0.00Noise level, dB (A):-151.1-147.8-147.9-147.8Dynamic range, dB (A):133.3133.2133.2133.2THD, %:0.00000.00000.00000.0000IMD, %:0.00020.00020.00020.0002Stereo crosstalk, dB:-151.3-146.2-146.4-145.9
In the 24/96 mode the ideal is even farther from our contenders. The reason is most likely not in the lack of bit-to-bit transfer precision, but in the PLL influence in the transceivers or in dithering reformation. One way or another, it's impossible to hear the difference between the original and the received sound – the differences are below the thermal noise threshold in semiconductors, and any digital-analog conversion will guarantee much more audible inaccuracies – the native level of noise in modern DAC is hardly over 120 dB A.
RightMark 3DSound 1.20 Tests
We test DirectSound functions of the card, because we have such an opportunity. RM 3DS tests demonstrate DirectSound compatibility but also the lack of any algorithms of software 3D sound. Just for information.
Device: Juli@ Ch12 (JulaWdm.sys)
Features: Device has not enough hardware 3D buffers Device has not enough hardware 2D buffers EAX1: N/A EAX2: N/A EAX3: N/A EAX4 Advanced HD: N/A
Rates: dwMinSecondarySampleRate 22050 dwMaxSecondarySampleRate 192000
Free buffers stats: dwFreeHw3DAllBuffers 0 dwFreeHw3DStaticBuffers 0 dwFreeHw3DStreamingBuffers 0 dwFreeHwMixingAllBuffers 0 dwFreeHwMixingStaticBuffers 0 dwFreeHwMixingStreamingBuffers 0
Max buffers stats: dwMaxHwMixingAllBuffers 1 dwMaxHwMixingStaticBuffers 1 dwMaxHwMixingStreamingBuffers 1 dwMaxHw3DAllBuffers 0 dwMaxHw3DStaticBuffers 0 dwMaxHw3DStreamingBuffers 0
Misc stats: dwFreeHwMemBytes 0 dwTotalHwMemBytes 0 dwMaxContigFreeHwMemBytes 0 dwUnlockTransferRateHwBuffers 0 dwPlayCpuOverheadSwBuffers 0
Audio transfer speed (software): 3.289 Mb/sec.
Exclusive iXBT.com interview with the ESI R&D team!
iXBT.com:How did you come up with the idea to produce such an unusual sound card with a unique design as Juli@? Is it an attempt to stand out against similar sound cards and to attract attention? What was the reason to break away from the multichannel solution, which we saw in Waveterminal 192L/X (Waveterminal 2496 descendant)?
ESI: Our main goal of developing Juli@ was making a reference model card in terms of audio quality. We did research the market and the already available products. We found that users would need either balanced or unbalanced I/O connections, preferably with the same number of input and output channels, but all products are supporting only one of each connector type. We came up with Juli@ with such a unique design with the user's needs in mind so that the I/O connections will be swappable between balanced and un-balanced. ESI plans to release new multi-channel solutions in the near future. iXBT.com:What was the criterion to select the certain ADC and DAC models (AKM 5385A and 4358)? Why did you choose the 8-channel DAC (AKM4358)?
ESI: We have been using the AKM4358 in development of other products such as the upcoming Audiotrak MAYA1010. We really liked the quality of AKM4358 so it was our conclusion to use it also for Juli@ despite the number of channels. This made it possible to use the additional channels to provide real-time monitoring for all S/PDIF and analog I/O signals, providing more functionality. As for the ADC, we always try to use the best available components, as recording is more important than playback for most professional users. iXBT.com:Did you choose during development a certain price/quality threshold, in order to reach a predetermined price (MSRP), or do your engineers preach some principle of reasonable sufficiency in the area of quality?
ESI: Of course reaching a specific price is always a concern during development. In case of Juli@, functionality and audio quality have been the highest priority. We are very glad that the final product is now available with a very competitive MSRP. iXBT.com:How do you feel, at what rates do the high sampling audio formats evolve in the area of music production and their demand by listeners/customers? ESI: When we started to sell 24bit audio cards several years ago, most of our professional users started immediately work with 24bit/44.1kHz. The higher resolution compared to previous 16bit products was a huge increase in audio quality useful for music production, even if the final product was just a 16bit/44.1kHz audio CD. Now, professionals are slowly starting to work with higher sample-rates in the production process, not only with the higher bit-rate. New formats such as DVD (audio or video) or SACD even require higher sample-rates in the production process; so many professionals are working with 96kHz or even 192kHz already. ESI supports these professionals not only with soundcards but also with exciting new concepts, such as our M-Fire M9600 24bit/96kHz DVD Master Recorder. These days, when we talk to the members of our professional user community, we can see that the demand for a higher recording quality standard is getting more and more important, although we have to admit, that this is a considerably slow process. Many professionals are also buying higher spec hardware because they understand that higher spec hardware also works better even when working with lower samplerates. For example, our 24bit/192kHz Juli@ will produce better results when used with 24bit/44.1kHz compare to other 24bit/96kHz devices, simply because of the much better ADC and DAC. For consumer products, the need to support 96kHz, or even 192kHz, is growing at the same time, mostly because of the necessity to playback already available consumer media. Windows Media Audio Professional supporting 24bit/96kHz is already a standard for PC audio. As more and more professionals are producing music and media content for higher bit and sample rates, the actual demand for consumer products that support these features is also growing. Still, it is not growing that fast, probably because of the reluctance of large media corporate to invest into better audio quality without having working Digital Rights Management solutions market ready.
iXBT.com:Can you tell us anything about the upcoming new generation of built-in sound named High Definition Audio replacing AC'97? Does it threaten high-quality soundcards supporting high sampling formats, or on the contrary will it draw attention to the high quality sound and promote high-quality sound devices?
ESI: We are aware of the HD Audio which sounds promising. But, from our experience in the past, we suspect that there always will be some limitations of on-board sound card. Most main board manufacturers will have to make compromises which will possibly lower the audio quality. We have evaluated the Azalia (former codename for HD Audio) demonstrated at the recent COMPUTEX Taipei 2004. There was certainly improvement on specification over the outdated AC97. However, we thought that it was not that impressive quality and not superior than AC97. We are confident that optional soundcards will be available in the future, probably mostly for audio enthusiast and of course for professionals. Audio still faces huge competition from leading companies and still doesn't provide a full substitute solution for high performance 3D audio and popular gaming surround sound standards such as EAX, Dolby Digital surround, and features like Advanced NSP, the ESI's solution to the native CPU processing for PC Audio. iXBT.com:What is your opinion of the E-MU revival in the sector of sound cards (especially the 1212M and 0404 models)? How you do evaluate DSP features and the quality of converters in these sound cards taking into account their prices? Will ESI respond with a solution featuring top-end converters at an affordable price?
ESI: We generally believe that onboard DSP solutions on audiocards will soon be history. Solutions like VST plugins or DirectX effects are providing excellent results for professional and even ultra high end professional effects, based on native CPU processing. Modern PCs are easily fast enough to provide better sounding effects and software synthesizer sounds compared to current DSP solutions. Right now, we are just at the beginning of this development - in the future, there will be less and less DSP based products in the audio market and native processing solutions will get even more important. In today's native processing environment, it is obvious that the value of a powerful driver providing low CPU load and low latency is considered much more valuable by professionals, compared to onboard DSP hardware. We also believe that we have developed a great value for audio professionals and audio enthusiasts with Juli@ - currently one of the best solution on the market for digital recording because of its exceptional circuitry design. The excellent frequency response is unmatched by any other products in this and even in higher price ranges. Yet, Singapore based Creative Labs has - with some of their E-MU brand products - achieved high dynamic range values on the input section - it is up to the customer if he prefers a good frequency response or a slightly higher dynamic range, just depending on his specific needs. Of course, ESI will provide even better AD and DA conversion quality for future professional audio products, e.g. with our upcoming multi-channel solutions. ESI was one of the first companies to introduce 192kHz and 7.1 channel sound cards and we will continue to introduce new standards and features first. Other examples are the unmatched features like our universal E-WDM drivers and of course DirectWIRE, a virtual digital routing solution with zero latency, exclusively available for users of ESI and Audiotrak products.
iXBT.com:How is ESI getting on? Should we expect new interesting products, such as Juli@, in future from ESI engineers? Can you share some of your plans with us?
ESI: Actually, ESI is currently mostly focusing on developing products that are more market-friendly. For consumer products, Audiotrak brand will release LP (Low Profile) type products which will inherit all the available features in a half sized PCI board and which will fit nicely in a small form factor PC cases. We have the new MAYA 1010 waiting to be released behind the corner but these might not be as special as Juli@. We do plan to design a variety of rack casings to give our users more connection possibilities, both for the consumer and professional music markets. Our drivers and control panels will always have ESI exclusive original features like E-WDM and DirectWIRE which have been as interesting to many people as Juli@'s brilliant swappable hardware design.
iXBT.com: We thank the entire ESI R&D team and personally Nikki Kichan Kang for the interview!
Conclusions
Professional sound card ESI Juli@ has an unusual design, it features both professional and consumer connectors, full MIDI and digital interfaces, high quality playback, excellent drivers, quite a reasonable price, and it is available on sale. Taking into account its price and professional orientation, we found no drawbacks in this card. ESI engineers did a good job, but one can never stop at what has been accomplished – we are looking forward to new sound cards from ESI/Audiotrak with higher quality DACs, which will compete with EMU products of the same price range.
ESI Juli@ gets the 'Original Design' award according to the test results.
We thank ESI for the kindly provided sound card ESI Juli@.
Write a comment below. Download nokia usb devices driver. No registration needed!
0 notes
Text
Deciding Which Generator is Right For You
Deciding which type generator you'll need may not be a simple task. There are lots of different kinds accessible today. When you buy, you can find a number of parameters that have to be addressed. So how could you precisely attempt? This short article is written to help you in determining which generator is most beneficial for the application.
First and foremost you will have to know very well what voltage source will come in your application. Electric engines could be classified as either AC (Alternating Current) or DC (Direct Current). Switching current types only operate on AC Voltage and primary current types only operate on DC Voltage. There's also a widespread generator that may operate on both AC and DC voltages.
Once you've recognized which power source you have you will have to decide which design works for the application. AC engines could be sub-divided in to these: Single Phase Induction, Three Phase Induction, Two Phase Servo, and Hysteresis Synchronous. DC engines could be sub-divided in to: Brushless DC, Brush DC, and Stepper types.
Next we need to realize different characteristics of every type to be able to precisely match a generator to their application.
An individual phase induction generator is connected to just one voltage line. An external capacitor must make this generator operate. The different types of single phase induction engines are famous by which technique they are started. The four fundamental types are: split phase, capacitor start, permanent split capacitor, and capacitor start/capacitor run.
A split up phase generator runs on the switching device to remove the start winding once the generator reaches 75% of their scored speed. Even though this sort has a easy design which makes it more affordable for commercial use, it also offers minimal beginning torques and large beginning currents.
The capacitor start generator is actually a divided phase capacitor generator with a capacitor in series with the beginning winding to create more beginning torque. This generator is more costly on bill of the switching and capacitor requirement.
A lasting split capacitor generator does have no looking switch. For this sort, a capacitor is forever connected to the beginning winding. Because this capacitor is necessary for continuous use, it doesn't offer beginning power, thus beginning torques are usually low. These engines are not advised for large beginning fill applications. But, they do have minimal beginning currents, quieter function, and larger life/reliability, thereby creating them a good choice for large routine rates. They are also the absolute most trusted capacitor generator on bill of devoid of a beginning switch. They can also be designed for larger efficiencies and power component at scored loads.
The capacitor start/capacitor run generator has both a start and run capacitor in the circuit. The start capacitor is turned out once reaching start-up. This sort of generator has larger beginning, decrease loaded currents, and larger efficiency. The disadvantage is the trouble that is required for just two capacitors and a switching device. Consistency also plays one factor on bill of the switching mechanism.
The three phase induction generator is injure for three phase changing voltage. These are the easiest and many rugged electrical engines available. The generator might be designed for either DELTA or WYE hook-up. This type is made for continuous use and large beginning torques. Generator speed is somewhat constant. If three phase voltage is available this is the generator to choose.
Two phase servo engines are used in servo systems, hence the name. They are really sensitive to voltage modifications on the get a grip on phase. This design involves two currents in 90 levels phase shift from one another so as to produce a rotating magnetic field. Servo engines have large torque to inertia percentage, high speed and is useful for velocity get a grip on applications. Tachometer feedback units could be supplied with these motors.
Hysteresis synchronous engines are ostensibly induction engines that run at synchronous speed. Whenever your application involves synchronous rates that is the best choice. These engines could be designed for either single phase or three phase. For single phase voltage a capacitor is going to be required. Hysteresis synchronous engines build what's called pull-out and pull-in torques. Pull-out torque is the total amount of torque/load the generator are designed for only as it pull out of synchronous speed. Pull-in torque is the total amount of torque on the productivity base that allows the generator to take in to synchronism and remain there. Equally pull-in and pull out torques are very similar. These engines have minimal beginning currents and minimal vibration. Since the rotor assembly is manufactured out of a cobalt material, that is hard to come by, this form of generator is expensive.
The primary current (DC) engines which can be accessible are brushless DC (BLDC), comb, and stepper motors. Once you only have DC voltage accessible then one of these simple engines should be used. Brushless DC engines do have no brushes thus there aren't any issues of comb use or sparking. Solid state controls and feedback devises are required for operation. These engines have predicable performance, large beginning torques, and are designed for large speeds. Even though more power productivity may be performed in an inferior offer, the digital controls make this design generator expensive.
Unlike brushless engines, comb DC engines don't involve any get a grip on electronics. Brush engines use commutator and brushes to produce a magnetic field. Even though these engines are often low priced, comb and commutator use limits their reliability and longevity.
Stepper engines are DC engines that create step-by-step steps. In the event that you involve base positioning to be predicable then stepper engines may be an option. These engines are trusted and reduced in cost. They are nevertheless, restricted in their ability to handle big inertia loads.
Once you've established the voltage and volume source the body has accessible you are able to determine the amount of phases and type generator to check at. Next you would have to know these to ensure that your generator design manufacture to greatly help pick the very best generator:
(1) Energy Output/Horsepower: The designer will have to know very well what the scored speed and torque parameter that your system requires motovario redüktör
(2) Figure Measurement: It is great for the designer to learn the physical limitations to be able to precisely size the motor.
(3) Duty Cycle/Time ranking: The amount of time the generator is functioning vs. time it is maybe not is a significant standards when developing the padding systems of the motor.
(4) Environmental Conditions: It is definitely vital that you advise the generator designer what environments the generator will see. That is essential therefore the correct housing is determined.
As you can see there are numerous different types of engines to decide on from. There's also several facets used in the choice. By dealing with a style manufacture you are able to assure to obtain the best generator for the application. This is why it is very important to look for a manufacturer before finalizing any systems design.
0 notes
Text
SEO Link Building & Establishing Authority
Crank up the SEO juice
You've created content that people are searching for, that answers their questions, and that search engines can understand, but those qualities alone don't mean it'll rank. To outrank the rest of the sites with those qualities, you have to establish authority. That can be accomplished by earning links from authoritative websites, building your brand, and nurturing an audience who will help amplify your content. Google has confirmed that links and quality content are two of the three most important ranking factors for SEO. Trustworthy sites tend to link to other trustworthy sites, and spammy sites tend to link to other spammy sites. But what is a link, exactly? How do you go about earning them from other websites? Let's start with the basics. What are links? Inbound links, also known as backlinks or external links, are HTML hyperlinks that point from one website to another. They're the currency of the Internet, as they act a lot like real-life reputation. If you went on vacation and asked three people (all completely unrelated to one another) what the best coffee shop in town was, and they all said, "Cuppa Joe on Main Street," you would feel confident that Cuppa Joe is indeed the best coffee place in town. Links do that for search engines. Since the late 1990s, search engines have treated links as votes for popularity and importance on the web. Internal links, or links that connect internal pages of the same domain, work very similarly for your website. A high amount of internal links pointing to a particular page on your site will provide a signal to Google that the page is important, so long as it's done naturally and not in a spammy way. The engines themselves have refined the way they view links, now using algorithms to evaluate sites and pages based on the links they find. But what's in those algorithms? How do the engines evaluate all those links? It all starts with the concept of E-A-T. You are what you E-A-T Google's Search Quality Rater Guidelines put a great deal of importance on the concept of E-A-T — an acronym for expert, authoritative, and trustworthy. Sites that don't display these characteristics tend to be seen as lower-quality in the eyes of the engines, while those that do are subsequently rewarded. E-A-T is becoming more and more important as search evolves and increases the importance of solving for user intent. Creating a site that's considered expert, authoritative, and trustworthy should be your guiding light as you practice SEO. Not only will it simply result in a better site, but it's future-proof. After all, providing great value to searchers is what Google itself is trying to do. E-A-T and links to your site The more popular and important a site is, the more weight the links from that site carry. A site like Wikipedia, for example, has thousands of diverse sites linking to it. This indicates it provides lots of expertise, has cultivated authority, and is trusted among those other sites. To earn trust and authority with search engines, you'll need links from websites that display the qualities of E-A-T. These don't have to be Wikipedia-level sites, but they should provide searchers with credible, trustworthy content. Page Authority, and Spam Score are important. In general, you'll want links from sites with a higher Domain Authority than your sites. Followed vs. nofollowed links Remember how links act as votes? The rel=nofollow attribute (pronounced as two words, "no follow") allows you to link to a resource while removing your "vote" for search engine purposes. Just like it sounds, "nofollow" tells search engines not to follow the link. Some engines still follow them simply to discover new pages, but these links don't pass link equity (the "votes of popularity" we talked about above), so they can be useful in situations where a page is either linking to an untrustworthy source or was paid for or created by the owner of the destination page. Say, for example, you write a post about link building practices, and want to call out an example of poor, spammy link building. You could link to the offending site without signaling to Google that you trust it. Standard links (ones that haven't had nofollow added) look like this: I love SEMRush Nofollow link markup looks like this: I love SEMRush If follow links pass all the link equity, shouldn't that mean you want only follow links? Not necessarily. Think about all the legitimate places you can create links to your own website: a Facebook profile, a Yelp page, a Twitter account, etc. These are all natural places to add links to your website, but they shouldn't count as votes for your website. (Setting up a Twitter profile with a link to your site isn't a vote from Twitter that they like your site.) It's natural for your site to have a balance between nofollowed and followed backlinks in its link profile (more on link profiles below). A nofollow link might not pass authority, but it could send valuable traffic to your site and even lead to future followed links. Tip: Use SEMRush extension for Google Chrome to highlight links on any page to find out whether they're nofollow or follow without ever having to view the source code!
Your link profile
Your link profile is an overall assessment of all the inbound links your site has earned: the total number of links, their quality (or spamminess), their diversity (is one site linking to you hundreds of times, or are hundreds of sites linking to you once?), and more. The state of your link profile helps search engines understand how your site relates to other sites on the Internet. There are various SEO tools that allow you to analyze your link profile and begin to understand its overall makeup. How can I see which inbound links point to my website? Use SEMRush and set up your site's URL. You'll be able to see how many and which websites are linking back to you. What are the qualities of a healthy link profile? When people began to learn about the power of links, they began manipulating them for their benefit. They'd find ways to gain artificial links just to increase their search engine rankings. While these dangerous tactics can sometimes work, they are against Google's terms of service and can get a website deindexed (removal of web pages or entire domains from search results). You should always try to maintain a healthy link profile. A healthy link profile is one that indicates to search engines that you're earning your links and authority fairly. Just like you shouldn't lie, cheat, or steal, you should strive to ensure your link profile is honest and earned via your hard work. Links are earned or editorially placed Editorial links are links added naturally by sites and pages that want to link to your website. The foundation of acquiring earned links is almost always through creating high-quality content that people genuinely wish to reference. This is where describing extremely high-quality content is essential! If you can provide the best and most interesting resource on the web, people will naturally link to it. Naturally earned links require no specific action from you, other than the creation of worthy content and the ability to create awareness about it. Tip: Earned mentions are often unlinked! When websites are referring to your brand or a specific piece of content you've published, they will often mention it without linking to it. To find these earned mentions, use SEMRush. You can then reach out to those publishers to see if they'll update those mentions with links. Links are relevant and from topically similar websites Links from websites within a topic-specific community are generally better than links from websites that aren't relevant to your site. If your website sells dog houses, a link from the Society of Dog Breeders matters much more than one from the Roller Skating Association. Additionally, links from topically irrelevant sources can send confusing signals to search engines regarding what your page is about. Tip: Linking domains don't have to match the topic of your page exactly, but they should be related. Avoid pursuing backlinks from sources that are completely off-topic; there are far better uses of your time. Anchor text is descriptive and relevant, without being spammy Anchor text helps tell Google what the topic of your page is about. If dozens of links point to a page with a variation of a word or phrase, the page has a higher likelihood of ranking well for those types of phrases. However, proceed with caution! Too many backlinks with the same anchor text could indicate to the search engines that you're trying to manipulate your site's ranking in search results. Tip: Use the "Anchor Text" report in SEMRush to see what anchor text other websites are using to link to your content. Links send qualified traffic to your site Link building should never be solely about search engine rankings. Esteemed SEO and link building thought leader Eric Ward used to say that you should build your links as though Google might disappear tomorrow. In essence, you should focus on acquiring links that will bring qualified traffic to your website — another reason why it's important to acquire links from relevant websites whose audience would find value in your site, as well. Tip: Use the "Referral Traffic" report in Google Analytics to evaluate websites that are currently sending you traffic. How can you continue to build relationships with similar types of websites? Link building don'ts & things to avoid Spammy link profiles are just that: full of links built in unnatural, sneaky, or otherwise low-quality ways. Practices like buying links or engaging in a link exchange might seem like the easy way out, but doing so is dangerous and could put all of your hard work at risk. A guiding principle for your link building efforts is to never try to manipulate a site's ranking in search results. But isn't that the entire goal of SEO? To increase a site's ranking in search results? And herein lies the confusion. Google wants you to earn links, not build them, but the line between the two is often blurry. To avoid penalties for unnatural links (known as "link spam"), Google has made clear what should be avoided. Purchased links Google and Bing both seek to discount the influence of paid links in their organic search results. While a search engine can't know which links were earned vs. paid for from viewing the link itself, there are clues it uses to detect patterns that indicate foul play. Websites caught buying or selling followed links risk severe penalties that will severely drop their rankings. (By the way, exchanging goods or services for a link is also a form of payment and qualifies as buying links.) Link exchanges / reciprocal linking If you've ever received a "you link to me and I'll link you you" email from someone you have no affiliation with, you've been targeted for a link exchange. Google's quality guidelines caution against "excessive" link exchange and similar partner programs conducted exclusively for the sake of cross-linking, so there is some indication that this type of exchange on a smaller scale might not trigger any link spam alarms. It is acceptable, and even valuable, to link to people you work with, partner with, or have some other affiliation with and have them link back to you. It's the exchange of links at mass scale with unaffiliated sites that can warrant penalties. Low-quality directory links These used to be a popular source of manipulation. A large number of pay-for-placement web directories exist to serve this market and pass themselves off as legitimate, with varying degrees of success. These types of sites tend to look very similar, with large lists of websites and their descriptions (typically, the site's critical keyword is used as the anchor text to link back to the submittor's site). There are many more manipulative link building tactics that search engines have identified. In most cases, they have found algorithmic methods for reducing their impact. As new spam systems emerge, engineers will continue to fight them with targeted algorithms, human reviews, and the collection of spam reports from webmasters and SEOs. By and large, it isn't worth finding ways around them.
How to build high-quality backlinks
Link building comes in many shapes and sizes, but one thing is always true: link campaigns should always match your unique goals. With that said, there are some popular methods that tend to work well for most campaigns. This is not an exhaustive list, so visit Moz's blog posts on link building for more detail on this topic. Find customer and partner links If you have partners you work with regularly, or loyal customers that love your brand, there are ways to earn links from them with relative ease. You might send out partnership badges (graphic icons that signify mutual respect), or offer to write up testimonials of their products. Both of those offer things they can display on their website along with links back to you. Publish a blog This content and link building strategy is so popular and valuable that it's one of the few recommended personally by the engineers at Google. Blogs have the unique ability to contribute fresh material on a consistent basis, generate conversations across the web, and earn listings and links from other blogs. Careful, though — you should avoid low-quality guest posting just for the sake of link building. Google has advised against this and your energy is better spent elsewhere. Create unique resources Creating unique, high quality resources is no easy task, but it's well worth the effort. High quality content that is promoted in the right ways can be widely shared. It can help to create pieces that have the following traits: Elicits strong emotions (joy, sadness, etc.) Something new, or at least communicated in a new way Visually appealing Addresses a timely need or interest Location-specific (example: the most searched-for halloween costumes by state). Creating a resource like this is a great way to attract a lot of links with one page. You could also create a highly-specific resource — without as broad of an appeal — that targeted a handful of websites. You might see a higher rate of success, but that approach isn't as scalable. Users who see this kind of unique content often want to share it with friends, and bloggers/tech-savvy webmasters who see it will often do so through links. These high quality, editorially earned votes are invaluable to building trust, authority, and rankings potential. Build resource pages Resource pages are a great way to build links. However, to find them you'll want to know some Advanced Google operators to make discovering them a bit easier. For example, if you were doing link building for a company that made pots and pans, you could search for: cooking intitle:"resources" and see which pages might be good link targets. This can also give you great ideas for content creation — just think about which types of resources you could create that these pages would all like to reference/link to. Get involved in your local community For a local business (one that meets its customers in person), community outreach can result in some of the most valuable and influential links. Engage in sponsorships and scholarships. Host or participate in community events, seminars, workshops, and organizations. Donate to worthy local causes and join local business associations. Post jobs and offer internships. Promote loyalty programs. Run a local competition. Develop real-world relationships with related local businesses to discover how you can team up to improve the health of your local economy. All of these smart and authentic strategies provide good local link opportunities. Refurbish top content You likely already know which of your site's content earns the most traffic, converts the most customers, or retains visitors for the longest amount of time. Take that content and refurbish it for other platforms (Slideshare, YouTube, Instagram, Quora, etc.) to expand your acquisition funnel beyond Google. You can also dust off, update, and simply republish older content on the same platform. If you discover that a few trusted industry websites all linked to a popular resource that's gone stale, update it and let those industry websites know — you may just earn a good link. You can also do this with images. Reach out to websites that are using your images and not citing/linking back to you and ask if they'd mind including a link. Be newsworthy Earning the attention of the press, bloggers, and news media is an effective, time-honored way to earn links. Sometimes this is as simple as giving something away for free, releasing a great new product, or stating something controversial. Since so much of SEO is about creating a digital representation of your brand in the real world, to succeed in SEO, you have to be a great brand. Be personal and genuine The most common mistake new SEOs make when trying to build links is not taking the time to craft a custom, personal, and valuable initial outreach email. You know as well as anyone how annoying spammy emails can be, so make sure yours doesn't make people roll their eyes. Your goal for an initial outreach email is simply to get a response. These tips can help: Make it personal by mentioning something the person is working on, where they went to school, their dog, etc. Provide value. Let them know about a broken link on their website or a page that isn't working on mobile. Keep it short. Ask one simple question (typically not for a link; you'll likely want to build a rapport first). Earning Links Earning links can be very resource-intensive, so you'll likely want to measure your success to prove the value of those efforts. Metrics for link building should match up with the site's overall KPIs. These might be sales, email subscriptions, page views, etc. You should also evaluate Domain and/or Page Authority scores, the ranking of desired keywords, and the amount of traffic to your content.
Beyond links: How awareness, amplification, and sentiment impact authority
A lot of the methods you'd use to build links will also indirectly build your brand. In fact, you can view link building as a great way to increase awareness of your brand, the topics on which you're an authority, and the products or services you offer. Once your target audience knows about you and you have valuable content to share, let your audience know about it! Sharing your content on social platforms will not only make your audience aware of your content, but it can also encourage them to amplify that awareness to their own networks, thereby extending your own reach. Are social shares the same as links? No. But shares to the right people can result in links. Social shares can also promote an increase in traffic and new visitors to your website, which can grow brand awareness, and with a growth in brand awareness can come a growth in trust and links. The connection between social signals and rankings seems indirect, but even indirect correlations can be helpful for informing strategy. Trustworthiness goes a long way For search engines, trust is largely determined by the quality and quantity of the links your domain has earned, but that's not to say that there aren't other factors at play that can influence your site's authority. Think about all the different ways you come to trust a brand: Awareness (you know they exist) Helpfulness (they provide answers to your questions) Integrity (they do what they say they will) Quality (their product or service provides value; possibly more than others you've tried) Continued value (they continue to provide value even after you've gotten what you needed) Voice (they communicate in unique, memorable ways) Sentiment (others have good things to say about their experience with the brand) That last point is what we're going to focus on here. Reviews of your brand, its products, or its services can make or break a business. In your effort to establish authority from reviews, follow these review rules of thumb: Never pay any individual or agency to create a fake positive review for your business or a fake negative review of a competitor. Don't review your own business or the businesses of your competitors. Don't have your staff do so either. Never offer incentives of any kind in exchange for reviews. All reviews must be left directly by customers in their own accounts; never post reviews on behalf of a customer or employ an agency to do so. Don't set up a review station/kiosk in your place of business; many reviews stemming from the same IP can be viewed as spam. Read the guidelines of each review platform where you're hoping to earn reviews. Be aware that review spam is a problem that's taken on global proportions, and that violation of governmental truth-in-advertising guidelines has led to legal prosecution and heavy fines. It's just too dangerous to be worth it. Playing by the rules and offering exceptional customer experiences is the winning combination for building both trust and authority over time.
Conclusion Authority is built when brands are doing great things in the real-world, making customers happy, creating and sharing great content, and earning links from reputable sources. Read the full article
0 notes
Text
[Jungian Cognitive Functions] INTP and INFP similarities
INTP vs. INFP: Similarities, Differences, & Paths to Growth Quote: --- As touched on in my post, Introverted Feeling (Fi) vs. Introverted Thinking (Ti), both Fi and Ti use a subjective approach to judging, that is, their preferred evaluative criteria derives from the self rather than from external sources. Both Fi and Ti make independent subjective judgments, the validity of which is experienced as self-evident. They are thus relatively unconcerned with the degree to which others agree with their methods or conclusions. The upshot of this is that both INTPs and INFPs are apt to be idiosyncratic and unconventional. Both types exhibit a deep concern for staying true to themselves—their own ideas, interests, values, and methods—rather than blindly conforming to world around them. The appearance of adaptability conferred by their P preference belies their inner craving for authenticity and self-direction. The idea that INPs are spontaneous with respect to the outside world is in many respects fallacious. They may be spontaneous in responding to their own inner impulses, but they are inclined to avoid or flee from externalities that have the potential to disrupt the peace and comfort derived from following their own inner compass (which is why many INPs score high as Enneagram Nines). Part of being a self-guided individual is clarifying one’s beliefs, values, and identity. This is why INPs invest so much time working to understand themselves and figure out what they believe; in order to effectively be themselves, they must first know themselves. This of course is where typology often proves useful, furnishing insight into the essential nature of their personality. --- Quote: --- While Fi / Ti compels INPs to clarify and hold true to their own path, their auxiliary function, Extraverted Intuition (Ne), prompts them to explore external ideas and possibilities that inform, enrich, or otherwise interface with that path. Indeed, even the most introverted INPs eventually tire of running around in their own minds and feel compelled to redirect their gaze outward. Not only is Ne keen to entertain and absorb circulating ideas, but it excels at seeing patterns and connections among those ideas. Exploring new ideas and possibilities can be deeply refreshing and invigorating for INPs, particularly after extensive periods of solitude or self-absorption. In many respects, Ne functions like a “reset button” for INPs, removing their introverted blinders and offering them a fresh set of ideas to explore. Ne can also have the effect of opening up or casting doubt on previously established judgments, which can be both a blessing and a curse for INPs. On the one hand, INPs appreciate the novelty and refreshment that Ne can bring. On the other hand, Ne is a potent destabilizer, injecting doubt, even chaos, into INPs’ self-understanding and worldview. As discussed in my post, INTPs’ & INFPs’ Quest for Convergence & Certainty, INPs who regularly employ Ne discover that, however earnest their attempts, ideational certainty perpetually eludes them. When it comes to Ne, there is only one thing one can be certain of: uncertainty. Because of the divergent and unpredictable nature of Ne, INPs are best understood as seekers and creatives rather than as knowers or doctrinaires. In my view, this is one of the most important things INPs can understand and embrace about themselves. Among other things, it can help them let go of the idea that, in order to be successful or move forward with their lives, they must first arrive at firm answers to all their questions. Put differently, self-identifying as seekers / creators helps INPs avoid the pitfalls of what we might call “NJ envy,” that is, of trying to operate as convergent knowers. --- Quote: --- As discussed in my book, The 16 Personality Types, all types struggle to effectively navigate the tensions and power struggles transpiring among their four functions. Of particular salience is the struggle between the dominant and inferior function, which represents the greatest power imbalance within the functional stack. Because INTPs’ dominant function is Ti, they often feel disconnected from the world of feeling, as well as the sense of meaning that feeling confers. Hence, one of their deepest fears is that life will prove to be utterly meaningless and that they will thus be condemned to a nihilistic existence. To assuage this fear, the psyche prompts INTPs to engage with the F world, be it through interpersonal relationships or in less direct ways, such as exploring subjects like philosophy, psychology, literature, religion, etc. The INFP’s deepest fear is in many respects the opposite of the INTP’s. Rather than being disconnected from F matters, the INFP feels estranged from, or insecure about, the world of T. INFPs tend not to fret about life lacking meaning or value, but about things like structure and organization, time and financial management, and other logistical matters. In attempt to overcome their T shortcomings, they commonly take interest in subjects like math, science, computers, engineering, law, finance, accounting, etc. Doing so helps temper their T concerns, reassuring them that psychological wholeness is within reach and that they will never be cut off from their Te function. That being said, we know that not all INPs pattern their careers around the needs and desires of their inferior function. In reality, we may find similar numbers of INTPs and INFPs in both the sciences and humanities. However, I suspect that INTPs studying the humanities are more likely to mistype as F types, and vice-versa for INFPs in the sciences. Both types may see themselves as more capable with respect to their inferior function than they actually are. This is just one of many ways in which the inferior function can generate type confusion among INPs, causing them to mistake a dream (i.e., having a developed and reconciled inferior function) for reality. Of course, this is not to say that these types cannot grow and develop their inferior functions, but only that their self-assessments are often skewed by inferior function ideals. There are at least a couple ways INPs may approach the challenge of integrating their dominant and inferior functions. The lowest hanging fruit is to use a piecemeal approach, attempting to satisfy both functions separately, as commonly seen in Phase II of type development. INTPs taking this approach will often try to satisfy their Fe through a relationship and their Ti through some form of self-directed work. Similarly, INFPs may satisfy their Fi by caring for children or pets, while simultaneously working to advance their career (Te). An alternative route to reconciling the dominant and inferior functions is by employing and developing what we might call the “bridge functions” (i.e., the auxiliary and tertiary functions), which are sandwiched between the dominant and inferior functions in the functional stack. As we’ve seen, these functions are identical for these two types. Thus, INPs hoping to build a bridge between their dominant and inferior functions, which in my view is a more effective and sustainable route to individuation, will share a lot in common. Perhaps the most important function with respect to INPs growth and development is Ne, the function of exploration and creativity. Rarely is it clear to INPs exactly how Ne can move them toward psychological wholeness. This lack of clarity, in combination with the destabilizing effects of Ne, helps us understand why letting go of the piecemeal approach can prove difficult for INPs. So despite the fact that building an Ne-Si bridge is a more reliable route to integration and wholeness for INPs, it requires a greater measure of faith, patience, and courage, since the inferior function is being approached in a more subtle and less direct way. This harkens back to my earlier point about INPs self-identifying as seekers / creators (Ne) vs. convergent knowers. Seeing themselves as knowers would be suggestive of identification with their inferior extraverted judging function (Fe or Te). But the truth is that INPs can’t authentically reach a point of knowing without first employing their top three functions. Before they can know anything with confidence, they must introspect (Fi / Ti), explore related ideas (Ne), and consider past information (Si). Not only that, but knowing often proves to be less rewarding for INPs than the process of seeking and creating—something they may not realize until the dog actually catches its own tail. Little is more deflating for these types than the sense that there is nothing more for them to explore or create. This sentiment is nicely captured in Dostoevsky’s Notes from Underground: “Man loves creating…But why does he so passionately love destructions and chaos as well?…Can it be that he has such a love…because he is instinctively afraid of achieving the goal and completing the edifice he is creating? How do you know, maybe he likes the edifice only from far off, and by no means up close; maybe he only likes creating it, and not living in it…” --- Introverted Feeling (Fi) vs Introverted Thinking (Ti) http://www.typologycentral.com/forums/myers-briggs-and-jungian-cognitive-functions/90112-intp-infp-similarities-new-post.html?utm_source=dlvr.it&utm_medium=tumblr
2 notes
·
View notes
Text
Creative Super X-Fi AIR Review: Spatial Audio Done Right
Our verdict of the Creative Super X-Fi AIR: Get the Creative Super X-Fi AIR if you need a new pair of headphones to use at home or in the office and crave the spatial audio experience. Reconsider if you want to heavily use them on-the-go, need noise-cancellation, or value a long battery life.810
Immersive surround sound, packed into a pair of comfortable headphones; it’s the quiet audiophile’s dream. Creative’s Super X-Fi Headphone Holography technology promises to deliver just that. We received an impressive demo of the technology at CES 2020 and took home a pair of Creative SXFI AIR headphones, valued at $159.99, to put them through the test. Here’s what we found.
Creative SXFI AIR Specifications
Design: closed over-ear
Drivers: 50mm Neodymium magnet
Frequency Response: 20Hz-20,000Hz
Impedance: 32 ohms
Connectivity: Bluetooth 4.2, USB-C, Line-in
MicroSD Card Slot: supports MP3, WMA, WAV, and FLAC formats
Microphone: detachable NanoBoom microphone
Colors: white and black, plus 16.7 million colors for the RGB ear-cup rings
Weight: 338g (11.9oz)
Battery Life: up to 10 hours
Charging time: 2.5 hours
Price: RRP $159.99, on sale for $139.99 at the time of writing
First Impressions
The SXFI AIR wireless over-ear headphones are fairly lightweight for their size (11.9oz or 338g). Each side of the headband expands up to 1.5inches (4cm) and the ear cups swivel slightly to provide a perfect fit for any head size or shape. The foam-cushioned headband and ear cups complete a comfortable experience.
Creative placed all control options on the left ear cup and an RGB light strip graces the outer edge of both cups. The light comes on when you’re using the headphones and you can customize the colors via the SXFI AIR Control app. Unfortunately, this is a separate app from the one you’ll need to set up your Super X-Fi profile; more on that below.
Overall, the Creative SXFI AIR headphones appear bulky and the absence of a folding mechanism means they’ll take up lots of space in your bag. The lack of active noise-cancellation makes them work best in a quiet environment. That said, the build quality appears solid enough to tolerate an active lifestyle.
Size comparison: Creative Super X-Fi Air vs. Sony WH-1000XM2 vs. TaoTronics BH22
Headphone Ports and Controls
Along the rim of the left ear cup you’ll find (from front to back):
on/off button
detachable NanoBoom microphone
USB-C port
3.5mm AUX jack
source selection (Bluetooth, USB, or SD card)
microSD card slot
Super X-Fi toggle
The touchpad on the side of the left ear cup lets you control playback and volume or accept calls.
The NanoBoom microphone plugs into its own 3.5mm AUX jack and you can replace it with a better quality gaming microphone if you like. Alternatively, you can take it out and plug up the port.
Two things make these headphones special: the built-in Super X-Fi technology (see below) and the audio source options. The USB-C charging port doubles as audio input for your PC, Mac, PS4, or Nintendo Switch. In addition to Bluetooth and the standard 3.5mm AUX jack, the SXFI AIR also functions as its own audio player when you insert a microSD card; a rare feature in headphones.
How Does Super X-Fi Work?
Super X-Fi is an AI-based technology that uses acoustic modeling to create the impression of spatial audio inside a pair of headphones. The technology is backed up by decades of research.
The Theory
The perception of sound is incredibly subjective and depends, in no small part, on the shape of your outer ears. Scientists refer to this circumstance as head-related transfer function (HRTF).
“Humans have just two ears, but can locate sounds in three dimensions – in range (distance), in direction above and below (elevation), in front and to the rear, as well as to either side (azimuth). This is possible because the brain, inner ear and the external ears (pinna) work together to make inferences about location.” Source: Wikipedia
Creative’s Take
Software can simulate those spatial cues, if it knows enough of the listener’s personal HRTF. Creative uses either an in-ear acoustic measurement or a picture-based head map to feed its AI with that data, which then calculates a custom-fit audio profile for every user. Once you’ve connected your SXFI profile, the SXFI software can crunch incoming audio to sound like you were using a surround sound speaker setup.
SXFI Gen2
In early 2020, Creative launched Gen2 of its AI, which brought a range of improvements to Super X-Fi, including the preservation of more sound details, better audio fidelity, clearer sound signature, and higher positioning accuracy, which is key for movies and games.
“At the beginning, our user profile base was in the range of tens of thousands. Over the past year, riding on the successful launch of Super X-Fi, we have accumulated hundreds of thousands of user profiles. This significant jump in real-world data has provided us with the capacity for more research, and importantly, it has enabled us to train the Super X-Fi AI engine to be even more accurate in personalizing the audio experience for our users. This was key in our development of the Super X-Fi Gen2 profile.” —Lee Teck Chee, Vice-President of Technology at Creative and inventor of Super X-Fi
The Gen2 engine also features more efficient power consumption. As a result, the SXFI AIR headphones gained more than 10% battery life. This explains why our headphones, using Bluetooth, lasted for a little more than the advertised 10 hours; a pleasant surprise. I also should note that the headphones turn off rather quickly when not in use. And every single time that happens, I’m startled by the voice that seemingly pops up right next to me to announce “powering off”.
While the SXFI AIR are excellent all-around headphones, you shouldn’t use them for first-person-shooter (FPS) games. Instead, look into the the SXFI GAMER headset, which runs a Super X-Fi engine designed specifically for the FPS-specific sound environment.
How to Create Your Super X-Fi Profile
The spatial audio experience generated by the Super X-Fi AI incorporates your head shape into its calculations. To create your custom profile, you can use the SXFI App to take pictures of your face and ears. Download the SXFI App (Android, iOS), create an account, and log in. Under Personalize, click Start Head Mapping, and follow the on-screen instructions.
We found that the quality of the pictures had an impact on the resulting audio. So make sure you take the pictures with sufficiently good lighting. It’s also easier when you can get someone else to take your pictures for you.
Once you have created your profile, connect your SXFI AIR headphones to your phone via Bluetooth. Then return to Personalize and select your profile. The app will automatically sync your data to the connected headphones. Now you’re ready to enjoy 3D spatial audio.
There’s a second way to create your personal profile. When you’re lucky enough to receive a SXFI demo, as we did at CES 2020, they will measure your personal audio perception using an in-ear microphone. Creative said they might offer this acoustic measurement as a premium service in the future. Until then, you’ll have to use the app.
The SXFI Sound Experience
Creative’s Super X-Fi demo at CES 2020 blew us away. For the demo, we were using Creative’s SXFI AMP with a pair of generic headphones. At one point during the setup, I briefly thought I heard an audio cue from an external speaker. When I remembered I was already wearing headphones, I was sold on the technology and the remaining demonstration convinced me further.
Bluetooth vs. Wired Connection
The experience we had during the demo was outstanding. And at first, I couldn’t quite replicate it with the SXFI AIR headphones, although I used the higher quality audio profiles created during the demo. However, when I switched from Bluetooth to wired audio transmission via USB and from Spotify to a high quality audio source, the sound improved dramatically.
Note: If you have a Spotify subscription, you’ll be streaming at up to 320 kbps (vs. 160 kbps for free users). Make sure you set the streaming quality to “very high” in your Spotify settings.
Unfortunately, Creative opted for the low-quality SBC codec for wireless streaming. Its maximum transfer rate is 320 kbps, making it just good enough for Spotify, though you may experience data loss. SBC simply won’t let you enjoy the full audio quality provided by a non-lossy format like FLAC.
In-ear Measurement vs. Head Map Audio Profile
Personally, I could tell a difference in quality between the audio profile generated during CES and the app-based audio profile. It was more subtle when I took the head map pictures in great lighting. Creative says that the picture-based audio profile reaches about 90% of the accuracy of an in-ear measurement.
Music vs. Entertainment vs. Calls
I most enjoyed the SXFI AIR headphones when listening to music. With default settings, it gives rock and pop songs a club-like sound with good bass and a satisfying 3D effect. For classical music, the overall quality and spatial audio effect were even more convincing.
Audio calls were the least pleasant as the speaker on the other end sounded tinny. Movies and TV shows were somewhere in the middle. I couldn’t shake off the “sound in a can” impression, but the spatial effects were clearly audible.
If you’re not satisfied with the sound, be sure to download the SXFI AIR Control app (Android, iOS) and use the Equalizer to customize your audio experience. It comes with presets for movies, games, as well as classical and pop music. Within this app, you can also disable or customize your AIR’s RGB ring lights.
It’s regrettable that you need a separate app to create your audio profile, but once that’s done, you’ll only need the SXFI AIR Control app to switch between existing profiles and customize your headphone settings.
How Does the SXFI AIR Compare?
Creative isn’t the only company that has come out with a spatial audio product.
Sony 360 Reality Audio
At IFA 2019 in Berlin, we received a quick demo of Sony’s 360 Reality Audio, which also uses a head map to generate a custom audio profile. While Sony’s object-based spatial audio technology works with any headphones and commands no additional hardware, it doesn’t work with any old audio source. 360 Reality Audio is its own music format.
Presently, you’ll need a subscription to a premium music streaming service like Deezer or Tidal, before you can access Sony’s 3D audio format. You’ll also need to use its mobile app to create your custom audio profile. In other words, you’re limited to music available on a streaming service via your smartphone. On the bright side, you can opt for an affordable monthly subscription and see how you like the effect.
During our demo in Berlin, we were impressed with the clarity of the 3D effect, but we noticed a drop in bass compared to the unaltered track.
Dolby Headphone
This mobile surround solution is part of the audio decoders found in many surround sound gaming headsets, including the HyperX Cloud series. You can get a HyperX Cloud II surround sound headset for as little as $99.
The Dolby Headphone technology has been around for over twenty years. Like Creative’s Super X-Fi, it applies HRTFs, but it went with a one-size-fits-all model that lacks user-specific customization.
Our Super X-Fi AIR Verdict
The SXFI AIR are a great pair of headphones for audiophiles who desire an immersive listening experience for music, entertainment, and non-FPS gaming. These headphones will let you privately enjoy surround sound, without bothering your neighbors.
However, the audio quality suffers when using Bluetooth or low-quality audio files, which is why we can only recommend the SXFI AIR if you’re prepared to use a wired connection or can transfer your media to a microSD card. Moreover, given that the headphones don’t offer active noise-cancellation (ANC), the battery life is slightly disappointing. The lack of ANC also limits their use in public spaces.
Get the Creative Super X-Fi AIR if you crave the spatial audio experience in a quiet environment and don’t mind its limitations.
If you already have a great pair of headphones, but want to experience the Super X-Fi sound, try the SXFI AMP, priced at $149.99. It contains the same technology, but it’s completely wired, using USB input from your phone, computer, or gaming console, and a 3.5mm audio jack output to your headphones.
Enter the Competition!
Creative Super X-Fi AIR Headphones Giveaway
Read the full article: Creative Super X-Fi AIR Review: Spatial Audio Done Right
Creative Super X-Fi AIR Review: Spatial Audio Done Right posted first on grassroutespage.blogspot.com
0 notes
Text
Creative Super X-Fi AIR Review: Spatial Audio Done Right
Our verdict of the Creative Super X-Fi AIR: Get the Creative Super X-Fi AIR if you need a new pair of headphones to use at home or in the office and crave the spatial audio experience. Reconsider if you want to heavily use them on-the-go, need noise-cancellation, or value a long battery life.810
Immersive surround sound, packed into a pair of comfortable headphones; it’s the quiet audiophile’s dream. Creative’s Super X-Fi Headphone Holography technology promises to deliver just that. We received an impressive demo of the technology at CES 2020 and took home a pair of Creative SXFI AIR headphones, valued at $159.99, to put them through the test. Here’s what we found.
Creative SXFI AIR Specifications
Design: closed over-ear
Drivers: 50mm Neodymium magnet
Frequency Response: 20Hz-20,000Hz
Impedance: 32 ohms
Connectivity: Bluetooth 4.2, USB-C, Line-in
MicroSD Card Slot: supports MP3, WMA, WAV, and FLAC formats
Microphone: detachable NanoBoom microphone
Colors: white and black, plus 16.7 million colors for the RGB ear-cup rings
Weight: 338g (11.9oz)
Battery Life: up to 10 hours
Charging time: 2.5 hours
Price: RRP $159.99, on sale for $139.99 at the time of writing
First Impressions
The SXFI AIR wireless over-ear headphones are fairly lightweight for their size (11.9oz or 338g). Each side of the headband expands up to 1.5inches (4cm) and the ear cups swivel slightly to provide a perfect fit for any head size or shape. The foam-cushioned headband and ear cups complete a comfortable experience.
Creative placed all control options on the left ear cup and an RGB light strip graces the outer edge of both cups. The light comes on when you’re using the headphones and you can customize the colors via the SXFI AIR Control app. Unfortunately, this is a separate app from the one you’ll need to set up your Super X-Fi profile; more on that below.
Overall, the Creative SXFI AIR headphones appear bulky and the absence of a folding mechanism means they’ll take up lots of space in your bag. The lack of active noise-cancellation makes them work best in a quiet environment. That said, the build quality appears solid enough to tolerate an active lifestyle.
Size comparison: Creative Super X-Fi Air vs. Sony WH-1000XM2 vs. TaoTronics BH22
Headphone Ports and Controls
Along the rim of the left ear cup you’ll find (from front to back):
on/off button
detachable NanoBoom microphone
USB-C port
3.5mm AUX jack
source selection (Bluetooth, USB, or SD card)
microSD card slot
Super X-Fi toggle
The touchpad on the side of the left ear cup lets you control playback and volume or accept calls.
The NanoBoom microphone plugs into its own 3.5mm AUX jack and you can replace it with a better quality gaming microphone if you like. Alternatively, you can take it out and plug up the port.
Two things make these headphones special: the built-in Super X-Fi technology (see below) and the audio source options. The USB-C charging port doubles as audio input for your PC, Mac, PS4, or Nintendo Switch. In addition to Bluetooth and the standard 3.5mm AUX jack, the SXFI AIR also functions as its own audio player when you insert a microSD card; a rare feature in headphones.
How Does Super X-Fi Work?
Super X-Fi is an AI-based technology that uses acoustic modeling to create the impression of spatial audio inside a pair of headphones. The technology is backed up by decades of research.
The Theory
The perception of sound is incredibly subjective and depends, in no small part, on the shape of your outer ears. Scientists refer to this circumstance as head-related transfer function (HRTF).
“Humans have just two ears, but can locate sounds in three dimensions – in range (distance), in direction above and below (elevation), in front and to the rear, as well as to either side (azimuth). This is possible because the brain, inner ear and the external ears (pinna) work together to make inferences about location.” Source: Wikipedia
Creative’s Take
Software can simulate those spatial cues, if it knows enough of the listener’s personal HRTF. Creative uses either an in-ear acoustic measurement or a picture-based head map to feed its AI with that data, which then calculates a custom-fit audio profile for every user. Once you’ve connected your SXFI profile, the SXFI software can crunch incoming audio to sound like you were using a surround sound speaker setup.
SXFI Gen2
In early 2020, Creative launched Gen2 of its AI, which brought a range of improvements to Super X-Fi, including the preservation of more sound details, better audio fidelity, clearer sound signature, and higher positioning accuracy, which is key for movies and games.
“At the beginning, our user profile base was in the range of tens of thousands. Over the past year, riding on the successful launch of Super X-Fi, we have accumulated hundreds of thousands of user profiles. This significant jump in real-world data has provided us with the capacity for more research, and importantly, it has enabled us to train the Super X-Fi AI engine to be even more accurate in personalizing the audio experience for our users. This was key in our development of the Super X-Fi Gen2 profile.” —Lee Teck Chee, Vice-President of Technology at Creative and inventor of Super X-Fi
The Gen2 engine also features more efficient power consumption. As a result, the SXFI AIR headphones gained more than 10% battery life. This explains why our headphones, using Bluetooth, lasted for a little more than the advertised 10 hours; a pleasant surprise. I also should note that the headphones turn off rather quickly when not in use. And every single time that happens, I’m startled by the voice that seemingly pops up right next to me to announce “powering off”.
While the SXFI AIR are excellent all-around headphones, you shouldn’t use them for first-person-shooter (FPS) games. Instead, look into the the SXFI GAMER headset, which runs a Super X-Fi engine designed specifically for the FPS-specific sound environment.
How to Create Your Super X-Fi Profile
The spatial audio experience generated by the Super X-Fi AI incorporates your head shape into its calculations. To create your custom profile, you can use the SXFI App to take pictures of your face and ears. Download the SXFI App (Android, iOS), create an account, and log in. Under Personalize, click Start Head Mapping, and follow the on-screen instructions.
We found that the quality of the pictures had an impact on the resulting audio. So make sure you take the pictures with sufficiently good lighting. It’s also easier when you can get someone else to take your pictures for you.
Once you have created your profile, connect your SXFI AIR headphones to your phone via Bluetooth. Then return to Personalize and select your profile. The app will automatically sync your data to the connected headphones. Now you’re ready to enjoy 3D spatial audio.
There’s a second way to create your personal profile. When you’re lucky enough to receive a SXFI demo, as we did at CES 2020, they will measure your personal audio perception using an in-ear microphone. Creative said they might offer this acoustic measurement as a premium service in the future. Until then, you’ll have to use the app.
The SXFI Sound Experience
Creative’s Super X-Fi demo at CES 2020 blew us away. For the demo, we were using Creative’s SXFI AMP with a pair of generic headphones. At one point during the setup, I briefly thought I heard an audio cue from an external speaker. When I remembered I was already wearing headphones, I was sold on the technology and the remaining demonstration convinced me further.
Bluetooth vs. Wired Connection
The experience we had during the demo was outstanding. And at first, I couldn’t quite replicate it with the SXFI AIR headphones, although I used the higher quality audio profiles created during the demo. However, when I switched from Bluetooth to wired audio transmission via USB and from Spotify to a high quality audio source, the sound improved dramatically.
Note: If you have a Spotify subscription, you’ll be streaming at up to 320 kbps (vs. 160 kbps for free users). Make sure you set the streaming quality to “very high” in your Spotify settings.
Unfortunately, Creative opted for the low-quality SBC codec for wireless streaming. Its maximum transfer rate is 320 kbps, making it just good enough for Spotify, though you may experience data loss. SBC simply won’t let you enjoy the full audio quality provided by a non-lossy format like FLAC.
In-ear Measurement vs. Head Map Audio Profile
Personally, I could tell a difference in quality between the audio profile generated during CES and the app-based audio profile. It was more subtle when I took the head map pictures in great lighting. Creative says that the picture-based audio profile reaches about 90% of the accuracy of an in-ear measurement.
Music vs. Entertainment vs. Calls
I most enjoyed the SXFI AIR headphones when listening to music. With default settings, it gives rock and pop songs a club-like sound with good bass and a satisfying 3D effect. For classical music, the overall quality and spatial audio effect were even more convincing.
Audio calls were the least pleasant as the speaker on the other end sounded tinny. Movies and TV shows were somewhere in the middle. I couldn’t shake off the “sound in a can” impression, but the spatial effects were clearly audible.
If you’re not satisfied with the sound, be sure to download the SXFI AIR Control app (Android, iOS) and use the Equalizer to customize your audio experience. It comes with presets for movies, games, as well as classical and pop music. Within this app, you can also disable or customize your AIR’s RGB ring lights.
It’s regrettable that you need a separate app to create your audio profile, but once that’s done, you’ll only need the SXFI AIR Control app to switch between existing profiles and customize your headphone settings.
How Does the SXFI AIR Compare?
Creative isn’t the only company that has come out with a spatial audio product.
Sony 360 Reality Audio
At IFA 2019 in Berlin, we received a quick demo of Sony’s 360 Reality Audio, which also uses a head map to generate a custom audio profile. While Sony’s object-based spatial audio technology works with any headphones and commands no additional hardware, it doesn’t work with any old audio source. 360 Reality Audio is its own music format.
Presently, you’ll need a subscription to a premium music streaming service like Deezer or Tidal, before you can access Sony’s 3D audio format. You’ll also need to use its mobile app to create your custom audio profile. In other words, you’re limited to music available on a streaming service via your smartphone. On the bright side, you can opt for an affordable monthly subscription and see how you like the effect.
During our demo in Berlin, we were impressed with the clarity of the 3D effect, but we noticed a drop in bass compared to the unaltered track.
Dolby Headphone
This mobile surround solution is part of the audio decoders found in many surround sound gaming headsets, including the HyperX Cloud series. You can get a HyperX Cloud II surround sound headset for as little as $99.
The Dolby Headphone technology has been around for over twenty years. Like Creative’s Super X-Fi, it applies HRTFs, but it went with a one-size-fits-all model that lacks user-specific customization.
Our Super X-Fi AIR Verdict
The SXFI AIR are a great pair of headphones for audiophiles who desire an immersive listening experience for music, entertainment, and non-FPS gaming. These headphones will let you privately enjoy surround sound, without bothering your neighbors.
However, the audio quality suffers when using Bluetooth or low-quality audio files, which is why we can only recommend the SXFI AIR if you’re prepared to use a wired connection or can transfer your media to a microSD card. Moreover, given that the headphones don’t offer active noise-cancellation (ANC), the battery life is slightly disappointing. The lack of ANC also limits their use in public spaces.
Get the Creative Super X-Fi AIR if you crave the spatial audio experience in a quiet environment and don’t mind its limitations.
If you already have a great pair of headphones, but want to experience the Super X-Fi sound, try the SXFI AMP, priced at $149.99. It contains the same technology, but it’s completely wired, using USB input from your phone, computer, or gaming console, and a 3.5mm audio jack output to your headphones.
Enter the Competition!
Creative Super X-Fi AIR Headphones Giveaway
Read the full article: Creative Super X-Fi AIR Review: Spatial Audio Done Right
Creative Super X-Fi AIR Review: Spatial Audio Done Right published first on http://droneseco.tumblr.com/
0 notes
Text
New 4K UHD Home Cinema projectors 2019
Of course, competitiveness requires companies to constantly expand their product range. This axiom holds for all segments of consumer electronics and the Home Cinema projectors was no exception. This year, the range of projectors has also expanded significantly. Of course, the format of the review does not allow us to consider their complete list, but it contains the most promising models. It includes the following projectors from major market leaders: - Epson 4K PRO-UHD Home Cinema 5050UB/5050UBe and 6050UB; - BenQ HT3550 and Premium CinePro Series (HT8060/HT9060); - Optoma EH412/EH412ST; - ViewSonic LS900WU; - NEC P605UL Projector.
Epson Home Cinema 5050UB/5050UBe and 6050UB
Actually, Epson Pro Cinema 6050UB in a black is an improved version of 5050UB. This model costs nearly $ 4,000, but provides an unprecedented rated contrast ratio to 1,200,000:1, supports professional calibration for ISF Day and Night modes and an additional setting of aspect ratio for a standalone anamorphic lens. Moreover, 6050UB comes with an extra lamp, a ceiling mount and cable cover. But it is available through CEDIA and other specialty dealers only. In fact, 5050UB replaces the very popular 5040UB projector of 2016. Its design has not changed. Epson 5050UB was the next step in improving the projectors of this company. 5050UB costs nearly $ 2,700 and provides brightness of 2600 lm for color and white with 1920x1080 resolution. But, of course, the improved 4K-Enhancement pixel shift technology is its main pros. Model uses the individual pixel plate with tighter tolerances. It increases the angle of the leading and trailing edges of the electronic shift pulse, redusing the delay between the trough and crest of the pixel shift changes. As a result, new projector has less downtime, providing more light output and a faster signal. In fact, 5050UB provides better accuracy, more output, and higher speed. According to the company, maximum supported resolutions of 5050UB reaches DCI 4K (4096 x 2160) vs 3840 x 2160 for 5040UB. Moreover, 5050UB provides Improved HDR Support by automatically defining an HDR standard and supporting customization from the menu. Of course, projector supports many other innovative technologies from EPSON. With high probability, the excellent color accuracy and contrast, wide lens shift and integrated wireless HDMI will make this model a bestseller in the segment of inexpensive 4K PRO-UHD projectors.
BenQ HT3550
Another market leader introduced a great 4K HDR projector, which claims to lead the list of the best budget Home Theater models. BenQ HT3550 continued the very popular series, which includes BenQ HT3050, HT2050, and HT2050a. As known, it appeared in April 2019 at a price of only $ 1,500. At the same time, this projector provides brightness of 2,000 Lumens, 30,000: 1 (full on / off with dynamic iris on) contrast, uses Dynamic Iris and supports Ultra HD 4K (3840 x 2160) resolution! Today, this model successfully competes with the popular Epson 5050UB, which is more than $ 1,000 more expensive. Main pros: - TI 0.47-inch chip with four-phase pixel shifting provides 4K UHD (3840 x 2160) resolution; - projector supports HDR10 and HLG HDR-standards; - the new 0.47-inch next-generation DLP XPR chip minimizes the dark frame around the image. Today, the company also uses it in the HT5550; - color accuracy from box reaches less than 3 Delta E for Rec.709, 100% Rec.709 coverage and 95% DCI-P3 coverage. In addition, a 10-element, 8-group, all glass 1.3x zoom lens projects 100-inch diagonal image from throw distance of nearly 8.25 to 10.75 ft (Throw Ratio of 1.13:1 - 1.47:1(D:W)). Unfortunately, BenQ's default settings provide smaller gamut compared to max possible (95% DCI-P3 vs 105% DCI-P3), and input lag is too high for some modern games.
Other BenQ HT3550 key features:
- six-segment RGBRGB color wheel; - 100% Rec.709 color gamut in D, 97% Rec.709 in Cinema mode at a higher brightness, and 95% DCI-P3 in the preset version; - dynamic iris modes - Low, Middle, High, or Off; - +10% vertical lens shift and ± 30 ° Vertical Keystone Correction; - two 18 Gbps HDMI 2.0b, HDCP 2.2 ports; - 4 color preset modes, one user mode for SDR, and modes for 3D, HDR10, and HLG; - lockable ISF Night and Day mode support; - silence mode blocks the pixel shift option, reducing the resolution to 1080p; - color control system provides settings for RGBCMY hue, saturation, gain, and adjusts white balance for RGB gain and offset; - 5-position HDR Brightness control; - CineMaster video processing includes color enhancement, flesh tones, detail enhancement, and frame interpolation; - two built-in 5-watt speakers; - Full HD 3D playback support; - backlit remote; - Lamp Life is 4,000/ 10,000 /15,000 hours (Normal/Eco/SmartEco modes); replacement lamp costs $150, 3-year warranty, and 1 year on lamp. Video at the end demonstrates unboxing and main specs of BenQ HT3550 4K HDR Home Theater projector.
BenQ Premium CinePro Series 4K UHD HDR Home Cinema Projectors
Company also introduced the BenQ 4K UHD HDR HT8060 and HT9060 in 2019 for $ 8,000 and $ 9,000. They have almost the same design, but, of course, different specs and functionality. The CinePro Series use latest DLP chipset that’s used in IMAX theater. It virtually eliminates pixel shifting and blending, providing 8.3 million distinct pixels for 4K UHD performance. HT8060 with BenQ’s CinematicColor provides 100% coverage color gamut Rec.709. The HT9060 combines CinematicColor with the Philips ColorSpark HLD LED system, expanding color space to super-wide DCI-P3. In addition, this technology eliminates the decrease in lamp brightness over time. Both projectors are compatible with the optional Panamorph Paladin anamorphic lens which 2.4:1 aspect ratio, delivering 2 million more pixels for increased brightness and detail. In addition, the projectors in this series use the patented 14-element, six-group and 4K-optimized lens that keeps the original brightness while minimizing chromatic aberration. Both models provide a brightness of 2,200 lm at 50,000: 1 (full on / off) contrast ratio with 3840 x 2160 Ultra HD native resolution. In addition, the projectors support Full HD 3D, and Horizontal ± 27.0% & Vertical ± 65.0% Lens Shift. Optics with Throw Ratio of 1.36: 1 - 2.03: 1 (D: W) projects a 150-inch image diagonally from a Throw Distance of 18 ft at 1.25x Zoom Range. The Lamp Life of Metal Halide for the HT8060 is traditionally 3,000 / 6,000 hours in norm / eco modes, but reaches 20,000 hours for the LEDs in HT9060. But probably, the price will significantly limit the popularity of this series despite its high class.
Optoma EH412/EH412ST
Of course, Optoma also did not stand aside this year. It presented the very interesting Optoma EH412 and EH412ST with great specs, but with priced at only $ 750 and $ 900, respectively. The company positions these models as professional projectors for business presentations, classrooms, and meeting rooms. But of course, they are great as a movie projectors. As the name suggests, the EH412ST is a Short Throw projector. Both models have an almost identical design. Both models have 1080p (1920 x 1080) resolution, 50,000:1 (full on/off) contrast ratio, support Full HD 3D mode и ± 40° Vertical Keystone Correction. Lamp Life of Metal Halide traditionally is 4,000 hours, but reaches an impressive 15,000 hours for eco mode. The brightness of the EH412 reaches 4,500 lm vs 4,000 lm in EH412ST. Of course, models use different optics. Throw Ratio of 1.12: 1 - 1.47: 1 (D: W) in EH412 provides such a projection from a Throw Distance of 18 ft at 1.19x. (Optical Zoom of model is 1.3x). Powerful 10-watt built-in speaker provides a fairly loud sound without the need for external speakers. Projectors support array of connectivity options including 2 x HDMI, VGA in and out, audio-in and out, RS232, and USB-A. Input Lag 32ms (min) is quite acceptable for games. Together with 4K HDR input, it allows you to use the projector for modern 1080p HDR gaming contents. Of course, such specs and functionality at a price of up to $ 1,000 offer excellent prospects for the models in this series.
ViewSonic LS900WU HDMI Networkable Laser Projector
As known, the company introduced model this spring. Today its Street Price is nearly $ 3,500. Its Laser Phosphor provides brightness of 6,000 lm, which is enough to form the WUXGA laser projection (1920 x 1200 resolution) up to 300 inches with 100, 000: 1 contrast ratio. But the optimal projection distance for Home Theater varies from 6.5 to 10 ft. In addition, 1.6x optical zoom, vertical/horizontal keystone correction, 4-corner adjustment, and a lens shift (+9.2% / -2.4% vertically & 2.5% horizontally) radically simplify the installation and setup of the projector. Moreover, 360-degree orientation function provides projector installation in just about any position or any angle from floor to ceiling, including upside down at a 45-degree angle. Projector supports Full HD 3D mode. Light source life varies from 20,000 to 30,000 hours, which depends on additional factors. Built-in HDBT receiver receives uncompressed HD video and audio from over 200 ft away via network cable. In fact, a powerful laser LS900WU with great specs, Geometric Correction, and a built-in HDBT receiver is a professional projector that is great for top-level Home Theater. But, probably, a sufficiently high price will limit its popularity.
NEC P605UL WUXGA Laser Projector
NEC P605UL is a continuation of the P series from this company. Today this 6,000 lm WUXGA laser projector becomes the brightest in series. Thus seven NEC P models offer brightness range from 4,700 to 5,500 lm at WXGA or WUXGA resolutions with lamp or laser light sources. 20,000-hour laser engine and virtually silent operation provide the main pros of this model. For this purpose, the company uses Whisper Quiet technology with patented sealed cooling engine. Indeed, the Audible Noise of modern projectors with such brightness reaches an average of 37 dB. For comparison, the noise of the P605UL in Eco mode does not exceed 19 dB. Of course, the model has all the traditional pros of a 3LCD technology, including equality of white and color brightness, lack of the rainbow artifacts, etc. In addition to high brightness of 6,000 / 3,000 lm (bright / eco modes), projector supports a 600,000:1 dynamic contrast ratio. The projector's optics with a Throw Ratio of 1.23: 1 - 2.00: 1 (D: W) provides almost 200 inch diagonal projection from Throw Distance of 21 ft at 1.3x optical zoom. Other key features: - scaling up to 4K (3840 x 2160)/30Hz; - a 1.6x manual zoom lens with manual focus; - lens shift provides from "0" to +60% vertically & ± 29% horizontally; - horizontal/vertical Keystone Correction ± 30 degrees; - supporting the 360 degree orientation; - portrait mode; - 2 x HDMI v1.4 with HDCP 1.4; - an integrated HDBaseT receiver; - USB port for charging. The street price for the projector is slightly higher compared to the ViewSonic LS900WU, and reaches $ 3,600.
Conclusion
In fact, all models are great for the Home Theater. BenQ offers the 4K UHD HDR HT3550 Home Theater with great specs for only $ 1,500, and projectors with true cinematic image quality costing up to $ 10,000. Laser networkable ViewSonic LS900WU and NEC P605UL combine the capabilities of a professional projector and top-level Home Theater for nearly $ 3,500. The Epson 5050UB 4K PRO-UHD uses Epson's innovative technologies, including improved 4K-Enhancement pixel shift, delivering superb quality at a price of only $ 2,700. Optoma offers the EH412 / EH412ST with excellent features for less than $ 1,000. These projectors are unlikely to enter this year in TOPs from experts due to the high competition from successful models of past years, for which companies have already reduced prices. Most likely, many consumers will also prefer Best sellers with good reviews and compelling advertising. But the projectors proposed in this review may be of interest to fans of innovative solutions who do not like to wait for a price reduction when choosing the optimal model. We sincerely wish you the Right Solution! Read the full article
#4KUHDHomeCinemaprojectors#4K-Enhancementpixelshifttechnology#BenQHT3550#BenQPremiumCineProSeriesHT8060#BenQPremiumCineProSeriesHT9060#Epson4KPRO-UHDHomeCinema5050UB#Epson4KPRO-UHDHomeCinema5050UBe#Epson4KPRO-UHDHomeCinema6050UB#Four-PhasePixelShifting#HomeCinemaprojectors2019#ImprovedHDRSupport#ISFNightandDaymode#NECP605ULProjector#New4KHDRHomeCinemaprojectors2019#OptomaEH412#OptomaEH412ST#PanamorphPaladinanamorphiclens#TI47-inchchip#ViewSonicLS900WU
0 notes
Text
All Links are Not Created Equal: 20 New Graphics on Google’s Valuation of Links
Posted by Cyrus-Shepard
Twenty-two years ago, the founders of Google invented PageRank, and forever changed the web. A few things that made PageRank dramatically different from existing ranking algorithms:
Links on the web count as votes. Initially, all votes are equal.
Pages which receive more votes become more important (and rank higher.)
More important pages cast more important votes.
But Google didn't stop there: they innovated with anchor text, topic-modeling, content analysis, trust signals, user engagement, and more to deliver better and better results.
Links are no longer equal. Not by a long shot.
Rand Fishkin published the original version of this post in 2010—and to be honest, it rocked our world. Parts of his original have been heavily borrowed here, and Rand graciously consulted on this update.
In this post, we'll walk you through 20 principles of link valuation that have been observed and tested by SEOs. In some cases, they have been confirmed by Google, while others have been patented. Please note that these are not hard and fast rules, but principles that interplay with one another. A burst of fresh link can often outweigh powerful links, spam links can blunt the effect of fresh links, etc.
We strongly encourage you to test these yourselves. To quote Rand, "Nothing is better for learning SEO than going out and experimenting in the wild."
1. Links From Popular Pages Cast More Powerful Votes
Let’s begin with a foundational principle. This concept formed the basis of Google’s original PageRank patent, and quickly help vault it to the most popular search engine in the world. PageRank can become incredibly complex very quickly—but to oversimplify—the more votes (links) a page has pointed to it, the more PageRank (and other possible link-based signals) it accumulates. The more votes it accumulates, the more it can pass on to other pages through outbound links. In basic terms, popular pages are ones that have accumulated a lot of votes themselves. Scoring a link from a popular page can typically be more powerful than earning a link from a page with fewer link votes.
2. Links "Inside" Unique Main Content Pass More Value than Boilerplate Links
Google’s Reasonable Surfer, Semantic Distance, and Boilerplate patents all suggest valuing content and links more highly if they are positioned in the unique, main text area of the page, versus sidebars, headers, and footers, aka the “boilerplate.”
It certainly makes sense, as boilerplate links are not truly editorial, but typically automatically inserted by a CMS (even if a human decided to put them there.) Google’s Quality Rater Guidelines encourage evaluators to focus on the “Main Content” of a page.
Similarly, SEO experiments have found that links hidden within expandable tabs or accordions (by either CSS or JavaScript) may carry less weight than fully visible links, though Google says they fully index and weight these links.
3. Links Higher Up in the Main Content Cast More Powerful Votes
If you had a choice between 2 links, which would you choose?
One placed prominently in the first paragraph of a page, or
One placed lower beneath several paragraphs
Of course, you’d pick the link visitors would likely click on, and Google would want to do the same. Google’s Reasonable Surfer Patent describes methods for giving more weight to links it believes people will actually click, including links placed in more prominent positions on the page.
Matt Cutts, former head of Google’s Webspam team, once famously encouraged SEOs to pay attention to the first link on the page, and not bury important links. (source)
4. Links With Relevant Anchor Text May Pass More Value
Also included in Google’s Reasonable Surfer patent is the concept of giving more weight to links with relevant anchor text. This is only one of several Google patents where anchor text plays an important role. Multiple experiments over the years repeatedly confirm the power of relevant anchor text to boost a page’s ranking better than generic or non-relevant anchor text. It’s important to note that the same Google patents that propose boosting the value of highly-relevant anchors, also discuss devaluing or even ignoring off-topic or irrelevant anchors altogether. Not that you should spam your pages with an abundance of exact match anchors. Data typically shows that high ranking pages typically have a healthy, natural mix of relevant anchors pointing to them.
Similarly, links may carry the context of the words+phrases around/near the link. Though hard evidence is scant, this is mentioned in Google’s patents, and it makes sense that a link surrounded by topically relevant content would be more contextually relevant than the alternative.
5. Links from Unique Domains Matter More than Links from Previously Linking Sites
Experience shows that it’s far better to have 50 links from 50 different domains than to have 500 more links from a site that already links to you.
This makes sense, as Google’s algorithms are designed to measure popularity across the entire web and not simply popularity from a single site.
In fact, this idea has been supported by nearly every SEO ranking factor correlation study ever performed. The number of unique linking root domains is almost always a better predictor of Google rankings than a site’s raw number of total links.
Rand points out that this principle is not always universally true. "When given the option between a 2nd or 3rd link from the NYTimes vs. randomsitexyz, it's almost always more rank-boosting and marketing helpful to go with another NYT link."
6. External Links are More Influential than Internal Links
If we extend the concept from #3 above, then it follows that links from external sites should count more than internal links from your own site. The same correlation studies almost always show that high ranking sites are associated with more external links than lower ranking sites.
Search engines seem to follow the concept that what others say about you is more important than what you say about yourself.
That’s not to say that internal links don’t count. On the contrary, internal linking and good site architecture can be hugely impactful on Google rankings. That said, building external links is often the fastest way to higher rankings and more traffic.
7. Links from Sites Closer to a Trusted Seed Set May Pass More Value
The idea of TrustRank has been around for many years. Bill Slawski covers it here. More recently, Google updated its original PageRank patent with a section that incorporates the concept of “trust” using seed sites. The closer a site is linked to a trusted seed site, the more of a boost it receives.
In theory, this means that black hat Private Blog Networks (PBNs) would be less effective if they were a large link distance away from more trusted sites.
Beyond links, other ways that Google may evaluate trust is through online reputation—e.g. through online reviews or sentiment analysis—and use of accurate information (facts). This is of particular concern with YMYL (Your Money or Your Life) pages that "impact the future happiness, health, financial stability, or safety of users."
This means links from sites that Google considers misleading and/or dangerous may be valued less than links from sites that present more reputable information.
8. Links From Topically Relevant Pages May Cast More Powerful Votes
You run a dairy farm. All things being equal, would you rather have a link from:
The National Dairy Association
The Association of Automobile Mechanics
Hopefully, you choose “b” because you recognize it’s more relevant. Though several mechanisms, Google may act in the same way to toward topically relevant links, including Topic-Sensitive PageRank, phrase-based indexing, and local inter-connectivity. These concepts also help discount spam links from non-relevant pages.
While I've included the image above, the concepts around Google's use of topical relevance is incredibly complex. For a primer on SEO relevance signals, I recommend reading:
Topical SEO: 7 Concepts of Link Relevance & Google Rankings
More than Keywords: 7 Concepts of Advanced On-Page SEO
9. Links From Fresh Pages Can Pass More Value Than Links From Stale Pages
Freshness counts.
Google uses several ways of evaluating content based on freshness. One way to determine the relevancy of a page is to look at the freshness of the links pointing at it.
The basic concept is that pages with links from fresher pages—e.g. newer pages and those more regularly updated—are likely more relevant than pages with links from mostly stale pages, or pages that haven’t been updated in a while.
For a good read on the subject, Justing Briggs has described and named this concept FreshRank.
A page with a burst of links from fresher pages may indicate immediate relevance, compared to a page that has had the same old links for the past 10 years. In these cases, the rate of link growth and the freshness of the linking pages can have a significant influence on rankings.
It's important to note that "old" is not the same thing as stale. A stale page is one that:
Isn't updated, often with outdated content
Earns fewer new links over time
Exhibits declining user engagement
If a page doesn't meet these requirements, it can be considered fresh - no matter its actual age. As Rand notes, "Old crusty links can also be really valuable, especially if the page is kept up to date."
10. The Rate of Link Growth Can Signal Freshness
If Google sees a burst of new links to a page, this could indicate a signal of relevance.
By the same measure, a decrease in the overall rate of link growth would indicate that the page has become stale, and likely to be devalued in search results.
All of these freshness concepts, and more, are covered by Google’s Information Retrieval Based on Historical Data patent.
If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you're about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)
11. Google Devalues Spam and Low-Quality Links
While there are trillions of links on the web, the truth is that Google likely ignores a large swath of them. Google’s goal is to focus on editorial links, e.g. “links that you didn't even have to ask for because they are editorially given by other website owners.” Since Penguin 4.0, Google has implied that their algorithms simply ignore links that they don’t feel meet these standards. These include links generated by negative SEO and link schemes.
That said, there’s lots of debate if Google truly ignores all low-quality links, as there’s evidence that low-quality links—especially those Google might see as manipulative—may actually hurt you.
12. Link Echos: The Influence Of A Link May Persist Even After It Disappears
Link Echos (a.k.a. Link Ghosts) describe the phenomenon where the ranking impact of a link often appears to persist, even long after the link is gone.
Rand has performed several experiments on this and the reverberation effect of links is incredibly persistent, even months after the links have dropped from the web, and Google has recrawled and indexed these pages several times.
Speculation as to why this happens includes: Google looking at other ranking factors once the page has climbed in rankings (e.g. user engagement), Google assigning persistence or degradation to link value that isn’t wholly dependent on its existence on the page, or factors we can’t quite recognize.
Whatever the root cause, the value of a link can have a reverberating, ethereal quality that exists separately from its HTML roots.
As a counterpoint, Niel Patel recently ran an experiment where rankings dropped after low-authority sites lost a large number of links all at once, so it appears possible to overcome this phenomenon under the right circumstances.
13. Sites Linking Out to Authoritative Content May Count More Than Those That Do Not
While Google claims that linking out to quality sites isn’t an explicit ranking factor, they’ve also made statements in the past that it can impact your search performance.
“In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites.” – Matt Cutts
Furthermore, multiple SEO experiments and anecdotal evidence over the years suggest that linking out to relevant, authoritative sites can result in a net positive effect on rankings and visibility.
14. Pages That Link To Spam May Devalue The Other Links They Host
If we take the quote above and focus specifically on the first part, we understand that Google trusts sites less when they link to spam.
This concept can be extended further, as there’s ample evidence of Google demoting sites it believes to be hosting paid links, or part of a private blog network.
Basic advice: when relevant and helpful, link to authoritative sites (and avoid linking to bad sites) when it will benefit your audience.
15. Nofollowed Links Aren't Followed, But May Have Value In Some Cases
Google invented the nofollow link specifically because many webmasters found it hard to prevent spammy, outbound links on their sites - especially those generated by comment spam and UGC.
A common belief is that nofollow links don’t count at all, but Google’s own language leaves some wriggle room. They don’t follow them absolutely, but “in general” and only “essentially” drop the links from their web graph.
That said, numerous SEO experiments and correlation data all suggest that nofollow links can have some value, and webmasters would be wise to maximize their value.
16. ManyJavaScript Links Pass Value, But Only If Google Renders Them
In the old days of SEO, it was common practice to “hide” links using JavaScript, knowing Google couldn’t crawl them.
Today, Google has gotten significantly better at crawling and rendering JavaScript, so that most JavaScript links today will count.
That said, Google still may not crawl or index every JavaScript link. For one, they need extra time and effort to render the JavaScript, and not every site delivers compatible code. Furthermore, Google only considers full links with an anchor tag and href attribute.
17. If A Page Links To The Same URL More Than Once, The First Link Has Priority
... Or more specifically, only the first anchor text counts. If Google crawls a page with two or more links pointing to the same URL, they have explained that while PageRank flows normally through both, they will only use the first anchor text for ranking purposes.
This scenario often comes into play when your sitewide navigation links to an important page, and you also link to it within an article below.
Through testing, folks have discovered a number of clever ways to bypass the First Link Priority rule, but newer studies haven’t been published for several years.
18. Robots.txt and Meta Robots May Impact How and Whether Links Are Seen
Seems obvious, but in order for Google to weigh a link in it’s ranking algorithm, it has to be able to crawl and follow it. Unsurprisingly, there are a number of site and page-level directives which can get in Google’s way. These include:
The URL is blocked from crawling by robots.txt
Robots meta tag or X-Robots-Tag HTTP header use the “nofollow” directive
The page is set to “noindex, follow” but Google eventually stops crawling
Often Google will include a URL in its search results if other pages link to it, even if that page is blocked by robots.txt. But because Google can’t actually crawl the page, any links on the page are virtually invisible.
19. Disavowed Links Don’t Pass Value (Typically)
If you’ve built some shady links, or been hit by a penalty, you can use Google’s disavow tool to help wipe away your sins.
By disavowing, Google effectively removes these backlinks for consideration when they crawl the web.
On the other hand, if Google thinks you’ve made a mistake with your disavow file, they may choose to ignore it entirely - probably to prevent you from self-inflicted harm.
20. Unlinked Mentions May Associate Data or Authority With A Website
Google may connect data about entities (concepts like a business, a person, a work of art, etc) without the presence of HTML links, like the way it does with local business citations or with which data refers to a brand, a movie, a notable person, etc.
In this fashion, unlinked mentions may still associate data or authority with a website or a set of information—even when no link is present.
Bill Slawski has written extensively about entities in search (a few examples here, here, and here). It’s a heady subject, but suffice to say Google doesn’t always need links to associate data and websites together, and strong entity associations may help a site to rank.
Below, you'll find all twenty principals combined into a single graphic. If you'd like to print or embed the image, click here for a higher-res version.
Please credit Moz when using any of these images.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
All Links are Not Created Equal: 20 New Graphics on Google’s Valuation of Links
Posted by Cyrus-Shepard
Twenty-two years ago, the founders of Google invented PageRank, and forever changed the web. A few things that made PageRank dramatically different from existing ranking algorithms:
Links on the web count as votes. Initially, all votes are equal.
Pages which receive more votes become more important (and rank higher.)
More important pages cast more important votes.
But Google didn't stop there: they innovated with anchor text, topic-modeling, content analysis, trust signals, user engagement, and more to deliver better and better results.
Links are no longer equal. Not by a long shot.
Rand Fishkin published the original version of this post in 2010—and to be honest, it rocked our world. Parts of his original have been heavily borrowed here, and Rand graciously consulted on this update.
In this post, we'll walk you through 20 principles of link valuation that have been observed and tested by SEOs. In some cases, they have been confirmed by Google, while others have been patented. Please note that these are not hard and fast rules, but principles that interplay with one another. A burst of fresh link can often outweigh powerful links, spam links can blunt the effect of fresh links, etc.
We strongly encourage you to test these yourselves. To quote Rand, "Nothing is better for learning SEO than going out and experimenting in the wild."
1. Links From Popular Pages Cast More Powerful Votes
Let’s begin with a foundational principle. This concept formed the basis of Google’s original PageRank patent, and quickly help vault it to the most popular search engine in the world. PageRank can become incredibly complex very quickly—but to oversimplify—the more votes (links) a page has pointed to it, the more PageRank (and other possible link-based signals) it accumulates. The more votes it accumulates, the more it can pass on to other pages through outbound links. In basic terms, popular pages are ones that have accumulated a lot of votes themselves. Scoring a link from a popular page can typically be more powerful than earning a link from a page with fewer link votes.
2. Links "Inside" Unique Main Content Pass More Value than Boilerplate Links
Google’s Reasonable Surfer, Semantic Distance, and Boilerplate patents all suggest valuing content and links more highly if they are positioned in the unique, main text area of the page, versus sidebars, headers, and footers, aka the “boilerplate.”
It certainly makes sense, as boilerplate links are not truly editorial, but typically automatically inserted by a CMS (even if a human decided to put them there.) Google’s Quality Rater Guidelines encourage evaluators to focus on the “Main Content” of a page.
Similarly, SEO experiments have found that links hidden within expandable tabs or accordions (by either CSS or JavaScript) may carry less weight than fully visible links, though Google says they fully index and weight these links.
3. Links Higher Up in the Main Content Cast More Powerful Votes
If you had a choice between 2 links, which would you choose?
One placed prominently in the first paragraph of a page, or
One placed lower beneath several paragraphs
Of course, you’d pick the link visitors would likely click on, and Google would want to do the same. Google’s Reasonable Surfer Patent describes methods for giving more weight to links it believes people will actually click, including links placed in more prominent positions on the page.
Matt Cutts, former head of Google’s Webspam team, once famously encouraged SEOs to pay attention to the first link on the page, and not bury important links. (source)
4. Links With Relevant Anchor Text May Pass More Value
Also included in Google’s Reasonable Surfer patent is the concept of giving more weight to links with relevant anchor text. This is only one of several Google patents where anchor text plays an important role. Multiple experiments over the years repeatedly confirm the power of relevant anchor text to boost a page’s ranking better than generic or non-relevant anchor text. It’s important to note that the same Google patents that propose boosting the value of highly-relevant anchors, also discuss devaluing or even ignoring off-topic or irrelevant anchors altogether. Not that you should spam your pages with an abundance of exact match anchors. Data typically shows that high ranking pages typically have a healthy, natural mix of relevant anchors pointing to them.
Similarly, links may carry the context of the words+phrases around/near the link. Though hard evidence is scant, this is mentioned in Google’s patents, and it makes sense that a link surrounded by topically relevant content would be more contextually relevant than the alternative.
5. Links from Unique Domains Matter More than Links from Previously Linking Sites
Experience shows that it’s far better to have 50 links from 50 different domains than to have 500 more links from a site that already links to you.
This makes sense, as Google’s algorithms are designed to measure popularity across the entire web and not simply popularity from a single site.
In fact, this idea has been supported by nearly every SEO ranking factor correlation study ever performed. The number of unique linking root domains is almost always a better predictor of Google rankings than a site’s raw number of total links.
Rand points out that this principle is not always universally true. "When given the option between a 2nd or 3rd link from the NYTimes vs. randomsitexyz, it's almost always more rank-boosting and marketing helpful to go with another NYT link."
6. External Links are More Influential than Internal Links
If we extend the concept from #3 above, then it follows that links from external sites should count more than internal links from your own site. The same correlation studies almost always show that high ranking sites are associated with more external links than lower ranking sites.
Search engines seem to follow the concept that what others say about you is more important than what you say about yourself.
That’s not to say that internal links don’t count. On the contrary, internal linking and good site architecture can be hugely impactful on Google rankings. That said, building external links is often the fastest way to higher rankings and more traffic.
7. Links from Sites Closer to a Trusted Seed Set May Pass More Value
The idea of TrustRank has been around for many years. Bill Slawski covers it here. More recently, Google updated its original PageRank patent with a section that incorporates the concept of “trust” using seed sites. The closer a site is linked to a trusted seed site, the more of a boost it receives.
In theory, this means that black hat Private Blog Networks (PBNs) would be less effective if they were a large link distance away from more trusted sites.
Beyond links, other ways that Google may evaluate trust is through online reputation—e.g. through online reviews or sentiment analysis—and use of accurate information (facts). This is of particular concern with YMYL (Your Money or Your Life) pages that "impact the future happiness, health, financial stability, or safety of users."
This means links from sites that Google considers misleading and/or dangerous may be valued less than links from sites that present more reputable information.
8. Links From Topically Relevant Pages May Cast More Powerful Votes
You run a dairy farm. All things being equal, would you rather have a link from:
The National Dairy Association
The Association of Automobile Mechanics
Hopefully, you choose “b” because you recognize it’s more relevant. Though several mechanisms, Google may act in the same way to toward topically relevant links, including Topic-Sensitive PageRank, phrase-based indexing, and local inter-connectivity. These concepts also help discount spam links from non-relevant pages.
While I've included the image above, the concepts around Google's use of topical relevance is incredibly complex. For a primer on SEO relevance signals, I recommend reading:
Topical SEO: 7 Concepts of Link Relevance & Google Rankings
More than Keywords: 7 Concepts of Advanced On-Page SEO
9. Links From Fresh Pages Can Pass More Value Than Links From Stale Pages
Freshness counts.
Google uses several ways of evaluating content based on freshness. One way to determine the relevancy of a page is to look at the freshness of the links pointing at it.
The basic concept is that pages with links from fresher pages—e.g. newer pages and those more regularly updated—are likely more relevant than pages with links from mostly stale pages, or pages that haven’t been updated in a while.
For a good read on the subject, Justing Briggs has described and named this concept FreshRank.
A page with a burst of links from fresher pages may indicate immediate relevance, compared to a page that has had the same old links for the past 10 years. In these cases, the rate of link growth and the freshness of the linking pages can have a significant influence on rankings.
It's important to note that "old" is not the same thing as stale. A stale page is one that:
Isn't updated, often with outdated content
Earns fewer new links over time
Exhibits declining user engagement
If a page doesn't meet these requirements, it can be considered fresh - no matter its actual age. As Rand notes, "Old crusty links can also be really valuable, especially if the page is kept up to date."
10. The Rate of Link Growth Can Signal Freshness
If Google sees a burst of new links to a page, this could indicate a signal of relevance.
By the same measure, a decrease in the overall rate of link growth would indicate that the page has become stale, and likely to be devalued in search results.
All of these freshness concepts, and more, are covered by Google’s Information Retrieval Based on Historical Data patent.
If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you're about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)
11. Google Devalues Spam and Low-Quality Links
While there are trillions of links on the web, the truth is that Google likely ignores a large swath of them. Google’s goal is to focus on editorial links, e.g. “links that you didn't even have to ask for because they are editorially given by other website owners.” Since Penguin 4.0, Google has implied that their algorithms simply ignore links that they don’t feel meet these standards. These include links generated by negative SEO and link schemes.
That said, there’s lots of debate if Google truly ignores all low-quality links, as there’s evidence that low-quality links—especially those Google might see as manipulative—may actually hurt you.
12. Link Echos: The Influence Of A Link May Persist Even After It Disappears
Link Echos (a.k.a. Link Ghosts) describe the phenomenon where the ranking impact of a link often appears to persist, even long after the link is gone.
Rand has performed several experiments on this and the reverberation effect of links is incredibly persistent, even months after the links have dropped from the web, and Google has recrawled and indexed these pages several times.
Speculation as to why this happens includes: Google looking at other ranking factors once the page has climbed in rankings (e.g. user engagement), Google assigning persistence or degradation to link value that isn’t wholly dependent on its existence on the page, or factors we can’t quite recognize.
Whatever the root cause, the value of a link can have a reverberating, ethereal quality that exists separately from its HTML roots.
As a counterpoint, Niel Patel recently ran an experiment where rankings dropped after low-authority sites lost a large number of links all at once, so it appears possible to overcome this phenomenon under the right circumstances.
13. Sites Linking Out to Authoritative Content May Count More Than Those That Do Not
While Google claims that linking out to quality sites isn’t an explicit ranking factor, they’ve also made statements in the past that it can impact your search performance.
“In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites.” – Matt Cutts
Furthermore, multiple SEO experiments and anecdotal evidence over the years suggest that linking out to relevant, authoritative sites can result in a net positive effect on rankings and visibility.
14. Pages That Link To Spam May Devalue The Other Links They Host
If we take the quote above and focus specifically on the first part, we understand that Google trusts sites less when they link to spam.
This concept can be extended further, as there’s ample evidence of Google demoting sites it believes to be hosting paid links, or part of a private blog network.
Basic advice: when relevant and helpful, link to authoritative sites (and avoid linking to bad sites) when it will benefit your audience.
15. Nofollowed Links Aren't Followed, But May Have Value In Some Cases
Google invented the nofollow link specifically because many webmasters found it hard to prevent spammy, outbound links on their sites - especially those generated by comment spam and UGC.
A common belief is that nofollow links don’t count at all, but Google’s own language leaves some wriggle room. They don’t follow them absolutely, but “in general” and only “essentially” drop the links from their web graph.
That said, numerous SEO experiments and correlation data all suggest that nofollow links can have some value, and webmasters would be wise to maximize their value.
16. ManyJavaScript Links Pass Value, But Only If Google Renders Them
In the old days of SEO, it was common practice to “hide” links using JavaScript, knowing Google couldn’t crawl them.
Today, Google has gotten significantly better at crawling and rendering JavaScript, so that most JavaScript links today will count.
That said, Google still may not crawl or index every JavaScript link. For one, they need extra time and effort to render the JavaScript, and not every site delivers compatible code. Furthermore, Google only considers full links with an anchor tag and href attribute.
17. If A Page Links To The Same URL More Than Once, The First Link Has Priority
... Or more specifically, only the first anchor text counts. If Google crawls a page with two or more links pointing to the same URL, they have explained that while PageRank flows normally through both, they will only use the first anchor text for ranking purposes.
This scenario often comes into play when your sitewide navigation links to an important page, and you also link to it within an article below.
Through testing, folks have discovered a number of clever ways to bypass the First Link Priority rule, but newer studies haven’t been published for several years.
18. Robots.txt and Meta Robots May Impact How and Whether Links Are Seen
Seems obvious, but in order for Google to weigh a link in it’s ranking algorithm, it has to be able to crawl and follow it. Unsurprisingly, there are a number of site and page-level directives which can get in Google’s way. These include:
The URL is blocked from crawling by robots.txt
Robots meta tag or X-Robots-Tag HTTP header use the “nofollow” directive
The page is set to “noindex, follow” but Google eventually stops crawling
Often Google will include a URL in its search results if other pages link to it, even if that page is blocked by robots.txt. But because Google can’t actually crawl the page, any links on the page are virtually invisible.
19. Disavowed Links Don’t Pass Value (Typically)
If you’ve built some shady links, or been hit by a penalty, you can use Google’s disavow tool to help wipe away your sins.
By disavowing, Google effectively removes these backlinks for consideration when they crawl the web.
On the other hand, if Google thinks you’ve made a mistake with your disavow file, they may choose to ignore it entirely - probably to prevent you from self-inflicted harm.
20. Unlinked Mentions May Associate Data or Authority With A Website
Google may connect data about entities (concepts like a business, a person, a work of art, etc) without the presence of HTML links, like the way it does with local business citations or with which data refers to a brand, a movie, a notable person, etc.
In this fashion, unlinked mentions may still associate data or authority with a website or a set of information—even when no link is present.
Bill Slawski has written extensively about entities in search (a few examples here, here, and here). It’s a heady subject, but suffice to say Google doesn’t always need links to associate data and websites together, and strong entity associations may help a site to rank.
Below, you'll find all twenty principals combined into a single graphic. If you'd like to print or embed the image, click here for a higher-res version.
Please credit Moz when using any of these images.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
All Links are Not Created Equal: 20 New Graphics on Google’s Valuation of Links
Posted by Cyrus-Shepard
Twenty-two years ago, the founders of Google invented PageRank, and forever changed the web. A few things that made PageRank dramatically different from existing ranking algorithms:
Links on the web count as votes. Initially, all votes are equal.
Pages which receive more votes become more important (and rank higher.)
More important pages cast more important votes.
But Google didn't stop there: they innovated with anchor text, topic-modeling, content analysis, trust signals, user engagement, and more to deliver better and better results.
Links are no longer equal. Not by a long shot.
Rand Fishkin published the original version of this post in 2010—and to be honest, it rocked our world. Parts of his original have been heavily borrowed here, and Rand graciously consulted on this update.
In this post, we'll walk you through 20 principles of link valuation that have been observed and tested by SEOs. In some cases, they have been confirmed by Google, while others have been patented. Please note that these are not hard and fast rules, but principles that interplay with one another. A burst of fresh link can often outweigh powerful links, spam links can blunt the effect of fresh links, etc.
We strongly encourage you to test these yourselves. To quote Rand, "Nothing is better for learning SEO than going out and experimenting in the wild."
1. Links From Popular Pages Cast More Powerful Votes
Let’s begin with a foundational principle. This concept formed the basis of Google’s original PageRank patent, and quickly help vault it to the most popular search engine in the world. PageRank can become incredibly complex very quickly—but to oversimplify—the more votes (links) a page has pointed to it, the more PageRank (and other possible link-based signals) it accumulates. The more votes it accumulates, the more it can pass on to other pages through outbound links. In basic terms, popular pages are ones that have accumulated a lot of votes themselves. Scoring a link from a popular page can typically be more powerful than earning a link from a page with fewer link votes.
2. Links "Inside" Unique Main Content Pass More Value than Boilerplate Links
Google’s Reasonable Surfer, Semantic Distance, and Boilerplate patents all suggest valuing content and links more highly if they are positioned in the unique, main text area of the page, versus sidebars, headers, and footers, aka the “boilerplate.”
It certainly makes sense, as boilerplate links are not truly editorial, but typically automatically inserted by a CMS (even if a human decided to put them there.) Google’s Quality Rater Guidelines encourage evaluators to focus on the “Main Content” of a page.
Similarly, SEO experiments have found that links hidden within expandable tabs or accordions (by either CSS or JavaScript) may carry less weight than fully visible links, though Google says they fully index and weight these links.
3. Links Higher Up in the Main Content Cast More Powerful Votes
If you had a choice between 2 links, which would you choose?
One placed prominently in the first paragraph of a page, or
One placed lower beneath several paragraphs
Of course, you’d pick the link visitors would likely click on, and Google would want to do the same. Google’s Reasonable Surfer Patent describes methods for giving more weight to links it believes people will actually click, including links placed in more prominent positions on the page.
Matt Cutts, former head of Google’s Webspam team, once famously encouraged SEOs to pay attention to the first link on the page, and not bury important links. (source)
4. Links With Relevant Anchor Text May Pass More Value
Also included in Google’s Reasonable Surfer patent is the concept of giving more weight to links with relevant anchor text. This is only one of several Google patents where anchor text plays an important role. Multiple experiments over the years repeatedly confirm the power of relevant anchor text to boost a page’s ranking better than generic or non-relevant anchor text. It’s important to note that the same Google patents that propose boosting the value of highly-relevant anchors, also discuss devaluing or even ignoring off-topic or irrelevant anchors altogether. Not that you should spam your pages with an abundance of exact match anchors. Data typically shows that high ranking pages typically have a healthy, natural mix of relevant anchors pointing to them.
Similarly, links may carry the context of the words+phrases around/near the link. Though hard evidence is scant, this is mentioned in Google’s patents, and it makes sense that a link surrounded by topically relevant content would be more contextually relevant than the alternative.
5. Links from Unique Domains Matter More than Links from Previously Linking Sites
Experience shows that it’s far better to have 50 links from 50 different domains than to have 500 more links from a site that already links to you.
This makes sense, as Google’s algorithms are designed to measure popularity across the entire web and not simply popularity from a single site.
In fact, this idea has been supported by nearly every SEO ranking factor correlation study ever performed. The number of unique linking root domains is almost always a better predictor of Google rankings than a site’s raw number of total links.
Rand points out that this principle is not always universally true. "When given the option between a 2nd or 3rd link from the NYTimes vs. randomsitexyz, it's almost always more rank-boosting and marketing helpful to go with another NYT link."
6. External Links are More Influential than Internal Links
If we extend the concept from #3 above, then it follows that links from external sites should count more than internal links from your own site. The same correlation studies almost always show that high ranking sites are associated with more external links than lower ranking sites.
Search engines seem to follow the concept that what others say about you is more important than what you say about yourself.
That’s not to say that internal links don’t count. On the contrary, internal linking and good site architecture can be hugely impactful on Google rankings. That said, building external links is often the fastest way to higher rankings and more traffic.
7. Links from Sites Closer to a Trusted Seed Set May Pass More Value
The idea of TrustRank has been around for many years. Bill Slawski covers it here. More recently, Google updated its original PageRank patent with a section that incorporates the concept of “trust” using seed sites. The closer a site is linked to a trusted seed site, the more of a boost it receives.
In theory, this means that black hat Private Blog Networks (PBNs) would be less effective if they were a large link distance away from more trusted sites.
Beyond links, other ways that Google may evaluate trust is through online reputation—e.g. through online reviews or sentiment analysis—and use of accurate information (facts). This is of particular concern with YMYL (Your Money or Your Life) pages that "impact the future happiness, health, financial stability, or safety of users."
This means links from sites that Google considers misleading and/or dangerous may be valued less than links from sites that present more reputable information.
8. Links From Topically Relevant Pages May Cast More Powerful Votes
You run a dairy farm. All things being equal, would you rather have a link from:
The National Dairy Association
The Association of Automobile Mechanics
Hopefully, you choose “b” because you recognize it’s more relevant. Though several mechanisms, Google may act in the same way to toward topically relevant links, including Topic-Sensitive PageRank, phrase-based indexing, and local inter-connectivity. These concepts also help discount spam links from non-relevant pages.
While I've included the image above, the concepts around Google's use of topical relevance is incredibly complex. For a primer on SEO relevance signals, I recommend reading:
Topical SEO: 7 Concepts of Link Relevance & Google Rankings
More than Keywords: 7 Concepts of Advanced On-Page SEO
9. Links From Fresh Pages Can Pass More Value Than Links From Stale Pages
Freshness counts.
Google uses several ways of evaluating content based on freshness. One way to determine the relevancy of a page is to look at the freshness of the links pointing at it.
The basic concept is that pages with links from fresher pages—e.g. newer pages and those more regularly updated—are likely more relevant than pages with links from mostly stale pages, or pages that haven’t been updated in a while.
For a good read on the subject, Justing Briggs has described and named this concept FreshRank.
A page with a burst of links from fresher pages may indicate immediate relevance, compared to a page that has had the same old links for the past 10 years. In these cases, the rate of link growth and the freshness of the linking pages can have a significant influence on rankings.
It's important to note that "old" is not the same thing as stale. A stale page is one that:
Isn't updated, often with outdated content
Earns fewer new links over time
Exhibits declining user engagement
If a page doesn't meet these requirements, it can be considered fresh - no matter its actual age. As Rand notes, "Old crusty links can also be really valuable, especially if the page is kept up to date."
10. The Rate of Link Growth Can Signal Freshness
If Google sees a burst of new links to a page, this could indicate a signal of relevance.
By the same measure, a decrease in the overall rate of link growth would indicate that the page has become stale, and likely to be devalued in search results.
All of these freshness concepts, and more, are covered by Google’s Information Retrieval Based on Historical Data patent.
If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you're about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)
11. Google Devalues Spam and Low-Quality Links
While there are trillions of links on the web, the truth is that Google likely ignores a large swath of them. Google’s goal is to focus on editorial links, e.g. “links that you didn't even have to ask for because they are editorially given by other website owners.” Since Penguin 4.0, Google has implied that their algorithms simply ignore links that they don’t feel meet these standards. These include links generated by negative SEO and link schemes.
That said, there’s lots of debate if Google truly ignores all low-quality links, as there’s evidence that low-quality links—especially those Google might see as manipulative—may actually hurt you.
12. Link Echos: The Influence Of A Link May Persist Even After It Disappears
Link Echos (a.k.a. Link Ghosts) describe the phenomenon where the ranking impact of a link often appears to persist, even long after the link is gone.
Rand has performed several experiments on this and the reverberation effect of links is incredibly persistent, even months after the links have dropped from the web, and Google has recrawled and indexed these pages several times.
Speculation as to why this happens includes: Google looking at other ranking factors once the page has climbed in rankings (e.g. user engagement), Google assigning persistence or degradation to link value that isn’t wholly dependent on its existence on the page, or factors we can’t quite recognize.
Whatever the root cause, the value of a link can have a reverberating, ethereal quality that exists separately from its HTML roots.
As a counterpoint, Niel Patel recently ran an experiment where rankings dropped after low-authority sites lost a large number of links all at once, so it appears possible to overcome this phenomenon under the right circumstances.
13. Sites Linking Out to Authoritative Content May Count More Than Those That Do Not
While Google claims that linking out to quality sites isn’t an explicit ranking factor, they’ve also made statements in the past that it can impact your search performance.
“In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites.” – Matt Cutts
Furthermore, multiple SEO experiments and anecdotal evidence over the years suggest that linking out to relevant, authoritative sites can result in a net positive effect on rankings and visibility.
14. Pages That Link To Spam May Devalue The Other Links They Host
If we take the quote above and focus specifically on the first part, we understand that Google trusts sites less when they link to spam.
This concept can be extended further, as there’s ample evidence of Google demoting sites it believes to be hosting paid links, or part of a private blog network.
Basic advice: when relevant and helpful, link to authoritative sites (and avoid linking to bad sites) when it will benefit your audience.
15. Nofollowed Links Aren't Followed, But May Have Value In Some Cases
Google invented the nofollow link specifically because many webmasters found it hard to prevent spammy, outbound links on their sites - especially those generated by comment spam and UGC.
A common belief is that nofollow links don’t count at all, but Google’s own language leaves some wriggle room. They don’t follow them absolutely, but “in general” and only “essentially” drop the links from their web graph.
That said, numerous SEO experiments and correlation data all suggest that nofollow links can have some value, and webmasters would be wise to maximize their value.
16. ManyJavaScript Links Pass Value, But Only If Google Renders Them
In the old days of SEO, it was common practice to “hide” links using JavaScript, knowing Google couldn’t crawl them.
Today, Google has gotten significantly better at crawling and rendering JavaScript, so that most JavaScript links today will count.
That said, Google still may not crawl or index every JavaScript link. For one, they need extra time and effort to render the JavaScript, and not every site delivers compatible code. Furthermore, Google only considers full links with an anchor tag and href attribute.
17. If A Page Links To The Same URL More Than Once, The First Link Has Priority
... Or more specifically, only the first anchor text counts. If Google crawls a page with two or more links pointing to the same URL, they have explained that while PageRank flows normally through both, they will only use the first anchor text for ranking purposes.
This scenario often comes into play when your sitewide navigation links to an important page, and you also link to it within an article below.
Through testing, folks have discovered a number of clever ways to bypass the First Link Priority rule, but newer studies haven’t been published for several years.
18. Robots.txt and Meta Robots May Impact How and Whether Links Are Seen
Seems obvious, but in order for Google to weigh a link in it’s ranking algorithm, it has to be able to crawl and follow it. Unsurprisingly, there are a number of site and page-level directives which can get in Google’s way. These include:
The URL is blocked from crawling by robots.txt
Robots meta tag or X-Robots-Tag HTTP header use the “nofollow” directive
The page is set to “noindex, follow” but Google eventually stops crawling
Often Google will include a URL in its search results if other pages link to it, even if that page is blocked by robots.txt. But because Google can’t actually crawl the page, any links on the page are virtually invisible.
19. Disavowed Links Don’t Pass Value (Typically)
If you’ve built some shady links, or been hit by a penalty, you can use Google’s disavow tool to help wipe away your sins.
By disavowing, Google effectively removes these backlinks for consideration when they crawl the web.
On the other hand, if Google thinks you’ve made a mistake with your disavow file, they may choose to ignore it entirely - probably to prevent you from self-inflicted harm.
20. Unlinked Mentions May Associate Data or Authority With A Website
Google may connect data about entities (concepts like a business, a person, a work of art, etc) without the presence of HTML links, like the way it does with local business citations or with which data refers to a brand, a movie, a notable person, etc.
In this fashion, unlinked mentions may still associate data or authority with a website or a set of information—even when no link is present.
Bill Slawski has written extensively about entities in search (a few examples here, here, and here). It’s a heady subject, but suffice to say Google doesn’t always need links to associate data and websites together, and strong entity associations may help a site to rank.
Below, you'll find all twenty principals combined into a single graphic. If you'd like to print or embed the image, click here for a higher-res version.
Please credit Moz when using any of these images.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
All Links are Not Created Equal: 20 New Graphics on Google’s Valuation of Links
Posted by Cyrus-Shepard
Twenty-two years ago, the founders of Google invented PageRank, and forever changed the web. A few things that made PageRank dramatically different from existing ranking algorithms:
Links on the web count as votes. Initially, all votes are equal.
Pages which receive more votes become more important (and rank higher.)
More important pages cast more important votes.
But Google didn't stop there: they innovated with anchor text, topic-modeling, content analysis, trust signals, user engagement, and more to deliver better and better results.
Links are no longer equal. Not by a long shot.
Rand Fishkin published the original version of this post in 2010—and to be honest, it rocked our world. Parts of his original have been heavily borrowed here, and Rand graciously consulted on this update.
In this post, we'll walk you through 20 principles of link valuation that have been observed and tested by SEOs. In some cases, they have been confirmed by Google, while others have been patented. Please note that these are not hard and fast rules, but principles that interplay with one another. A burst of fresh link can often outweigh powerful links, spam links can blunt the effect of fresh links, etc.
We strongly encourage you to test these yourselves. To quote Rand, "Nothing is better for learning SEO than going out and experimenting in the wild."
1. Links From Popular Pages Cast More Powerful Votes
Let’s begin with a foundational principle. This concept formed the basis of Google’s original PageRank patent, and quickly help vault it to the most popular search engine in the world. PageRank can become incredibly complex very quickly—but to oversimplify—the more votes (links) a page has pointed to it, the more PageRank (and other possible link-based signals) it accumulates. The more votes it accumulates, the more it can pass on to other pages through outbound links. In basic terms, popular pages are ones that have accumulated a lot of votes themselves. Scoring a link from a popular page can typically be more powerful than earning a link from a page with fewer link votes.
2. Links "Inside" Unique Main Content Pass More Value than Boilerplate Links
Google’s Reasonable Surfer, Semantic Distance, and Boilerplate patents all suggest valuing content and links more highly if they are positioned in the unique, main text area of the page, versus sidebars, headers, and footers, aka the “boilerplate.”
It certainly makes sense, as boilerplate links are not truly editorial, but typically automatically inserted by a CMS (even if a human decided to put them there.) Google’s Quality Rater Guidelines encourage evaluators to focus on the “Main Content” of a page.
Similarly, SEO experiments have found that links hidden within expandable tabs or accordions (by either CSS or JavaScript) may carry less weight than fully visible links, though Google says they fully index and weight these links.
3. Links Higher Up in the Main Content Cast More Powerful Votes
If you had a choice between 2 links, which would you choose?
One placed prominently in the first paragraph of a page, or
One placed lower beneath several paragraphs
Of course, you’d pick the link visitors would likely click on, and Google would want to do the same. Google’s Reasonable Surfer Patent describes methods for giving more weight to links it believes people will actually click, including links placed in more prominent positions on the page.
Matt Cutts, former head of Google’s Webspam team, once famously encouraged SEOs to pay attention to the first link on the page, and not bury important links. (source)
4. Links With Relevant Anchor Text May Pass More Value
Also included in Google’s Reasonable Surfer patent is the concept of giving more weight to links with relevant anchor text. This is only one of several Google patents where anchor text plays an important role. Multiple experiments over the years repeatedly confirm the power of relevant anchor text to boost a page’s ranking better than generic or non-relevant anchor text. It’s important to note that the same Google patents that propose boosting the value of highly-relevant anchors, also discuss devaluing or even ignoring off-topic or irrelevant anchors altogether. Not that you should spam your pages with an abundance of exact match anchors. Data typically shows that high ranking pages typically have a healthy, natural mix of relevant anchors pointing to them.
Similarly, links may carry the context of the words+phrases around/near the link. Though hard evidence is scant, this is mentioned in Google’s patents, and it makes sense that a link surrounded by topically relevant content would be more contextually relevant than the alternative.
5. Links from Unique Domains Matter More than Links from Previously Linking Sites
Experience shows that it’s far better to have 50 links from 50 different domains than to have 500 more links from a site that already links to you.
This makes sense, as Google’s algorithms are designed to measure popularity across the entire web and not simply popularity from a single site.
In fact, this idea has been supported by nearly every SEO ranking factor correlation study ever performed. The number of unique linking root domains is almost always a better predictor of Google rankings than a site’s raw number of total links.
Rand points out that this principle is not always universally true. "When given the option between a 2nd or 3rd link from the NYTimes vs. randomsitexyz, it's almost always more rank-boosting and marketing helpful to go with another NYT link."
6. External Links are More Influential than Internal Links
If we extend the concept from #3 above, then it follows that links from external sites should count more than internal links from your own site. The same correlation studies almost always show that high ranking sites are associated with more external links than lower ranking sites.
Search engines seem to follow the concept that what others say about you is more important than what you say about yourself.
That’s not to say that internal links don’t count. On the contrary, internal linking and good site architecture can be hugely impactful on Google rankings. That said, building external links is often the fastest way to higher rankings and more traffic.
7. Links from Sites Closer to a Trusted Seed Set May Pass More Value
The idea of TrustRank has been around for many years. Bill Slawski covers it here. More recently, Google updated its original PageRank patent with a section that incorporates the concept of “trust” using seed sites. The closer a site is linked to a trusted seed site, the more of a boost it receives.
In theory, this means that black hat Private Blog Networks (PBNs) would be less effective if they were a large link distance away from more trusted sites.
Beyond links, other ways that Google may evaluate trust is through online reputation—e.g. through online reviews or sentiment analysis—and use of accurate information (facts). This is of particular concern with YMYL (Your Money or Your Life) pages that "impact the future happiness, health, financial stability, or safety of users."
This means links from sites that Google considers misleading and/or dangerous may be valued less than links from sites that present more reputable information.
8. Links From Topically Relevant Pages May Cast More Powerful Votes
You run a dairy farm. All things being equal, would you rather have a link from:
The National Dairy Association
The Association of Automobile Mechanics
Hopefully, you choose “b” because you recognize it’s more relevant. Though several mechanisms, Google may act in the same way to toward topically relevant links, including Topic-Sensitive PageRank, phrase-based indexing, and local inter-connectivity. These concepts also help discount spam links from non-relevant pages.
While I've included the image above, the concepts around Google's use of topical relevance is incredibly complex. For a primer on SEO relevance signals, I recommend reading:
Topical SEO: 7 Concepts of Link Relevance & Google Rankings
More than Keywords: 7 Concepts of Advanced On-Page SEO
9. Links From Fresh Pages Can Pass More Value Than Links From Stale Pages
Freshness counts.
Google uses several ways of evaluating content based on freshness. One way to determine the relevancy of a page is to look at the freshness of the links pointing at it.
The basic concept is that pages with links from fresher pages—e.g. newer pages and those more regularly updated—are likely more relevant than pages with links from mostly stale pages, or pages that haven’t been updated in a while.
For a good read on the subject, Justing Briggs has described and named this concept FreshRank.
A page with a burst of links from fresher pages may indicate immediate relevance, compared to a page that has had the same old links for the past 10 years. In these cases, the rate of link growth and the freshness of the linking pages can have a significant influence on rankings.
It's important to note that "old" is not the same thing as stale. A stale page is one that:
Isn't updated, often with outdated content
Earns fewer new links over time
Exhibits declining user engagement
If a page doesn't meet these requirements, it can be considered fresh - no matter its actual age. As Rand notes, "Old crusty links can also be really valuable, especially if the page is kept up to date."
10. The Rate of Link Growth Can Signal Freshness
If Google sees a burst of new links to a page, this could indicate a signal of relevance.
By the same measure, a decrease in the overall rate of link growth would indicate that the page has become stale, and likely to be devalued in search results.
All of these freshness concepts, and more, are covered by Google’s Information Retrieval Based on Historical Data patent.
If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you're about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)
11. Google Devalues Spam and Low-Quality Links
While there are trillions of links on the web, the truth is that Google likely ignores a large swath of them. Google’s goal is to focus on editorial links, e.g. “links that you didn't even have to ask for because they are editorially given by other website owners.” Since Penguin 4.0, Google has implied that their algorithms simply ignore links that they don’t feel meet these standards. These include links generated by negative SEO and link schemes.
That said, there’s lots of debate if Google truly ignores all low-quality links, as there’s evidence that low-quality links—especially those Google might see as manipulative—may actually hurt you.
12. Link Echos: The Influence Of A Link May Persist Even After It Disappears
Link Echos (a.k.a. Link Ghosts) describe the phenomenon where the ranking impact of a link often appears to persist, even long after the link is gone.
Rand has performed several experiments on this and the reverberation effect of links is incredibly persistent, even months after the links have dropped from the web, and Google has recrawled and indexed these pages several times.
Speculation as to why this happens includes: Google looking at other ranking factors once the page has climbed in rankings (e.g. user engagement), Google assigning persistence or degradation to link value that isn’t wholly dependent on its existence on the page, or factors we can’t quite recognize.
Whatever the root cause, the value of a link can have a reverberating, ethereal quality that exists separately from its HTML roots.
As a counterpoint, Niel Patel recently ran an experiment where rankings dropped after low-authority sites lost a large number of links all at once, so it appears possible to overcome this phenomenon under the right circumstances.
13. Sites Linking Out to Authoritative Content May Count More Than Those That Do Not
While Google claims that linking out to quality sites isn’t an explicit ranking factor, they’ve also made statements in the past that it can impact your search performance.
“In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites.” – Matt Cutts
Furthermore, multiple SEO experiments and anecdotal evidence over the years suggest that linking out to relevant, authoritative sites can result in a net positive effect on rankings and visibility.
14. Pages That Link To Spam May Devalue The Other Links They Host
If we take the quote above and focus specifically on the first part, we understand that Google trusts sites less when they link to spam.
This concept can be extended further, as there’s ample evidence of Google demoting sites it believes to be hosting paid links, or part of a private blog network.
Basic advice: when relevant and helpful, link to authoritative sites (and avoid linking to bad sites) when it will benefit your audience.
15. Nofollowed Links Aren't Followed, But May Have Value In Some Cases
Google invented the nofollow link specifically because many webmasters found it hard to prevent spammy, outbound links on their sites - especially those generated by comment spam and UGC.
A common belief is that nofollow links don’t count at all, but Google’s own language leaves some wriggle room. They don’t follow them absolutely, but “in general” and only “essentially” drop the links from their web graph.
That said, numerous SEO experiments and correlation data all suggest that nofollow links can have some value, and webmasters would be wise to maximize their value.
16. ManyJavaScript Links Pass Value, But Only If Google Renders Them
In the old days of SEO, it was common practice to “hide” links using JavaScript, knowing Google couldn’t crawl them.
Today, Google has gotten significantly better at crawling and rendering JavaScript, so that most JavaScript links today will count.
That said, Google still may not crawl or index every JavaScript link. For one, they need extra time and effort to render the JavaScript, and not every site delivers compatible code. Furthermore, Google only considers full links with an anchor tag and href attribute.
17. If A Page Links To The Same URL More Than Once, The First Link Has Priority
... Or more specifically, only the first anchor text counts. If Google crawls a page with two or more links pointing to the same URL, they have explained that while PageRank flows normally through both, they will only use the first anchor text for ranking purposes.
This scenario often comes into play when your sitewide navigation links to an important page, and you also link to it within an article below.
Through testing, folks have discovered a number of clever ways to bypass the First Link Priority rule, but newer studies haven’t been published for several years.
18. Robots.txt and Meta Robots May Impact How and Whether Links Are Seen
Seems obvious, but in order for Google to weigh a link in it’s ranking algorithm, it has to be able to crawl and follow it. Unsurprisingly, there are a number of site and page-level directives which can get in Google’s way. These include:
The URL is blocked from crawling by robots.txt
Robots meta tag or X-Robots-Tag HTTP header use the “nofollow” directive
The page is set to “noindex, follow” but Google eventually stops crawling
Often Google will include a URL in its search results if other pages link to it, even if that page is blocked by robots.txt. But because Google can’t actually crawl the page, any links on the page are virtually invisible.
19. Disavowed Links Don’t Pass Value (Typically)
If you’ve built some shady links, or been hit by a penalty, you can use Google’s disavow tool to help wipe away your sins.
By disavowing, Google effectively removes these backlinks for consideration when they crawl the web.
On the other hand, if Google thinks you’ve made a mistake with your disavow file, they may choose to ignore it entirely - probably to prevent you from self-inflicted harm.
20. Unlinked Mentions May Associate Data or Authority With A Website
Google may connect data about entities (concepts like a business, a person, a work of art, etc) without the presence of HTML links, like the way it does with local business citations or with which data refers to a brand, a movie, a notable person, etc.
In this fashion, unlinked mentions may still associate data or authority with a website or a set of information—even when no link is present.
Bill Slawski has written extensively about entities in search (a few examples here, here, and here). It’s a heady subject, but suffice to say Google doesn’t always need links to associate data and websites together, and strong entity associations may help a site to rank.
Below, you'll find all twenty principals combined into a single graphic. If you'd like to print or embed the image, click here for a higher-res version.
Please credit Moz when using any of these images.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
All Links are Not Created Equal: 20 New Graphics on Google’s Valuation of Links
Posted by Cyrus-Shepard
Twenty-two years ago, the founders of Google invented PageRank, and forever changed the web. A few things that made PageRank dramatically different from existing ranking algorithms:
Links on the web count as votes. Initially, all votes are equal.
Pages which receive more votes become more important (and rank higher.)
More important pages cast more important votes.
But Google didn't stop there: they innovated with anchor text, topic-modeling, content analysis, trust signals, user engagement, and more to deliver better and better results.
Links are no longer equal. Not by a long shot.
Rand Fishkin published the original version of this post in 2010—and to be honest, it rocked our world. Parts of his original have been heavily borrowed here, and Rand graciously consulted on this update.
In this post, we'll walk you through 20 principles of link valuation that have been observed and tested by SEOs. In some cases, they have been confirmed by Google, while others have been patented. Please note that these are not hard and fast rules, but principles that interplay with one another. A burst of fresh link can often outweigh powerful links, spam links can blunt the effect of fresh links, etc.
We strongly encourage you to test these yourselves. To quote Rand, "Nothing is better for learning SEO than going out and experimenting in the wild."
1. Links From Popular Pages Cast More Powerful Votes
Let’s begin with a foundational principle. This concept formed the basis of Google’s original PageRank patent, and quickly help vault it to the most popular search engine in the world. PageRank can become incredibly complex very quickly—but to oversimplify—the more votes (links) a page has pointed to it, the more PageRank (and other possible link-based signals) it accumulates. The more votes it accumulates, the more it can pass on to other pages through outbound links. In basic terms, popular pages are ones that have accumulated a lot of votes themselves. Scoring a link from a popular page can typically be more powerful than earning a link from a page with fewer link votes.
2. Links "Inside" Unique Main Content Pass More Value than Boilerplate Links
Google’s Reasonable Surfer, Semantic Distance, and Boilerplate patents all suggest valuing content and links more highly if they are positioned in the unique, main text area of the page, versus sidebars, headers, and footers, aka the “boilerplate.”
It certainly makes sense, as boilerplate links are not truly editorial, but typically automatically inserted by a CMS (even if a human decided to put them there.) Google’s Quality Rater Guidelines encourage evaluators to focus on the “Main Content” of a page.
Similarly, SEO experiments have found that links hidden within expandable tabs or accordions (by either CSS or JavaScript) may carry less weight than fully visible links, though Google says they fully index and weight these links.
3. Links Higher Up in the Main Content Cast More Powerful Votes
If you had a choice between 2 links, which would you choose?
One placed prominently in the first paragraph of a page, or
One placed lower beneath several paragraphs
Of course, you’d pick the link visitors would likely click on, and Google would want to do the same. Google’s Reasonable Surfer Patent describes methods for giving more weight to links it believes people will actually click, including links placed in more prominent positions on the page.
Matt Cutts, former head of Google’s Webspam team, once famously encouraged SEOs to pay attention to the first link on the page, and not bury important links. (source)
4. Links With Relevant Anchor Text May Pass More Value
Also included in Google’s Reasonable Surfer patent is the concept of giving more weight to links with relevant anchor text. This is only one of several Google patents where anchor text plays an important role. Multiple experiments over the years repeatedly confirm the power of relevant anchor text to boost a page’s ranking better than generic or non-relevant anchor text. It’s important to note that the same Google patents that propose boosting the value of highly-relevant anchors, also discuss devaluing or even ignoring off-topic or irrelevant anchors altogether. Not that you should spam your pages with an abundance of exact match anchors. Data typically shows that high ranking pages typically have a healthy, natural mix of relevant anchors pointing to them.
Similarly, links may carry the context of the words+phrases around/near the link. Though hard evidence is scant, this is mentioned in Google’s patents, and it makes sense that a link surrounded by topically relevant content would be more contextually relevant than the alternative.
5. Links from Unique Domains Matter More than Links from Previously Linking Sites
Experience shows that it’s far better to have 50 links from 50 different domains than to have 500 more links from a site that already links to you.
This makes sense, as Google’s algorithms are designed to measure popularity across the entire web and not simply popularity from a single site.
In fact, this idea has been supported by nearly every SEO ranking factor correlation study ever performed. The number of unique linking root domains is almost always a better predictor of Google rankings than a site’s raw number of total links.
Rand points out that this principle is not always universally true. "When given the option between a 2nd or 3rd link from the NYTimes vs. randomsitexyz, it's almost always more rank-boosting and marketing helpful to go with another NYT link."
6. External Links are More Influential than Internal Links
If we extend the concept from #3 above, then it follows that links from external sites should count more than internal links from your own site. The same correlation studies almost always show that high ranking sites are associated with more external links than lower ranking sites.
Search engines seem to follow the concept that what others say about you is more important than what you say about yourself.
That’s not to say that internal links don’t count. On the contrary, internal linking and good site architecture can be hugely impactful on Google rankings. That said, building external links is often the fastest way to higher rankings and more traffic.
7. Links from Sites Closer to a Trusted Seed Set May Pass More Value
The idea of TrustRank has been around for many years. Bill Slawski covers it here. More recently, Google updated its original PageRank patent with a section that incorporates the concept of “trust” using seed sites. The closer a site is linked to a trusted seed site, the more of a boost it receives.
In theory, this means that black hat Private Blog Networks (PBNs) would be less effective if they were a large link distance away from more trusted sites.
Beyond links, other ways that Google may evaluate trust is through online reputation—e.g. through online reviews or sentiment analysis—and use of accurate information (facts). This is of particular concern with YMYL (Your Money or Your Life) pages that "impact the future happiness, health, financial stability, or safety of users."
This means links from sites that Google considers misleading and/or dangerous may be valued less than links from sites that present more reputable information.
8. Links From Topically Relevant Pages May Cast More Powerful Votes
You run a dairy farm. All things being equal, would you rather have a link from:
The National Dairy Association
The Association of Automobile Mechanics
Hopefully, you choose “b” because you recognize it’s more relevant. Though several mechanisms, Google may act in the same way to toward topically relevant links, including Topic-Sensitive PageRank, phrase-based indexing, and local inter-connectivity. These concepts also help discount spam links from non-relevant pages.
While I've included the image above, the concepts around Google's use of topical relevance is incredibly complex. For a primer on SEO relevance signals, I recommend reading:
Topical SEO: 7 Concepts of Link Relevance & Google Rankings
More than Keywords: 7 Concepts of Advanced On-Page SEO
9. Links From Fresh Pages Can Pass More Value Than Links From Stale Pages
Freshness counts.
Google uses several ways of evaluating content based on freshness. One way to determine the relevancy of a page is to look at the freshness of the links pointing at it.
The basic concept is that pages with links from fresher pages—e.g. newer pages and those more regularly updated—are likely more relevant than pages with links from mostly stale pages, or pages that haven’t been updated in a while.
For a good read on the subject, Justing Briggs has described and named this concept FreshRank.
A page with a burst of links from fresher pages may indicate immediate relevance, compared to a page that has had the same old links for the past 10 years. In these cases, the rate of link growth and the freshness of the linking pages can have a significant influence on rankings.
It's important to note that "old" is not the same thing as stale. A stale page is one that:
Isn't updated, often with outdated content
Earns fewer new links over time
Exhibits declining user engagement
If a page doesn't meet these requirements, it can be considered fresh - no matter its actual age. As Rand notes, "Old crusty links can also be really valuable, especially if the page is kept up to date."
10. The Rate of Link Growth Can Signal Freshness
If Google sees a burst of new links to a page, this could indicate a signal of relevance.
By the same measure, a decrease in the overall rate of link growth would indicate that the page has become stale, and likely to be devalued in search results.
All of these freshness concepts, and more, are covered by Google’s Information Retrieval Based on Historical Data patent.
If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you're about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)
11. Google Devalues Spam and Low-Quality Links
While there are trillions of links on the web, the truth is that Google likely ignores a large swath of them. Google’s goal is to focus on editorial links, e.g. “links that you didn't even have to ask for because they are editorially given by other website owners.” Since Penguin 4.0, Google has implied that their algorithms simply ignore links that they don’t feel meet these standards. These include links generated by negative SEO and link schemes.
That said, there’s lots of debate if Google truly ignores all low-quality links, as there’s evidence that low-quality links—especially those Google might see as manipulative—may actually hurt you.
12. Link Echos: The Influence Of A Link May Persist Even After It Disappears
Link Echos (a.k.a. Link Ghosts) describe the phenomenon where the ranking impact of a link often appears to persist, even long after the link is gone.
Rand has performed several experiments on this and the reverberation effect of links is incredibly persistent, even months after the links have dropped from the web, and Google has recrawled and indexed these pages several times.
Speculation as to why this happens includes: Google looking at other ranking factors once the page has climbed in rankings (e.g. user engagement), Google assigning persistence or degradation to link value that isn’t wholly dependent on its existence on the page, or factors we can’t quite recognize.
Whatever the root cause, the value of a link can have a reverberating, ethereal quality that exists separately from its HTML roots.
As a counterpoint, Niel Patel recently ran an experiment where rankings dropped after low-authority sites lost a large number of links all at once, so it appears possible to overcome this phenomenon under the right circumstances.
13. Sites Linking Out to Authoritative Content May Count More Than Those That Do Not
While Google claims that linking out to quality sites isn’t an explicit ranking factor, they’ve also made statements in the past that it can impact your search performance.
“In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites.” – Matt Cutts
Furthermore, multiple SEO experiments and anecdotal evidence over the years suggest that linking out to relevant, authoritative sites can result in a net positive effect on rankings and visibility.
14. Pages That Link To Spam May Devalue The Other Links They Host
If we take the quote above and focus specifically on the first part, we understand that Google trusts sites less when they link to spam.
This concept can be extended further, as there’s ample evidence of Google demoting sites it believes to be hosting paid links, or part of a private blog network.
Basic advice: when relevant and helpful, link to authoritative sites (and avoid linking to bad sites) when it will benefit your audience.
15. Nofollowed Links Aren't Followed, But May Have Value In Some Cases
Google invented the nofollow link specifically because many webmasters found it hard to prevent spammy, outbound links on their sites - especially those generated by comment spam and UGC.
A common belief is that nofollow links don’t count at all, but Google’s own language leaves some wriggle room. They don’t follow them absolutely, but “in general” and only “essentially” drop the links from their web graph.
That said, numerous SEO experiments and correlation data all suggest that nofollow links can have some value, and webmasters would be wise to maximize their value.
16. ManyJavaScript Links Pass Value, But Only If Google Renders Them
In the old days of SEO, it was common practice to “hide” links using JavaScript, knowing Google couldn’t crawl them.
Today, Google has gotten significantly better at crawling and rendering JavaScript, so that most JavaScript links today will count.
That said, Google still may not crawl or index every JavaScript link. For one, they need extra time and effort to render the JavaScript, and not every site delivers compatible code. Furthermore, Google only considers full links with an anchor tag and href attribute.
17. If A Page Links To The Same URL More Than Once, The First Link Has Priority
... Or more specifically, only the first anchor text counts. If Google crawls a page with two or more links pointing to the same URL, they have explained that while PageRank flows normally through both, they will only use the first anchor text for ranking purposes.
This scenario often comes into play when your sitewide navigation links to an important page, and you also link to it within an article below.
Through testing, folks have discovered a number of clever ways to bypass the First Link Priority rule, but newer studies haven’t been published for several years.
18. Robots.txt and Meta Robots May Impact How and Whether Links Are Seen
Seems obvious, but in order for Google to weigh a link in it’s ranking algorithm, it has to be able to crawl and follow it. Unsurprisingly, there are a number of site and page-level directives which can get in Google’s way. These include:
The URL is blocked from crawling by robots.txt
Robots meta tag or X-Robots-Tag HTTP header use the “nofollow” directive
The page is set to “noindex, follow” but Google eventually stops crawling
Often Google will include a URL in its search results if other pages link to it, even if that page is blocked by robots.txt. But because Google can’t actually crawl the page, any links on the page are virtually invisible.
19. Disavowed Links Don’t Pass Value (Typically)
If you’ve built some shady links, or been hit by a penalty, you can use Google’s disavow tool to help wipe away your sins.
By disavowing, Google effectively removes these backlinks for consideration when they crawl the web.
On the other hand, if Google thinks you’ve made a mistake with your disavow file, they may choose to ignore it entirely - probably to prevent you from self-inflicted harm.
20. Unlinked Mentions May Associate Data or Authority With A Website
Google may connect data about entities (concepts like a business, a person, a work of art, etc) without the presence of HTML links, like the way it does with local business citations or with which data refers to a brand, a movie, a notable person, etc.
In this fashion, unlinked mentions may still associate data or authority with a website or a set of information—even when no link is present.
Bill Slawski has written extensively about entities in search (a few examples here, here, and here). It’s a heady subject, but suffice to say Google doesn’t always need links to associate data and websites together, and strong entity associations may help a site to rank.
Below, you'll find all twenty principals combined into a single graphic. If you'd like to print or embed the image, click here for a higher-res version.
Please credit Moz when using any of these images.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
All Links are Not Created Equal: 20 New Graphics on Google’s Valuation of Links
Posted by Cyrus-Shepard
Twenty-two years ago, the founders of Google invented PageRank, and forever changed the web. A few things that made PageRank dramatically different from existing ranking algorithms:
Links on the web count as votes. Initially, all votes are equal.
Pages which receive more votes become more important (and rank higher.)
More important pages cast more important votes.
But Google didn't stop there: they innovated with anchor text, topic-modeling, content analysis, trust signals, user engagement, and more to deliver better and better results.
Links are no longer equal. Not by a long shot.
Rand Fishkin published the original version of this post in 2010—and to be honest, it rocked our world. Parts of his original have been heavily borrowed here, and Rand graciously consulted on this update.
In this post, we'll walk you through 20 principles of link valuation that have been observed and tested by SEOs. In some cases, they have been confirmed by Google, while others have been patented. Please note that these are not hard and fast rules, but principles that interplay with one another. A burst of fresh link can often outweigh powerful links, spam links can blunt the effect of fresh links, etc.
We strongly encourage you to test these yourselves. To quote Rand, "Nothing is better for learning SEO than going out and experimenting in the wild."
1. Links From Popular Pages Cast More Powerful Votes
Let’s begin with a foundational principle. This concept formed the basis of Google’s original PageRank patent, and quickly help vault it to the most popular search engine in the world. PageRank can become incredibly complex very quickly—but to oversimplify—the more votes (links) a page has pointed to it, the more PageRank (and other possible link-based signals) it accumulates. The more votes it accumulates, the more it can pass on to other pages through outbound links. In basic terms, popular pages are ones that have accumulated a lot of votes themselves. Scoring a link from a popular page can typically be more powerful than earning a link from a page with fewer link votes.
2. Links "Inside" Unique Main Content Pass More Value than Boilerplate Links
Google’s Reasonable Surfer, Semantic Distance, and Boilerplate patents all suggest valuing content and links more highly if they are positioned in the unique, main text area of the page, versus sidebars, headers, and footers, aka the “boilerplate.”
It certainly makes sense, as boilerplate links are not truly editorial, but typically automatically inserted by a CMS (even if a human decided to put them there.) Google’s Quality Rater Guidelines encourage evaluators to focus on the “Main Content” of a page.
Similarly, SEO experiments have found that links hidden within expandable tabs or accordions (by either CSS or JavaScript) may carry less weight than fully visible links, though Google says they fully index and weight these links.
3. Links Higher Up in the Main Content Cast More Powerful Votes
If you had a choice between 2 links, which would you choose?
One placed prominently in the first paragraph of a page, or
One placed lower beneath several paragraphs
Of course, you’d pick the link visitors would likely click on, and Google would want to do the same. Google’s Reasonable Surfer Patent describes methods for giving more weight to links it believes people will actually click, including links placed in more prominent positions on the page.
Matt Cutts, former head of Google’s Webspam team, once famously encouraged SEOs to pay attention to the first link on the page, and not bury important links. (source)
4. Links With Relevant Anchor Text May Pass More Value
Also included in Google’s Reasonable Surfer patent is the concept of giving more weight to links with relevant anchor text. This is only one of several Google patents where anchor text plays an important role. Multiple experiments over the years repeatedly confirm the power of relevant anchor text to boost a page’s ranking better than generic or non-relevant anchor text. It’s important to note that the same Google patents that propose boosting the value of highly-relevant anchors, also discuss devaluing or even ignoring off-topic or irrelevant anchors altogether. Not that you should spam your pages with an abundance of exact match anchors. Data typically shows that high ranking pages typically have a healthy, natural mix of relevant anchors pointing to them.
Similarly, links may carry the context of the words+phrases around/near the link. Though hard evidence is scant, this is mentioned in Google’s patents, and it makes sense that a link surrounded by topically relevant content would be more contextually relevant than the alternative.
5. Links from Unique Domains Matter More than Links from Previously Linking Sites
Experience shows that it’s far better to have 50 links from 50 different domains than to have 500 more links from a site that already links to you.
This makes sense, as Google’s algorithms are designed to measure popularity across the entire web and not simply popularity from a single site.
In fact, this idea has been supported by nearly every SEO ranking factor correlation study ever performed. The number of unique linking root domains is almost always a better predictor of Google rankings than a site’s raw number of total links.
Rand points out that this principle is not always universally true. "When given the option between a 2nd or 3rd link from the NYTimes vs. randomsitexyz, it's almost always more rank-boosting and marketing helpful to go with another NYT link."
6. External Links are More Influential than Internal Links
If we extend the concept from #3 above, then it follows that links from external sites should count more than internal links from your own site. The same correlation studies almost always show that high ranking sites are associated with more external links than lower ranking sites.
Search engines seem to follow the concept that what others say about you is more important than what you say about yourself.
That’s not to say that internal links don’t count. On the contrary, internal linking and good site architecture can be hugely impactful on Google rankings. That said, building external links is often the fastest way to higher rankings and more traffic.
7. Links from Sites Closer to a Trusted Seed Set May Pass More Value
The idea of TrustRank has been around for many years. Bill Slawski covers it here. More recently, Google updated its original PageRank patent with a section that incorporates the concept of “trust” using seed sites. The closer a site is linked to a trusted seed site, the more of a boost it receives.
In theory, this means that black hat Private Blog Networks (PBNs) would be less effective if they were a large link distance away from more trusted sites.
Beyond links, other ways that Google may evaluate trust is through online reputation—e.g. through online reviews or sentiment analysis—and use of accurate information (facts). This is of particular concern with YMYL (Your Money or Your Life) pages that "impact the future happiness, health, financial stability, or safety of users."
This means links from sites that Google considers misleading and/or dangerous may be valued less than links from sites that present more reputable information.
8. Links From Topically Relevant Pages May Cast More Powerful Votes
You run a dairy farm. All things being equal, would you rather have a link from:
The National Dairy Association
The Association of Automobile Mechanics
Hopefully, you choose “b” because you recognize it’s more relevant. Though several mechanisms, Google may act in the same way to toward topically relevant links, including Topic-Sensitive PageRank, phrase-based indexing, and local inter-connectivity. These concepts also help discount spam links from non-relevant pages.
While I've included the image above, the concepts around Google's use of topical relevance is incredibly complex. For a primer on SEO relevance signals, I recommend reading:
Topical SEO: 7 Concepts of Link Relevance & Google Rankings
More than Keywords: 7 Concepts of Advanced On-Page SEO
9. Links From Fresh Pages Can Pass More Value Than Links From Stale Pages
Freshness counts.
Google uses several ways of evaluating content based on freshness. One way to determine the relevancy of a page is to look at the freshness of the links pointing at it.
The basic concept is that pages with links from fresher pages—e.g. newer pages and those more regularly updated—are likely more relevant than pages with links from mostly stale pages, or pages that haven’t been updated in a while.
For a good read on the subject, Justing Briggs has described and named this concept FreshRank.
A page with a burst of links from fresher pages may indicate immediate relevance, compared to a page that has had the same old links for the past 10 years. In these cases, the rate of link growth and the freshness of the linking pages can have a significant influence on rankings.
It's important to note that "old" is not the same thing as stale. A stale page is one that:
Isn't updated, often with outdated content
Earns fewer new links over time
Exhibits declining user engagement
If a page doesn't meet these requirements, it can be considered fresh - no matter its actual age. As Rand notes, "Old crusty links can also be really valuable, especially if the page is kept up to date."
10. The Rate of Link Growth Can Signal Freshness
If Google sees a burst of new links to a page, this could indicate a signal of relevance.
By the same measure, a decrease in the overall rate of link growth would indicate that the page has become stale, and likely to be devalued in search results.
All of these freshness concepts, and more, are covered by Google’s Information Retrieval Based on Historical Data patent.
If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you're about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)
11. Google Devalues Spam and Low-Quality Links
While there are trillions of links on the web, the truth is that Google likely ignores a large swath of them. Google’s goal is to focus on editorial links, e.g. “links that you didn't even have to ask for because they are editorially given by other website owners.” Since Penguin 4.0, Google has implied that their algorithms simply ignore links that they don’t feel meet these standards. These include links generated by negative SEO and link schemes.
That said, there’s lots of debate if Google truly ignores all low-quality links, as there’s evidence that low-quality links—especially those Google might see as manipulative—may actually hurt you.
12. Link Echos: The Influence Of A Link May Persist Even After It Disappears
Link Echos (a.k.a. Link Ghosts) describe the phenomenon where the ranking impact of a link often appears to persist, even long after the link is gone.
Rand has performed several experiments on this and the reverberation effect of links is incredibly persistent, even months after the links have dropped from the web, and Google has recrawled and indexed these pages several times.
Speculation as to why this happens includes: Google looking at other ranking factors once the page has climbed in rankings (e.g. user engagement), Google assigning persistence or degradation to link value that isn’t wholly dependent on its existence on the page, or factors we can’t quite recognize.
Whatever the root cause, the value of a link can have a reverberating, ethereal quality that exists separately from its HTML roots.
As a counterpoint, Niel Patel recently ran an experiment where rankings dropped after low-authority sites lost a large number of links all at once, so it appears possible to overcome this phenomenon under the right circumstances.
13. Sites Linking Out to Authoritative Content May Count More Than Those That Do Not
While Google claims that linking out to quality sites isn’t an explicit ranking factor, they’ve also made statements in the past that it can impact your search performance.
“In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites.” – Matt Cutts
Furthermore, multiple SEO experiments and anecdotal evidence over the years suggest that linking out to relevant, authoritative sites can result in a net positive effect on rankings and visibility.
14. Pages That Link To Spam May Devalue The Other Links They Host
If we take the quote above and focus specifically on the first part, we understand that Google trusts sites less when they link to spam.
This concept can be extended further, as there’s ample evidence of Google demoting sites it believes to be hosting paid links, or part of a private blog network.
Basic advice: when relevant and helpful, link to authoritative sites (and avoid linking to bad sites) when it will benefit your audience.
15. Nofollowed Links Aren't Followed, But May Have Value In Some Cases
Google invented the nofollow link specifically because many webmasters found it hard to prevent spammy, outbound links on their sites - especially those generated by comment spam and UGC.
A common belief is that nofollow links don’t count at all, but Google’s own language leaves some wriggle room. They don’t follow them absolutely, but “in general” and only “essentially” drop the links from their web graph.
That said, numerous SEO experiments and correlation data all suggest that nofollow links can have some value, and webmasters would be wise to maximize their value.
16. ManyJavaScript Links Pass Value, But Only If Google Renders Them
In the old days of SEO, it was common practice to “hide” links using JavaScript, knowing Google couldn’t crawl them.
Today, Google has gotten significantly better at crawling and rendering JavaScript, so that most JavaScript links today will count.
That said, Google still may not crawl or index every JavaScript link. For one, they need extra time and effort to render the JavaScript, and not every site delivers compatible code. Furthermore, Google only considers full links with an anchor tag and href attribute.
17. If A Page Links To The Same URL More Than Once, The First Link Has Priority
... Or more specifically, only the first anchor text counts. If Google crawls a page with two or more links pointing to the same URL, they have explained that while PageRank flows normally through both, they will only use the first anchor text for ranking purposes.
This scenario often comes into play when your sitewide navigation links to an important page, and you also link to it within an article below.
Through testing, folks have discovered a number of clever ways to bypass the First Link Priority rule, but newer studies haven’t been published for several years.
18. Robots.txt and Meta Robots May Impact How and Whether Links Are Seen
Seems obvious, but in order for Google to weigh a link in it’s ranking algorithm, it has to be able to crawl and follow it. Unsurprisingly, there are a number of site and page-level directives which can get in Google’s way. These include:
The URL is blocked from crawling by robots.txt
Robots meta tag or X-Robots-Tag HTTP header use the “nofollow” directive
The page is set to “noindex, follow” but Google eventually stops crawling
Often Google will include a URL in its search results if other pages link to it, even if that page is blocked by robots.txt. But because Google can’t actually crawl the page, any links on the page are virtually invisible.
19. Disavowed Links Don’t Pass Value (Typically)
If you’ve built some shady links, or been hit by a penalty, you can use Google’s disavow tool to help wipe away your sins.
By disavowing, Google effectively removes these backlinks for consideration when they crawl the web.
On the other hand, if Google thinks you’ve made a mistake with your disavow file, they may choose to ignore it entirely - probably to prevent you from self-inflicted harm.
20. Unlinked Mentions May Associate Data or Authority With A Website
Google may connect data about entities (concepts like a business, a person, a work of art, etc) without the presence of HTML links, like the way it does with local business citations or with which data refers to a brand, a movie, a notable person, etc.
In this fashion, unlinked mentions may still associate data or authority with a website or a set of information—even when no link is present.
Bill Slawski has written extensively about entities in search (a few examples here, here, and here). It’s a heady subject, but suffice to say Google doesn’t always need links to associate data and websites together, and strong entity associations may help a site to rank.
Below, you'll find all twenty principals combined into a single graphic. If you'd like to print or embed the image, click here for a higher-res version.
Please credit Moz when using any of these images.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
All Links are Not Created Equal: 20 New Graphics on Google’s Valuation of Links
Posted by Cyrus-Shepard
Twenty-two years ago, the founders of Google invented PageRank, and forever changed the web. A few things that made PageRank dramatically different from existing ranking algorithms:
Links on the web count as votes. Initially, all votes are equal.
Pages which receive more votes become more important (and rank higher.)
More important pages cast more important votes.
But Google didn't stop there: they innovated with anchor text, topic-modeling, content analysis, trust signals, user engagement, and more to deliver better and better results.
Links are no longer equal. Not by a long shot.
Rand Fishkin published the original version of this post in 2010—and to be honest, it rocked our world. Parts of his original have been heavily borrowed here, and Rand graciously consulted on this update.
In this post, we'll walk you through 20 principles of link valuation that have been observed and tested by SEOs. In some cases, they have been confirmed by Google, while others have been patented. Please note that these are not hard and fast rules, but principles that interplay with one another. A burst of fresh link can often outweigh powerful links, spam links can blunt the effect of fresh links, etc.
We strongly encourage you to test these yourselves. To quote Rand, "Nothing is better for learning SEO than going out and experimenting in the wild."
1. Links From Popular Pages Cast More Powerful Votes
Let’s begin with a foundational principle. This concept formed the basis of Google’s original PageRank patent, and quickly help vault it to the most popular search engine in the world. PageRank can become incredibly complex very quickly—but to oversimplify—the more votes (links) a page has pointed to it, the more PageRank (and other possible link-based signals) it accumulates. The more votes it accumulates, the more it can pass on to other pages through outbound links. In basic terms, popular pages are ones that have accumulated a lot of votes themselves. Scoring a link from a popular page can typically be more powerful than earning a link from a page with fewer link votes.
2. Links "Inside" Unique Main Content Pass More Value than Boilerplate Links
Google’s Reasonable Surfer, Semantic Distance, and Boilerplate patents all suggest valuing content and links more highly if they are positioned in the unique, main text area of the page, versus sidebars, headers, and footers, aka the “boilerplate.”
It certainly makes sense, as boilerplate links are not truly editorial, but typically automatically inserted by a CMS (even if a human decided to put them there.) Google’s Quality Rater Guidelines encourage evaluators to focus on the “Main Content” of a page.
Similarly, SEO experiments have found that links hidden within expandable tabs or accordions (by either CSS or JavaScript) may carry less weight than fully visible links, though Google says they fully index and weight these links.
3. Links Higher Up in the Main Content Cast More Powerful Votes
If you had a choice between 2 links, which would you choose?
One placed prominently in the first paragraph of a page, or
One placed lower beneath several paragraphs
Of course, you’d pick the link visitors would likely click on, and Google would want to do the same. Google’s Reasonable Surfer Patent describes methods for giving more weight to links it believes people will actually click, including links placed in more prominent positions on the page.
Matt Cutts, former head of Google’s Webspam team, once famously encouraged SEOs to pay attention to the first link on the page, and not bury important links. (source)
4. Links With Relevant Anchor Text May Pass More Value
Also included in Google’s Reasonable Surfer patent is the concept of giving more weight to links with relevant anchor text. This is only one of several Google patents where anchor text plays an important role. Multiple experiments over the years repeatedly confirm the power of relevant anchor text to boost a page’s ranking better than generic or non-relevant anchor text. It’s important to note that the same Google patents that propose boosting the value of highly-relevant anchors, also discuss devaluing or even ignoring off-topic or irrelevant anchors altogether. Not that you should spam your pages with an abundance of exact match anchors. Data typically shows that high ranking pages typically have a healthy, natural mix of relevant anchors pointing to them.
Similarly, links may carry the context of the words+phrases around/near the link. Though hard evidence is scant, this is mentioned in Google’s patents, and it makes sense that a link surrounded by topically relevant content would be more contextually relevant than the alternative.
5. Links from Unique Domains Matter More than Links from Previously Linking Sites
Experience shows that it’s far better to have 50 links from 50 different domains than to have 500 more links from a site that already links to you.
This makes sense, as Google’s algorithms are designed to measure popularity across the entire web and not simply popularity from a single site.
In fact, this idea has been supported by nearly every SEO ranking factor correlation study ever performed. The number of unique linking root domains is almost always a better predictor of Google rankings than a site’s raw number of total links.
Rand points out that this principle is not always universally true. "When given the option between a 2nd or 3rd link from the NYTimes vs. randomsitexyz, it's almost always more rank-boosting and marketing helpful to go with another NYT link."
6. External Links are More Influential than Internal Links
If we extend the concept from #3 above, then it follows that links from external sites should count more than internal links from your own site. The same correlation studies almost always show that high ranking sites are associated with more external links than lower ranking sites.
Search engines seem to follow the concept that what others say about you is more important than what you say about yourself.
That’s not to say that internal links don’t count. On the contrary, internal linking and good site architecture can be hugely impactful on Google rankings. That said, building external links is often the fastest way to higher rankings and more traffic.
7. Links from Sites Closer to a Trusted Seed Set May Pass More Value
The idea of TrustRank has been around for many years. Bill Slawski covers it here. More recently, Google updated its original PageRank patent with a section that incorporates the concept of “trust” using seed sites. The closer a site is linked to a trusted seed site, the more of a boost it receives.
In theory, this means that black hat Private Blog Networks (PBNs) would be less effective if they were a large link distance away from more trusted sites.
Beyond links, other ways that Google may evaluate trust is through online reputation—e.g. through online reviews or sentiment analysis—and use of accurate information (facts). This is of particular concern with YMYL (Your Money or Your Life) pages that "impact the future happiness, health, financial stability, or safety of users."
This means links from sites that Google considers misleading and/or dangerous may be valued less than links from sites that present more reputable information.
8. Links From Topically Relevant Pages May Cast More Powerful Votes
You run a dairy farm. All things being equal, would you rather have a link from:
The National Dairy Association
The Association of Automobile Mechanics
Hopefully, you choose “b” because you recognize it’s more relevant. Though several mechanisms, Google may act in the same way to toward topically relevant links, including Topic-Sensitive PageRank, phrase-based indexing, and local inter-connectivity. These concepts also help discount spam links from non-relevant pages.
While I've included the image above, the concepts around Google's use of topical relevance is incredibly complex. For a primer on SEO relevance signals, I recommend reading:
Topical SEO: 7 Concepts of Link Relevance & Google Rankings
More than Keywords: 7 Concepts of Advanced On-Page SEO
9. Links From Fresh Pages Can Pass More Value Than Links From Stale Pages
Freshness counts.
Google uses several ways of evaluating content based on freshness. One way to determine the relevancy of a page is to look at the freshness of the links pointing at it.
The basic concept is that pages with links from fresher pages—e.g. newer pages and those more regularly updated—are likely more relevant than pages with links from mostly stale pages, or pages that haven’t been updated in a while.
For a good read on the subject, Justing Briggs has described and named this concept FreshRank.
A page with a burst of links from fresher pages may indicate immediate relevance, compared to a page that has had the same old links for the past 10 years. In these cases, the rate of link growth and the freshness of the linking pages can have a significant influence on rankings.
It's important to note that "old" is not the same thing as stale. A stale page is one that:
Isn't updated, often with outdated content
Earns fewer new links over time
Exhibits declining user engagement
If a page doesn't meet these requirements, it can be considered fresh - no matter its actual age. As Rand notes, "Old crusty links can also be really valuable, especially if the page is kept up to date."
10. The Rate of Link Growth Can Signal Freshness
If Google sees a burst of new links to a page, this could indicate a signal of relevance.
By the same measure, a decrease in the overall rate of link growth would indicate that the page has become stale, and likely to be devalued in search results.
All of these freshness concepts, and more, are covered by Google’s Information Retrieval Based on Historical Data patent.
If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you're about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)
11. Google Devalues Spam and Low-Quality Links
While there are trillions of links on the web, the truth is that Google likely ignores a large swath of them. Google’s goal is to focus on editorial links, e.g. “links that you didn't even have to ask for because they are editorially given by other website owners.” Since Penguin 4.0, Google has implied that their algorithms simply ignore links that they don’t feel meet these standards. These include links generated by negative SEO and link schemes.
That said, there’s lots of debate if Google truly ignores all low-quality links, as there’s evidence that low-quality links—especially those Google might see as manipulative—may actually hurt you.
12. Link Echos: The Influence Of A Link May Persist Even After It Disappears
Link Echos (a.k.a. Link Ghosts) describe the phenomenon where the ranking impact of a link often appears to persist, even long after the link is gone.
Rand has performed several experiments on this and the reverberation effect of links is incredibly persistent, even months after the links have dropped from the web, and Google has recrawled and indexed these pages several times.
Speculation as to why this happens includes: Google looking at other ranking factors once the page has climbed in rankings (e.g. user engagement), Google assigning persistence or degradation to link value that isn’t wholly dependent on its existence on the page, or factors we can’t quite recognize.
Whatever the root cause, the value of a link can have a reverberating, ethereal quality that exists separately from its HTML roots.
As a counterpoint, Niel Patel recently ran an experiment where rankings dropped after low-authority sites lost a large number of links all at once, so it appears possible to overcome this phenomenon under the right circumstances.
13. Sites Linking Out to Authoritative Content May Count More Than Those That Do Not
While Google claims that linking out to quality sites isn’t an explicit ranking factor, they’ve also made statements in the past that it can impact your search performance.
“In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites.” – Matt Cutts
Furthermore, multiple SEO experiments and anecdotal evidence over the years suggest that linking out to relevant, authoritative sites can result in a net positive effect on rankings and visibility.
14. Pages That Link To Spam May Devalue The Other Links They Host
If we take the quote above and focus specifically on the first part, we understand that Google trusts sites less when they link to spam.
This concept can be extended further, as there’s ample evidence of Google demoting sites it believes to be hosting paid links, or part of a private blog network.
Basic advice: when relevant and helpful, link to authoritative sites (and avoid linking to bad sites) when it will benefit your audience.
15. Nofollowed Links Aren't Followed, But May Have Value In Some Cases
Google invented the nofollow link specifically because many webmasters found it hard to prevent spammy, outbound links on their sites - especially those generated by comment spam and UGC.
A common belief is that nofollow links don’t count at all, but Google’s own language leaves some wriggle room. They don’t follow them absolutely, but “in general” and only “essentially” drop the links from their web graph.
That said, numerous SEO experiments and correlation data all suggest that nofollow links can have some value, and webmasters would be wise to maximize their value.
16. ManyJavaScript Links Pass Value, But Only If Google Renders Them
In the old days of SEO, it was common practice to “hide” links using JavaScript, knowing Google couldn’t crawl them.
Today, Google has gotten significantly better at crawling and rendering JavaScript, so that most JavaScript links today will count.
That said, Google still may not crawl or index every JavaScript link. For one, they need extra time and effort to render the JavaScript, and not every site delivers compatible code. Furthermore, Google only considers full links with an anchor tag and href attribute.
17. If A Page Links To The Same URL More Than Once, The First Link Has Priority
... Or more specifically, only the first anchor text counts. If Google crawls a page with two or more links pointing to the same URL, they have explained that while PageRank flows normally through both, they will only use the first anchor text for ranking purposes.
This scenario often comes into play when your sitewide navigation links to an important page, and you also link to it within an article below.
Through testing, folks have discovered a number of clever ways to bypass the First Link Priority rule, but newer studies haven’t been published for several years.
18. Robots.txt and Meta Robots May Impact How and Whether Links Are Seen
Seems obvious, but in order for Google to weigh a link in it’s ranking algorithm, it has to be able to crawl and follow it. Unsurprisingly, there are a number of site and page-level directives which can get in Google’s way. These include:
The URL is blocked from crawling by robots.txt
Robots meta tag or X-Robots-Tag HTTP header use the “nofollow” directive
The page is set to “noindex, follow” but Google eventually stops crawling
Often Google will include a URL in its search results if other pages link to it, even if that page is blocked by robots.txt. But because Google can’t actually crawl the page, any links on the page are virtually invisible.
19. Disavowed Links Don’t Pass Value (Typically)
If you’ve built some shady links, or been hit by a penalty, you can use Google’s disavow tool to help wipe away your sins.
By disavowing, Google effectively removes these backlinks for consideration when they crawl the web.
On the other hand, if Google thinks you’ve made a mistake with your disavow file, they may choose to ignore it entirely - probably to prevent you from self-inflicted harm.
20. Unlinked Mentions May Associate Data or Authority With A Website
Google may connect data about entities (concepts like a business, a person, a work of art, etc) without the presence of HTML links, like the way it does with local business citations or with which data refers to a brand, a movie, a notable person, etc.
In this fashion, unlinked mentions may still associate data or authority with a website or a set of information—even when no link is present.
Bill Slawski has written extensively about entities in search (a few examples here, here, and here). It’s a heady subject, but suffice to say Google doesn’t always need links to associate data and websites together, and strong entity associations may help a site to rank.
Below, you'll find all twenty principals combined into a single graphic. If you'd like to print or embed the image, click here for a higher-res version.
Please credit Moz when using any of these images.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes