There was recently a huge leak from Google that detailed much of how their search algorithm works, and it was quite eye opening. One of the best overviews of the leak comes from Rand Fishkin, and you can read it here.
In digging in, he had a few very insightful thoughts that I wanted to share here.
Big brands matter
He had a few thoughts about building a big brand in order to rank well in Google. Here is one:
“If there was one universal piece of advice I had for marketers seeking to broadly improve their organic search rankings and traffic, it would be: “Build a notable, popular, well-recognized brand in your space, outside of Google search.”
A related thought is a bit more glum, and essentially says to put your efforts elsewhere if you can’t be seen as a major brand.
“The content you create is unlikely to perform well in Google if competition from big, popular websites with well-known brands exists. Google no longer rewards scrappy, clever, SEO-savvy operators who know all the right tricks. They reward established brands, search-measurable forms of popularity, and established domains that searchers already know and click.”
Clicks matter, and Google lied
A theme that goes through Rand’s piece, along with others that I’ve read, is that Google has been dishonest with us for years. It could be somewhat defensible (trying to stop spammers from knowing everything), but it’s undeniable that Google literally lied about various pieces of how their algorithm works.
A major one is clicks in the search results. Google has long downplayed the importance of them (denying them completely until 2019, I believe), but it seems that clicks can matter a lot. Here is a scenario about that from Rand:
Let’s say, for example, that many people in the Seattle area search for “Lehman Brothers” and scroll to page 2, 3, or 4 of the search results until they find the theatre listing for the Lehman Brother stage production, then click that result. Fairly quickly, Google will learn that’s what searchers for those words in that area want.
Even if the Wikipedia article about Lehman Brothers’ role in the financial crisis of 2008 were to invest heavily in link building and content optimization, it’s unlikely they could outrank the user-intent signals (calculated from queries and clicks) of Seattle’s theatre-goers.
Extending this example to the broader web and search as a whole, if you can create demand for your website among enough likely searchers in the regions you’re targeting, you may be able to end-around the need for classic on-and-off-page SEO signals like links, anchor text, optimized content, and the like.
All in all, it’s a massive leak with huge implications, and I encourage you to read Rand’s piece and follow his links to other sources with more info. Things won’t change drastically for how search engine optimization is performed, but these new insights certainly shape how all of us think about Google’s algorithm and behavior.
Leave a Reply