Why Does It Take So Long To Rank In Google?
When you write some content and publish it to your blog, it would be pretty cool if the rankings and traffic started rolling in immediately.
Right after hitting publish is when I am usually most interested in a blog post, and unfortunately, that is when it has the impact and shows the fewest results.
Good things take time, and that certainly applies to Search Engine Optimization (SEO) and blogging.
If you are running a big blog with a high Domain Authority, sometimes your posts can get indexed and ranked by Google within a day. Unless you are running one of those big blogs, it is going to take you a bit longer than that.
So why does it take so long to rank your blog posts in Google?
It has to do with Google's algorithm, and nobody but Google knows exactly how that works or what it takes into account.
All I am about to say is speculation, but then, almost everything written about SEO, other than what is written by Google, is speculation to some degree.
I believe they require all content to age for a while before they take it seriously and give it a chance at ranking in the SERPs.
Googlebombing / Spam
I think they do that as a measure against web spam and the actions of spammers. There is a concept called Googlebombing or linkbombing in which lots of people who run websites (webmasters) and have the ability to link out to other websites work together to manipulate the Google search results for certain keywords. They do this by linking to a certain page with the same or similar anchor text from lots of different websites. Google's algorithm then determines that the page with lots of new links with similar keyword-rich anchor text must be a good fit and search result for that anchor text.
This is generally how Google works at a very high level, and it usually works great when there is not a large conspiracy of influencers trying to play with the results.
In a couple of famous googlebombing cases, the targets were United States presidents. Webmasters working together to pull a large scale prank were able to get a biography of President George W. Bush to show up as the top result for the search query "Miserable Failure". More recently, similar tactics were used to get photos of President Donald Trump to show up for the search term "Idiot".
Often, in addition to the links and anchor text, there is some social media aspect, such as a related Reddit post receiving lots of upvotes. This can help the spammers and pranksters to achieve their goal because social signals are known to be a part of the inputs to Google's PageRank / RankBrain algorithm.
Even in high profile cases such as these, Google's approach is not to manually alter the search results, but rather to try to improve the algorithm so that the manipulation would not work going forward.
I think that one of the main reasons why it takes so long to get your blog post ranked in Google is because they have built in an aging filter to prevent this sort of manipulation.
Aged Content Bias or Delayed Ranking as a Manipulation Deterrent
Whether the aim be to rank an irrelevant site for a specific keyword in an attempt at a joke or it be to rank poor quality content whose only goal is to make money selling stuff (known as spam), the people behind these attempts are often only willing to devote a relatively small amount of time to the project.
Especially people pushing spam. There are lots of ways to make legitimate money by creating value for users, but they often require a significant time investment. Spammers tend to be people looking for shortcuts. They want to make more money than they have earned. They want to provide as little value as possible while maximizing their returns.
Because their bias is towards short-term thinking, anything which would make their plan take too long would make the whole thing infeasible for them. In the earlier days of search there was a concept known as Rank and Bank or Churn and Burn.
Both describe the same thing.
The idea was to use Blackhat SEO techniques (those which Google advised against and would penalize you for if they caught you) to quickly rank a site, then with that ranked site make as much money as quickly as possible with that traffic. You had to be fast because Google would notice the blackhat techniques footprint and either lower the rankings or totally remove the site from the index, also known as de-indexing the site.
Often the blackhat technique would involve using software that automatically created thousands of backlinks that were not genuinely earned editorially. This would provide enough of a quality signal to the PageRank algorithm that it would deem this site with lots of new backlinks as a good result for users and the pages would start to rank highly.
So speed of results was key to incentivizing spammers. They were short-term thinkers, and they were able to get quick results.
I believe Google has been delaying rankings to make any attempt at improving rankings for a site take so long to see results that the only people who would put in the work required would be those confident in the longevity of the site. People building quality sites on good content are willing to put in lots of time up front without seeing any results because they are confident that once they begin to see results, they will keep those results. They are not doing anything that would provoke Google to punish their sites.
Spammers, on the other had, would be unwilling to put in the work because, even if the rankings do eventually come, the site will be penalized and they will lose their now massive time investment. Making the investment take too long makes the grift seem less appealing and would therefore prevent lots of these spammers from even starting.
Google Sandbox
This time delay of rankings has sometimes been called the Google Sandbox. The Sandbox is a period of time when a site is brand new that it does not rank well for even the lowest competition keywords.
It seems like Google tries to keep the site in isolation while it studies it and its changes over a period of time. It wants to see how it behave and grows. It wants to make sure that it is not serving spam or malware or low quality content.
When Google engineers were asked directly about the existence of the sandbox, they said they don't have such a thing built into the algorithm, but they also said that it takes time and a sufficient volume of context (amount of content and hyperlinks to the content) in order to determine the meaning, topical relevance, and value of a site and its content. So, while they officially said no such sandbox exists, they have said they don't rank sites until they understand their content and importance, and that does take at least some time. Sound a little like an admission of the existence of a sandbox.
How to Get Out of the Google Sandbox and Start Ranking
There seems to be a few ways out of the sandbox or to increase your chances of good rankings.
The factors seem to be time, backlinks, and volume of content.
Time
The first way to escape the sandbox and start accumulating some decent rankings is to just wait until Google lets your site out. It does seem like, even if you take no specific action to improve the performance of some blog post, that eventually you do start to see rankings, however low.
For me, on this site, it seems like a year is the magic amount of time. I wrote a couple posts a little over a year ago that I felt were very high quality and useful, but they got no love from the search engines. They received next to no traffic.
Around 9 months after publishing, those articles started getting some minor rankings and traffic. After a full 12 months, there seems to be a lot more happening with them.
They still are not getting all that much traffic, but I can see the keywords they are showing up for in my Google Search Console reports, and the number of keywords that are generating impressions for those posts is growing quickly. I think one of them is ranking for over 600 keywords.
Of course, some of those keywords only have a single impression, but it is still exciting to see such growth for something that I wrote over a year ago. And it really seems like time was the only thing that changed. Waiting for Google to accept my post was a strategy that worked for me in that case.
Advice from those working in SEO professionally sometimes talks about taking around 60-90 days for content to start getting traction in the search engines.
Depending on the competitiveness of some keywords, it can take even longer. Maybe sometimes 12-24 months.
Waiting it out might be the least effective approach to seeing results, but it does seem that there is something to it.
I think that Google might either be purposely delaying rankings for sites that have not earned their trust yet to prevent spam, or they are collecting data on the site to determine what it should rank for.
Google looks at things like click-through-rate (CTR), dwell time, time on site, bounce rate, and pogosticking. These are things that Google measures by testing an article in the SERPs for a little bit. It seems like Google will randomly choose to turn up your rankings on a blog post to see how it performs with users.
The tests usually don't last long. Depending on how much traffic the keywords attract, it could be just a few minutes or even hours. You will see spikes in your traffic in your analytics, and these tests can be why. If they try your article in the results and people click on it like crazy, they might consider it to be an appropriate and relevant page to display for that search term.
If people stay on the site for several minutes and don't click back to the results page to find another blog post, then Google could reasonably believe that to mean that the information on the page was of high quality and sufficiently answered the searchers query.
This sort of testing is happening automatically and constantly and the results are analyzed and incorporated by the search algorithm. Because of the number of posts that they must perform this kind of testing on, it can take months or years for Google to learn enough about a page to be confident about what it should rank for and how highly.
While I don't know for sure, I think this is part of why it can take a while to start to rank a new site and why at some point you do start to rank while having seemingly done nothing to change things.
Backlinks
Another way to rank faster is by attracting or building lots of backlinks. Backlinks to a blog post or site can indicate that the destination of the link is a good source vouched for by the person posting the link.
If you get one or two links, it might not tell Google all that much about the site or its quality.
But if you get dozens, hundreds, or even thousands of links from many different website or referring domains, then it can say quite a bit about the perceived quality of that post or site.
A site with lots of links pointing at it can be considered to have accrued and earned trust, and Google might choose to rank it higher in the SERPs.
Each link will also have anchor text, which is the text that, when clicked, takes someone to the destination of the link. It is often blue and underlined. What people choose as the anchor text for a link to a resource is very telling of what they think of that sites content.
People will generally try to make the anchor text descriptive of what their readers will find if they click through that link.
If lots of different websites are linking to a blog post with the anchor text "Copywriting Tips", then Google would have good reason to assume that the blog post is about copywriting tips and that it should rank for keywords related to copywriting tips.
Another way backlinks can help Google understand an article is by comparing the content of the page that posted the link to the content on the destination page.
You can picture it like a Venn Diagram (the figure with two circles that partially overlap and compare what is the same in the two circles). Whatever is written about in both articles is likely to be the topic of the article. Sometimes not. Sometimes a person will link to something to make a secondary or ancillary point.
It would be like if you were trying to explain how buildings are engineered to withstand hurricanes and talked about the resulting buildings as being as "sturdy as a rhino" and then you linked to a page that talked about how tough rhinos are. Just because the source article mentioned the word "rhino" and the linked to page is about rhinos does not mean that the source article is about rhinos.
But pretty often pages link to other pages that actually are about the same thing being written about in the source article.
So in the same example, that page might link to another page that talks about reinforced buildings meant for hurricane prone regions. In that case, the overlapping topic in the Venn Diagram would be spot on in telling Google what the page is about.
Even more often, if a page links out to lots of pages, whatever is the related topic of all of the pages or most of the pages is what the source page is about.
All of these backlink factors (number of links, anchor text, and context of linking sites) can possibly speed up the time required to rank an article because they are giving Google the information it would need to understand and trust it. The same types of information that it would need to test by itself otherwise.
By generating lots of highly relevant backlinks, either through outreach or other methods, you can significantly cut down the length of time it takes for your content to rank. And the effect of backlinks is cumulative for a site, so if you get a lot of links to your site, your future content will find it easier to rank more quickly. This is due to the site's improved Domain Authority.
Volume of Content
The content posted on a site is also very important to the time it takes Google to let you out of the sandbox.
If you only have a few articles posted on your site, then Google will have a difficult time knowing what the site is about.
The more you write on your site, the better Google can tell what your site is about.
If you write just one article about gardening, it might not be so sure that you are an expert resource on gardening. If, however, you have written over a dozen articles about gardening, it might begin to think of you as someone to be trusted in the area of gardening.
It can also be one of those things that would sort out the real content creators from the spammers. Content takes a long time to create and it can be very expensive to have created. Spammers will be unlikely to invest so much in content to overcome the sandbox. A legitimate content creator will continue creating content until Google starts to trust them.
Outbound links can also tell Google about your content. Again, like was discussed in the previous section on backlinks, what a page links to often tells readers and search engines what the page is about.
Google has also described the entire internet environment as consisting of "good neighborhoods" and "bad neighborhoods". Those would be site that are high quality and authoritative, and the latter would be sites of poor quality that lead to bad user outcomes and are often spam.
If your content tends to link out to good neighborhoods, then Google might infer that your site is part of the good neighborhood and should be trusted. If your content sends a lot of links into the bad neighborhood, then it might not be so quick to trust you.
Some SEO experts say that the sweet spot for volume of content is around 55-75 published posts. They have found through their experience working on different client sites that once you accumulate around 50+ posts, Google will see your site as having crossed a trust threshold and it will start to rank your articles better. You have proved your worth by sticking it out to 50 posts and so they start to show your content to more searchers.
That is clearly not a hard number. It is an approximate range that seems to apply to most sites. There are almost certainly niches and verticals where the number would be much higher, and some where it would be lower. But this general number and the associated advice would encourage you to push as quickly as you can to write and publish around 50 blog posts, because you will see some better results after that point.
I have wondered whether it is all about the number of posts or more about total word count on an entire site. If you have 50 blog posts that were all really short, I don't think that would do as much for your site's rankings and traffic as 50 really meaty posts. And I think that if you wrote really long and detailed posts, that you might get results with less than 50 posts.
Lots of articles that try to answer the question "How many blog posts do you need to rank in Google?" seem to support the last point I made about total word count, but without actually mentioning word count. One way they make this point is by pointing to an outlier case of a site that has a relatively small number of articles (early on it was only about 40 posts, but now the site has over 100) but with a huge amount of traffic. The site I've most often seen referenced in this way is Brian Dean's Backlinko, known particularly for the Skyscraper Technique.
That site posted only 30-40 articles, but each was a massive resource guide. Each article was thousands of words long, some even over 10,000 words. Because so many words on so few pages was able to break the site out of the sandbox, it makes me believe that Google was more concerned with the total length of content on the website and no so much with how many pages it was spread out over. So, writing more, wherever it is written, will help you to rank better. And trying to take the shortcut of writing 50 miniature articles won't work.
Without any evidence or good reason to believe so, I wonder if the word count at which Google takes your site seriously is 50,000 or 100,000 words. That would certainly be enough to display context and expertise.
The WordStream blog adds another layer to this issue. They talk about increasing the total wordcount for a website in order to increase the number of keywords you are eligible to rank for. You don't always need to have a keyword included in your content verbatim in order to rank for it, but lots of times you do. If you do not use a keyword phrase in your content, then you will not rank for it. So writing more words, more paragraphs, will allow you to naturally use lots of different phrasings and related keywords, each of which, once used, you would be eligible to rank for.
By having more keywords used, even those you were not targeting and did not intentionally use, there are more chances for your posts to generate rankings and traffic. And even if each keyword only brings you a few clicks a month, if you are ranking for 1,000 keywords, that would be a significant amount of traffic.
So, the final way I believe you can get quicker results in ranking your blog post involves writing more content. The more content you have published, the better your chances of ranking highly and bringing in traffic.
Critical Authority Threshold (CAT)
I wanted to mention one more concept here that I think is very relevant. This is a concept put forth by Nat Eliason on his Growth Machine content marketing agency's blog.
The way he explains it is that as you write more content that targets more keywords, you are increasing your Potential Traffic. The search volume of each keyword times the number of keywords is how much traffic you could be generating with your site. But even though you are writing more content and hitting more keywords, you just don't get any improved results. Or what results you do get are much smaller than you would expect.
This will happen until you cross a Critical Authority Threshold, or CAT. This is a term of his own coining, not something used by Google, but it seems very fitting. The idea is that Google is not sure whether to trust your site as a relevant authority in the beginning. You write articles, and Google just sits on them. They are crawled, indexed, and then little to nothing else happens. At some point, Google decides it can trust your site and it starts to increase your rankings dramatically. It takes you from nearly nothing to much closer to your Potential Traffic almost all at once.
The increase in rankings and traffic is not a steady linear change as you publish more content. It is slow and low, then exponential.
The exact point at which Google decides it can trust you is the CAT for your site.
Nat says that two things can help you to cross the CAT and get the traffic you deserve.
One is patience. You simply need to wait out Google's waiting period. It might have to do with that testing they perform on your articles when shown in the SERPs. It might be something else. But whatever it is, if you just persevere, you will eventually cross the CAT.
The other is backlinks. Backlinks can speed up your site reaching its CAT. Generate enough relevant backlinks and you could cross your CAT in a week or two.
Nat does not mention that the volume of published content is helpful in crossing the CAT. I believe that it might be helpful in speeding it up. The more content you publish, the more Google must learn about your content, and the more of an expert they must consider you to be.
I mean, even if you are just a rambling idiot that doesn't have anything important to say on a topic, if you write enough about it, those writings must be worth showing to interested searchers at some point. Just by sheer volume devoid of quality, you would have written on a topic more than 90% of people have. Most people don't write anything publicly, so having some volume of content in a specific subject area must make it worth something.
Whether the increased volume of content is a direct cause for Google's improved impression of your site's credibility is unknown. It could easily be that by having more content written you just increase the chances that other sites link to your content, and that the links are what really make a difference. If you keep writing content, at some point, it seems almost statistically impossible for nobody to have linked to you. Keep writing and it seems inevitable that you will attract links and rankings.
Conclusion
I think that Google intentionally delays a sites keyword rankings and traffic.
It is likely to be a method to deter spammers and people looking to manipulate the search engine's results.
It could also be that Google is trying to figure out what your site is all about and how good the information is. And that process takes time. This is almost certainly some part of the reason it takes so long to achieve worthwhile ranking results, because it is what Google explained is happening during the time that people refer to as the Sandbox.
Google's ultimate goal is to provide the best quality results for users. User experience is important to Google. By taking things slowly, Google is probably avoiding making lots of silly mistakes. Mistakes like providing garbage content and spam in the SERPs.
As frustrating as it can be to dutifully create great content to attract a readership to your brand or business and see no results for so long, it is ultimately a good thing to ensure quality results for all.
Even search engine marketers use Google to find answers to questions, and they would not like it if crappy marketing copy that didn't help users at all made it to the top of the SERPs.
Stay determined and drive through to results. Keep writing. Keep publishing. Eventually you will get ranked and bring in traffic.
Do you believe that these are the most important things Google looks at? Do you know of a way to get out of the Google Sandbox any quicker? Do you think Google is approaching this issue the right way (low and slow)? Do you think the SERPs would be improved if new sites and new articles could rank more quickly? Would lowering the resistance to new publishers (allowing them to achieve ranking and traffic results more quickly) encourage even more people to create content for the internet and improve everyone's access to great and diverse information? Would the world be better off if more or less people blogged and created content? Let me know in the comments!
Other Articles to Read
Comments
Post a Comment