The Most Active and Friendliest
Affiliate Marketing Community Online!

“Adsterra”/  “CPA

Sandbox theory, Basic principals of SEO or just slinging sand?


<b>Senior Member - SEO Pro<br />Global Moderator</
There has been much speculation about the Google sandbox theory. The buzz in most forums on this topic is so great that most SEO, real estate, mortgage, and internet marketing forums have threads devoted to this subject. Some SEO "experts" believe that the sandbox effect is created because the age of the site and links have to be grandfathered in. If this is the case, Then how long and old do the sites have to be. Is there a standard system for this aging process or some other method that is used by Google?

Others believe that Google has a filter that has a pre-determined age when the site is released from the grip of the poor ranking slums of search. There is still much to work on with this theory, If these filters exist, How do they determine which sites do not get caught with these filters because some sites do not appear to be caught by these "filters" . And some sites seem to stay in these filters a long time(years in some cases). Is it an unfair filtering system produced by Google?
There are also some who believe that it is a combination of all of these factors, And even a crowd of people who believe that you have to pay Google or "Know Someone" to get premium spots in natural search positions.

And with all of these theories there are people who claim that they built a site and received top rankings in 3 months or less and avoided the "sandbox" and flew right to the top. Are these just isolated cases, or a few sites that fell through the cracks?.

I believe the answer is simple, and I am sure I will catch some flack from this post.
Google does updates about every 90 days(but now is starting to look like a continous update), The people who have proper onpage work are running a close race, building links is where the appearance of the "sandbox" come into play.

Let's say you will need approx. 400 links from relevant sites to get a top 10 placement and you are starting from zero. The sites that are already in the top 10 vary in IBL's from 300 to 500. you start building links, you get 500 links in 30 days and think that you have a position in the top 10 coming at the next update. but you started 45 days after the previous update, you built your links in 30 days, now Google has 15 days to find all of these 400 links you added in 30 days all across the net and have them in their data centers before the update starts. well, they only find 200 of them and you do not make the top 10 of the serps.

You now have another 90 days(average) before the next update. This now puts you at 4 1/2 months before you get out of the sandbox, right? WRONG! You still have to consider that once you get in the top 10 other sites are still competing for the top positions and are steadily building links.
Now you missed another update because you thought you had enough links to get top serps. Now you are at 7 1/2 months. This is what gives the "effect" to the "theory" of the sandbox.
This process can continue for many years, there are also other factors that affect the amount of links needed to get top serps also . Anchor text is a major factor.

IMO there is no sandbox. just improper planning on the SEO or webmasters part.
Google is not a respecter of sites, who ever plays by the rules the strictest wins! It is a program and it does what it is programmed to do and ranks sites according to it's algorithm. any time I have been caught in the "sandbox" was because of the quantity (or lack of) proper links that I had indexed by the next update.

Yahoo does updates on an average of every 90 days(Yahoo just announced a slower crawl, so they may not find links as fast now). MSN about every 2 weeks. This would give reason to climbing the serps quickly in these engines. They find them and credit them quicker.


<b>Senior Member - SEO Pro<br />Global Moderator</
Really, the basics of seo is really simple. Anyone can do it. and the rewards are worth the effort.


New Member
1)The Sandbox Effect is a theory used to explain certain behaviors observed with some Internet search engines.
2)The Sandbox Effect is the theory that websites with newly-registered domains or domains with frequent ownership or nameserver changes are placed in a sandbox holding area) in the indexes of Google until such time is deemed appropriate before a ranking can commence.
3)Webmasters may notice that their site will only show for keywords that are not competitive.This effect does not affect new pages unless the domain is in the sandbox.


<b>Senior Member - SEO Pro<br />Global Moderator</
1. "The Sandbox Effect is a theory" The word "theory" means they have no substantial proof. and i have proof and records that show otherwise.. And a search engine does not behave. it runs on a program. has no feelings, no preferences, does not get up in a bad mood, and its wife did not make it mad.

What you are suggesting is at the level of artificial intellegence. Google is not there yet.

2. I have taken over 100 "new"(less than 30 days old at the start and only 4 months when ranking.) domains to the top of google for competitive words or phrases in 1 update cycle. It was the type and amount of links used and credited before an update. and at what "time" does google "deem"appropriate. How come it only affects a portion of sites?(i might add that the ones that do not deem appropriate do not have enough quality of quantity of links from my records).

3. a new website will rank for a few "random phrases" from date of index unless the entire target is very competitive. so, that is not valid either. I can build a blog today and rank tomorrow in google for low comp or "default" phrases.

I wish i could get this on mythbusters.


As a new to SEO world, I had hard time learning about Sandbox and its functionality. I finally found the great article here and I am learning so fast from this forum already.

many thanks for that awesome article.



The Sandbox is real, and the duration a site stays in the sandbox is a function of the target keywords. I made a few pages in a sub-domain of one of my web-sites about hotels. The pages were indexed but did not turn up in any but most obscure searches. One year after launch, almost to the day, without doing anything, the sub-domain was out of the sandbox, and I saw a very substantial increase in traffic.


<b>Senior Member - SEO Pro<br />Global Moderator</
That is because of a lack of inbound proper links.

Graigslist can add a new city (sub domain) and get high PR the next PR update, and its pages will rank for keywords instantly and also have a heavy crawl.

But, all other CL subdomains are linking to them instantly and they are allPR4 or higher indexes from that respective homepage.

There was a recent article about the big companies dominatiating with this exact same method.

You are right that your subdomain will not rank until you beat your competition with links(proper) and proper onpage.

Subdomains are treated as new domains. But i can show you over 50 sites right now with competitive searches at state, national and international levels that are under 6 months old and 1st page in google and yahoo.

If you look at the structure of Cl, you will see over 200 inbound links from the other CL subdomains, and a new city (sub domain) will instantly have clout because they instantly have over 200 PR5 related links. not counting other links.

Go to any state in CL and look for the cities listed as "New" and you will see that they will rank just as well as any other CL subdomain if you post an add with a specific keyword.

How come they are not sandboxed?????

We have already established that a subdomain is a new url from your own post.

The reason your new subdomain did nothing is purely from lack of good links. I would bet money that if you bookmarked a new subdomain in about 100 social bookmarking sites instantly, you would start gaining traffic from google within a few days as they are links from authority sites and thay are one way links.

Anchor text also affects the rankings.

The sandbox is nothing more than a "theory" just like duplicate content