Welcome to WebmasterWorld Guest from 22.214.171.124
Forum Moderators: open
"Google doesn't like this, Google doesn't like that..." and there you go. A vast percentage of threads are started by somebody that had dropped three positions at Google serps and starts a theory out of thin air.
There are no proofs (read it: Official from Google) about sandboxing, AWS or affiliate sites, AdWords publishers not ranking high, sites wiped out for over optimization, large sites not being "liked" by Google, and the list goes on.
Why are so many webmasters so fond of these theories? Isn't that some kind of "over analysis"?
Of course I do some analysis myself, and also I understand that Google changes its algorithm and makes it confusing to follow, but the basic rules haven't changed: Build a good site, get some relevant links and don't behave bad with any search engine. Spend more time looking for all the traffic oportunities that you have in the web. A good link from a good web site is a mine of gold. And it is not that difficult.
I'm not telling you to forget about Page Rank, or exchanging links or deep analyzing. That's good. But if things change, don't start blaming on Google and do first some research around. Your competition is working as hard as you do. You are not alone.
How can you start a project believing that Google doesn't like big sites? Tell that to the Encyclopedia Brittanica... Or Microsoft, or Slashdot, or Sitepoint, or Adobe, or About.com, or...
It is not my intention to be flamed, but stop and think about this for a minute.
You are are the right track. Focus on the content and links, and everything else will take care of itself.
But seriously: Basically you are right! But where does this need for over analysing SERPS come from? Given the current search engine market a lot of people's businesses depend on Google. They see their traffic go down, don't now why. They are desperately looking for an explination and Google is not giving it to them... This is the best fertilizer rumors and theories...
2. Google came about with the intention of trying to be the most accurate search facility for the content of documents. The two guys can take pride in that it was achieved.
3. A guy who makes and sells widgets was accustomed to advertising in the local press, theme magazines, local radio and TV, and even in countries as opposed to continents, national TV and radio. Nowhere in the Google concept did they set out to enable such a business to avoid such marketing costs by employing a computer nerd, like you or me, to get a website on first page of G.
4. In my opinion, based on a brief analysis of SERPS of documents based on the original concept, nothing has changed despite the average of 5 changes in the G algo per month over the last 15 months.
5. What has changed is that the algo has gradually removed from the SERPS those pages created by a webmaster rather than a university professor, and which were massaged with every trick known to man, including those found by posting on a forum asking, "Do you think I can get away with this?"
6. One would hope that G keeps going with this aim until the SERPS contain only those pages that are based on content, rather like a photocopy of a document, and none of those on which the code behind the page might be the basis for selection. In other words, that G excludes pages created by a brilliant webmaster rather than an authority who merely asked someone to "put his page on the net".
In my opinion, naturally
I think the main problem is that you get individuals who like to set the cat amongst the pigeons. They like to confuse new webmasters and fellow SEO's. After all, when you can feed your competitors a bunch of BS through a forum - it can be quite a good tool to get the upper hand when they start to listen.
I have always felt people over analyze out of extreme frustration as dealing with the engines isn't always easy or pleasent. Alot of times you're throwing stuff up against the wall and seeing what sticks. I've always been able to stay constant despite many extreme highs and some downs.
Bacause starting with theories is the first step to arrive at the truth. The next step is to check out if the theory is consistent with the reality of Google. For example, if the theory is that Google doesn't like big sites, that Microsoft, or Slashdot, or Sitepoint, or Adobe, or About.com are doing well in Google is good evidence that this theory is false.
Exactly. Plus in competitive categories, it's often a game of inches. Understanding the nuances, ebbs and flows of the SE's over time (as much as it's possible to do) can make tremendous difference. To the point that I have often been surprised how simple, little well-informed tweaks can make meaningful differences in rankings, CT and ROI.
Whether it be speculation or not, something has happened which is beyond my control.
The Sandbox may be speculation but whatever it is or is called it undeniably exists.
Granted there can be too much theory but many of these things do exist.
Why are so many webmasters so fond of these theories?
Because webmasters running multiple sites or being under pressure from higher ups to deliver fast results, can't take their own time and not care about anything else. They need some theory or some definitive trend which can deliver the results at the earliest time. Thats the reality.
After leaving G and going out on my own… I have always taken that “other side of the fence” mentality with me whenever reading theories & posts from others, or when devising my own. Sometimes the simple & obvious are the best kept secrets.
Sometimes I think they change the algo just for the sake of change, just to keep everyone guessing and maybe to make some lower ranked sites rise to the top.
If I take off my SEO hat, I have to admit that there are lots of good, relevant sites that are on page 2,3,4, and beyond, but since they don't know SEO they are not seen.
Maybe google is trying to keep with their original mandate of "Don't be evil" and trying to give the little guy a chance instead of some big, corporate site that can hire an SEO army.
Sounds more like socialism than capitalism to me ;-)
Sometimes the simple & obvious are the best kept secrets.
Hear hear, the simplest explanation is my favorite, that's a classic that there's no need to update. Especially when the simplest explanation explains more than a collection of complicated explanations. After that you can sort of start seeing what's happening from a bigger picture. In this case it's probably two simple explanations working hand in hand, one business side, one engineering side.
a while ago when we all knew the earth was the center of the universe some guys came along and said, hey, the sun is the center of the solar system, everyone said, you're crazy, you don't know what you're talking about. They did this for different reasons, some had a direct interest in keeping the other theory hanging in there, some were stubborn, some really believed in the more complicated theories. But the simpler explanation won out, it explained things better and more completely, the older one was just getting too complicated, all those orbits twisting around each other, it just got confusing...
Could it possibly be that in your sector...Google is keeping a closer eye on things as more and more individual lawyers and full-fledged legal firms looking to exploit the search patterns of those in need of legal help...an people flood this sector and either bid high PPC's or assault the SERPs?
This would be a natural trend and you can bet that Google looks at volume trends per sector to see how they can either better serve that sector or curve abuse and spam..
If you are building out legal sites that target niche case sectors than you shold be able to compete at a niche level...but sites like yourlawyer.com have been doing this for sometime now and have a very deep strategy in place...
Google may have hit some sort of storage ceiling...?
and they have simply set up some sort of rotation algo to let new sites in and rotate out old stuff or sites that aren't serving good click thrus...?
Just some ideas...not theory or answers...
It is easy for me to recognise changes in my area I think because so few sites are altered in any way, therefore shifts in google ranking are usually something to do with Google.
I fully agree. Although I started this thread, I must admit that when I find no logical explanation, I'm tempted to come up with strange theories. But then I stop and think for a while and try to relax and continue working.
I have a page about "widget effects tutorials". It ranks #1, but if you search (which is the "correct" phrase in my area) for "widget effect tutorials" it used to rank #20 or so. That was strange because not so long ago plurals and singulars were the same for google. Then I changed the title and the heading to reflect the correct phrase and in two days I ranked first.
So, by changing two letters I ranked 20 results up.
"...Sometimes the simple & obvious are the best kept secrets..."
I don't build sites for Google, I build sites that Google can read. And yes, of course, I keep and eye on them.
I had a site that was in the first page of serps for many keywords in a non-competitive area (the science of widgetography) and I moved to a new site that is very similar but with a better domain name and a bit of seo (h1 for chapter headings, h2 for sections, etc). I was in the index in a week, had my 1000+ backlinks in about 8 weeks, now going on three months with no PR and still not in the serps for most of the terms I was on the first page for. My site is a series of 11 books on the subject by a world recognized expert. It is linked to by many university libraries and many librarians have said what a great resource it is. I am number 4 on Yahoo for the single word widgetography (out of 1,290,000). I am not in the serps at all on Google (out of 765,000) and looking at the results Yahoos are much better although Google and Yahoo agree on about 10 of the top 20.
I think the short and simple explaination is that Google is just plain broke in the sense it is no longer a consistent quality index with consistent being the keyword.