|I'm getting tired of speculations|
Are you suffering from "no-update" syndrome?
| 11:38 pm on Sep 5, 2004 (gmt 0)|
For over a year, I've been reading all kinds of speculations and strange theories.
"Google doesn't like this, Google doesn't like that..." and there you go. A vast percentage of threads are started by somebody that had dropped three positions at Google serps and starts a theory out of thin air.
There are no proofs (read it: Official from Google) about sandboxing, AWS or affiliate sites, AdWords publishers not ranking high, sites wiped out for over optimization, large sites not being "liked" by Google, and the list goes on.
Why are so many webmasters so fond of these theories? Isn't that some kind of "over analysis"?
Of course I do some analysis myself, and also I understand that Google changes its algorithm and makes it confusing to follow, but the basic rules haven't changed: Build a good site, get some relevant links and don't behave bad with any search engine. Spend more time looking for all the traffic oportunities that you have in the web. A good link from a good web site is a mine of gold. And it is not that difficult.
I'm not telling you to forget about Page Rank, or exchanging links or deep analyzing. That's good. But if things change, don't start blaming on Google and do first some research around. Your competition is working as hard as you do. You are not alone.
How can you start a project believing that Google doesn't like big sites? Tell that to the Encyclopedia Brittanica... Or Microsoft, or Slashdot, or Sitepoint, or Adobe, or About.com, or...
It is not my intention to be flamed, but stop and think about this for a minute.
| 11:46 am on Sep 9, 2004 (gmt 0)|
I completely agree that some of the analysis of Florida was "over analyzed". The best one that makes me laugh is the "over optimized" theory (I laugh outloud every time I hear it).
You are are the right track. Focus on the content and links, and everything else will take care of itself.
| 11:58 am on Sep 9, 2004 (gmt 0)|
Well thanks a lot!... I was just about to post my theory that Google doesn't like domains starting with "A" :-))
But seriously: Basically you are right! But where does this need for over analysing SERPS come from? Given the current search engine market a lot of people's businesses depend on Google. They see their traffic go down, don't now why. They are desperately looking for an explination and Google is not giving it to them... This is the best fertilizer rumors and theories...
| 12:21 pm on Sep 9, 2004 (gmt 0)|
1. As someone who has a mass of test sites on which one can test all these "wet dreams", one cannot help but laugh occasionally.
2. Google came about with the intention of trying to be the most accurate search facility for the content of documents. The two guys can take pride in that it was achieved.
3. A guy who makes and sells widgets was accustomed to advertising in the local press, theme magazines, local radio and TV, and even in countries as opposed to continents, national TV and radio. Nowhere in the Google concept did they set out to enable such a business to avoid such marketing costs by employing a computer nerd, like you or me, to get a website on first page of G.
4. In my opinion, based on a brief analysis of SERPS of documents based on the original concept, nothing has changed despite the average of 5 changes in the G algo per month over the last 15 months.
5. What has changed is that the algo has gradually removed from the SERPS those pages created by a webmaster rather than a university professor, and which were massaged with every trick known to man, including those found by posting on a forum asking, "Do you think I can get away with this?"
6. One would hope that G keeps going with this aim until the SERPS contain only those pages that are based on content, rather like a photocopy of a document, and none of those on which the code behind the page might be the basis for selection. In other words, that G excludes pages created by a brilliant webmaster rather than an authority who merely asked someone to "put his page on the net".
In my opinion, naturally
| 2:11 pm on Sep 9, 2004 (gmt 0)|
I was thinking about this earlier.
I think the main problem is that you get individuals who like to set the cat amongst the pigeons. They like to confuse new webmasters and fellow SEO's. After all, when you can feed your competitors a bunch of BS through a forum - it can be quite a good tool to get the upper hand when they start to listen.
| 2:39 pm on Sep 9, 2004 (gmt 0)|
I just dont buy it.
I've got a few totally clean, very informative, well linked sites that were close to the top for lots of search terms in the US and UK.
I've had them checked over by quite a few of the members here for anything nefarious and honestly there is just nothing.
The sites were all created by me ie same method, same language, same everything and optimised exactly the same in the US and UK.
As of a few weeks ago my US sites sunk like a stone, all the top positions, and there were a lot, turned into page # 4 positions. My UK sites which cover UK law as opposed to US law, ie totally different pages, are untouched.
Considering that except for different text they were very similar, with very similar linking patterns how do you explain the exclusive US plummet?
Could it be anything to do with the amount that US lawyers will bid for terms which is considerably more than they are able to do in the UK due to the fact that UK lawyers are paid by time spent as opposed to a %age of the compensation claim, which can in the US be very substantial. Does anyone else here believe that Google effectively manipulates the serps to produce these effects or is it just me? The algo can be made to produce required results without manual interference and they can sit in their ivory tower and truthfully say that they dont manipulate manually.
| 2:45 pm on Sep 9, 2004 (gmt 0)|
I agree with you Enrique.
I have always felt people over analyze out of extreme frustration as dealing with the engines isn't always easy or pleasent. Alot of times you're throwing stuff up against the wall and seeing what sticks. I've always been able to stay constant despite many extreme highs and some downs.
| 3:14 pm on Sep 9, 2004 (gmt 0)|
>Why are so many webmasters so fond of these theories? Isn't that some kind of "over analysis"?
Bacause starting with theories is the first step to arrive at the truth. The next step is to check out if the theory is consistent with the reality of Google. For example, if the theory is that Google doesn't like big sites, that Microsoft, or Slashdot, or Sitepoint, or Adobe, or About.com are doing well in Google is good evidence that this theory is false.
| 4:08 pm on Sep 9, 2004 (gmt 0)|
>first step to arrive at the truth
Exactly. Plus in competitive categories, it's often a game of inches. Understanding the nuances, ebbs and flows of the SE's over time (as much as it's possible to do) can make tremendous difference. To the point that I have often been surprised how simple, little well-informed tweaks can make meaningful differences in rankings, CT and ROI.
| 5:23 pm on Sep 9, 2004 (gmt 0)|
You have to pay attention to what Google is doing. If you can capture a short term trend that produces a lot of traffic, it's worth a lot of money.
| 5:59 pm on Sep 9, 2004 (gmt 0)|
I've just had a site that has sat solidly on the first page for its main search term for ages drop to the bottom of page 3.
Absolutely nothing on page or link wise has changed, no adds on the site no link selling or trading etc.
Whether it be speculation or not, something has happened which is beyond my control.
The Sandbox may be speculation but whatever it is or is called it undeniably exists.
Granted there can be too much theory but many of these things do exist.
| 6:06 pm on Sep 9, 2004 (gmt 0)|
Same as me. It wouldnt just happen to be a big money term would it?
| 6:21 pm on Sep 9, 2004 (gmt 0)|
I agree with JdgeJeffries I see Google attempting to force us to use Froogle and adwords for teh big money terms/many searches terms.
| 6:31 pm on Sep 9, 2004 (gmt 0)|
|Why are so many webmasters so fond of these theories? |
Because webmasters running multiple sites or being under pressure from higher ups to deliver fast results, can't take their own time and not care about anything else. They need some theory or some definitive trend which can deliver the results at the earliest time. Thats the reality.
| 7:04 pm on Sep 9, 2004 (gmt 0)|
Having worked for G quite a while ago, I remember being quite amused reading the sometimes cleaver, but often outlandish theories to describe the results people were seeing. Usually the theories never lined up to what was actually going on, and what was actually going on was very simple and right in front of everybody’s noses.
After leaving G and going out on my own… I have always taken that “other side of the fence” mentality with me whenever reading theories & posts from others, or when devising my own. Sometimes the simple & obvious are the best kept secrets.
| 7:34 pm on Sep 9, 2004 (gmt 0)|
Sometimes I think they change the algo just for the sake of change, just to keep everyone guessing and maybe to make some lower ranked sites rise to the top.
If I take off my SEO hat, I have to admit that there are lots of good, relevant sites that are on page 2,3,4, and beyond, but since they don't know SEO they are not seen.
Maybe google is trying to keep with their original mandate of "Don't be evil" and trying to give the little guy a chance instead of some big, corporate site that can hire an SEO army.
Sounds more like socialism than capitalism to me ;-)
| 9:41 pm on Sep 9, 2004 (gmt 0)|
|Sometimes the simple & obvious are the best kept secrets. |
Hear hear, the simplest explanation is my favorite, that's a classic that there's no need to update. Especially when the simplest explanation explains more than a collection of complicated explanations. After that you can sort of start seeing what's happening from a bigger picture. In this case it's probably two simple explanations working hand in hand, one business side, one engineering side.
a while ago when we all knew the earth was the center of the universe some guys came along and said, hey, the sun is the center of the solar system, everyone said, you're crazy, you don't know what you're talking about. They did this for different reasons, some had a direct interest in keeping the other theory hanging in there, some were stubborn, some really believed in the more complicated theories. But the simpler explanation won out, it explained things better and more completely, the older one was just getting too complicated, all those orbits twisting around each other, it just got confusing...
| 9:56 pm on Sep 9, 2004 (gmt 0)|
Could it possibly be that in your sector...Google is keeping a closer eye on things as more and more individual lawyers and full-fledged legal firms looking to exploit the search patterns of those in need of legal help...an people flood this sector and either bid high PPC's or assault the SERPs?
This would be a natural trend and you can bet that Google looks at volume trends per sector to see how they can either better serve that sector or curve abuse and spam..
If you are building out legal sites that target niche case sectors than you shold be able to compete at a niche level...but sites like yourlawyer.com have been doing this for sometime now and have a very deep strategy in place...
Google may have hit some sort of storage ceiling...?
and they have simply set up some sort of rotation algo to let new sites in and rotate out old stuff or sites that aren't serving good click thrus...?
Just some ideas...not theory or answers...
| 1:58 am on Sep 10, 2004 (gmt 0)|
|Google may have hit some sort of storage ceiling...? |
That would be an example of a simple explanation that covers most of the things people are seeing better than most other explanations that are being offered.
| 3:19 am on Sep 10, 2004 (gmt 0)|
Not likely a storage ceiling but maybe processing the data and returning results as quickly as google has been known for is getting difficult?
| 7:40 am on Sep 10, 2004 (gmt 0)|
not not a big money term, I'm in an area with little to no SEOing, up against many sites that have not even been touched for 5 years. I don't have any ads, buy any links etc.
It is easy for me to recognise changes in my area I think because so few sites are altered in any way, therefore shifts in google ranking are usually something to do with Google.
| 12:48 am on Sep 20, 2004 (gmt 0)|
"...Sometimes the simple & obvious are the best kept secrets..."
I fully agree. Although I started this thread, I must admit that when I find no logical explanation, I'm tempted to come up with strange theories. But then I stop and think for a while and try to relax and continue working.
I have a page about "widget effects tutorials". It ranks #1, but if you search (which is the "correct" phrase in my area) for "widget effect tutorials" it used to rank #20 or so. That was strange because not so long ago plurals and singulars were the same for google. Then I changed the title and the heading to reflect the correct phrase and in two days I ranked first.
So, by changing two letters I ranked 20 results up.
"...Sometimes the simple & obvious are the best kept secrets..."
I don't build sites for Google, I build sites that Google can read. And yes, of course, I keep and eye on them.
| 7:18 am on Sep 20, 2004 (gmt 0)|
One item that is not often emphasized is that with a 'rolling update' not all sites will see the results of a change at the same time. Perhaps some sites are not updated at all, if Google doesn't see a change on a site will it bother to update all the word weightings it uses in algorithms.
I had a site that was in the first page of serps for many keywords in a non-competitive area (the science of widgetography) and I moved to a new site that is very similar but with a better domain name and a bit of seo (h1 for chapter headings, h2 for sections, etc). I was in the index in a week, had my 1000+ backlinks in about 8 weeks, now going on three months with no PR and still not in the serps for most of the terms I was on the first page for. My site is a series of 11 books on the subject by a world recognized expert. It is linked to by many university libraries and many librarians have said what a great resource it is. I am number 4 on Yahoo for the single word widgetography (out of 1,290,000). I am not in the serps at all on Google (out of 765,000) and looking at the results Yahoos are much better although Google and Yahoo agree on about 10 of the top 20.
I think the short and simple explaination is that Google is just plain broke in the sense it is no longer a consistent quality index with consistent being the keyword.