Forum Moderators: open
What causes 90% of sites to disappear?
Why not ask Google? Time after time I hear "My site is completely gone from Google" then when checking 'the site' it is NOT gone at all, they are simply not ranking where, and for what, they want to.
Where are all these unproven silly theories coming from? If someone has proven something present it, otherwise make it clear it's only a HUNCH. Remember that there are 3.6 billion pages so any proof must consider that.
Filter and algorithm are 2 different things. While a page may be subjected to an algorithm to find if it meets the citeria of a Filter is not the Filter itself.
When you filter something you do so in the *hope* of removing it. Just like any of the many filters on most cars, e.g. air filter, fuel filter, oil filter etc.
When Google filters pages (e.g Safe Search etc) it too does so in the *hope* of ommiting the page from the SERP's. It if does so succefully the page WILL NOT be found on any page of the results. If it is not succefull (not sure of the %) then the page will be on one of the result pages.
[edited by: I_am_back at 3:02 am (utc) on Dec. 13, 2003]
I'll say it again, Google is currently supplying on -in search results that are truest to the original intent of the internet: information sharing without a financial motive. None of the information sites that I look at have disappeared. If Google has suddenly raised the ante for commercial sites... so it goes. That would be your basic capitalist law of the jungle in action. May the best widget salesman win. You're probably all being herded into AdWords, my sympathies...
Now breaking that down ;-)
I'll say it again, Google is currently supplying on -in search results that are truest to the original intent of the internet: information sharing without a financial motive.
Without a financial motive for who?
You're probably all being herded into AdWords, my sympathies...
Once again, :-)
At this point I don't buy into the conspiracy, but I do have alot of questions for certain commercial searches.
[edited by: More_Traffic_Please at 3:09 am (utc) on Dec. 13, 2003]
No, this has nothing to do with semantics. It has everything to do with inexperienced webmasters blaming the wrong cause for an effect.
Blaming your tires for losing a race when you have no gasoline in the car is foolish and not a matter of semantics, and that is precisely what some folks are trying to do.
What causes 90% of sites to disappear?
Is anyone seriously suggesting that Google's index has shrunken to 10% of its pre-Florida size?
Is anyone seriously suggesting that Google's index has shrunken to 10% of its pre-Florida size?My hunch is they were referring to the fact that for many searches upwards of 90 pages are missing from the top 100 results that were there previous to the 'switch to Noogle'.
Can we just say 'the switch to Noogle' instead of referring to the algo/filter/feature? e.g. the switch to Noogle seems to have been more prevalent for commercial queries
Today, at work, I was searching for something about eukaryotic promoter databases. Out of old habit, I used Google first. I did not find what I was looking for, but I noticed that there were (obviously irrelevent) ads coming up at the right for some of my searches. I couldn't prevent myself from thinking that if Google was, for some stupid reason, treating this search as a commercial one, the absolute best results could be "filtered out". In the past, I would have simply decided that what I was looking for did not exist yet, but this time I had to use other search engines to be sure.
What I am saying is about trust as a user. I don't trust people who have lied to me. Lying by omission is lying. I can't trust Google anymore. Never will again. I know that there are flower shops in my city, and I expect a good search engine to find them for me. I don't care if their web sites say "flower shop mycity" too many times. I don't even care if they do massive link exchange (even though I think that it's a despicable practice). Let me, the user, decide whether I want to buy something from them. If I can't find these sites, it is censorship. And censorship is wrong. Very wrong.
[edited by: Spica at 9:04 am (utc) on Dec. 13, 2003]
It is very strange to see some people here loving Google so unconditionally that they are ready to deny that there has been a drastic change in the way commercial searches are handled.
It's strange that people like to make up their own reality. Who here has said there haven't been drastic changes? Why would you make such a post?
It is very strange to see some people here loving Google so unconditionally that they are ready to deny that there has been a drastic change in the way commercial searches are handled.
I for one still find Google the best SE for getting what I want. I will also NOT subscribe to unproven conspiracy theories that state Google has drastically changed the way it handles commercial searches.
I'm in the Software business and sell it online, if there was a "drastic change in the way commercial searches are handled" I believe I would notice it.
This whole forum seems to rapidly being filled with "You must hate Google or we'll hate you" types. I think the word "Professional" should be ommited from the forums description.
If Google believes this change will benefit them by selling more Adwords, this would be a very short term gain. When the quality of the results suffer, then Google will suffer.
Content is no longer king in the new Google order.
I believe it still is. The more content you have the better your chances of having a page, or more, in the SERP's and the more pages you have in the SERP's the better your chance of ranking well. It just like going fishing with one rod or 2. While your chances are not doubled with 2, they are increased.
Instead those who link to content are kings.
Quite possibly correct. But this is all the more reason why content is still king, you link all your content pages to another of you own content pages. Links like "See Also Red Widgets" work very well.
My industry can not afford the costs of adwords. It takes 100 visitors to get an inquiry and 100 inquiries to get one sale. The profit on a sale is about $1000-2000 and Adwords sell for $5-10 per visitor.
WOW! I only make $13.00 profit on some of my Adword sales. I would of course NEVER bid for the no 1 position though! I bid the min 5c per click and bottom feed on a *LOT* of keyword phrase combinations. This helps keep the CTR high enough to NOT cut me off. I also only target the 'richer' countries. $20.00 is a months/years wages in some Countries. I also skip the option of having ads show on AdSense. All my ads omit words like Free, Shareware, Demo, cheap etc
Fortunately, I prided myself on content.
I would say to keep pride and do not let Googles latest changes take that away from you.
While I of course need to make sales to make a living, I by know means write content only for paid products and services. In fact, 80% of my content pages as simply free relevant (to my industry) content pages. This, I believe is what encourages Bookmarks and repeat visitors. Forums are also another great way to get thousands of free content pages in Google. Best of all your 'potential' customers write it for you. We should also not forget the opt in newsletters.
[edited by: I_am_back at 10:14 am (utc) on Dec. 13, 2003]
I don't care if their web sites say "flower shop mycity" too many times. I don't even care if they do massive link exchange... Let me, the user, decide whether I want to buy something from them.
Well put Spica.
Who knows what Google's long term strategy is, whether it's to move people over to adwords, prepare for an IPO or deliver better quality results to its users?
My feeling is they need to take the short term effects into account too, and they have underestimated the impact it would have on webmasters and small-fries (I say this because I am a small fry just trying to make ends meet) who count on part of their livelyhood coming in from sales on the net.
I'm not suggesting they have any moral responsability, just that they didn't do their homework and may have started a chain reaction which will have negative effects in the long run for their business.
When the quality of the results suffer, then Google will suffer
I agree with allanp73
Perhaps we should add "along with lots of family run online businesses and users".
Getting back on topic here...
I don't see them relaxing their filters at all, although some dc results are different from others, but this has always been the case. On the contrary, everything they've done in this last month is reinforcing the conspiracy notions (Shopping? Try Froogle). The bottom line is that Google is in the search engine game for Google and will do what is in their interest.
Lots of sites have lost rank due to different algorithmic ranking. At the same time an exact same number of sites have gained ranking.Then also at the same time, lots of sites have been affected by Google's upping the attack on duplicate content. They have always tried to deal with duplicate content, looking for the canonical/real/main page. Clearly they have gotten a lot of sites in that net this time that they shouldn't.
Hi Steve,
I agree with your analysis here plus you quite often bang on about "no one's arguing that the previous allinanchor was good."
Perhaps the change in the algorithm is nothing more than a marked down grading of the allinanchor component of the calculation plus the other things that you mention.
I've gone back and looked at a comparison of allinanchor:widget insurance and post florida plain search widget insurance. One of the sites that have dropped out of the top of SERPs was #1 allinanchor is now #331 plain search and is a www.generic-hyphenated-url.com site. Some of the others around in the top 20 allinanchor have bably 50 (just guessing) other similar sites which are just hastily prepared template copies with a few links and bits of cobbled copy and every one using a hyphenated generic url.
%0suffered similarly and they are also hyphenated generic domain names. Except for the index.html page every page in the site that was #1 has the same text exactly in the <title> tags and that has been copied from another www.generic-hyphenated-url.com site where every page has exactly the same text enclosed by the title tags as the previous page and is on a virtual host in the same IP range as the owners proD
I had previously been in an ongoing battle with these guys and had changed my internal site structure to incorporate widget-insurance into many of the internal links both in the text and the url and I'm still at #3 allinanchor even though I have a www.brandname.com domain. Couple this with the duplicate issues that I've found and I can accept that I've been sucked into the trap of the allinanchor bait provided by Google previously.
If Google had just removed any weight from allinanchor in their algorithm then we would have seen something close to what we saw post Florida even if they had left everything else in the algo constant. If they did remove this component of the calculation this would make PageRank and other factors more important. This might look like a filter because it would have the effect of dropping pages that relied heavily on allinanchor for their SERPs position to drop down.
Perhaps sites that have dropped out just had too much duplication and from the evidence of a couple of cases I have seen this would be a very strong candidate for the reason for them not appearing in the 1000.
I thought that we were supposed to be having a discussion here to help everyone concerned. I think that folks who have not been affected or have been positively effected by Florida have a different insight to those of us who have been badly effected. They notice things and see things differently to those of us who are trying to find the answer to what to do next. Some of us have got past looking for someone or something to blame and have started to look for low risk things that we can do to try and get back in the next update.
looking for duplicate content, examining where your sites are hosted and how that might look in terms of close links, looking for links out to areas of the web that are bad neighbourhoods will not do you any harm whatever the cause of the problem that we are facing.
FWIW I think that folks like Steveb have benefited from following the recipe for successto the letter. They have not overdone one ingredient in order to fool Google, long term that has to be a very good strategy. In the short term I'm going to wait to see what the next update brings and if I'm not back in the top 10 then I'm going to have to do something.
Best wishes
Sid
What causes 90% of sites to disappear?
The suggestion that Google may be broken isn't fashionable at the moment - but that doesn't mean it isn't a possible explanation - it is certainly one of the simplest contenders ;)
One big question is "Why are certain keywords and keyword combinations apparently 'penalised'?
It's a tough one, because whenever a pattern is spotted, contradictions can always be found. They certainly look, in the main like commercial terms, but are not restricted exclusively to these.
Technical difficulties with indexing at Google could explain this unfathomable mix of 'penalised' terms - perhaps they are simply the most popular ones in terms of number of sites and pages to crawl. This would explain the preponderance of 'commercial' terms that appear to have been 'penalised'.
There was some speculation after all, a while back now, that Google was approaching the mathematical limit of pages it could index without a major overhaul.
Yours unfashionably :)
You'll see 380,000 results spread across almost as many domains, of which only 21 are displayed. That's a pretty efficient filter.
I wrote this in another thread a few days back, but that thread was cut a number of posts back and closed (long overdue, so no problem), so that post disappeared. At that time, the same search showed 100,000 less results
/claus
[edited by: claus at 11:57 am (utc) on Dec. 13, 2003]
It just like going fishing with one rod or 2. While your chances are not doubled with 2, they are increased.
Reminds me of a post when the original OOP or algorithm was tested for a couple of weeks six months back.
To paraphrase the original poster:
If Google reduces my rankings for one site - I'll build 10 more. If they reduce my rankings for 10 sites - I'll build 100 more.
Some of the best advice I've seen here yet.
You'll see 380,000 results spread across almost as many domains, of which only 21 are displayed. That's a pretty efficient filter.
If you look at the source of the ones returned when you "repeat the search with the omitted results included." you will see that they have identical <title>text and body text.
This example and others I've happened upon myself lead me to believe that text in the <title> tag is the most important from the point of view of duplicate filtering. Duplicate text inside the body tags (but not in other tags within the body tags, possibly) can be ruled out as a strong factor in duplicate filtering because Google is clearly not seeing syndicated content as duplication.
If you remember the other example that led to the thread on this topic being pulled demonstrated the obverse. Almost identical text in the body but the title tags were not formed correctly and so would not match what the filter was looking for and would therefore be overlooked. ie not filtered.
Best wishes
Sid
PS When that thread was pulled I had the page open on another browser window and was able to save it to study further.
for me your example does not look like a result of a better filter. I strongly believe pre Florida the search will have returned almost the same results and was as well displaying only some few pages (before clicking on the display ommitted results). This is the 'Duplicate Content' filter known for a long time.
I still believe that G is broken. Yesterday I had to try to find some relevant information for an antibiotic for my youngest daughter. I typed the antibiotics name 'augmentine' in the G toolbar. From #1 to #250 all spam redirecting all to the same vendor. Could you explain to me what the benefit of G's new filter should be when the results are so f'''g bad? That's not a filter.
Either they are trying to make money through AW or it's simply a broken index. I don't believe the IPO or conspiracy theory. Think about the homepages having disapeared. Remember the jewlery / real estate searches. Why else should they refresh the index hourly since Nov 15?
One more argument against the consiracy theory: they know perfectly well that most Mum & Pap bizz can't afford paying for AW. They might try but most will stop their campaigns after a month or so.
G's Nov 15 changes are a good example for tech company investing more and more in engineering and programming instead of listening to users and webmasters (and other humans).
As I see it, if a an over-SEO filter catches one site in its net, then shouldn't all its duplicates go away as well?
From
'Furry Blue Widgets'
to
'Welcome to our site' or wtte.
Result:
It has messed up my Alltheweb SERPs!
So I chickened out and changed it back, so not enough time to see what effect it had in Google. And besides, my site is about 'Furry Blue Widgets' - so it seems a perfectly reasonable title to me. My experiments with bending to the new 'algo' are over for at least a couple of months.
and I'm cross with myself, because I promised not to twiddle!
Someone who has a site which is spidered by google everyday, or at least very frequently could "experiment" without risking his/her neck in other search engines.
I'm considering a few very subtle changes in title tags to just see what happens and then play it by ear.
To me the filter google is using is focusing on repeat phrases throughout a web site, most often found in title tags.
Anecdotally, contiguous words in the <title> seem to harm my SERPs; whilst search terms well separated in the title return better results. But once again, there are frequent exceptions to this. My sites are spidered daily - but of course even a refreshed page might not provide any information until its position is re-calculated properly, and that could take weeks. So on balance I've decided not to experiment - I don't think the answer is that simple anyway. I'm increasingly inclined to believe it is a technical problem, or an algo that has behaved in a way not predicted by Google.