new with google?
| 11:38 am on May 17, 2003 (gmt 0)|
From SJ to FI, then CW and now DC, i have confirm a new things happened with the new algo.
In FI, CW and DC, my site is in the index. Most search terms performing the same as usual, EXCEPT the top keywords.
Top keywords are keywords used a lot in anchor text and contained in the title of my page.
The reason I am posting this is not to blame google or what. I need to at least find our the reason why this is happening.
Anyone here having the same experience? Maybe we can discussed here and find out the reason. Maybe there is a new filter working againt us.
| 10:57 pm on May 19, 2003 (gmt 0)|
Inbound anchor text (from external sites) has something to do with it, perhaps related to 'perfect text'.
I can't believe that Google would penalise external anchor text which is relevant to your site. As I mentioned a few days ago if a site links to your site about "blue widgets" then the most logical thing to have in the anchor text on the external site is the words "blue widgets"... this way the surfer on the external site knows exactley what to expect at the other end of the link.
Possibly another consideration for Google would be that a workaround to this would be very easy for spammers.
| 11:08 pm on May 19, 2003 (gmt 0)|
hamster77, I may have missed this, but when did you make the change to add the H1 tag? If it was recent, I doubt seriously that it is being factored in at all yet.
needinfo, I sort of think and hope you are right. I can see why Google might surmise that this is a spam issue, since spammers can easily set up lots of dummy sites and point them to their key properties with total control over anchor text...but this would be a bassakwards way of dealing with it, as it would penalize lots of good links. It's probably not that simple either, if it exists at all; may be codependant on other signs of spam that are present.
It could also be an algo element that is getting higher importance than it would ultimately, because they are just testing it.
The only reason I've even raised it before is that it's the only thing we can find that would explain some of the activity we're seeing on a few sites we own. And in mentioning it here, a few others have responded that it's a possibility with their sites too.
| 11:28 pm on May 19, 2003 (gmt 0)|
>>I am glad you are reassuring me about my possible faults... The point is... now I have no clue!
Hey, me neither :)
I'm not one to get stressed out by it though. A few months down the line i'll have figured it out ;)
>>I just changed the meta-refresh to permanent 301 redirect
A wise decision.
| 11:45 pm on May 19, 2003 (gmt 0)|
I changed to the H1 tag in time for the April deep crawl as far as I remember, but maybe that's not been included in this update? I don't see how it can be the H1 tag that's caused the site to sink like a rock, but a few other people had mentioned it which just made me wonder about it. Nothing else on the site had changed at all, other than getting more backlinks over the last 2/3 months that have never been factored in by Google. It's a site belonging to a shop that sells local widgets made by local widget-makers. It's always been at around #10 behind sites mainly belonging to American importers of local widgets and that seemed fairly reasonable. Now our local widget site is around #300 and about 200 of the sites above it have nothing to do with local widgets and make no mention of them anywhere. Weird...
| 1:26 am on May 20, 2003 (gmt 0)|
hamster77, as you will have read already, so many have the same experience. So unless everyone in here who is going thru this pain is a spammer, it would make sense that this issue will right itself.
Meanwhile, at our place at least, we're combining GG's (at times cryptic) feedback, plus our own observations, to conclude that recent developments at our sites are not really factored in yet for the most part, despite the (apparent) addition of some freshbot data to the (apparently) older indexes being used...plus, no one knows what filters are turned on, how the current ones are interacting, or how those yet to be implemented will affect the SERP's.
Then you have GoogleGuy recently stating that the new algo is "waiting in the wings"...with backlinks to be added, etc...
If nothing else - short term financial losses aside - it doesn't appear to be time to panic yet...only time to start preparing to panic.
| 3:20 am on May 20, 2003 (gmt 0)|
It appears that my site also was hit by this penalty. My site has a three word name, domain & title. The title is keyword1 keyword2 keyword3, the domain is keyword1keyword2keyword3. It also describes the function of the site.
Keyword1 is was seems to have taken the hit, also it is one that I would like to target. It was at #45 on the old index, was at #23 when -sj and -fi were first updated. Now it is around #250 in the new index. It is at #22 with allinanchor, and it is #4 in the directory, which tells me that the penalty will not or has not yet transferred to a directory search.
Keyword1 was the first word in the title, appeared twice in the meta description and meta keywords. It appeared once in the body text and twice as alt text image links. I didn't use h1 or h2 on the page at all.
It comes up as #9 in the serps for key-word1, even though I don't have the word like that on the page. I don't have anchor text like that either. Most of my links are either text links with the title as the anchor text, or a image without anchor text.
Multiple keyword searches haven't been affected. Most of them are used on the subpages. The main page that I targed does not have a lot of content on it.
Anybody seeing a pattern yet that perhaps we can fix?
| 3:58 am on May 20, 2003 (gmt 0)|
>>If nothing else - short term financial losses aside - it doesn't appear to be time to panic yet...only time to start preparing to panic.
OK. I checked my panic button. It's working fine, so I guess I am ready.
Talking about buttons...whatever happened to the big red button that GG used to push every month? You know, the button that let's them switch to using the results from the latest crawl...
I see a lot of discussion here about minute details of Google's latest algo, but I haven't yet seen anyone coming up with a plausible explanation as to WHY ON EARTH ARE THEY USING SUCH OLD DATA to test this new algo.
| 4:31 am on May 20, 2003 (gmt 0)|
Very interesting and weired observation, Waynet.
I see the same: for "keyword1 keyword2", my site has sunk 2 pages on all datacenters now except for -in, but for "keyword1 key-word2", I get results that seem to be like the old ones on all the datacenters. Does anyone have an explanation for that?
| 8:29 am on May 20, 2003 (gmt 0)|
Just a short note about the <h1> tags. I use them (although I don't like them) because it was strongly suggested by a well known pr10 validation site which everyone here knows about. Now, if we can't trust a site which Google itself values so highly as a good site for reference then it seems we are left in the position of making each site unique, although relevance and content still remain king. As I read in another forum - diversify, which I'll take to mean make your site the way you want it, not the way another site suggests you make your site. I'll gladly put those H1 tags to bed - they take up valuable real estate and add nothing to my page. Keyword density I believe is also falling by the wayside, but only if they are used in over-abundance and without thought for the reader. As I think back to my college days - when writing a thesis paper we tell the reader what they "are" going to read, and then we sum it up in the end by telling them "what" they just read. I'd therefore think keywords need only be told perhaps 2-3 times per page, and only when needed in relevance to the sentence structure. Perhaps some of those PHD'S at Google have put in an "english major" into their algo? Would like to hear some thoughts on this as I'm going to have a serious look at my site and keyword density. It does make sense to the reader - if they found your page by searching for "widgets" then we don't need to mention "widgets" in every other sentence. Perhaps out with the H1 tags, reduce keyword density and use your <title> to bring the reader to your page. Google may actually be trying to make it easier for us - to use our common sense and to write for the reader, not for the Search Engines. I got buried in the SERPS this update and may shake out better when it's finally done, but I see changes ahead for my web site. Would love to hear GGuy's opinion about this. Sorry, didn't mean to ramble on like I did. I was mad about this update, but the more I think about it perhaps Google is very subtly saying "it's time to change" and "rearrange".
| 8:48 am on May 20, 2003 (gmt 0)|
>I'd therefore think keywords need only be told perhaps 2-3 times per page, and only when needed in relevance to the sentence structure. Perhaps some of those PHD'S at Google have put in an "english major" into their algo? Would like to hear some thoughts on this as I'm going to have a serious look at my site and keyword density. It does make sense to the reader - if they found your page by searching for "widgets" then we don't need to mention "widgets" in every other sentence.
One problem with this idea. On my single most important keyword, the #1 site has 20 instances of the keyword out of about 200 or so words in visible text. Arguably this site deserves to be #1. Although, I wouldn't mind that spot. ;) I doubt the webmaster even thought of SEO. Basically, this page is a links page, both to internal links about the topic, and also external sites. The way this is done makes sense for the reader. If sites with too high KWD were knocked down in the SERPs, this is a case where this would mean less relevant sites rising to the top.
| 8:53 am on May 20, 2003 (gmt 0)|
Is it possible you just said that outbound link text is now the flavor of the month? Just like it was 18 months ago? This would explain and thank goodness for my krappy old site I can't change (can no longer ftp in) doing well.
| 9:06 am on May 20, 2003 (gmt 0)|
I don`t know for you, but things have changed for me, couple of hours ago:
3rd on sj, va, dc
4th on ab, zu, cw,
5th on in
7th on ex
nowhere only on fi
| 9:12 am on May 20, 2003 (gmt 0)|
The SERPs seem to be settling.
[edited by: JasonIR at 9:42 am (utc) on May 20, 2003]
| 9:15 am on May 20, 2003 (gmt 0)|
>Is it possible you just said that outbound link text is now the flavor of the month? Just like it was 18 months ago?
No. This page has been #1 for years on this SERP. And decent inbound links. Only solace now that Google has driven a site through the heart of my site, they happen to link to me twice on that page. Better than nothing.
| 10:28 am on May 20, 2003 (gmt 0)|
Things has changed today, site disappear in all datacenters, including the good old SJ, which also give up my site finally.
| 10:51 am on May 20, 2003 (gmt 0)|
Isn't it too soon to be trying to analyse any new algorithm?
Add in new data, new back links, spam filters, new algorithm tweaks - ALL things that GoogleGuy says are on the way - and you may well have completely different SERPS
And subsequently completely different views
| 11:42 am on May 20, 2003 (gmt 0)|
Everyone who is saying that it is too early to start analysing the algo, IMO has either not followed the past updates over the last 15 months or so, or they are simply clinging to hope.
My experience in past updates has always been that once the new index hit the www2 and www3 there are very few movements after that. Agreeably this update is different in many ways, and like others I am still hoping that this is just a glitch, but I am not holding my breath.
The biggest problems Google faces today is that their algo is so simple that any 5 year old can get to the top for almost any keyword, by simply doing some basic on-page optimisation and getting lots of non-reciprocal links with keyword anchortext from high-PR sites. This wasn't a big problem a year ago, but with the explosive growth of this forum the spam is increasing every day, as more and more people understand how simple it really is. I think what we are seeing this update is an attempt to curb the buying of high-PR links, and the introduction of a penalty for some sites that are overdoing it, (like one of my sites :(. However, if I am right there certainly is a threshold since my other sites in slightly less competitive areas have escaped the penalty.
What I can't understand is why they didn't just remove the little green bar from the toolbar. If they replaced it with some other cool feature instead, and motivated the move by saying that they had improved their algo so much that PR is no longer the best way of displaying the relevance of a site, no one would really question it IMO.
I hope that I will be proven wrong and that everything will “go back to normal” in a few days, but that does not solve Google’s long-term problem...
| 12:04 pm on May 20, 2003 (gmt 0)|
GoogleGuy has stated many times over that there are many more factors that are going to be implemented and the SERPS will change.
Maybe they will get worse for us, maybe better.
| 12:44 pm on May 20, 2003 (gmt 0)|
> Isn't it too soon to be trying to analyse any new algorithm?
Yes, it is too early to analyze the new algorithm completly. However, since in my case the problem appears on all data centers which have the new index (7/8), I don't expect that it will disappear.
> Add in new data, new back links, spam filters, new algorithm tweaks - ALL things that GoogleGuy says are on the way - and you may well have completely different SERPS
At least in my case, new data and new backlinks won't change the situation, since it seems to be an on-page problem (Backlinks, anchor text and PR doesn't seem to be part of the problem). Also, adding filters will remove some entries (of course, that's good), but won't affect my general situation.
| 1:02 pm on May 20, 2003 (gmt 0)|
What doc_z just posted applies directly to me too and is also my motivation in trying to pin down this problem as early as possible. Whether or not there are changes to come won't affect the fact that we are seeing the main elements of the new SERPs across nearly all the data centres.
I mentioned somewhere before that www.google.com are pulled directly from the 8 or 9 data centres. What you see there is what is on Google.com NOW so whether or not the 'update' is 'finished' I am dealing with current Google serps and there effects.
| 1:06 pm on May 20, 2003 (gmt 0)|
How do you know it is an on-page factor?
| 1:20 pm on May 20, 2003 (gmt 0)|
> How do you know it is an on-page factor?
Just a guess, because I'm still #3 for allinanchor and PR didn't change very much while allintitle and allintext drops from #2 to approxematly #490.
| 1:31 pm on May 20, 2003 (gmt 0)|
What do you think is *causing* the penalty (if there is one)? Backlinks (external sources)? Internal links? Too much keyword density? (In our case, the only distinction we can find between our sites that went up and our sites that went down is that, counterintuitively, the sites that went down have more/better external backlinks. Or, is it too many links with *perfect text*?
Geez, do I have to write all the sites that link to us with good descriptions, and ask them to make their links to us less clear?
| 1:36 pm on May 20, 2003 (gmt 0)|
Geez, do I have to write all the sites that link to us with good descriptions, and ask them to make their links to us less clear?
Crazy isn't it... that's exactley why I think the semi-penalty has nothing at all to do with so called "perfect text" in anchor links.
| 1:39 pm on May 20, 2003 (gmt 0)|
|Isn't it too soon to be trying to analyse any new algorithm? |
UK_Web_Guy, fair question. But as I've answered about 10 times in other places, Google as chosen to open up this process to us, and since they are the 800 pound gorilla in the space, its worth trying to understand what they think is fair and acceptable and what they don't. If you don't watch the road during the trip, it's a lot harder to say how you got to your destination.
WARNING - RANT TO FOLLOW: Believe it or not, I do not enjoy practicing SEO, and FWIW, we don't go overboard on it. But the constant refrain from Google that webmasters should just "build good sites that appeal to users" is absolute nonsense - unfortunately.
We had good sites that were built for users. They were built in '99. I didn't even know about SEO then; nor did the agency that desinged them; nor the builder that built them.
The sites did well...for a while.
Then they disappeared altogether from some SE's and dropped like a stone in others. Why? We designed for the user, not the SE, and it turned out that we ran afoul of some basic "rules" that we had no idea even existed...not with hidden text or cr*p like that (those things don't help users).
We suffered because of:
--too much cross-linking (done for ease of navigation, and for cross-promotion reasons...you know...like all the other major media types do!);
--too much keyword density (we had some really cool-looking pages that, but because of graphics, keyword density for the limited text on the page was too high;
--spammers, who practiced extreme SEO and skyrocketed above us (in some cases).
Lesson: SEO is a necessity if you want your site to make money (and, if you don't have millions for marketing, or an off-line infrastructure to defray some of your costs, like CNN).
So, now we keep up with the Joneses in SEOville. We still design primariy for the user...but now we always do it with SEO in mind. Not doing that in this environment would be, in a word, stupid. But what a waste of our time.
Google has gotten so big and powerful that they are now setting the course of the Web, not just showing it to us. They are a business, and so they can do what they want as long as it's legal. But when you reach a point where you have so much power, you do have at least some responsibility to your community (in this case, the Web), to get it right. Google, the way you have handled Dominic...well, let's just say that it has not been your finest hour.
| 2:38 pm on May 20, 2003 (gmt 0)|
I think not Perfect Link, it is too many new links which is generated in a months time (the speed of new links) and they are reciprocated -> activate the penalty.
| 2:52 pm on May 20, 2003 (gmt 0)|
None of my links are reciprocal (except of course for my internal links).
I posted my latest theory as a new thread (http://www.webmasterworld.com/forum3/13322.htm). Of course, many people will disagree with the idea, but I am convinced enough that I am changing my site accordingly. Only time will tell.
The obvious complication is that some of the people posting here may have dropped down the SERPS for miscellaneous other reasons, that's why it all seems so confusing.
[edited by: Spica at 3:09 pm (utc) on May 20, 2003]
| 2:55 pm on May 20, 2003 (gmt 0)|
do you have links from an entire site with lots of pages? That might be the case too.
I have a page in my site which has internal links. Maybe it is too much until it is drop below many that have only 1 backlinks!
| 2:55 pm on May 20, 2003 (gmt 0)|
The majority of my incoming links are not reciprocal either.
| 2:59 pm on May 20, 2003 (gmt 0)|
Well said wackmaster - I pretty much agree with you 100%
If search engines were not a consideration, I would suggest that most webmasters would have moved by now to the far more aesthetically powerful Flash or other graphics based medium for their sites.
However, within those considerations, you design for users.
| 3:19 pm on May 20, 2003 (gmt 0)|
wackmaster and Jedi,
I will go for JAVA applets menu if google can read it! With colourful icon that animate! wow. But googlebot ...
OK, here is what I want to conclude a bit.
Ok, some people claimed that their links not reciprocal, so we take this posibility out. Some of us argue about H1, I don't think it is the reason too.
But 1 thing we are always the same (except rfgdxm1 which I don't think we have the same case here at all). We have our site display TOP with a search using allinanhor. However, page is missing when the keywords phase is search directly. The point is we do have strong anchor text and google algo's backbone are on Anchor Text and it is impossible to have a site missing UNLESS then did something (the penalty). Remember, a site can rank very high even if they don't have the keywords in any of the page! As long as they have some anchor text which point to it.
In my area, I am not alone, some of my competitors with strong links do encounter the same faith with me.
Shall we now agree that it is the anchor text/inbound links (which used to help us) thats dropped us out of the ranking. We shall agree that it is not due to any on-page factors such as H1 text.
The truth is, I am actively seeking for link exchange and every months I have lots of new links. Maybe, this filter think that if my site could have such an increase of links in such a short time, it must be done by link exchange (or for sure it it not link to me because my site is good). Maybe the filter doesn't like lots of same anchor text with 4-5 words? (how long is your anchor text?)
Hmmm, after I lost my last appearance in SJ today, I see no hope my site can appear in this update.
For those who keep saying that it is too early to examine the algo, please take your time to post elsewhere. Is it still early when your hair is already burning! You can say that loud because your site is steadily appearing in the index but we aren't. Please be understanding.