Forum Moderators: open
what annoys me is that i can get penalised for too much occurences of my phrase in my text, yet sites above me have the keyphrases repeated in comment tags and can get ranked high. i would fill out a spam report but i think those are pretty pointless.
This should be stapled onto some folks eyelids around here. It is incomprehensible how anyone can look at the datacenters from day to day and conclude from what they see that they should change something on their site like anchor text or headers or whatever. I have pages that are exactly #24 in a search four days a week and exactly #2 three days a week. Never third or eighteen or forty. This is Google sorting itself out, not my html code in my third paragraph.
One day I'm 7th for allinanchor and the next day I'm 33rd. Ya think that overnight twenty six sites added a bunch of incoming links... and then removed them the next day so I could move up to 7th again?
My conclusion is that there is a problem with Google and you are wasting your time tring to analyse it. Its not operating to the published specification - there is no point trying to understand why there are some 'anomolies' in some of your searches.
and More_traffic_Please said (and Steve_B agreed):
For the time being, Google is broke. I'm not touching my site until the data centers remain calm for at least a couple of weeks and some patterns start to emerge.
Guys - I can't stress this enough - there is obviously a problem with the implementation of new spam filters (amongst all the other problems) - the really spammy bad stuff is "floating to the top" Its not everflux anymore - its 'everflush'. And they just can't get rid of those big 'floaters'.....
I can show you hidden links (on blue chip corporate pages) - I can show you javascript redirections occupying top 30 placings on extremely competitive phraseterms - with porn interspersed on non porn searches!. Cloaking with index.exe files; frameset redirects - its all there - and All in the top spots. All on over a million page serps. And the backlink estimates are back to the old rule of 2.
Just let Google get on with fixing it. At the moment - your mileage will vary significantly - depending on the datacentre and the spamminess of your industry - at this point - you are wasting your time trying to understand it - and doing yourself a disservice by changing anything.
Everflush.
Agreed. I have stopped looking for my sites on Google right now. It does not help me one bit. I have pretty much concluded they have problems, especially relating to data from the period February to May? from this year. I also believe they have not implimented their Spam filters (or at least some / many of them), as GG had said on a couple of Spam reports I sent him that the sites (or rather their techniques) I detailed would need to be addressed. But by writing an algo, not a manual ban. I do not doubt GoogleGuy's sincerity. So, as these sites are still riding high, I can only presume that the filters have not been added yet. If they have not been added, why? Becuase Google need to sort out a real problem first before confusing things with the Spam filters.
I now just monitor webmasterworld, allowing other people's nerves to get frayed, just watching for the "Phew, it seems to be sorted" thread with many replies agreeing. And then I will go look again for my sites on Google, not before.
Agreed.
I have pages that are #1 or completely missing depending on the time of day. When it's #1 I'm a genius, when it drops out past #500, I am penalized.
I guess 500+ pages added better content and acquired more new links than me. Then, the next day, due to the precise, super calculations of Google, my page is again more relevant an important than every other page on the web for this term!
Come on guys, this is a joke. It's not everflux, superflux, freshdeep, or rolling Google. Google simply cannot get an index together right now.
If anyone here can think of one solid reason why Google would possibly want to rank pages #1 than drop them day after day, only to bring them back, I would love to here it.
Maybe there is a good reason why a page would be relevant at some times during the day but not others? Maybe Google wants users to get confused by SERPS and click adwords? Maybe Google wants different SERPS on different datacenters to confuse webmasters? :)
Hate to say it as it has become cliche but Google is broken.
A rolling update would not mean pages being completely lost and reappearing at the top time and time again- it would mean minor itterations as it is impossible for THAT much to change every day. The SERPS have no authority because gg seems to think that showing any page is sufficient.
new sites are welcome for google...
i can notice :
before A was 1# B was 3# C was 1#
now A it's 1# in the second page
B it's 1# in the second page (for a research of one keyword)
C sometimes is 1# sometimes 10# and sometimes you couldn't see it
sometimes in 3 or 4 datacenter i can see old results where A it's 3# for a research of one keyword but it's 11# for a research of 2 keywords ...
All websites are affected on the bug/filter of google of the anchor text.
Will i need to modify the anchor text and tell webmaster link my websites to modify it or i wait google's update?
I going SPAM, 100%!
I give up. I have been sitting and waiting (Per: GG) while all the while my business has gone to hell over the past few months.
So, hello Godaddy, I am about to get 200 plus urls and have some fun.
Thanks Google, I waited it out, but no longer.
Best regards,
Spam-Master-Anon27
Anon27:
I still believe that things will calm down at Google.
I can't believe that they would show users randomly generated SERPS for ever ;-)
All my sites are made same style, same style of becoming backlinks even about the same number of backlinks and they all have highly different postitions. Thus Google is still working on the Algo or there are differences on my sites I don't know about.
Site ranks 32 for "widgets" 243 for "blue widgets"
Leave sme with the conclusiong I am the problem, as I set 2-3 text linsk on the word "blue widgets" to different pages within the site.
Removed them, back to NO2 for blue widgets.
None of my compeitors have been penalsied, the spammers or no. I am convinced penalties are being handed out by goog for over optimisation of key word text links.
Anon, check your page, how many times have you got key word outgoing links on it?
Get up this morning... foolish enough to take a peak at the index... and all the recovery from the last week has gone. The barometer sites just disappeared from #1 to totally off the radar. My reward no doubt for producing content sites (like the others I watch).
I think the point is that it's either broken.... or intentional. I believe that people will still have positive feelings to Google if it's the former... but if it's the latter? Hmmm.
Surely, if it were a new filter that would be that!
Also, GG has not made any comment on this topic for a long while.
In his Q&A a few weeks back he said he didn't feel any need to worry about index pages missing.
1. If there was such a blatant filter as this in the pipeline would GG not know about it?
2. Why is GG not commenting in any of these posts now?
Either doesn't want to give anything away or doesn't want to post that Google is in fact not functioning correctly or maybe he doesn't know?
One blue widgets linked to domain, other 3 linked to different pages in site.
Linked to domain and 3 inner pages. One was also a <h1>.
Nothing else changed, and problem seems to be solved.
I don't even think this is new to google. I think if you have the same text linking to different pages, you get the penalty, I once attached a sitemap that was generated by software that named all the linsk identically and got a penalty, which disappeard when I removed it.
Yah, it's getting a bit ridiculous. Been gone and came back a few times recently but each time one has to wonder if they will make it back in.
<<I still believe that things will calm down at Google.
I can't believe that they would show users randomly generated SERPS for ever ;-) >>
I am starting to believe they will. Since the scoring of SERPS appears to be totally random (based on time of day/day of week), more sites seems to be the answer for webmasters.
I have been preaching the "multiple sites approach" for some time now. Every time Google makes a change, no matter how much content/links one has, they are in danger of being lost in the SERPS. Now, more than ever, with Google dropping pages daily, multiple sites has become essential. Can it be spammy at times? Yes, absolutely. Do I care? No.
When pages are #1 allinanchor and "clean", yet are nowhere to be found in the SERPS (on most days), one has to conclude that either Google went WAY too far in filtering pages or it is seriously screwed up. If the web overwhelmingly votes for a page, why would Google ignore it?
And what do we learn from this?
Always have another domain up your sleeve that comes up when others fall down the SERPS.
I have seen this actually work for this update.
Back to the Topic.
Definitely need to look at the whole anchor text thing. One of my client's site has more than 100 one way incoming links with the keyword as anchor text and the site is #1 in allinanchor: results, but not in top 90 for google results. So if the site doesn't rank well the lesson I learn is keep mixing the anchor text a bit. And don't use the same anchor text for all incoming links.
GG said site cannot be penalised for external factors. But what if all incoming anchor links to a site are same? They can then ignore some of these links. In ignoring these links you will naturally fall in rankings.
And I still feel this is not their best results. They can do much better IMHO.
Nah.... you can't run a business on total instability.
The social contract is dead. Meaning, the idea that if you play ball with Google, focus on content and good honest links, they will also play ball with you and offer at least some stability and prospects of placement.
Part of that contract was Google's PR and approachability. For example GoogleGuy on here. It doesn't take a lot for him to say that the system is working itself out and index pages will be fine again... IF that was the case.
He hasn't offered a bean of guidance on this for weeks, and remember that it's by far the biggest of current issues. No doubt not his fault, possibly under orders. But why?
Everyone really has to draw their own conclusions on the intent here.
Yes it could easily be broken and there is plenty of evidence. But if it isn't, you have to start to respond and build again sometime.
Strangely, I searched Yahoo for the same keyword phrase and my index page still shows on the first page of results. I checked the Yahoo cached copy of the page and it's the one with the latest refresh date (3 Jul 2003) that doesn't show in Google's data centers Top 100 serps.
Hmmmm...
I don't believe this. I see many guestbook spamming sites in the Top positions for my keywords, and they use the same anchor text in every guestbook.
Why should your clients site be penalised for 100 links with the same anchor text while a guestbook spammer with 350+ backlinks with the same anchor text isn't be penalised?
I am not talking of one site there is a plenty of sites which fit in that profile!
We have played everything by the book for years with google - no spam tactics or anything and lo and behold this morning our index page has completely been dropped from no3 to nowhere in site and our other site has had the same treatment.
Our websites have to pull in 10,000 per month at least to stay afloat.
This is one algorithym i do not like.
Bek.
I don't believe this theory apart from it is a hushed sneaking penalty catching only some sites a day and spreading over the network in months.