Forum Moderators: open
Recently my (pr4) site dropped from position 10 to 40 on one of my keywords
I noticed that there is somebody out there (easy to see who) that has been filling up the results using a trick, I guess a new trick.
They use a number of unindexed URLs i.e. no cached link and no size is shown for those listings. Apparently Google has some kind of a problem here that can be exploited. I would guess unindexed urls should get lower rankings, apparently this is not the case.
Do we have more information on that? Has anyone researched this?
Thanks
MK
I imagine what you are seeing is cloaking: the page you see is different from the one presented to googlebot. Noarchive is typically used to prevent humans from seeing the trickery.
I've noticed quite a bit of this lately ... seems awfully effective.
Welcome to WebmasterWorld [webmasterworld.com]
I don't understand. Is what the other sites are doing considered spam by Google and do they have some method of penalizing any of these?
Fresh information is given a temporary "boost" in search results. This lasts a few or couple of days, depending on the page rank of the site. So for popular sites that change every few days, the new information is available ( I think they called the project "minty" ) sooner that once a month. This has worked very well.
So google retired freshbot. They changed it's behaviour so that it will spider deeper into your site the higher your page rank is, and will try to "deduce" how often a page changes, and come back for them at regular intervals.
So you want high rankings? Update your pages. And if you do it regularly, you'll get regular visits from google.
So here's the nasty trick - and I learned it kinda by accident, kinda by running across these pages. These are pages that are actually programs that change the text around alot. Usually they'll target a specific topic. I'm noticing this with one of my pages....
I have a "waste of time" page that shows off how markov chains can be used to generate "human readable" random text. It comes up with some funny stuff sometimes. But everytime that deepfreshbot comes around, that page get's ranked higher than my home page for a search like "bitesizeinc" (the domain name). My home page has a page rank of 5 (hopefully on the rise, knock on wood), and the waste of time is a 1. And the home page has MORE specific meta tags and title and all that stuff. But after a day, it returns back to normal order....
Actually, I was afraid that Google might consider this example "SPAM" and would penalize me, so I slimmed it down considerably. I suppose if I added the noarchive command along with this, and only changed the pages daily instead of every time you load, noone would really be able to tell this was happening.
Chris
[edited by: Marcia at 5:42 am (utc) on Oct. 11, 2003]
[edit reason] No URLs or sigs, please. [/edit]
I don't understand. Is what the other sites are doing considered spam by Google and do they have some method of penalizing any of these?
I am lost on this one as well! We read references to site banning etc... and yet I have actively attempted to draw Google's attention to one site owner who has over 90% coverage on any SERP's for every search string relative to his tourism content. He is so good he is now running over into neighbouring geographic areas - unbelievable!
All attempts for dialogue have been ignored by Google and every aspect of this site is SPAM - hidden text links, gateway pages - duplicated content - domain cloaking and for 6 weeks we have had no joy at all.
So how does Google detect spam and do they deal with it? So far I am a little unconvinced as the site I am referring too breaches every possible spam rule - which is why they are blanketing the forst 50 results.
How do you report spam? I tried the online form but so far nothing?
This is so blatant that it destroys the use of Google for a search on anything related to this one operator's topic(s).
/Wayne
I've been looking for evidence of any type of spam filtering or the like @ Google for weeks now (>month?), and i just can't find anything significant. I remember an old thread/post by GG suggesting that they would not do that much about spam for a while, as they needed to work with the new stuff... that seems to have been some understatement.
Personally i don't really have any customers that would benefit from a heavy spamming campaign, although i can't help considering it, as (a) it seems rather easy at the moment, and (b) i'm looking into a few pretty spammy cats preparing for future work (like, say top 50 or so is more or less some shade of grey.) Then again, these types of customers would definitely not be interested, regardless of results.
What i would like to see is action from Google on this issue. I definitely expect it to come at some point, but the situation right now is not encouraging for "whitehat" seo at all.
/claus
For example, I have several dozen pages on my site where I added the "noindex,nofollow" tag four months ago. They are still in the index.
(I'm considering building a "please-remove-these" page that links to those pages, hopefully getting Google to remove them. Maybe add a 301 redirect in addition to noindex,nofollow.)
***
Anyway. My theory about spammy sites (like Global Wayne mentioned) and Google:
Google is probably hesitant about applying new rules to older pages. Imagine what would happen if they would remove older, apparently "spammy" sites in one fell swoop: countless phone calls and e-mails from irate webmasters. Bad press from a lot of legitimate sites being removed. Etc. etc. They probably don't need that right now.
Or, maybe these pages haven't been analyzed in months. Just my guess...
Johanna
Still a bit strange, that google doesn't read indexed pages for months to check their content.
For the heck of it, I searched for those pages in SafeSearh and they weren't displayed. So, at least Google knows it doesn't know what's on those pages :-)
What i would like to see is action from Google on this issue. I definitely expect it to come at some point, but the situation right now is not encouraging for "whitehat" seo at all.
That's also what I have been pointing out for so long.
Meanwhile, there are some of us here that either are not aware of what's going on at Google or over infatuated with Google that they keep in believing that only 'what's in the guideline' works for Google.
It's easy to believe in idealism but it's out of touch with reality. Action speaks louder than words. So, no matter what Google says when their own action contradicts it, which one would you follow, guidelines or action?
I'm not an advocate of SEO tricks but a believer of quality content but it's tough to compete against someone who uses 'blackhat' technique when Google itself tolerate this kind of practices...almost to a point of being an acceptable practices.
Cheers
It's just good behavior - you told it not to follow your links after all ;)
SafeSearch... *lol* .. not one i use frequently, if ever, but nice to know it's useful ;)
>> believing that only 'what's in the guideline' works for Google
You can still get good results and top-ranking pages by following the guidelines. It's just that more "spammy" things seem to work nicely as well, and in some cases they might just be much quicker/easier to implement than the patient good-guy approach.
It's not exactly encouraging; and as good ideas tend to be adopted quickly, i hope that they'll do something soon - however, "hope" is what i do now, a few weeks ago i was "certain" that something would happen soon. Then again, "soon" is a relative measure.
/claus
I'm not an advocate of SEO tricks but a believer of quality content but it's tough to compete against someone who uses 'blackhat' technique when Google itself tolerate this kind of practices...almost to a point of being an acceptable practices.
I totally agree - if you take the time - and let's face it following the guidelines and doing the right thing - is quite time consuming.
For these other dudes to just bust the format with the devil-may-care attitude hardly rewards your effort.
Meanwhile the spammers reign all over given search strings; it makes Google look rather silly to me.
The spammers we are dealing with are in travel related sites and what we are looking at is almost white collar crime - as one particular operator actually poses as all these major sites (my clients among them) - so I know he is turning over an extra 50K a month just from this one client. The client won't drop him as an agent... so this is an ugly cycle to be caught in. BTW Alta Vista tipped him from 2K URL's down to 2 in 48 hours.
I live in hope that Google suddenly drops him as well - he has invested in almost 80 domains with stollen content for 3 geographic regions! That is a BIG spam effort...
We work on going the clean route of professional publishing waiting for the morning someone at Google checks to see where the backilnks are all coming from and he will be gone!
We do live in hope:-)
/Wayne
The page that is being feed to Googlebot is so ugly that there's no way you can present this page to a user and make sense out of it. But, obviously Google likes it :( As a user if you happened to access a page like this, you would be clicking your back button in a new speed record.
What makes this page hard to present to normal users is the 'fact' that you can't use CSS, without elaborating why...all html tags 'should be' embedded in that page.
No matter what...the content has no sense at all. Think of it this way...there's no content, only tags and keywords.
Now create a page like that for your users :D Either way you are caught between Google and users, no wonder why the owner have to resort to cloaking and redirect.
Cheers