Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
Is this a side-effect of improved dynamic spidering, or have the web developers for the sites wised-up (en masse)?
Random forum posts and PDF's sometimes show up in the top 10 for words that are pricey at overture ($5+).
My hope is that this is just a temporary situation before the full update occurs.
>>I haven't seen an increase in pdf/doc files in the index, and I've got a pretty good spectrum of stats to draw on. <<
Well sorry, but it isn't just me seeing this... certainly not... far too widespread a problem for that. Take a look around the threads. There is a significant increase in these file types in high unwarranted positions, and in some areas it is totally strangling web content.
>> But Napoleon, the last time I asked for an example involving your site, you weren't willing to tell me any specifics. <<
I have nothing to hide... BUT.... I prefer caution.
So I identify my sites. Then I come here and point out problem areas like this. Doesn't that make me a hostage to fortune? A hostage to Google's ethics.... my livelihood on the trust that they won't take punitive action?
No, I'm not crazy or paranoid, I just prefer not to open any risks, however small, where it is not necessary. I'm not suggesting at all that anything like this would happen - but I obviously don't know 100% that it wouldn't.
And SteveB... I'm not the only who feels like this. That's one of the positives of this board. We can discuss matters without undue risk. The complaining isn't unfounded, it is based on fact... fact which Google ought to explore and deal with. You can't close our observations and pretend they don't exist just because we prefer not to file a spam report.
>> When Amazon come out 2 times in the first 10 for terms like "men's underwear", honestly GoogleGuy this should ring you a bell. <<
Absolutely, and that's a very mild example.
If your guy is telling you nothing has changed this week GG, he's wrong.... and I don't care who he is. I believe my own eyes. The change is stark and obvious.... and it looks very very bad in some areas. This is exactly what others are saying.
The answer is simple: Ask him what he has really changed, and tell him to change it back. Then ask him to explore the intermittent index problem!
In case I do come over as Mr Grumpy though, perhaps that is because I care. Google has long been the bastion of free search and free indexing, whilst most of the rest caved in to commercial manipulation of their returns. It has been a champion of quality over short term greed.
That's what I care about... I wouldn't like to see a shift in philosophy, but fear it is around the corner and may even be in play. There are too many problems in place which don't appear to be being taken too seriously. The two main ones of course being the intermittent index (which recovers on refresh) and the amazon/pdf/etc strangulation.
These ARE big deals, and Google's reaction to them will ultimately tell us a lot about their current attitude to quality.
To be fair again (as much as it pains me - joke) they did address the first missing index problem a few months ago, although obviously not totally.
The first step to tackle a problem though is to accept that it exists. I would urge you GG to press on with these two issues and get to the bottom on them. I for one believe you will be doing Google (and the searching public) a great service if you do.
It's just arm-waving. You have an invitation from someone who can do something about it and you want to just assert your observations.
And this has zero to do with anonymity quite obviously. If you are talking about a widespread phenomenon then it will exist in areas totally unrelated to your own interest area and you can submit reports on those.
I actually used the "Report a spam result" feature of Google about 1 week ago fo the first time. I reported 2 sites that were using very blatent 'keyword stuffing'. So far no reply and no change in the results. I loose a lot of faith in any company when I email them and they either, A) do not reply, and/or B) take no action on my email.
Now contrast the above with emailing AdWords or Adsense I get replies with 48 hours.
All-in-all I see no point in reporting these types of things, or should I be more patient?
All-in-all I see no point in reporting these types of things, or should I be more patient?
Be patient :)
As has been mentioned many a time, Google rather than try and take a swipe at 1 site flouting the rules, will always try and discover the cause/problem over a period of time, and then try and correct the mistake/error across the whole algorithm.
Not sure how many reports are going in, however to expect a reply to each and every 1 would be impossible, so just be patient and move on to other things in your life.
hey, and let's cut some slack to GoogleGuy, the guy really does his best, and having negative coments thrown about is no good for any of us.
Ok, it's been 1 week and 5 minutes now, just how patient am I supposed to be :o)
As for the results (try "I want to lose weight")there is defineltly some changes and some pdf and Amazon results but looking at them its hard to argue that my site deserves to be there more than them
That's your decision Steve.
>> If you are talking about a widespread phenomenon then it will exist in areas totally unrelated to your own interest area and you can submit reports on those. <<
I tend to spot them in areas related to mine, because that's what I am looking at. However, there are countless others which are not hard to find... if I can find them, so can GG.
Some people are already starting to cite examples anyway. It's widespread and it's a real problem. It seemed to appear over the weekend... so something DID change.
I guess some of the problem could be the before/after syndrome. Unless you knew the results for a search were clean before, you won't understand that they have badly deteriorated.
However, (Amazon/PDF/Amazon/EBay/PDF) for example at (1,2,3,4,5) is something I had NEVER seen before.... so I think you can safely bet that anything remotely resembling this is a result of the change.
Believe me... it ain't a figment of my imagination, or of the many other people who have also commented in this and other threads. It's stark... and if I'm "arm waving", fine. I prefer to thing that exposing a significant weakness like this is a positive.
Imo, wouldn't it be quite easy to spot duplicates - which no one really wants - in a SERP? E.g., seeing both Amazon.com and Amazon.co.uk on the first page, then 'punishing' the least appropriate mirror so it shows up at least some pages later.
Maybe it's allready been fixed and it's all just a matter of time before the SERPs are nice and clean again.
Having a quick look at the Serps page for this result you will see the Amazon co.uk, com and de pages being displayed.
This is a good example why there should be a Google.us and a Google.co.uk that automatically searches for sites hosted in those countiries as the default option - Or alternatively these pages should be picked up somehow as being from the same site (group of sites) and two results followed by the more results from this domain option being available.
I think it's also important that once this amazon/pdf/ebay/whatever problem is accepted (it is frankly far too stark to be denied for ever), the fluctuating index issue is also addressed. That is also a big one, but slightly less visible.
Not mentioning these for fear of being flamed is actually doing Google a dis-service... surely it's far better to be highlighted here by us, than by Joe Public in the wider world?
As well I am seeing a LARGE amount of those SomeonesSite.com/chnl10.asp?keywords=blue%20widgets
results coming up in the top 10 and 20 results for several searches I am doing.
Several (not all) of my sites have taken BIG hits in the past couple weeks and most of the new sites that are now ranking higher are pure crap results.
I think the major sites such as Amazon who everybody on the net knows about and who have a major affiliate program should be manually regulated by Google.
Ok. Why don't we just manually regulate all sites that do well on Google? Can you see where this is going?
Also: it got a little bit heated with Googleguy earlier on. He doesn't have to give us information. He is here out of Google goodwill - you don't see them lighting up messageboards all over the web, do you? He said that the first thing he did when he read these threads is went over to a search engieneer and ran off some stats regarding the prevalence of pdf's and doc's. No change - but that isn't the point - he tells us this and we give him cr*p! Now that is what I call ungrateful... Where else could you get that sort of information?
I don't think it was particularly heated... he just seemed a little reluctant to see what is totally evident to almost everyone else. That's not a criticism by the way - when you are very close to something it IS sometimes hard to see the obvious. It's perfectly natural and normal.
I can get information like that just about anywhere else, because it isn't useful information. We can all see with our own two eyes that the results are now full of Amazon.co.uk, eBay, msnshop, epinions, etc. When someone denies it, we should be grateful?
I wonder how many of you, having domain.com and domain.co.uk would complain if the top10 is flooded with your domain listings.
To answer your question...I would be exxxxtremely happy but on the same token 'how many webmasters are going to hate me?','how many webmasters are going to report my site for spamming?'.
But that's not the point, as a searcher would you be happy to see 10 results belonging to 1 site? You would be looking at other search engines at no time.
But to go along with your concept. Let's say I dominate the top10 for 'blue widget'...what is to prevent other webmasters to dominate say 'red widget', 'green widget', 'purple widget', etc...
See what I mean? If this is the current algorithm then pretty soon every query, the top10 would be dominated by 2 or 3 sites, that's 7-8 listing that's redundant and doesn't give searchers much any choices. Do you think this kind of search engine would survive on the long run? Alltheweb here I come and along with me are people who values my opinion.
It's about search quality.
p.s. You know what I have observed also? With that query 'web database applications', the adwords are extremely attractive to click, most are right on the money. It's like we have a reversal here, the regular serp are now advertisements, the adwords are the ones that are more relevant to the query.
Funny you should mention this. I've noticed a dramatic increase lately in the CTR on our AdWords campaign. Is this because the top serps are not providing what the searcher is looking for?
Looking at most of the results I don't really believe that however after making no changes to our ads the results are up.
But the results you are looking at are based 'on the keywords that you advertised', am I correct on that one?
If my assumption is correct, do you have access to the actual Google referral? Like, what is/are the actual query used when the user click on your ads?
Let's not forget, Google uses 'related search' type of technology. Meaning if the query is...
'How do I use blue widget?', obviously, nobody would have paid for that exact term. However, Google is able to pull adwords base on 'blue widget'.
Maybe if you could find out the exact query then that could give you an overall picture of what's happening to the regular serp.
Take for example the query 'web database applications'. How many adwords advertisers do you think have targetted that phrase? Most are targetting...'web applications' or 'database applications' or 'web site database'.
I'm pretty sure adwords advertisers are experiencing an increase of CTR because of queries like 'web database applications'.
Here's how things stand from where I sit. After talking to a crawl engineer and another engineer, I'm relatively convinced that we're not crawling/returning pdf/doc files more than before. That's from looking at actual stats, histograms, etc. that we keep internally as we do our crawl and monitor the quality of our results. If I wanted to go back and raise this issue with them again, I would need some data. That's why I asked for people to send me actual reports that demonstrate the issue.
So far, I've gotten one spam report in the last day with the keywords pdf or doc. For the search in that complaint, there are no pdf's in the search results. Instead, the complaint was that the pdfs on that site don't return the content that the site promises. I'm pretty sure that it was just chance that that person happened to mention "pdf" in their complaint. :)
Where do we go from here? I am happy to dig into this some more--but I need data. Anecdotal reports without specifics won't do me any good. If folks are willing to help, just drop me a spam report with a query and the word pdf in the text of the description. If I get any reports, I'll check it out.
He has solicited feedback that people have so far been unable to provide.
Napolean you need to look up irony or something. You yourself posted that if a person was unfamiliar with a search previously, then it would be hard to know what pdfs or docs or whatever are new. So if YOU can't go to an anonymous search and say clearly "this is bad" and why, then how the heck is anyone else supposed to? How?
If people see a problem in their own interest area and are too mortified to mention that, and they can't find a similar unrelated problem to bring to Google's attention, then what is that? It;s close to the very definition of "non-problem" actually.
Waving your arms and making a ruckus is a waste of good electrons if you are too lazy to do anything about it.
And I'm not saying this problem doesn't exist, but it does NOT exist in the areas I focus on. However, a possibly related or similar problem exists of thousands of worthless PR0 zero pages that no human ever lays eyes on are raising the ranking of sites they link to via anchor text.
In other words, there may be the exact same number of pdfs and docs in the searches, but the RANK HIGHER and thus are just more noticeable just because Google is valuing the anchor text on baloney/worthless dynamic-type pages pointing to these Amazon-type pages. It is a serious problem if for every query Amazon shows 50,000 anchor text links to those words, and Google counts them. And then the Amazon mirror question exists: considering the one example posted, it would seem possible that the top ten search results would consist of two results each from all the Amazons, each showing the same two books. Obviously that is terrible and it would be pro-actively productive to send such results to Google.