Welcome to WebmasterWorld Guest from 22.214.171.124
b2net, I don't consider those rankings indicative of anything coming in the future. Some data went into the index without all of our quality signals incorporated, and it should be mostly back to normal and continuing to get back to normal over the course of the day.
You know Matt, one little comment like that can save thousands of webbies from going bonkers? You should hang out here more often. There are lots of simple webbies like me who try very hard to keep up with Google's algo but just can't get a handle on it so a "little bug" like what just happened can be devastating!
Only top (up to) 1000 will be displayed.
Some (after 300-400) are with arabic characters, mostly in arabic (using search in USA!). Some (after 500 - 600) are really related to the subject, in English.
Are you still trying to invent algo to filter top-1000 from 250,000 documents? Using documents written in arabic in top-300? what about the rest 249700 documents? I am absolutely sure 249700 do not contain even 2 keywords from three used for query.
Many people suddenly attacked blogs/social/directories and even changed page titles. Only few read this...
That is no doubt easy for you to say if you are not subject to the vagaries of the 950 penalty. However, I am, and all I have ever done is focused on the needs of my audience and providing great information for them. In this context I judge your advice to be unhelpful.
<edited for langauge>
my site was hit on Sep has 50% less traffic and has many strange sympthons already said here
can be my own fault i mean there are thousand of factors at a web page
but if there is a penalty and there is no message at WebMaster Tools, how can you know what happened?
sometimes you request reconsideration from your site that is already indexed and it can gets better and then from nothing disapear some keywords again
then you will change lots o things and if time pass and nothing changes submit a reconsideration request again
wouldnt it be easier if google could offer a tool to diagnostic your pages, show the rank (something that webmaster tools didnt have a way to see a specific page rank of a determined page)
such tool could inform you what issues you have and would be easy to fix then and avoid others
then everybody would be happy at least knowing that their sites are performing bad to a poor content (according to google's algo) and not a penalty ?
i know that is not possible to reply to each request, bla bla bla, but it is not fare be punished without knowing why
[edited by: tedster at 4:21 pm (utc) on Nov. 2, 2008]
This morning the live SERPs seem to have calmed down in the areas I look at frequently. Even some wacky site: operator counts are now back in line.
I know everyone gets happy when MC comes on and makes a PROCLAMATION, but give it til Nov 5th folks.
As always, I must be the "bad guy".. who contradicts the Goog employees...
This isn't a "bug", it's a test.
If you think i'm full of hot air, please follow this thread all the way back to Oct SERPs.
This "bug" has rolled out EXACTLY as I said it would since the get go.
It's not a "bug"... it's a test. Which i expect to be firmly done by Nov 5th.
Until then, STUDY THE VARIOUS SERPS.
Whether you're ranking now or not, there's a lot to be learned from them right now.
you have to really piss google off to not even rank for your own name
This one really tickles me because Iíve got a site that has been experiencing this problem for fourteen months that has just now corrected for this. I donít look for this to last. These Google fellows have been manipulating these results so much since the beginning of Universal search theyíve long forgotten what the dials do and the repercussions one has on the other. Correcting something to Google is getting 10% of the results back to what they previously were. MSN please buy Yahoo so small business can get back to making money.
"I think this was a short-term issue and things should be back to normal pretty soon (if not already)."
Translation - "Unlike the Position #6 test, which turned into a 'bug' because we forgot to re-roll it back, we at Goog won't forget this time, so don't worry guys"
Some data went into the index without all of our quality signals incorporated
Translation - "We hadn't added the Ghost Dataset(TM) yet to those SERPs which is why they looked horrible"
"it should be mostly back to normal and continuing to get back to normal over the course of the day."
Translation - "This is a test that we wanted to run over the least busy time before the holidays. It's mostly done now. And again, we WON'T forget to roll it back like last year"
Of particular note, Google Books has taken the lead with much too many listings in the top ten for searches that are not well-suited. A strong second would be a flood of inner pages for related shopping websites and informational websites which have taken (and displaced) what I would deem very authority websites (whom were competitors to websites I own)
Now I am all for having less competition, but I don't feel that the current results are doing anyone justice :)
I could send you examples, but I am sure you can find them quite easily.
Looks to me like Google accidentally revealed that their system no longer chooses the best results but filters out the bad results and someone applied the wrong filters coupled with a dataset error. So it looks like restoring the dataset and filters after a lapse in time is not the same as rolling back to the 'normal' situation. Perhaps Google search should be renamed to Google Book search at the moment.
This recent "bug" at least removed this irrelevant garbage and left the first page looking relevant for the first time this year.
Too bad it's only a "bug" and not a foretaste of -improved- things to come.