Thanks for putting out the fire in my head, Matt. Good to have you chiming in on this one ;)
"quality signals", LOL
Back to 'normal'? I haven't laughed so much for years.
Anyone fancy defining 'normal' for me?
|b2net, I don't consider those rankings indicative of anything coming in the future. Some data went into the index without all of our quality signals incorporated, and it should be mostly back to normal and continuing to get back to normal over the course of the day. |
You know Matt, one little comment like that can save thousands of webbies from going bonkers? You should hang out here more often. There are lots of simple webbies like me who try very hard to keep up with Google's algo but just can't get a handle on it so a "little bug" like what just happened can be devastating!
Yeah Matt, Please don't leave us webbies all alone in webby world.. uuhh webbies.. #*$!?! puke.
ps. Is there anyway you can talk to sergey and larry and ask them to create some online diagnostics for google search letting webmasters know when there's a fault or update going on. Thanks ;)
This sounds very good: 250,000 documents found, displaying 1-10.
Only top (up to) 1000 will be displayed.
Some (after 300-400) are with arabic characters, mostly in arabic (using search in USA!). Some (after 500 - 600) are really related to the subject, in English.
Are you still trying to invent algo to filter top-1000 from 250,000 documents? Using documents written in arabic in top-300? what about the rest 249700 documents? I am absolutely sure 249700 do not contain even 2 keywords from three used for query.
Many people suddenly attacked blogs/social/directories and even changed page titles. Only few read this...
Not seeing anything close to 'normal' SERP's yet today.
Bizarre...my site:example.com search added about 3000 pages today.
[edited by: tedster at 4:44 am (utc) on Nov. 2, 2008]
[edit reason] switch to example.com - it can never be owned [/edit]
If everything is back to "normal" then I'm screwed... LOL
@ianevans: I see 200 more backlinks and 300 more pages indexed in GWT.
Too bad I'm buried under 10 pages of nonsense.
"My advice to many of you... stop worrying so much about your rank for keyword x, or keyword y, or rank of inner-page z... follow... no follow... blog links... directory links... inner-link... outer links... "
That is no doubt easy for you to say if you are not subject to the vagaries of the 950 penalty. However, I am, and all I have ever done is focused on the needs of my audience and providing great information for them. In this context I judge your advice to be unhelpful.
<edited for langauge>
Noticed the normal results and my index back a few hours ago which is again vanished!
IS EVERYTHING REALLY BACK TO NORMAL?
<Edited for language>
I too am still seeing some very bizarre results for keywords monitored and site command.
Back to normal? Not in my niche. I am seeing domain after domain getting two and three listings per page of the search results and its all duplicate content produced by dynamic, query string and URLs with partner codes appended. It's pushing me lower and lower in the search results.
i personally think that there is a million possibilities every time for issues
my site was hit on Sep has 50% less traffic and has many strange sympthons already said here
can be my own fault i mean there are thousand of factors at a web page
but if there is a penalty and there is no message at WebMaster Tools, how can you know what happened?
sometimes you request reconsideration from your site that is already indexed and it can gets better and then from nothing disapear some keywords again
then you will change lots o things and if time pass and nothing changes submit a reconsideration request again
wouldnt it be easier if google could offer a tool to diagnostic your pages, show the rank (something that webmaster tools didnt have a way to see a specific page rank of a determined page)
such tool could inform you what issues you have and would be easy to fix then and avoid others
then everybody would be happy at least knowing that their sites are performing bad to a poor content (according to google's algo) and not a penalty ?
i know that is not possible to reply to each request, bla bla bla, but it is not fare be punished without knowing why
[edited by: tedster at 4:21 pm (utc) on Nov. 2, 2008]
szykman, one issue is that many ranking drops are not related to a true "penalty" - that is, something Google thinks of as a penalty because it records some flag against your site. A classic example is ranking problems that stem from duplicate content. Webmasters often use the phrase "duplicate content penalty" but it's not a penalty.
This morning the live SERPs seem to have calmed down in the areas I look at frequently. Even some wacky site: operator counts are now back in line.
thanks for the tips
glad things are better for your area, hope more areas will get better soon ; )
no improvement for me, is everything fixed for everyone else? This is looking bad for me, ranking 50th for my own unique name. Pretty extreme penalty if you ask me, you have to really piss google off to not even rank for your own name...I hope this not fixed yet.
In the areas I watch the movement has stilled, but the search is far from complete. I would suggest that in it's current state, those topics are about 65% relevant (as opposed to about 85% relevant before the 'error')
Right now I can see results back to normal with index pages in SERP atleast in my niche.
Not back to normal for my subdomain yet. A lot of flux every time I check the site:sub.domain.tld query yet. Sometimes 500K pages, sometimes 200K pages, or 2M pages. This happens only for one subdomain wich lost 75% of referers from G. All other domains ok, never touched by 'the issue'.
I know everyone gets happy when MC comes on and makes a PROCLAMATION, but give it til Nov 5th folks.
As always, I must be the "bad guy".. who contradicts the Goog employees...
This isn't a "bug", it's a test.
If you think i'm full of hot air, please follow this thread all the way back to Oct SERPs.
This "bug" has rolled out EXACTLY as I said it would since the get go.
It's not a "bug"... it's a test. Which i expect to be firmly done by Nov 5th.
Until then, STUDY THE VARIOUS SERPS.
Whether you're ranking now or not, there's a lot to be learned from them right now.
|you have to really piss google off to not even rank for your own name |
This one really tickles me because Iíve got a site that has been experiencing this problem for fourteen months that has just now corrected for this. I donít look for this to last. These Google fellows have been manipulating these results so much since the beginning of Universal search theyíve long forgotten what the dials do and the repercussions one has on the other. Correcting something to Google is getting 10% of the results back to what they previously were. MSN please buy Yahoo so small business can get back to making money.
Just a quick "MC to Whitenight to WebmasterWorld" translator
|"I think this was a short-term issue and things should be back to normal pretty soon (if not already)." |
Translation - "Unlike the Position #6 test, which turned into a 'bug' because we forgot to re-roll it back, we at Goog won't forget this time, so don't worry guys"
|Some data went into the index without all of our quality signals incorporated |
Translation - "We hadn't added the Ghost Dataset(TM) yet to those SERPs which is why they looked horrible"
|"it should be mostly back to normal and continuing to get back to normal over the course of the day." |
Translation - "This is a test that we wanted to run over the least busy time before the holidays. It's mostly done now. And again, we WON'T forget to roll it back like last year"
YAY! I got a pretty big boost for my main keyword. All auxilliary keywords seem to be unaffected this time :D
"This isn't a "bug", it's a test."
Nope, it wasn't a test, whitenight.
Wow, my category jumped again from 68 million results to 142 million results and Google has a new google map + location links sitting right at the top for my main keyword.
Ouch, traffic looks like it's already tanking 35% on my index page....
Hi Matt. We are still noticing considerably poor results in many genres that I look at, in comparison to pre-Oct 31.
Of particular note, Google Books has taken the lead with much too many listings in the top ten for searches that are not well-suited. A strong second would be a flood of inner pages for related shopping websites and informational websites which have taken (and displaced) what I would deem very authority websites (whom were competitors to websites I own)
Now I am all for having less competition, but I don't feel that the current results are doing anyone justice :)
I could send you examples, but I am sure you can find them quite easily.
Google makes a settlement with book publishers and authors and all of sudden there are 40 books results in the top 100 positions.
If it was not a 'test' then what was it?
Looks to me like Google accidentally revealed that their system no longer chooses the best results but filters out the bad results and someone applied the wrong filters coupled with a dataset error. So it looks like restoring the dataset and filters after a lapse in time is not the same as rolling back to the 'normal' situation. Perhaps Google search should be renamed to Google Book search at the moment.
In my industry, the first page of SERPs is stuffed with sites that sell banner and text link advertising and offer reciprocal link exchanges - all against Webmaster Guidelines. Google doesn't seem to care.
This recent "bug" at least removed this irrelevant garbage and left the first page looking relevant for the first time this year.
Too bad it's only a "bug" and not a foretaste of -improved- things to come.
man.. google's results pages are getting so messy now. local results, sitelinks, shopping results, postcode filters etc. sure.. some of these things can be useful at times but the results page beginning to look like yahoo's homepage :) the relevant natural results are getting buried in amongst all this crap - why does every service eventually seem to try and kill itself by adding features and complicating things?
| This 210 message thread spans 7 pages: < < 210 ( 1 2 3 4 5  7 ) > > |