Forum Moderators: Robert Charlton & goodroi
Continued from:
[webmasterworld.com...]
The same down-and-on problem here in Turkey.
But j3 goes on and off. No steady results. At least three different sets:
216.239.63.104 (I think with additional tweaks)
64.233.161.104 (still J2)
64.233.179.104 (J3)
If google thinks they can penalize / downgrade a site just by backlinks then they are in wrong path, It will be easy gaming in future to sabotage competitor sites just by submitting to 1000s of blogs , message boards and posting their links in 100s of free sites which the site competitor will not have any control of, I know people who are capable of doing it just for few dollars
If google thinks that someone is spamming with their backlinks they have to verify their onpage factors too, That way they can take fair decisions algorithmically,
Good to see google fighting spam agressively but when appling automated backlink spam filters they should be very careful that they dont loose high quality sites which add lot of value to their search users,
The test DCs which seem to be the future google.com results( 64.233.179.104 , 64.233.179.99 ) seem to bring back lot of good quality sites which were dropped by Jagger update accidentally, that is a good move by google,
One day older than yesterday, and another great day ahead to live and enjoy :-)
And I thought that I was clear in my yesterday post, when talking about relevancy in search results both on google.com and the test DC.
In fact when I gave that query example, I wasn't talking about spam. I was talking about the quality of search results
>>powerofeyes
>>>>"american heritage cabinets"<<<<
It shows results 1 to 10 of about 101 results and you want to take that as an example for spam, :-) Google dont have to concentrate any of these areas.<<
Give it another test drive, and tell me what do you think about relevancy in search results and quality of the serps you get :-)
They do not want to admit there is an issue, no more that Walmart would admit there was an issue with the Santa photo booth when the train started smoking.
I don't agree with the way G has dealt with this. However, I do feel they will come to a resolution in time.
Noticing a fairly big shift today in some of the sectors I am in. Will take a look and post if I see any significant which could have caused this shift.
One sandboxed site for us went from 310 to 110 today.
Spam is everywhere, You cannot fight all spammers because they come out with new ways to play google results all the time, For the keyword you mentioned I can see the .cc domain spamming but other sites look relevant, so 1 out of 10 sites spamming is acceptable when compared to the huge amount of spam sites out there,
When google tries to go above the threshold level in automated spam filters they tend to loose very good quality authority sites, that is what happened in jagger update, I saw lots of quality sites disappear just because google's algorithm found something fishy in their backlinks, IMHO backlink filters should be handled carefully or they will loose relevancy and quality,
[edited by: powerofeyes at 7:35 am (utc) on Dec. 7, 2005]
>>Googles filters are soooo lame. It can't be that hard to filter out this stuff.<<
I think that the folks at the plex are runing tests at the moment WITHOUT filters.
For example, in search related to advertising sector on the test DC, I get Yahoo and MSN within the top 10 and are not relevant at all to my search keyword phrases.
I.e Google is treating its compititors well :-)
Long live democracy of search!
Why to test a phrase which 1 out of 10 million peopl search, I dont see people use that term anywhere, #*$! shows no results for that phrase, Test a phrase which people more actively use,
>>>>>>>>.Google is treating its compititors well.
You mean google should drop their competitor sites completely,
so 1 out of 10 sites spamming is acceptable when compared to the huge amount of spam sites out there
Google has been better than many others in fighting spam. But, if Google has achieved it by causing a significantly big number of sites to suffer collateral damage, then that is something to worry about.
Sure that having your site still totally screwed for no other reason than a software bug would have driven me as nuts as it did for you, it's not very professional for the less that can be said.
Site:www.domain.com www.domain.com still shows pages of supplementals before getting to the more recently crawled pages.
"request for feedback"
Feedback: When I 301 a page months ago, and Google picks it up, I expect them to not list the long gone URL on a "test" datacenter (and not as a supplemental either). I can't believe it is actually a test, or feedback will be solicited, until the data is at least close to current.