| 8:25 am on Jan 14, 2007 (gmt 0)|
jetteroheller, did you change anything on those subdomains? When they was filtered?
| 11:26 am on Jan 14, 2007 (gmt 0)|
|jetteroheller, did you change anything on those subdomains? When they was filtered? |
There is a long list of actions since June 27th
I recovered short between September 30th and November 2nd
www and not www solved
No links anymore to my internet promotion site on each page, this could be seen that I sell links to me.
The internet promotion page has no pages anymore with links to keywords.
The navigation great reformed. Much less links, sitemap and news page instead.
Each page has a link to a contact form.
The sctipt which creates the contact form now checks, should the calling page exist, and returns error404 when the calling page does not exist.
The contact form has now not the same title,
only "contact form" instead
There had been pages of Amazon indexed of part of my site, because I used them as PSA replacement at AdSense. Problem solved.
The contact form had the same navigation link structrue like the calling page. This removed
The contact form had ads from Amazon, (no AdSense allowed on contact forms) and this colud be seen as a low value affliant page. Amazon ads removed.
There had been also scripts with a print chapter system repeating the same title line. They have now "Chapter print out" as title.
There had been 3 random links to new pages in the same subdomain on each page. This has been replaced by links to the pictures in high resolution, if pictures on the page.
The error 404 pages had Babylon ads. To be not seen as low value affliant page removed.
| 4:11 pm on Jan 14, 2007 (gmt 0)|
|I have a few supplementals, pretty much all the duplicate URLS created by my WordPress Blog, and I know with some certainty that my site has a high level of trust so I don't think that is at play here unless it's a weaker rather than stronger relationship. |
Sure the site may have a high level of trust, but possibly those duplicate pages are not, or placed less importance on because they are "duplicates" as you say and Google has placed priority or trust in the other version of the page and not these, thus they are supplemental.
| 6:16 pm on Jan 14, 2007 (gmt 0)|
Today last domain recovered.
Rank the same even better as before december 17.
But still remain a subdomain of this domain unrecovered.
I'm still waiting and don't change anything.
| 6:47 pm on Jan 14, 2007 (gmt 0)|
2 domains recovered just a few hours ago.
| 6:50 pm on Jan 14, 2007 (gmt 0)|
After I recovered on Saturday, 6 from 10 subdomains out of the filer,
now only 1 from 10 subdomains out of the filter.
Sunday evenenig 9 domains again in the filter.
| 7:16 pm on Jan 14, 2007 (gmt 0)|
The June 27 2006 date seems to be the "adjustment" that no one understands. I've seen references to that date on almost every message board, Google employees have been asked about it at every SEO conference, and there still has been no definitive answer.
It is hard to believe that the update happened almost 8 months ago, and we are still talking about it and trying to figure it out.
Matt Cutts referenced it one or twice very cryptically on his blog, and has been ignoring questions on it for 8 months now.
[edited by: Rx_Recruiters at 7:17 pm (utc) on Jan. 14, 2007]
| 7:47 pm on Jan 14, 2007 (gmt 0)|
I operate a large, popular website which was penalized 4 months ago. Last Saturday January 6th we started receiving traffic again, but it has been on and off all week.
Because of the size and diversity of the site, we can typically predict how many visitors we'll get from Google each hour within a few percentage points. Something funny we've noticed this week is that each update has occured between the hours of 9 and 10 am PST (Google time). This includes today (Sunday). When traffic is on, we get about 6x the traffic from Google when traffic is off, and it turns on or off every day or two, between 9 and 10 am.
Just wondering, has anyone else has noticed this particular time frame with their websites?
| 7:51 pm on Jan 14, 2007 (gmt 0)|
First, hi to all of you, I am new here as member, but am reading forum since 27. December ( guess why :) ), and I just loved those Google problems discussions.They made stay calm after all my sites were kicked on Xmas and lost all Google traffic.
It seems its back today, I am not sure for all sites, but few major are back where they were before 27.12
Just hope it will stay there now.
| 7:58 pm on Jan 14, 2007 (gmt 0)|
Among others, I've been casually tracking 29 pages this month using a type of two word search term that is common for the subject of those pages. When I started all 29 pages ranked in the top 10 for the search term used, most in the top 3.
The results have shown these pages randomly staying put, or rising and falling by several spots on a day to day basis. Sometimes a page will even fall off the first page of the serps for a day or two. But for every fall there seems to be a rise for another page.
The net result for me has been more or less neutral as far as overall ranking is concerned.
But there is another aspect of this that seems to have quite a bit larger impact.
That is the issue of which page gets returned ranking well for the search term.
In my case I have an applicable sub-index page and individual topic specific pages (which are listed on the sub-index page).
It's pretty common for these pages to both appear in the serps, one or the other indented as a second listing. It's when only one appears that I see a real impact.
It appears that if the sub-index page gets the spot, based on my observation of the serps when I check, my traffic drops off dramatically, in the 10 - 15% range. When a topic specific pages get the single listings, traffic rises again.
That;s interesting because when the sub-index page gets the only spot the snippet commonly lists several choices that might match the search query. Si if the search was "blue widget" the snippet might show choices for round blue widget, square blue widget, or triangular blue widget while the topic specific page would clearly be for only one variation of blue widget. Since "blue widget" is not a very specific search (in this case) I'd have thought the sub-index pages that show more choises would have performed better.
I haven't really sorted all this out yet, but I'm leaning towards thinking that between very frequent data pushes and the even short delays getting all the DCs synchronized (does that ever happen?) the everflux people have mentioned seems to rule the day making checking this stuff almost futile, at least in my case.
| 7:59 pm on Jan 14, 2007 (gmt 0)|
Appear to have recovered today - even gaining positions in the SERPS top 10 where we had none at all before for certain keyword combos.
How long it will stay that way is another matter...
| 8:02 pm on Jan 14, 2007 (gmt 0)|
An oldish site of mine which had a robots excude on it till Thursday morning went to page 1 position 7 for it's main keyword on Friday... it stayed there till this morning and is now nowhere to be found again. Something is going on!
| 8:25 pm on Jan 14, 2007 (gmt 0)|
yes we are now back! But will it last?
we didn't do anything to our site, just bought a few adwords to keep the phones ringing. So they made about £10k dollars in the process.
had done some site wide 301 redirects in late nov and we think thats why we got kicked. but who really knows what happened?
| 8:32 pm on Jan 14, 2007 (gmt 0)|
I'm back too!
Not on every keywords yet for google.fr, but back on all datacenters for all my keywords now!
It's so good!
| 8:38 pm on Jan 14, 2007 (gmt 0)|
All sites recovered.
Wish you all the same!
| 10:29 pm on Jan 14, 2007 (gmt 0)|
A staggering amount of garbage has been added into and back into the index today, while very little has recovered.
One of the worst data refreshes yet.
Gee, do we really need even MORE of that hijack/redirect blog spam crap?
It's scary how bad Google is at this.
| 11:08 pm on Jan 14, 2007 (gmt 0)|
I recovered partly too. If we're lucky this will last over 24 hours.
| 11:52 pm on Jan 14, 2007 (gmt 0)|
serps look a bit better
However, in many cases its like they will rank the weakest pages of a website for the search string.
IE you have a page dedicated to "blue widgets" with some pages off it "yellow green blue widgets" and "pink brown yellow widgets" they will rank one of these for the keyword search string "blue widgets" rather than the dedicated page.
Most odd, its like the algo is designed to deliver serps that are in the ball park but just not quite on target
Is it just me or have googles serps got worse with each update since the new infastructure was rolled out last year?
| 12:11 am on Jan 15, 2007 (gmt 0)|
Well, it lasted for about 5 hours then gone again.... sigh
| 12:13 am on Jan 15, 2007 (gmt 0)|
It does kinda look like there was a second push of the data refresh, that added and subtracted things again... looks like most of the redirect spam stayed though.
| 12:16 am on Jan 15, 2007 (gmt 0)|
a lot of afflinks and blogs still in,
but me to recovored partially
| 12:27 am on Jan 15, 2007 (gmt 0)|
Seems we have had a bit of a bounce pack for the positive today as well.
However when I do a site:www.domain.com search it shows about 90k results are found, then at the bottom of the page it says
"repeat the search with the omitted results included."
so when I click that it only comes up with about half as many..... strange it should show the same or more.
I did see something really odd today. Checking pr on my site
Mine came up same number as usual
site just above me came up one more point than mine
however the top ranked site for the same search term came up like this:
Google Pagerank for 'http://www.mycompetitor/subdirectory/ ' is currently -1 out of 10.
-1? what the heck does that mean .. i've never seen that before ever.
[edited by: Bewenched at 12:41 am (utc) on Jan. 15, 2007]
| 12:47 am on Jan 15, 2007 (gmt 0)|
5 hours of happines!
I was expecting at least 1 day :)
| 12:58 am on Jan 15, 2007 (gmt 0)|
I'm seeing no indented listings.... strange.
[edited by: MHes at 12:58 am (utc) on Jan. 15, 2007]
| 12:58 am on Jan 15, 2007 (gmt 0)|
Been following Google today and during the last hour (4PM Sunday 14th) I have seen a huge drop in visitors and tons of new spam sites showing up in the search.
I have checked their datacenters and a lot of EDU spam and blog spam sites start showing up again and my site and other sites I rack are not to be seen.
I see a lot of sites in the SERP's that have a cache date of Jan 13 so i guess they decided to let loose the cached/spidered pages from yesterday without screening them at all...'
Just when I thought this was over since I saw better hits during the day, then it starts all over again....
| 1:54 am on Jan 15, 2007 (gmt 0)|
it's getting ridiculous.... really...
| 2:09 am on Jan 15, 2007 (gmt 0)|
Google is pushing new data as we speak. I am seeing enormous turbulence and with each refresh of the browser there is a new set of results. Hopefully it will be much better by Tuesday considering Monday is a holiday in the US.
| 2:23 am on Jan 15, 2007 (gmt 0)|
Yes it is going on as we speak and like someone said. It's getting rediculous.
I wish they could get someone else to bully. Getting so @@%$#@@$@$#2 tired of their updates or downgrades or whetever it is they are (not) calling it. MC says "No, we didn't do anything" and yet they get 10 porn sites back into the index but don't care about thousands of hard working people trying to make a living.
Thinking about getting totally out of the internet business right now since this is not worth it. Just gets your blood pressure up to the boling point every @$@#!$#@# time they think they need to change something. I have said before and I say it again, I am TIRED of being a BETA tester for Google. Why don't they have some real persons cehcking their crappy result before they release it!
| 2:33 am on Jan 15, 2007 (gmt 0)|
Why are data refreshes performed every week? I thought they should happen every 2 days?
| 2:41 am on Jan 15, 2007 (gmt 0)|
They are every day or two.
| 2:45 am on Jan 15, 2007 (gmt 0)|
During the last few hours my site received the level of traffic I had around one month ago. It seems that all is gone again. Some result checks show SERPs as before.
I have never seen such a big movement in such a short amount of time ever since I am working on my site. That was something big and I would rate it as a step in the right direction.
It also showed me one very important thing: the dissappearance (supplemental, old caches etc.) of my pages from Googles search results has been not my fault. It wasn't duplicate content, unreliable server results, 301 gone wrong or something else. It wasn't my sitemap file, not the technology I do use or something else. The pages ARE there.
It's simply that the plex is able to send 15'000 users on one day - and the other day they send 200. Or nothing.
I do respect the basic ideals and the idea behind Google - to make knowledge available in an accessible way. I like Google.
On the other hand, Google has to understand that professional webmasters that run large websites can't run these websites without a certain stable level of traffic. One day nothing the next one thousands ... that's the nightmare of every server admin.
I am not raising my hands here saying that Google owes me something - they don't. If they send visitors to pages in my industry that are (IMHO) of a lower quality then mine - that's ok for me.
I would be very grateful for a stabilization of the current situation. Send me nothing or send me thousands - but do that in a linear way. Thanks.
| This 247 message thread spans 9 pages: < < 247 ( 1  3 4 5 6 7 8 9 ) > > |