Forum Moderators: Robert Charlton & goodroi
A very interesting article (including comments by Matt Cutts) has been posted over at PBS. It may be possible that it is related to the December/January changes
[pbs.org...]
[edited by: tedster at 8:00 pm (utc) on Jan. 11, 2007]
jetteroheller, did you change anything on those subdomains? When they was filtered?
There is a long list of actions since June 27th
I recovered short between September 30th and November 2nd
www and not www solved
No links anymore to my internet promotion site on each page, this could be seen that I sell links to me.
The internet promotion page has no pages anymore with links to keywords.
The navigation great reformed. Much less links, sitemap and news page instead.
Each page has a link to a contact form.
The sctipt which creates the contact form now checks, should the calling page exist, and returns error404 when the calling page does not exist.
The contact form has now not the same title,
only "contact form" instead
There had been pages of Amazon indexed of part of my site, because I used them as PSA replacement at AdSense. Problem solved.
The contact form had the same navigation link structrue like the calling page. This removed
The contact form had ads from Amazon, (no AdSense allowed on contact forms) and this colud be seen as a low value affliant page. Amazon ads removed.
There had been also scripts with a print chapter system repeating the same title line. They have now "Chapter print out" as title.
There had been 3 random links to new pages in the same subdomain on each page. This has been replaced by links to the pictures in high resolution, if pictures on the page.
The error 404 pages had Babylon ads. To be not seen as low value affliant page removed.
I have a few supplementals, pretty much all the duplicate URLS created by my WordPress Blog, and I know with some certainty that my site has a high level of trust so I don't think that is at play here unless it's a weaker rather than stronger relationship.
Sure the site may have a high level of trust, but possibly those duplicate pages are not, or placed less importance on because they are "duplicates" as you say and Google has placed priority or trust in the other version of the page and not these, thus they are supplemental.
It is hard to believe that the update happened almost 8 months ago, and we are still talking about it and trying to figure it out.
Matt Cutts referenced it one or twice very cryptically on his blog, and has been ignoring questions on it for 8 months now.
[edited by: Rx_Recruiters at 7:17 pm (utc) on Jan. 14, 2007]
Because of the size and diversity of the site, we can typically predict how many visitors we'll get from Google each hour within a few percentage points. Something funny we've noticed this week is that each update has occured between the hours of 9 and 10 am PST (Google time). This includes today (Sunday). When traffic is on, we get about 6x the traffic from Google when traffic is off, and it turns on or off every day or two, between 9 and 10 am.
Just wondering, has anyone else has noticed this particular time frame with their websites?
It seems its back today, I am not sure for all sites, but few major are back where they were before 27.12
Just hope it will stay there now.
The results have shown these pages randomly staying put, or rising and falling by several spots on a day to day basis. Sometimes a page will even fall off the first page of the serps for a day or two. But for every fall there seems to be a rise for another page.
The net result for me has been more or less neutral as far as overall ranking is concerned.
But there is another aspect of this that seems to have quite a bit larger impact.
That is the issue of which page gets returned ranking well for the search term.
In my case I have an applicable sub-index page and individual topic specific pages (which are listed on the sub-index page).
It's pretty common for these pages to both appear in the serps, one or the other indented as a second listing. It's when only one appears that I see a real impact.
It appears that if the sub-index page gets the spot, based on my observation of the serps when I check, my traffic drops off dramatically, in the 10 - 15% range. When a topic specific pages get the single listings, traffic rises again.
That;s interesting because when the sub-index page gets the only spot the snippet commonly lists several choices that might match the search query. Si if the search was "blue widget" the snippet might show choices for round blue widget, square blue widget, or triangular blue widget while the topic specific page would clearly be for only one variation of blue widget. Since "blue widget" is not a very specific search (in this case) I'd have thought the sub-index pages that show more choises would have performed better.
I haven't really sorted all this out yet, but I'm leaning towards thinking that between very frequent data pushes and the even short delays getting all the DCs synchronized (does that ever happen?) the everflux people have mentioned seems to rule the day making checking this stuff almost futile, at least in my case.
However, in many cases its like they will rank the weakest pages of a website for the search string.
IE you have a page dedicated to "blue widgets" with some pages off it "yellow green blue widgets" and "pink brown yellow widgets" they will rank one of these for the keyword search string "blue widgets" rather than the dedicated page.
Most odd, its like the algo is designed to deliver serps that are in the ball park but just not quite on target
Is it just me or have googles serps got worse with each update since the new infastructure was rolled out last year?
However when I do a site:www.domain.com search it shows about 90k results are found, then at the bottom of the page it says
"repeat the search with the omitted results included."
so when I click that it only comes up with about half as many..... strange it should show the same or more.
----
I did see something really odd today. Checking pr on my site
Mine came up same number as usual
site just above me came up one more point than mine
however the top ranked site for the same search term came up like this:
Google Pagerank for 'http://www.mycompetitor/subdirectory/ ' is currently -1 out of 10.
-1? what the heck does that mean .. i've never seen that before ever.
[edited by: Bewenched at 12:41 am (utc) on Jan. 15, 2007]
Just when I thought this was over since I saw better hits during the day, then it starts all over again....
GEEEEESSSSHHHHhhh...........