Welcome to WebmasterWorld Guest from 188.8.131.52
Toolbar PR update is also reported in another separate thread.
I would be grateful if the more experienced ones can offer their opinions on what this refresh is all about. I for one am happy, but how is it affecting everyone overall? My own couple sites have seen traffic quadruple.
Also one more question - a site really cant be worth the 100th page of SERPs on sep 29th, and worth top position on sep 30th. That just is not right, especially when I am talking about a site which has been completely whitehat, adds 10 plus pages a day, and is widely linked to naturally. If natural content sites can bounce so much in SERPs (meaning they could be on some fine line they accidentally reached and a little tweak somewhere throws it all off), I shudder to think what the data refreshes must be doing to your average online merchant or corporate website.
Now, I was anyway expecting the site to return to the SERPs with this refresh with or without the fixing I did, thats whats been happening for one year.
There are more pages to fix, and hope to do it soon. What would be interesting to see is will the site go back to the back of beyond even after the fixing of meta descriptions during the crest period till the next refresh.
(I want to kick all those who told me 2 years back that SEs dont look at meta descriptions, so dont bother to have them!)
I "recovered" from the June 27 refresh on Sept 16, so this was a very, very short return to where I was. Less than two weeks of good ranking and now I'm back at the bottom of the barrel again.
The June 27th horror is over!
Seems all my subdomains of my major income domain are now out of the filter.
In the last weeks, I felt after all the actions done sometimes like a witch doctor perfomring a rain dance in the dessert.
No idea what action or actions finally brought the success.
Maybe all of them, maybe none of them.
Here my long list:
1.) www not www. 301 redirects to right version installed
2.) cgi script returned form mail pages for not any longer existing calling pages. Now the cgi script checks, exists the calling page, returns status error 404 for not existing
3.) Sitemap page created for each subdomain instead of many links to subdirectories. This reduced for nearly all pages the number of links below 100
4.) In Google sitemaps preferred version www not www
5.) cgi script for formmail returned same title line as the calling page. Now returns only "Contact form" to avoid duplicate title problems
[edited by: tedster at 4:02 pm (utc) on Sep. 30, 2006]
If i search for my keyword phrase like after every 30-60 minutes, i get different results.
Sometimes i show up in top 10 and sometimes not in top 100.
I am consistently watching the number of results for my niche.
Sometime the figure is 3,100,000 sometimes 2,900,000 and sometimes 3,210,000 and 2,85,000
My site shows up in 2 of these 4 cases.
My traffic from google is increasing from zero after sept. 30
Earlier i was getting all traffic from MSN, now BIG G delivers me some more traffic.
Anyone else experiencing similar phenomena and getting any results with handlings? I know even CNN is wavering with their cache result going from the 30th to the 27th just minutes ago but there is something that can be done to handle that happening.
Previous data refreshes that impacted me were on a site-wide level. Virtually every SERP I track was impacted, and impacted severely. Keywords I ranked in the top ten for were relegated to positions around 150-350.
After a short recovery from the Sept 15/16 refresh, this data refresh has also impacted my site differently. Only about half of the SERPs I track were impacted. The other half has seen no change whatsoever. The pages that were impacted were not pushed back quite as far. Instead of positions 150-350 they have been pushed back about 75-125 positions from the top 10.
Of course this leads me to wonder what might be different about the pages that were pushed back versus those pages that survived this data refresh. I've started a big spreadsheet to look at factors such as page length, keyword density, number of incoming links both internal and external, anchor text pointing at those pages, age of the pages, etc. The site is driven by a custom CMS so page structure (URL writing, title tags, H1 tags, etc are built the same way on each page). Therefore I'm just looking at the other factors I mentioned.
Hopefully I might be able to figure out what happened with those pages, but I seriously doubt I will see any significant trends or figure out what happened.
And the rest of the site is back in SERPs with rankings as before. Will have to investigate why this happened...
It's going to take a little while for the dust to settle yet again before we know where we stand. Some of our pages have moved up and down as much as 4PR points just in the last 8 hours indicating fresh data is still very much in the pipe.
I did see a big boost in PR on some of our internal pages but not on the index page which I thought was unusual. Anyone else noticed this?
Despite still appearing for some keywords, overall traffic is down by about 75%, which is particularly worrying on the run up to Christmas.
For anyone else adversely affected on September 30th, can you see a similar pattern?
Impossible to ever know. You just hang on and wait for the dust settle, stand up, pick up your things and try to move on.
But for sure, so far these past 24 hours, we are hurting bad. Traffic is off 50%. Damn. We had been so relieved since August 2006 and hopeful it would hold up for a bit.
Our site has tens of thousands of pages, our site is considered an authority site with google in many areas (double indented listings on many keywords), we've been around for years (1996). And, on the keywords where we've been pushed back from #1 to 4th page somewhere, the truth is, the majority of the listings above us suck.
So, this adjustment is NOT an improvement in SERP quality, despite our bias - it really is not better.
Thank you - writing this down and rereading it is healthy for my grieving process. On to the next step...
P.S. We are located in Toronto, Canada. The adjustments occured Saturday afternoon, and by Saturday evening, it was over. It all went down in about 5 hours. Since Saturday night, (it is now Sunday afternoon), there have been no more changes in any keywords we watch. We have a bunch where we reman in #1, and we are watching them, waiting for them to dissappear, but thankfully, they remain. Murphy's Law probably dictates that now that I've written this down, they will dissappear. Yes, this is depressing.
One year ago, various pages from each of the four domains were listed in the SERPs. Most were Supplemental, many were URL-only. Many also had duplicate or missing meta description tags. The redirects were put in at that time, and many (but not all) of the meta descriptions were fixed. The last dozen or so were fixed only a few months ago.
A site:domain.com search shows 184 entries:
- 168 www entries in the normal index, all of which are "200 OK". There are 168 pages on the site. Correct!
- 14 www entries showing as Supplemental - these are all pages that have been "404" for a couple of months, as they were moved to a different folder. Their new version, at the new URLs, all show in the normal index, part of the 168 above.
- 2 non-www URLs showing as Supplemental. These reappeared in the listings last week - right out of left field; having originally dropped out of the index more than 9 months ago. As they are Supplemental, and have been 301 for a year, they can safely be ignored.
The first Supplemental Result is 164th in the listings, just ahead of the last couple of normal listings.
A site:wwwdomain.com -inurl:www search shows 17 Supplemental entries:
- 14 www entries showing as Supplemental - as above. They are all "404" pages.
- 2 non-www URLs showing as Supplemental - as above. They are both "301" pages.
- 1 www URL that showed up as a Normal Result in the first search now appears as a Supplemental Result in this search. This is a "historical" supplemental. The snippet represents older content from many months ago, even though the cache date is only two weeks ago.
This is the only one that needs checking back on. One link to it from another site might fix the problem, so we already did that.
This "update" / "refresh" - no major changes to report; just those two Supplemental non-www URLs reappearing again. This site and several others like it, proved to me beyond all reasonable doubt that:
- the 301 redirects for duplicates of all types are vital.
- unique title tag and meta description data is vital.
- Supplemental Results for 301 and 404 URLs are dropped after a year, not sooner.
[edited by: g1smd at 8:51 pm (utc) on Oct. 1, 2006]
However, there have not been very many posts comparatively about September 30th which makes me concerned. If not as many sites are affected, I imagine that there is less impetus for Google to fix a "problem", although I am maybe being naive with that assertion.
Past experience has shown me that provided you have a good solid site, rich with unique content, at some stage it will recover its traffic.
for a 500m results page
for google .com
1: Wikipedia from nowhere to #5
2: Im #7 for plural UP from #10
3: #21 for singlar DOWN from #12
for same in g.uk
4: dumped for #1 since 2 yrs to #10 for singular
5: no change in plural
Other major terms just vanished from serps
Site unaffected by any previous google updates
Google, please get your act together. You do realize from reading this forum, how many people are affected by your 'data pushes' don't you? You have set the bar by being the best search engine, and many webmasters and business owners have come to rely on your expertise in the search engine business, in order to maintain traffic to their site. By default, you have inherited this position. It is now your responsibility to maintain it, especially now that you are being well paid for it, after going public.
OK, if someone typed in "what is a " then give them wikipedia because that is what they want
otherwise can it - even as a user for other stuff, if I type in "personal loan" or virtually anything else, I don't want wiki
Please no wiki
I assume that Google does not want to index a hundred million auction pages, 95 million of which have already ended.