Welcome to WebmasterWorld Guest from 184.108.40.206
OK, so after feeling really good to get our site back in order after the 27th of June "data refresh" I come back from holiday and find its all gone Pete Tong again.
The site: tool is NOT SHOWING OUR HOME PAGE FIRST. AGAIN!
Consequently, the home page is not showing in the serps. Again. Grrrrr....
Another data refresh required please Google. You're killing me.
[edited by: tedster at 6:17 am (utc) on Aug. 30, 2006]
There should be some real experts to analyze the reasons for that carzy line of dates which is doing so much damage to solid publishers.
Why can`t publishers organize themselves in order to gain a little bit of strength and independance.
This Google monopoly is much too risky with such fragile technology.
I didn't do anything in the last months, the site was growing very well, every day new useful content.
And above all: NO TRICKS since it has been online (2001), it was always clean, no cloaking, keyword stemming, invisible text, buying links or WHATEVER.
I simply can't see what I've done wrong!
LOW CONTENT:LINKS RATIO
We lost a number of pages from the index altogether. These pages had relatively little paragraph-based text (the unique content is in tables) and a high ratio of internal and external outbound links.
LIGHT ON CONTENT
We have pages that just list businesses in a city. The ones with only 1 or 2 listings (which previously had ranked well for [city] businesses) fell dramatically. The ones with >10 listings are still doing fine.
DUPLICATE CONTENT (MAYBE)
Our "city pages" operated like search results, so users could sort the businesses by various criteria. This meant it was possible to have the official URL like "/business/city-state.html" and then also have "page=city&state=ST&city=CITY&sort=name&order=1".
Here's what I'm doing about it:
- adding more paragraphs to our city pages
- eliminating some of the outbound links
- replacing the query-parameter-based sorting with AJAX-based sorts that put sort variables in the server-side session, then 301 redirecting anyone who tries to access our city page via query parms to the "official" page
One thing I'm also doing is spending time parsing our Apache logs:
cat apache_log ¦ cut -d\ -f7,9 ¦ egrep -v '200¦301'
This produces a list of URL's and response codes. Look for any response codes that are suspicious, especially 302 or 500+. Then figure out what's up with these pages.
I also have a phpbb forum with about 120.000 posts. Since August 17th there has been a drop in traffic to about 10%, so traffic from Google has almost DIED.
This thread might be of some use to you espeically when using PHPstyle forums and cms software:
Remember that last year we went from page 1 to 700+. I believed at the time it was a duplicate content filter and after 301'ing a second domain to the first, a month later everything was fixed. No idea if that was the issue really.
So when on August 17th our SERP's fell again, I ruled out duplicate content as that's all been handled. Then the other day I stumbled onto something. Our second domain was back but as supplemental results even though it's still 301'd. Hmmm, our second domain is back and we're penalized exactly the same as last year? Can it be?
So I mentioned it to Matt Cutts last night. Note, I didn't tell him my domains.
Tonight I was shocked to find some of our listings back on 220.127.116.11 which happens to be my current www.google.com ip. I then checked to see if our second domain was showing up in the supplemental and it wasn't.
Coincidence? You be the judge.
I can only hope our SERP's are back as this has been a trial as I'm sure all of you can attest. And while the Bible tells us to "count it all joy when we fall into divers temptations..." that's a tough request ;-)
The Supplemental Results are cleaned away one full year after the redirect is first actioned: no sooner and no later.
Your result is exactly as I would have expected. The previous Supplemental Updates have occurred in 2005 August and 2006 February/March, and the currect one is in progress right now: some datacentres have been cleaned up more than others - and the work is not yet completed on any of themm, as far as I can see.
I tried to explain all this in: [webmasterworld.com...] and several other recent threads.
Frankly this seems kind of buggy to me-- for the past 1-2 years Google has been pretty good at just "figuring it out." Then suddenly it's like they said, "we'd rather have 0 pages for the content than choose from among 4 different URLs."
It's also odd that they don't look to our very comprehensive and painstakingly created sitemap to guide them to the canonical pages.
The directive "design your site for users, not search engines" applies to static sites and on-page optimization. For complex 100k page dynamic sites, it's utterly bad advice.
My key money phrase on the site that has been experiencing issues during some of the updates went from #1 to #9 (nothing too drastic). The thing that worried me the most is that on the allinanchor: it went from #1 to AWOL -- how confusing is that?!
It is a day ending in 7 and a full moon, so I wouldn't be terribly surprised if some sort of funky data refresh is going on.
On an allinanchor: search, I come back to the top of the second page when appending &filter=0.
Has anyone else ever seen anything quite like that? Back in June we were doing the "" and &filter=0 tests for the main search results, but this is the first that I've seen anything happen on the allinanchor: query as well.
The &filter=0 parameter shows all of the results, and it is not uncommon to see stuff appear very close to the top when you do that.
It's your wake up call to fix the problems with the site. Those pages when unfiltered will do really well. Get working.
I get what the &filter=0 is supposed to do in terms of showing filtered results on normal search, but have never seen it occur on an allinanchor: search before.
In that event, what can one get working on fixing?
Edit: I found it. A new scraper popped in that duplicated me on a cloaked redirect and Google nailed us for it...I hate that so much. Thanks g1smd.
[edited by: JoeSinkwitz at 10:09 pm (utc) on Sep. 7, 2006]
<No discussion of specific tools, sorry. See Forum Charter [webmasterworld.com]>
On the scraping, I'd appreciate it if someone could point me to (or explain) how to find and get these removed.
[edited by: tedster at 1:06 am (utc) on Sep. 8, 2006]
1. Through G sitemaps, submit a spam report on the scraper.
2. Get more links, build more content.
3. Contact scraper's host and registrar.
4. Get more links, build more content.
5. Send e-mail followed up by certified letter to scraper.
6. Get more links, build more conent.
We're up to #5 right now, though all the DCs gfe-* show me in the correct position again; still we're following through because it is a frightening proposition that an external entity can wreak so much havok on one's ranking. Hopefully Google can lessen the blow of these crazy dupe filters as the algo progressively evolves.
Should I keep on waiting for things to iron themselves out and go back to normal, or should I just accept the fact that Google has gone to the dogs and start finding a new way to market my site?
Added: the vanished pages are really vanished from Google, as in unable to be found even with a site:example.com "unique phrase" type search.
Also seeing a lot of "hotel spam" results for searches like "widgetville thingyburb".
[edited by: zCat at 4:46 pm (utc) on Sep. 8, 2006]