Welcome to WebmasterWorld Guest from 220.127.116.11
Now this is interesting.
A site with 40 000 "real" pages and some 80 000 duplicate content pages excluded using robots.txt (it's a forum - see my prior posts about vbulletin) and still some 80 000 duplicate pages that are not yet so excluded.
Additionally some 500 000 non-thread pages also excluded in robots.txt and most of those already delisted. The whole site is listed as www; nothing is listed as non-www at all.
Looking purely at indexed threads:
site:domain.com shows 90 000 www pages all as normal results; including some duplicate content that will eventually be excluded.
site:domain.com -inurl:www shows 24 000 www pages all of which are marked as Supplemental Results and all of which also have an old cache date. This search should show zero results. It certainly should not be showing www pages at all, the search was for "-inurl:www". What is going on?
[edited by: tedster at 8:44 pm (utc) on June 13, 2006]
[edit reason] split into new thread [/edit]
......DC watching is so stupid and futile.
Then why are you watching them?! The funny thing is earlier there was a thread about how stupid it was as well and yet those results that us DC watchers see have been and continue to be showing up on Google and it's sister portals. If it's really that stupid then how about not contributing your two cents here?!
Sorry if I'm being a little direct but when people state that DC watching is futile and stupid it offends me because I watch it religiously and it indeed has never proved worthless to any of my sites that I've watched the last few years through the course of the many updates.
Futile? In one way maybe. We cannot directly influence Googles algorithms and their propagation.
We CAN look for subtleties in the 'new' serps, possibly making adjustments for those.
If that does any good, I can't call it futile. -Larry
[edited by: tedster at 7:36 am (utc) on June 14, 2006]
[edited by: sandpetra at 6:47 am (utc) on June 12, 2006]
[edited by: lawman at 12:04 am (utc) on June 14, 2006]
In another thread someone reported that 18.104.22.168 is showing the old results but in fact they are more like some reults I saw a few weeks ago.
On that DC for my main target 2 word term there is a site which is there by virtue of having bought many backlinks from unrelated topic pages which use that term in the link text. There are two lisitngs with inset secondary pages and a few review and directory type sites in the top 20. Many of my serious competitors who should be there are not there, there's a smattering of authority but the omission of some serious competitor sites is unfair to both the site oweners and Google users. We remain at #4 but what is around us is less relevant than the old results where we are at #4 and nowhere near as good quality in the new results where we are at #1.
On the new results we move to the top but other good sites that should be there also rise into the top 10.
Are we seeing a third set of results or has one of the two changed significantly?
Google is testing new results, but... what impact have them into advertisings? What CTR have them with the new results set?
Do you think that better results will lower CTR in advertisings (and google revenues)?
And more important: What google will do?
"ive had a significant rise in the number of known brands asking me to place a paid link to them."
You just go ahead and make some $$$$ for the kids university costs. Just don't tell anybody that I have suggested that :-)
Long Live BigDaddy Linking Spirit!
Long live the free paid-links market!