Forum Moderators: open
I found out that they devalue 2 keywords of mine, ABC and XYZ. Both are the most important keywords of mine, appearing in over 100 anchor text from different sites.
So, If I search for ABC or XYZ or ABC XYZ, I am out to 200+ because Google devalue both for my site.
Let say I have another keyword PPP that rank well (in top 10). And I now search for PPP ABC or PPP XYZ or PPP ABC XYZ, what I get is Not-Good-but-Not-Bad ranking (20-30).
So I conclude that there are no ranking weight giving to my penalized keywords. So I have to depends on the not penalized keywords to pull up my listing when the search include either one of the penalized keywords.
Get it?
Now, I am trying to figure out the reason they choose that 2 keywords. They done a great job on getting them correct. But how they do it? Special Compiled Dictionary? Excessive of Anchor Text (my competitor have more but not penalized)?
All datacenters are showing the same results with only some small changes but www-in shows completely different reults! It seems that in www-in are only complete new websites or websites which droped out of the index after the last update!
I see the same results on several of the other datacenters as well. For us, -in is showing much better results than some of the others. There seems to be an even split between datacenters showing -in type data and some showing the florida garbage.
MikeD, does alexa do filtering and have theor own particular database or is just taking a straight feed from the google dc's? If it's coming straight from google then they should be moving from dc to dc and hitting -in one tenth of the time.
I've decided to stop worrying about this stuff for now. I'm just trying to make my website what my customers want and to have the right links out there to bring them in. Google is just an added perk.....
p.s. - yes I was wiped out by Florida, but oh well...
For instance, using 1 site as a guage, the following datacenters were showing these results as of this morning:
fi- #8
ab- #38
cw- #38
ex- #38
dc- #8
va- #38
lm- #38
mc- down
sj- down
kr- #8
www2 and 3 both list it as #8.
I know one site and one search phrase dont prove much, I am just using it to demonstrate how they are differing. The other sites in the top 50 are pretty much the same, just in different positions.
I have been checking them once in the morning and once at night and the seem to keep flip-flopping. The only datacenter that I have seen stay constant is -in.
one more thing, the site I am watching is moving between 8 and 38 in the main index. So -in is not the only one being used.
[edited by: vbjaeger at 8:22 pm (utc) on Dec. 12, 2003]
Bad results are bad results in google eyes! They already have filtered out websites with backlinks from guestbooks and they already have filtered out OOP websites!
But in www-in they still show websites with guestbook backlinks and OOP!
Thats the reason why we say this are bad results!
I'd suggest that people who are not doing well in -in not see it as the next update, but rather as a possible future direction. Analysis of it might give clues on how to make Google happy down the road, but I don't think anyone should jump out the window because they're not doing well in it. The thing might never spread to the other dc's.
Wrt Alexa. Try closing the browser, clearing the cache and all that stuff in case you just keep hitting the same dc via Alexa over and over again. Otherwise, I don't know. For me, it isn't showing -in.
vbjaeger, on really competitive serps, the dc's are often not exactly in sync.
You are probably right. The search has over 5.3 million results. As I mentioned earlier, it is only for one keyword phrase.
What I am seeing is the other DCs shuffling the results almost hourly if not daily, and the results are switching between 2 sets of results. The site I am watching has been bouncing back and forth from #8 to #38 and then back again. One set seems consistent with what I am seeing on -in and each of the other DCs has shown the same results. -in is the only one that has not moved.
Of course I can't measure the status of the serps from one site, I am just using it to see what changes are taking place for this particular site.
What I am seeing is the other DCs shuffing the results almost hourly if not daily
Google tries to be very fresh these days, (since Dom/Esm last May/June), so you have a lot of freshdeepbot changes getting shunted through the dc's, not all at the same time. There's still very much an everflux effect at work. -in is a different piece of work.
Still the pages hosted on free servers with very less relevence on subject are still making top. Some Major-Big sites are still able to beat Google on -in with doorway and sub-domains.
overall it looks lil better.
Its not propagated yet to www