Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
Does anyone else have a premonition that they are going to throw another big change at us or are they just holding everything static for the holidays?
Don't know about that, but my index page just took a dive (about 120 places) for the single keyword it was #5 on pre-Florida. It had worked its way back up to about #20.
joined:Nov 9, 2003
[edited by: superscript at 3:38 am (utc) on Dec. 19, 2003]
joined:Dec 17, 2003
This clearly means that either something is going to happen, or alternatively, nothing at all is going to happen - now you can't argue with that ;)
Congratulations! You are now one of a few that have ever posted a fact on WW :o)
"Results 1 - 10 of about *snip*. Search took 9.91 seconds."
I don't think I've ever seen Google take more than one-second to retrieve results. I've seen my connection to the web server act slowly sometimes, but this has never affected the "search took x seconds" results before. It only seems to have done it once, so probably a fluke.. but interesting.
What we are seeing is similar to what happened briefly earlier in the month, sites that were lost are now coming back in as "fresh" data, and should normally be ranking weaker than they did before, but they should improve the next update cycle to where they were.
Whatever Google tried to do with the -in experiment earlier this month obviously didn't work very well so they had to fiddle around for a couple weeks, holding out a lot of pages from the other datacenters in the process. My thinking is that the stuff on -in will move over to the other datacenters shortly, basically like a normal update where one datacenters leads the process. I don't think of this as a normal update though. I would guess we'd see that shortly after this data gets assimilated.
Or not.... :)
catch ya tomorrow,
a couple of weeks ago most here where saying the update has finished since then we have had 3 major movements in results and change of back links once.
www2, www3, www-in are very in-different to each other but very different to other datacenters and appear to be changing for the last 2 days. Back links are still the same and results still look very messy.
When will Google sort out these results, nearly most searches give me pages with in a site instead of the index page also big sites with only 1 page relating to my search out rank sites that have 30 to 40 pages of content relating to my search.
It also appears that the results are still coming in sets of two one link followed by another underneath this is like on every result so now we only have 4 web sites on the first page each with two listings?
One of my competitors has not optimized his pages at all and I always believe his pages are far better then mine but he has been wiped out for every thing.
Still a big mess :(
One of my very relevant sites is back on page 2, others are still missing.
See lots of massive sites with subdomains still around and the local guy still being kicked in the teeth.
joined:Dec 17, 2003
update has finished
Googles updates are perpetual.
Is that you again, Dave?
Who is this Dave guy? He sounds like a really nice guy!
If -in is going to go live this will be an unmitigated disaster for my main search term for my main site/page.
In general terms it looks like one problem that was there before Florida has been replaced by another one. Previously a proportion of the sites in the top 20 were generic-hyphenated-domainname.com pages that had got there by virtue of sharing the search term with their domain name and so di well from the in-anchor ellement of the algo. Now we have directories which appear to be generated static HTML including Espotting or Overture ads and the only reason that tyey have any content on the subject being searched for is the presence of our Espotting ad repeated twice. And some big brand general sites.
Say the term is widget financial. There are two companies that are called something like widgetology and widgetwise. These are acknowledged as leading specialist companies in the niche market in which they opperate in one little country which is OUS across the Atlantic but which uses the original version of the English language. The web sites of those two companies do not appear in the first 500 in SERPs for the most widely used term on -in. Our site which hovered previously between #3 and #1 is now at #550 on -in and #45-#43 on the other data centres. If -in is using technology to understand what a domain name means by splitting it into tokens and "sensing" it then it is getting it all wrong in my niche and penalising good specialist sites from good specialist companies.
#45 I could get to grips with we are still, even on -in #3 - #1 for all of our secondary terms but we are being penalised because of our niche market specialism because our registered company name could be misinterpreted because your Ontology does not "understand" the nuances of the English language.
There is no way that we are going to change our company name so if you are sensing the meaning of a domain name and in effect penalising those that have a meaning something similar to the search term (a counter logical step in terms of delivering good results) but you are also going to force us to simply buy a domain name which is totally unrelated to the search term.
If that is the case could you please tip us the wink.
I understand why you might use CIRCA to sense the menings of domain names in order to target ads to parked domains but I think that you are walking a tightrope if you turn that around and penalise domains in searches where their company name and domain name sound like they deliver what is being searched for.
Please don't propogate -in
what could be my last