Forum Moderators: open

Message Too Old, No Replies

Google lag and local elections

Where are the candidates?

         

Powdork

5:01 pm on Oct 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Several of the candidates for city council as well as web sites related to certain upcoming ballot measures have created websites, but you guessed it, they can't be found even when searching by name. Some of them are months old and in one case the site shows up as aa backlink to one of my sites, but can't be found with any search related to the election. In the googd old days these folks would have easily been found on G, but not on Y or MSN. Now they are nowhere. G has lost one of its major advantages over the other two SE's. Are they going to wait until Vegas to fix it?

sonic10

10:19 pm on Oct 29, 2004 (gmt 0)

10+ Year Member



The sandbox is a deliberate, conscious decision by Google. A somewhat desperate one if you ask me.

No doubt. When I directly asked google about the sandbox/dampening filter being applied to new sites.

Their Reply:
"we don't personally review individual sites, nor do we comment on webmaster techniques or the details of our search technology beyond what appears on our site."

I simple yes/no would have clarified it for me. But with an ambiguous answer like this just tells me that indeed there is a particular filter applied to newer sites for a period of time. Googleguy has already stated that adding addtional links are gradually credited to your site over time. So obviously they can suspend/delay your true rankings as well.

Powdork:
I guess the good ol' days are gone. One of my sites is 4 months old has a PR6 homepage, fairly competitive keywords and nowhere to be found on google but like in your case easily found on yahoo,msn and others.

I guess for campaigns/elections you gotta spend money. Sooo adwords, which is exactly what google expects most will do for a few months while your site is in this probation period. It's all business for google now. Current and relevant results have taken a back seat....

Powdork

10:48 pm on Oct 29, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I guess for campaigns/elections you gotta spend money.
Personally I don't think this will last. I could believe the sandbox was intentional if it really were only 60-90 days but given that many have been in it for 6 months plus I think there is somethibng awry with the size of the index. The current spidering seems to suggest that these pages in the supplemental index and those in the 'lag' index will be added shortly. Google's complete silence also speaks to it ending soon. I'm half expecting an update of epic proportions. Perhaps the first one that will introduce a new technology worthy of a pre update announcement. Of course if it doesn't happen soon, we have to figure their problems may be more permanent in nature.

Hanu

11:17 pm on Oct 29, 2004 (gmt 0)

10+ Year Member



Charlier,
if a word appeared for more then x number of documents in the forward index then no records for that word would be added for new documents

I don’t know. In my experience the index of lagged sites seems to be complete. I can do phrase searches for the context of every keyword occurrence on a sandboxed site and Google shows me every occurrence. For example, say the competitive keyword a sandboxed site doesn’t rank for is widget. If I search for “you can buy widget” site:xyz.com or “widget costs 20 bucks” site:xyz.com or “you can buy widget here for” site:xyz.com and so on I can pinpoint every occurrence of widget on every page of that site. Assuming that phrase searches also use the forward index, that would pretty much prove that Google is not dropping keyword occurrences on newer sites in favour of older sites.

In my opinion, Florida and updates that followed consisted of these measures:

1)turn up anchor text weight
2)turn down title text weight
3)multiply anchor text weight with link age

isitreal

11:48 pm on Oct 29, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I like to think of it networking terms, you have your IP address 192.168.0.*, 256 addresses can exist, your company grows, widget sales really took off, then you have to add a new subnet, you connect to it through a bridge, you still have access to all the other machines on the new range, but those accesses all go through a bridge, not as efficient but it still works, you just want your main work to happen on the main, original subnet, then you can get whatever you need from the new one whenever you need it.

The internet has the same problem, 2^32 possible IP addresses using the current IPv4, the next version will support 2^128, IPng.

Same types of issues, currently we get around the limitations by using a lot of local subnets etc, inefficient, but it sort of works.

Imaster

7:05 am on Oct 30, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Current and relevant results have taken a back seat....

MSN has got something to capitalize on, if this Sandbox (deliberate or accidental) thing continues. I think Google will have to end it soon when MSN and Yahoo start promoting themselves as a much fresher/relevant service than Google.

BeeDeeDubbleU

8:57 am on Oct 30, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I think Google will have to end it soon when MSN and Yahoo start promoting themselves as a much fresher/relevant service than Google.

I never really thought about that but it only makes me more concerned. Clearly the others know about the lag and they don't appear to have done anything about it, i.e. capitalising on it?

Why?

Imaster

11:40 am on Oct 30, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I never really thought about that but it only makes me more concerned. Clearly the others know about the lag and they don't appear to have done anything about it, i.e. capitalising on it?

Why?

MSN might not yet be able to capitalise since they still do not have a better engine. Once they launch one, they might go for this promotion. If they indeed launch this year, the MSN Search PR team might even be actually working on it right now! ;)

Yahoo, I don't know. Maybe they are slow as usual or maybe even they are planning to create such a sandbox environment to boost overture sign-ups. :(

Whatever be the case, in the long run, whichever engine provides the most relevent/fresh results is gonna win. it is not gonna happen overnight, but will surely happen. Let's see what Google does regarding this issue in their next update (possible November). It has already started crawling all sites like crazy and people have already started seeing some major changes in results.

Let's all keep our fingers crossed! :)

yintercept

12:10 am on Oct 31, 2004 (gmt 0)

10+ Year Member



There's actually a large number of local directories, news agencies and and official election sites that work to provide current lists of campaign related web sites.

Google, or any large search engine for that matter, could very easily inject valid election campaingn sites in their results simply by contacting the 50 state election offices.

The pathetic thing about the Google Lag is that all of the VoteMe2004.com sites will pull in a large number of links during the election cycle. Google will crawl them, sandbox the sites during the election. When Google finally lists the sites, they will bought up in the aftermarket and redirected to porn, or 404 errors.

Personally, I think human edited local directories are the way to go.

Powdork

2:37 am on Nov 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Now to get off the election thing. I was asked last week to shop a restaurant on the north shore of my Lake in the Sky. Trying to find the address and phone number I searched for the restaurant on Google with its name and the city. I was able to find directories with the phone number so Google helped me. Today when preparing the invoice I searched again to get the address to send the bill to. Using the same search the restaurant's site was number one. Only then did I notice I was searching on Yahoo. The first entry for the site on the wayback machine was in March. I went and tried the search again on Google. It was nowhere to be found. It is indexed, however.
Strike Two.

dazzlindonna

4:19 am on Nov 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Trying to find the address and phone number I searched for the restaurant on Google with its name and the city. I was able to find directories with the phone number so Google helped me. Today when preparing the invoice I searched again to get the address to send the bill to. Using the same search the restaurant's site was number one. Only then did I notice I was searching on Yahoo. The first entry for the site on the wayback machine was in March. I went and tried the search again on Google. It was nowhere to be found. It is indexed, however.
Strike Two.

Perfect example, powdork. This is exactly the problem and it is what Google needs to be worrying about.

Powdork

4:45 am on Nov 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Indeed. In fact I just switched the firefox search box from default Google to Yahoo. They simply are providing answers to my questions while Google is not.

Buddha

7:36 am on Nov 9, 2004 (gmt 0)

10+ Year Member



Today, I heard that Tara Reid was photographed as her strap slipped off, exposing her left breast.

So 1st I went to google to find the photo. I did a search and nothing came up. So then I looked up news and a few articles came up, but no picture.

So then I went to Yahoo and did a regular search and the first 3 listings all had very fresh content, with relevant photos to the search. :)

Kudos to Y! for satisfying my search needs. G, I think you should re-think the sandbox effect.

Powdork

7:40 am on Nov 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



gotta love nipplegate:)

whoisgregg

8:04 am on Nov 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I tried to use Google to find a manufacturer's web site. After many unsuccesful searches I finally got it and discovered they had a flash only site, no content except a copyright notice at the bottom. So searches for "widget brand" wouldn't find their site, only "2004 manufacturer name, inc." had any success.

My "widget brand" search did find the sites of many distributors.

So Google is broken right? Because it didn't give me what I wanted?

If I was a consumer looking for the product, Google gave me just what I wanted, ten different places to buy "widget brand" products.

But I was not a consumer and Google did not give me what I wanted. To return what I wanted, a search engine needs to glean from just two words, "widget brand" this statement of my actual desire:
I want to know who manufactures "widget brand" products.

Even though most of the time people who type in "widget brand" would describe their intent as:
I am interested in buying "widget brand" something, either now or in the future

Or the rare instances when a searcher for "widget brand" describes themselves as:
I am researching an article about current hype surrounding "widget brand" products

All three searchers want very different results but they all use the exact same two words to describe what they want.

Everyone will always have stories of "I typed in 'blue widgets' and it didn't give me what I wanted," because searcher intent is an unsolvable problem when the input is a few words.

It's a testament to how far computer programs have come that anyone can expect a computer program to understand that person's mental state from two words! But the truth is that sometimes two to four words, out of context, just isn't enough information.

If you want to learn for yourself how hard this task is, try to go a week only communicating to other people with notes of paper that are limited to four words per idea.

Powdork

8:42 am on Nov 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



whoisgregg, you can continue to say that the sandbox doesn't exist but the fact of the matter is that at this time last year Google would have had the restaurant's page listed first within 24 hours of finding the first link to it. I didn't search for for widget brand, I searched for the restaurant's unique name and the city in which it resides. And my main point, the one and only point that matters, is that Yahoo was able to regurgitate this information. For now, they are the better search engine. Did you try to find the manufacturer on Yahoo? Did it have a <Title>?

BeeDeeDubbleU

9:03 am on Nov 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I didn't search for for widget brand, I searched for the restaurant's unique name and the city in which it resides.

Exactly!

whoisgregg

5:22 pm on Nov 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've never said the sandbox doesn't exist -- I only disagree that the effective delay in ranking of sites or links (no one except G really knows which or if both are true) is the enormous problem it's made out to be.

The folks at every search engine are doing whatever they think is the best for their search engine. I like the concept that different engines will try different things and I can recognize that any approach will have a mix of effects, positive, neutral, and negative.

Webmasters prove to search engines on a daily basis that we will seek every approach possible to "own" their results pages. Search engines using the same methods of determining ranking forever is the same as just turning over their website to a handful of the most experienced SE optimizers.

If any search engine told you precisely what to do to rank first for a particular term, thousands would do it for every term making all search engine results useless. The determination of ranking has to be complex, it has to be secret, and it has to change sporadically.

BeeDeeDubbleU

5:36 pm on Nov 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The determination of ranking has to be complex, it has to be secret, and it has to change sporadically.

I wouldn't call hiding all new sites that have been indexed since last February "sporadic".

whoisgregg

2:04 am on Nov 11, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



hiding all new sites

Sandbox or not, all new sites aren't missing/ranking very poorly from the SERPs. I'll go so far as to use the term many new sites, but it's been discussed here a number of times that webmasters of new sites have been ranking. If it was so broad sweeping that every single new site was wholly missing, you and I would be commiserating instead of disagreeing. :)

Powdork

2:17 am on Nov 11, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I only disagree that the effective delay in ranking of sites or links (no one except G really knows which or if both are true) is the enormous problem it's made out to be.
It's not a big problem if it is intentional. Then it is only really a problem for me and others like me. If it is unintentional and due to Google's reaching limits in hardware and/or architecture then yes, it represents a huge problem that will only become more obvious with each passing day.

renee

4:38 am on Nov 11, 2004 (gmt 0)

10+ Year Member



"../a huge problem that will only become more obvious with each passing day."

google has solved its capacity problem. just look at the front page of google - say's searching more than 8 billion pages! no more sandbox!

Powdork

5:05 am on Nov 11, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



no more sandbox!
The sandbox was never about being indexed, it was about ranking for competitive phrases.

BeeDeeDubbleU

10:07 am on Nov 11, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It's not a big problem if it is intentional. Then it is only really a problem for me and others like me.

I think that this is questionable. If it intentional it is still a huge problem because, contrary to what some people are claiming here, there is only a very small fraction of sites escaping this and when they do it is probably by accident. So Google, effectively, has stopped featuring new sites in the SERPS - huge problem.

If it is unintentional? Huge problem!

whoisgregg

7:41 pm on Nov 12, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



So Google, effectively, has stopped featuring new sites in the SERPS

Freshbot picks up new sites and the sites will rank effectively for a limited period of time, like a few days. It's only the interim period after a site is no longer fresh and it's inbound links have not yet "aged" when the new site ranks poorly.

renee

8:33 pm on Nov 12, 2004 (gmt 0)

10+ Year Member



look at all the "new" pages added by google to increase their count to 9 billion. most of them are from old domains and most do not rank in the serps. looks like google has now extended the "sandbox" or lag to all new (meaning newly indexed) pages whether from new or old domains!
This 85 message thread spans 3 pages: 85