Forum Moderators: open
Come on now ... this is just daft! What is the "rest of the Internet"? Google has to wait for no one. Isn't it supposed to be the leader?
It may be daft, I've never claimed wisdom or intelligence. :)
But to think that Google ignore the 'rest of the internet' is to ignore the very ranking metric that made Google's results better than the other engines initially and continues to be a foundation of it's ranking algorithm: inbound links
The collective linking structure of sites is a quality filter for Google -- the only reason it's not a perfect system is that people are aware of it and seek to manipulate it. For new sites, an excellent reason to "age" it's inbound links is to ensure that the inbound links are "real" links. (In the sense that Google wants a link to be a "vote" for either quality, interest, or whatever.)
Keeping with the theme of applying our concepts to political sites, if age linking and the other 'sandbox' effects we see applied, then there could be dozens of sites all claiming to be the official site for each presidential candidate. They could be put up, interlinked with link farms and ranked in SERPs within days.
What would search engine users do when there are a dozen, "Vote for widgets" sites with conflicting information and no clear way to determine the true site?
1. How is Google supposed to know that which political candidates sites are the legitimate ones?
...when someone puts up a site (www.candidatesname) with the title Candidates Name, and the name is unique and other sites link to it using Candidate's Name, then Google should return the site when someone searches for 'Candidates Name'. ...
The only difficult part of that technique is buying the domain name. But what if the candidate buys votename.com? Or if candidatename.com is taken? (You point out the name would need to be unique, what about when it's not unique?)
The rest of it, Name in title, Name in links can all be gamed too easily for Google to trust it. I wish all of us webmasters could be trusted with providing accurate data about our sites. Search engines know we can't be trusted, so the engines will always be written with complex data analysis methods and many of those methods will hinge on watching the site interact with users before it gets ranked.
By bringing up a small sample of sites, the thread is either:
a) about that small sample of sites and, presumably, how those should be treated differently.
b) about the larger sandbox issue.
Or maybe, just maybe, it's about how the larger sandbox issue relates to this small sample of sites, which are related by the fact their information will be worth much less after Nov. 2. (my bold)
The information being distributed and available after November 2 will be worth much less to the candidate. That's why political sites will sometimes get pulled down after the elections.
No search engine should rank any site for containing time sensitive information until that site can show a record of always having time sensitive information. The perfect example is news sites. "Time sensitive information distribution" should be relegated to sites which display their ability to distribute that information regularly. Those sites can link or post their source to make their sites more valuable to their users.
Enough of those links and the source material becomes evidentally relevant and will also appear.
From the search engines perspective, every new site is an unknown. It could be the mindless ranting of a lunatic who has discovered HTML :D or it could be the wisdom of a future world leader. (Heck, it could be the mindless ranting of a lunatic future world leader.)
Nothing on page can be trusted wholely to make a new site relevant to any particular term. So where does the search engine get it's information? From the rest of the internet and it can take time for a definitive answer to arise. It's the reality of webmaster's behavior that new sites must be treated in this way by search engines.
Same way they algorythmically judge every site
That's what they are doing right now. Part of their algorithm has the effect of delaying inclusion for many sites.
Let's pretend that this is not a bout political websites, just new information, that has little relevance after a given period of time
That's called news. There is no reason why a search engine should consider a new website to be just as relevant for an established news site. The real world equivalent of giving greater weight to established sources is being carried over to the internet. I know some folks disagree with this concept (that the source of an idea is relevant to the quality of the idea) but the search engines must have thought it would make their SERPs better overall.
Of course, it is no surprise that you are suffering from poor rankings given your general grasp of logic.
Thanks! :)
<edited>I'll leave my responses to mfishy's post. Quotes above are from his now deleted post.</edited>
I will let you continue to debate yourslef for the next 3 posts! LOL
:D lol I'm hunting bad links in a major site revision today -- My choice of activity between site wide find and replaces is either watch homestarrunner or participate in this discussion. Probably leaves me with a little too much time to type. :)
(If only I'd learned more about regular expressions before experimenting it wouldn't take me a day to fix my mistakes.)
Crazy? who knows?
I have now stopped taking any new clients whose sole priority is google (i am not the type to look a gift horse in the mouth) but I'm turning more and more people down, i gotta sleep at night.
My last refusal was from a major world wide charity site, who are moving domain name they even offered a salary as well?, contract for two-years!
When I tried to explain the current difficulties, they thanked me for my honesty..but could I recommend another who could do the job..jeez some guys don't get it
Seriously though, if a lot of people think Google sucks -- a lot of people will stop using it. The only reason people keep using something that sucks is because they think the alternative is worse or the decision has been taken away from them.
It's monopoly and misinformation that I think should be combatted, not the idea that there can be only one right way to organize data. I don't think a 'sandbox' is a good thing -- but it's their choice to try it and see how it works. <shrug>
Exactly, and i can say it don't work for the right reasons.
So is the g-lag to do with politics?
Maybe g thought candidate (a) could out spam candidate (b) ;- Thus making conspiricy theories...ie g supports candidate (a) or (b) whoever wins.
I'd want to hear that interview, "How do you respond to the claim that google only lists red party candidate sites?"
Google rep responds, "That's because blue party candidate sites all use flash, javascript redirects and php session variables."
Reporter stares blankly into space, "So... how do you respond?" :)
err me sir, them guy's signed toooo many guest books, it ain't fair. And they is nothing but underhand cheating spammers who should face fraud charges -;)
Reporter says "but you won sir" ;-
The only reason people keep using something that sucks is because they think the alternative is worse or the decision has been taken away from them.
Wrong! People will continue to use something that sucks if they don't know that it sucks. That is a fact. I know it, Google knows it and I think you also know it.
Probably more than 99% of users have no idea that Google is currently defective.
People will continue to use something that sucks if they don't know that it sucks.
They'll use it until you push their nose into a better alternative. Remember the time when Altavista was the biggest search engine? Back then someone showed me the Google search form and I knew this must be a better search engine. Just by looking at the form. One logo, no ads, no banners, no directory. Plain and simple. Search for something, find it. The G serps have improved in the past 12 months as far as spam is concerned. But at what a price!
For the time being, I don't see an alternative. I would be the first one to switch and I would spread the word.
So if Google can't fix this lag problem, which I absolutely do not believe has as its primary cause the deliberate attempt to filter out spam [although this could easily have been a secondary motivation], somebody will eventually come along and offer a product without these restrictions. The web is fluid, it changes all the time, it's not static, if a search engine can't handle this fluid nature then it can't handle the web, that means a company that can handle it will come up. Likewise, if a search engine can't handle spammers without totally throttling the incoming material that it's based on, a search engine will come up that can. Even accepting that this was done to stop spammers, that's saying that the spammers won, and forced google to dump fresh, timely scans of the web to control the spam, but I don't believe this is the case.
The political website thing was the example that struck me as one of the clearest to demonstrate the absurdity of the so called 'lag'. Trying to create some rationalization for a system failing to handle new sites in a timely manner, I don't know, I think google is still benefiting from that good guy image they started up with. Microsoft also used to be a much better company, they even had real live people you could call for free for tech support, then they got too big.
It's my guess that we're all going to wake up one morning, or month, and suddenly the 'lag' will be gone, slow indexing of new content will be gone, and results will be significantly different. Whatever problem is causing this will have been fixed, and google will more or less successfully have prevented the problem from ever affecting its stock prices [nice going guys, very slick].
My current suspicion is that what made MS have to put off for about 8 months their search engine release is very similar to what is making first the 'lag' appear, then grow longer and longer, it used to be 2 months about, now it's I think 6 months. I'm just guessing here, but I think it's related to hardware prices and problems with the operating systems that run that hardware, that end of things took a bit longer to work out than had been anticipated.
I'm just guessing here, but I think it's related to hardware prices and problems with the operating systems that run that hardware, that end of things took a bit longer to work than had been anticipated.
The lagging/lagged/sandboxed sites are in the index. They are just not ranking well. Depending on PR, changes made to the these sites take effect more or less immediately in the index and cache. We were told that new pages on established non-lagging sites rank well immediately, albeit I have not really witnessed this myself.
Anyway, it's not the hardware. It's not a glitch or problem with the software. The sandbox is a deliberate, conscious decision by Google. A somewhat desperate one if you ask me.
that's exactly my point, it's so desperate that there is simply no way a group of guys that smart would be doing it they weren't being forced to by something they hadn't considered when they built the system.
These are young guys, doing something new, there wasn't a lot of prior data to work on when they built google, including the speed which the web would grow, it simply was not predictable, especially how spammers would be able to generate huge sites overnite.
People say this was predictable and once it happened easily fixable, I don't think that's true. I think it will be fixed, without any question, but it hasn't been fixed yet. 6 months for a site to leave the sandbox is a full on, 100% system failure. As this election discussion clearly shows as far as I'm concerned. If google can't handle the web it will fail, it's as simple as that. But I think they will work it out, it's just going to take some time.
Anyway, it's not the hardware. It's not a glitch or problem with the software. The sandbox is a deliberate, conscious decision by Google. A somewhat desperate one if you ask me.
I don't think so. If it is deliberate the person who came up with should be sacked. Do you realise the damage that this could do to Google if the press gave it the attention it deserved?
In actual fact, if the politicians whose websites are not featuring knew about this they could probably rattle a few cages.
BeeDeeDubbleU,
> If it is deliberate the person who came up with should be sacked.
Agreed.
>Do you realise the damage that this could do to Google if the press gave it the attention it deserved?
No, I don't. But I think the damage caused by spam was bigger. From a users perspective, it is easy to identify spam but much harder to realize that serps are stale. The sandbox fights spam but fools users.
I can't help it. I need to make a sweeping generalization at the end of my posts. Maybe I should boldface it.
I have switched and I have switched all my clients (100+) to [techpreview.search.msn.com...] May not be the best but at least they are attempting! (just hope they implement a 301 when launched)
They will get the picture soon enough after watching radar blips from their screen disappear!
This is nothing more than a way to extract more money from surfers. Google has proven they don't care about their results and risk making them poor in order to boost adwords revenue. They are one of the greediest companies on the net and nothing would surprise me.
There is a reason scraper sites running adsense rank higher than real sites. I don't use Google anymore unless I need a spelling, conversion, or image. I don't click on their ads as well. Yahoo! whores out their front page with ads, but atleast Inktomi is trying to make better results for its users.
If I launch a new 'joecandidate dot org' website that is extremely relevant, it can take months. This is absurd. This "lag" is a complete disservice to the public and an embarrasment to Google.
This is absurd. This "lag" is a complete disservice to the public and an embarrasment to Google.
Of course it is Kirby. The argument that older sites have gained trust is utter rubbish.
Google is supposed to be able to evaluate a page by what type of incoming links point to it. If site A is deemed important by google and it links to site B that should tell them something. If it doesn't, than they are admitting that they may have been wrong about Site A in the first place!