Forum Moderators: Robert Charlton & goodroi
Need in-depth research information about a product? Google won't provide it so searchers have to go elsewhere.
OK, let's look at some practical issues.Not even close to being the right questions.
1) Where does the underlying data come from?Ever hear of the World Wide Web?
2) How do you correct for "garbage in, garbage out"?By being smart rather than stupid about what the search engine spiders and includes.
3) You talk about "demoting Amazon" and, presumably, other brands that compete with your search engine's sponsors.It's got sponsors now?
4) Where is the demand for this "alternative Google" coming from?This might be hard to hear, but not everyone thinks that Google is great. A new search engine has to be better than Google (with the banjaxed results, that might be getting easier), give the searchers what they want and provide opportunity, if necessary, for monetisation.
1) which set of Google SERPs do you scrape?
2) How do you correct for "garbage in, garbage out"? Let's say that Google is ranking scrapers higher than the original sources, or that John Doe's site has disappeared from the SERPs because of a wrongly-applied manual penalty.
Are you going to write a complicated algorithm to run on top of Google's already complicated algorithm? Are you going to hack into Google's data centers to extract pages that aren't being shown? And who decides whether the penalty against John Doe was legitimate or unfair from a searcher's point of view?
3) You talk about "demoting Amazon" and, presumably, other brands that compete with your search engine's sponsors. Is this idea driven by what searchers want or what you want? And who decides which sites should be "demoted"? (Not the sponsors, presumably, or Amazon could just kick in a few million bucks and demand a spot at the top of the heap.)
4) Where is the demand for this "alternative Google" coming from? Duckduckgo certainly hasn't taken the world by storm, and Bing hasn't been able to take market share from Google despite huge expenditures on search technology, promotion, and advertising. Is there an audience for an "alternative Google" beyond disgruntled SEOs and site owners?
Are you suggesting that the search engineers at Google, Bing, Yandex, etc. are marketers?
Let's create a new engine that doesn't have it's own tech, but rather, scrapes Google, moves the ads back to the right sidebar, demotes amazon, demand media, etc, and sends the results back to the user.
I think it's a model that works, but scale, and Google reaction to widespread use would be potential problems, in addition to letting enough people know about the niche engine to actually "make a dent'.
We may not all have had the appropriate training but nevertheless there are those of us who are relatively wise and competent to take on the task of ensuring the right direction for the future of search and web development. This is really important.Building search engines is a bit different to building websites and requires people to be able to think in a different manner. The last thing that such a venture would want is to turn into another Wikia Search with a bunch enthusiasts and no professionals with real world experience. Wizards? Pah! I'm a NetGod. :)
And who'd dominate the open-source contributions?
@EditorialGuy
You can't scrape Google or Bing, or anyone else for that matter. If you did, you would be spending the rest of your life sifting through the garbage that these indexes think are relevant.
You want new, fresh, and relevant results? .. Then your new search engine has to start at the beginning and move along from there.
The webmaster would remove any spam sites in their mini index and contact other participants in their niche asking them to do the same.Self regulation is no regulation. It doesn't work when financial advantages are involved.
The problem is that one just cannot trust Google not to change the terms of access if it becomes successful.
choosing the content your users search: your site, a collection of sites that you choose or the entire web. You can also prioritise and restrict search to specific sections of sites.
Self regulation is no regulation. It doesn't work when financial advantages are involved.
For sure there is a problem when you are dealing with commercial queries and that applies to whichever method you use.It is a problem in all areas due to naturally occurring competition.
The thing is that I am sure there is a solution. You could have people giving favour to sites that send them traffic and this would mean that those sites rank higher than sites that do not refer much traffic. This would be a good thing coz those sites would receive traffic but then send it on.Perhaps. But it would require that human traffic be distinguished from bot traffic.
You could give a boost to the sites who have updated their mini indexes most recently.It would essentially reward content churning.
The point being that the problem can be solved I believe if we put our minds to it.There is a solution: Don't use Google's GIGO approach and rely on trying to sort out the spam after the spam has destroyed the index. This is where the simpleton approach of Garbage In Garbage/Google Out used by Google just causes problems.
Trying to game the system just doesn't work.There is a fast and elegant method of stopping a lot of the issues that cause problems for search engine submissions. One could deepsix any meatbot/spammer submitted site.