Welcome to WebmasterWorld Guest from 54.159.51.118

Forum Moderators: mademetop

Is 2018 The Time For a New Search Engine for Non-Brands

     
10:45 am on Dec 14, 2017 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 9, 2000
posts:25721
votes: 820


Brands are everywhere: In the high street, in the mall, around the globe, and, for you and me, online.

Localisation might have been the answer, but it's also dominated by brands.

The deep pockets of brands mean a smaller business cannot compete online.

What if there was new search engine that only displayed non-brands?
The idea would be that the independent business would have as good a chance as any other in the mall of the Internet.

It would need to define a brand, of course, but, if you think about it, it's relatively easy. Brands just wouldn't be listed in any organic results.

Google and Bing could do it, but that would go against their objective, and certainly wouldn't be good for the bottom line.

How would the new search engine make money to survive?

Pay-for-play really won't work as we're talking about small businesses that might spend less than $1000 per year to promote themselves.

There are options to be ad-funded, and you could use the funds of brands to pay for ads.

Perhaps i'm just dreaming and you'll wake me up in a moment.
11:27 am on Dec 14, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 16, 2005
posts:2830
votes: 143


The problem is spam. One reason Google and Bing favour brands, is that favouring big and well known sites removes the worst of the spam.

The other problem is that there are huge barriers to entry:

1. Cost (hardware, bandwidth and development)
2. Patents. Google and MS has a lot of patents on search, enough to constitute a patent thicket you are not going to be able to engineer around. A start up would not be in a position to negotiate a cross licensing agreement, so you are finished.
3. Branding and marketing. Really, very difficult to get users to change habits. Even Bing with MS behind it has made limited progress against Google.

If you can get the users, you will get ads. That bit will work, but getting there is impossible.
11:46 am on Dec 14, 2017 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 9, 2000
posts:25721
votes: 820


graeme_p you're correct, there are huge challenges, and that's a great list of reasons to not do it.

Yes, spam is an issue, everywhere, and new AI ought to help in that respect.
Google got everyone to help them identify spam by asking for a disavow file. There's a wealth of data in those files which already exist.
Cost - start small, win investment.
Patents - yes, and there are other ways of making a database.
Branding - every small business would have a stake in it succeeding.
1:05 pm on Dec 14, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 16, 2005
posts:2830
votes: 143


Start small, means you need to target a niche, but that gives people a choice of using two search engines (your for the niche and another for general) or one for everything, and most people will opt for the latter.

Patents: not sure what you mean.

Branding: convincing non-web small businesses of that will be difficult. At best they will wait and see if it succeeds.
2:03 pm on Dec 14, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 16, 2003
posts:1954
votes: 9


Etsy, Amazon, eBay, etc all do pretty well at delivering non-brand / indie products. ;)
3:05 pm on Dec 14, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2706
votes: 116


i think we just need Duck Duck Go to take off. that will be a lot quicker than building up another search engine.
4:24 pm on Dec 14, 2017 (gmt 0)

Junior Member

joined:Apr 21, 2016
posts: 64
votes: 11


So why not help Duck Duck Go take off? Would promoting it via your pages help get more people using it?
4:30 pm on Dec 14, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14843
votes: 473


Amazon is a search engine.
YouTube is a search engine.

There are many other search engines that aren't commonly thought of as search engines that serve non-brands.
4:58 pm on Dec 14, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member ken_b is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 5, 2001
posts:5855
votes: 103


Duck Duck Go


Maybe they could start by ditching the juvenile playground name for a serious business name.

"Google" was at least easy to remember and didn't sound like a playground game.
5:02 pm on Dec 14, 2017 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 9, 2000
posts:25721
votes: 820


>"Google" was at least easy to remember and didn't sound like a playground game.

Better than backrub, eh! hehe
7:30 am on Dec 15, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 30, 2002
posts: 2648
votes: 97


Yes, spam is an issue, everywhere, and new AI ought to help in that respect.
This is a lot less of a problem on a local level. As for AI solving it, that's just clueless Google propaganda. The search spam problem is a lot easier to solve than most people think. The reason that webmasters think it is some kind of insurmountable problem is simply because they have never seen the problem from the other side.

From running web usage surveys (gTLDs/ccTLDs/new gTLDs), there's a major section of the web that does not change over the course of a year. It would be necessary to use a smart spidering algorithm to only spider recently changed content.

The duplicate content is a far more important problem for a local search engine. While a locale's footprint might have a set of applicable TLDs, people still don't know how to use 301s or 302s properly.

For some, this is a discussion about the future. For a few of us here on WW, it is one about our past. However, the single most important question for a search engine is not the technology or the algorithms but rather the funding.

Regards...jmcc
9:07 am on Dec 15, 2017 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 9, 2000
posts:25721
votes: 820


>just clueless Google propaganda
Oh, come on, why just Google? In that case it's every other company involved in AI.

I was lucky to see and try a demo of a AI and machine learning chat bot a short while back, and it really was pretty good. It was able to differentiate between minor inferences in the question, and it found from its database differing answers. To prove it worked it had a turn-off-what-I've-learn't mode and was able to surface content which was, one-might-call, spammy or inappropriate. Switch the modes back and you could not get it to deliver the inappropriate answers as it's memory was restored.
It wasn't perfect, but neither is a search engine algo.

You are correct about funding, but, once again, there is VC out there.

As for DDG, there it really does require funding, and elimination of the brands if it were to work as i've suggested, otherwise it won't help the smaller businesses.
All the small businesses out there could really use this.
10:33 am on Dec 15, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 30, 2002
posts: 2648
votes: 97


Oh, come on, why just Google? In that case it's every other company involved in AI.
Because spam isn't that kind of problem. It is easy to fool "technology" journalists with the technological equivalent of smoke and mirrors. Most webspam has characteristics that allow huge amounts of it to be eliminated from an index with a few keystrokes. As I said, most webmasters don't see this problem from the perspective of those who build or have built search engines.

Regards...jmcc
10:49 am on Dec 15, 2017 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 9, 2000
posts:25721
votes: 820


I get that, spam will fool it, and it's a constant battle.

But, back on topic, it should be easier to produce something that will serve the smaller, non-brands businesses.
11:03 am on Dec 15, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 30, 2002
posts: 2648
votes: 97


No. Trying to use AI to attack spam is a misuse of the software. One needs people with a hunter-killer mindset (rather than a Happy-Clappy mindset that believes that software will solve all problems) to deal with spam.

Most of the web is small businesses. Every successful TLD that I've surveyed is full of them. A network of localised SEs might be a better way to go as Unified Search is a bit of a mess and insanely frustrating at times.

Regards...jmcc
11:19 am on Dec 15, 2017 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 9, 2000
posts:25721
votes: 820


>A network of localised SEs might be a better way to go

Now we're getting somewhere. :)

Are you talking regional by country, or even more localised, such as by county, or even by town?
11:37 am on Dec 15, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 30, 2002
posts: 2648
votes: 97


Country is easiest. Going to county or town level is a bit more complex but it could be done. Google can be defeated at the local level because it hasn't good local knowledge. The main problem affecting the larger search engines at the moment is that the renewal rates on domain names in some of the legacy TLDs is not great (<60% for one year regs in .COM) and there is an ongoing shift to ccTLDs for local sites and businesses. Many ccTLD registries don't provide access to their zonefiles so Google has to depend on crawling to detect new sites. That's a big problem for Google because its FUDbuddies in the SEO business and media have convinced people that it is bad to link. Thus no outbound links to new sites means that these new sites are quite invisible to conventional search engines.

Regards...jmcc
10:34 am on Dec 19, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 16, 2005
posts:2830
votes: 143


The problem with competing at a country level is the same as anyone targeting a niche: you have to persuade users to use two search engines, one for national, one for global searches. You immediately lose all users who use the default search engine (or "just time it in at the top"), and its too much thinking for most of the rest.

Most people have no idea how search engines work or how to frame queries so that they are not ambiguous.
11:05 am on Dec 19, 2017 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 9, 2000
posts:25721
votes: 820


Google, especially, has done a particularly good job of getting search on every Android device, and pretty much every desktop.

How would a new search service start to break through and gain eyeballs?

Is the conclusion it's simply too difficult to compete, and the concept should be forgotten about?
11:22 am on Dec 19, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2706
votes: 116


Maybe itís time for someone to ressurect the old directory idea. Crawling the entire web was all right when it was small but itís so big now and there are so many rubbish sites that itís almost pointless crawling the entire thing.

Just have a curated directory which only lets in the decent sites. No need for algos or the expense of crawling. Just got to pay for the human editors.
3:56 pm on Dec 19, 2017 (gmt 0)

Junior Member

joined:Apr 21, 2016
posts: 64
votes: 11


What are the definitions of "rubbish sites" and "decent sites"?
4:05 pm on Dec 19, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2259
votes: 599


What are the definitions of "rubbish sites" and "decent sites"?

Easy --> a rubbish site is any site that out ranks yours! a decent site is your own!
4:16 pm on Dec 19, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 30, 2002
posts: 2648
votes: 97


Just have a curated directory which only lets in the decent sites. No need for algos or the expense of crawling. Just got to pay for the human editors.
And this is the rock upon which many web directories were wrecked. Humans are not horizontally scalable. Domain names expire and web directories disappear. Approximately 56% of .COM domain names registered this time last year will renew. Some of the ccTLDs have higher renewal rates but keeping track of these deleted sites and maintaining a current index for the directory does require some effort. Then there's the problem of finding new websites. How do you do it? Relying on user submissions is going to get piles of low quality websites from meatbots and automated submission operations. Setting up a web directory seems to have been a rite of passage for web devs. The problem is that if it becomes successful, then it will turn into a full time job and Google will try to kill it as it did for all web directories once it had plundered their links.

Regards...jmcc
1:02 am on Dec 20, 2017 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1193
votes: 331


Rather than wishing upon a [ insert dream ] search engine why not do what some have done since day one, even more following 07-October-2008 and develop your business/site as a branded business/site? Yes, at one time having a keyword domain aka exact match domain leveraged SE stupidity, however that has been shifting steadily from advantage to constraint for a decade.

Google and other general SEs have always had a critical problem and it's not spam, rather it's limited answer space. As we know western searchers drop off significantly between first and second pages. So the 'answer' needs to be shown on the first. And just as no one ever got fired (back when) for buying IBM so too the easy out for a SE is to prefer 'brands' over generic. Unfortunately, for SEs, some of those 'brands' are actually niche search destinations as well: Wikipedia, Amazon, Etsy, eBay, etc. They have built sufficient 'brand' presence that if they weren't shown prominently in results a large number of searchers would go direct rather than via G.

Add in the shift from search to answer first then search and Google really has a lack of available real estate for a good many queries in a good many niches. For most sites there simply is no room at the inn. And the majority of those sites stand about outside and complain rather than stay elsewhere. Or also become brands.

As has been mentioned Local still gives the advantage to local sites, however even here the advantage favours brands. Note: I say favours, there are always exceptions. Unfortunately, brand is rarely 'made' via Google, rather it is 'made' via visitors being sufficiently impressed that they return and recommend, by appropriate marketing to acquire and superior service to satisfy.

Far too many sites are so Google optimised that most/all their traffic is via Google and the one time visitor. They have made themselves into a Google commodity, the polar opposite of a 'brand'. Further, they use the same platforms and offer pretty much the same [ insert product/service ] as a zillion others, they are figuratively and literally interchangeable with many/most competitors, the antithesis of being a 'brand'.

My last point is that 'branding' is done by others, not by the business/brand/site. You put the brand out and others define it for good or bad or total indifference.

The 800lb gorilla is Google's switch to increasingly providing answers and holding visitors rather than referring them. That it has driven even the safe brands down in the results, which in turn has shifted many sites out of sight and mind is the big recent ongoing change. Brand preference has been in existence for a decade now; it is far from new. And therefor should have - already - been developed around; the simplest if not exactly easiest method being to transform one's site into a brand, into a destination. A funny thing about SEs, and people generally, is that the more you need them the more they abuse the relationship and take advantage, however, the less you need them the more they want to associate with you.

The critical lesson for today: become a destination, a brand, however niche OR become/remain disposable SE commodity, extraneous result page filler.
8:46 am on Dec 21, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 16, 2005
posts:2830
votes: 143


Directories go one of two ways:

1) Either a small number of high quality sites, which favours large sites
2) A large number of (almost certainly volunteer) editors, so quality drops

It does not help that most people volunteer to get their own sites listed. My own experience of it was enough to make me very sceptical that you can get a large number of contributors without most being clueless with varying and superficial definitions of quality.

Google has this problem as well. Despite the huge amounts of money they have, they seem to base their ML training on what Americans in their twenties think of a site after a quick glance. If you are going to cut out the ML and do it all manually and properly you will need a lot more money than Google.
9:21 am on Dec 21, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2706
votes: 116


Do you remember how much yahoo used to charge for looking at your site ó 500 quid or something like that. And I seem to remember that didnít even guarantee a listing. It must have been a gold mine in its heyday. And that was before everything was plastered with ads as well.

I think itís time for paid directories to make a comeback. Make them pay a few hundred quid and give them an actual site review as well. Could be a good review or it could be a terrible one. Sort of like Kirkus does, but for websites instead of books.
that is something google doesnít offer and would distinguish it from all the other search engines
11:28 am on Dec 22, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 16, 2005
posts:2830
votes: 143


The problem is:
1. No one is going to pay you for a site review unless you have a lot of users
2. You are not going to have a lot of users unless you have a lot of reviewed sites (enough that people can find what they are looking for reliably) and have a UI at least as comfortable as Google, and have done the marketing to convince a lot of people to use it.

I know of one paid directory trying this, and the only reason people pay for listings is the hope it will help then rank better in Google. They have no traction with real users.
5:04 pm on Dec 22, 2017 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1193
votes: 331


To take a tangential view of the thread title:
Is 2018 The Time For Search Engine's to Expand Their Query Results Index?

However a SE might separate/congregate query return data there are fundamentally two of interest to searchers:
1. every page that might satisfy the query.
This is the silly [ About 85,400,000 results ] or similar one sees at the top of a Google results page.

2. every page that is actually returned for the query.
This is a maximum of 1000, typically many less. And, as we know, visitors rarely go past 10 and almost never past 30.

What I'd like to see is the ability that if one looks at a page of results and simply refreshes it the SE returns something other than that initial 1000. I want out of the personalised filter bubble that generated the initial bumph. I want access to more than some percent of a percent of a percent of the indexed pages that you say apply.

Perhaps those 1000 are the very best of the zillions available, possibly the first 10 are the very best of the 1000 but, in my experience, probably not.

Give me the choice of paging through default results as now.
BUT
Also give me the choice of asking for more. Not another query, simply more of what you say you have behind the curtain for this one.
5:13 pm on Dec 22, 2017 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1193
votes: 331


Because broad topic aka general search engines have pretty much replaced general directories and the dominance of a relative few SE's it is easy to forget that niche SE's and directories still exist.

While for most of us niche SE's are not likely to be traffic drivers they can be helpful when researching content. Google et al may have indexed everything including the kitchen sink but they are extremely constrained in what they show in query results, indeed with personalisation one's filter bubble is ever contracting, which almost forces webdevs to produce similar content. It's as if the Library of Congress only allowed the public access to a fraction, say equivalent to a small town library. Yes, with effort one can do the equivalent of interlibrary loan but it is tedious and still fractional.
Note: If a genie ever offered me three wishes I'd be very tempted to wish that Google's deceptive bucket sort be well shoved...

Niche directories can be traffic referrers, and while traffic is likely to be low volume it is also likely to be high converting being self selected. In my niches most such directories of value are part of niche discussion groups/fora. Often a main value added enticement for registering/paying. And, as such often not found via a general search query. Sometimes one has to dig a bit.
7:08 pm on Dec 22, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2706
votes: 116


maybe someone could just change the way the results are displayed. at the moment all of the search engines have one thing in common: they are all text-based. how about doing a visual one?
do you remember that million dollar webpage thing? you could have a grid layout with maybe four boxes in each row. each one contains an image of the webpage. the most relevant pages will be in the top rows and then the boxes get smaller and smaller as you scroll down to the less relevant pages.
but the idea is that you'll be putting loads and loads of sites in front of the user's eyes instead of just 8-10. they can quickly scan through the lot and pick which one looks the best
This 33 message thread spans 2 pages: 33