That looks interesting, thanks.
I'm testing this on our NFP site.
Google were kind enough to give us free site search, the one WITHOUT ads, but when we tested it, it returned more dead supplemental results and long-gone 404s than live pages, so we had to delete it, and use FDSE instead.
If this Y! is ad-free and returns good results we will keep it.
I've just installed it, and it returns no results for any searches of our site. We are in Y!, so perhaps they need a few hours to build a database.
[edited by: Angonasec at 12:06 pm (utc) on Aug. 8, 2006]
>>I've just installed it, and it returns no results for any searches of our site. We are in Y!, so perhaps they need a few hours to build a database.
I don't think they need time, this is just a simple front end to Y! that is set up to do an advanced search for you website only. I just tried the demo builder and it works for me.
Looks like there is a limit of 25 sites that you can add to the trusted sites list.
Which is too bad. I have 100+ trusted sites for a particular niche that I would love to be able to search quickly and easily and I know my visitors for that niche would like to be able to do that too.
*Sigh* I look forward to the day of the Web 2.0 of search engines. Wouldn't it be great if you could just add (and subtract) trusted sites to your personal search engine the way you can to a blogroll or del.icio.us? Spam would be a non-factor. Site gets spammy, you delete it from your search engine. You could use your friends search engines to see if they have sites you might like or the search engine of a blogger that you admire.
I thought that at one point in time the search engines were kind of going that way, but I haven't seen anybody doing it well. They are all still trying to please the masses rather than just me (in a global, each person is a "me" sort of sense).
The database loads on a project like that would be overwhelming. Not to mention, it would only be useful to be able to subtract sites. In order to add sites, you have to have a means to find them and, hence, the whole point of a search engine.
That being said, I could also see a secondary market come about wherein people sell their "subtracted" sites which were spammy in their opinion. People with a reputation would be in high-demand to show their subtraction list. Would be kind of cool unless you're one of the unfortunate publishers to be considered spammy.
Is the search box branded "Yahoo"? I would love to see an example.
|The database loads on a project like that would be overwhelming. |
I am not quiet so sure. You wouldn't have to crawl "all" the web to figure out what was good, only the parts that people had requested be added to their personal search engines and you could stop crawling if no one had a site selected. That in itself would probably cut down tremendously on the load.
Then factor in that your algo could be less agressive (and require less work) because people would most likely choose sites they liked, which probably would not be junk. (yeah, some but if that's what the people want...)
If 8 years ago, the web had looked like it does today, most people probably would have said the same about ranking sites based on links.
|Not to mention, it would only be useful to be able to subtract sites. In order to add sites, you have to have a means to find them and, hence, the whole point of a search engine. |
All 100+ of my sites did not comes from finding them on a search engine. They came from links gathered from blogs and me wandering around other people sites. Plus, that's the beauty in having access to other people's search engines. If I can't find it in my search engine, I could go to a freind's or blogger's search engine and find new sites that way.
It would be more like the "good old days" of the internet when linking worked because you linked because the site was cool not because it gave them PR.
I have to tell you, as of late I have been disgusted with all 3 major search engines. I can't find anything. I know the info is out there and most of the time I am looking for documentation for facts I already know.
I know some people love wikipeadia, but I don't and I hate that they come up for every search I do. But someone else may feel different.
They are trying to please everybody with a one size fits all algo and it is not working
Right now, I have these 100 sites that I know is packed with info but have very little pull with the search engines. They are small sites, hobby sites where information is posted out of love. But they don't know anything about SEO and probably wouldn't care anyway. I can only search them with the power of my own memory. My memory isn't that good.
The search engine that give me the power to choose my own trusted sites, as many as I want, will have my love and adoration and I will sing their praises to every person I know.
|I have 100+ trusted sites for a particular niche that I would love to be able to search quickly and easily and I know my visitors for that niche would like to be able to do that too. |
Not to hijack the news about Yahoo!, but if anyone is looking, WebmasterWorld talked about the gigablast offering back in 2004 [webmasterworld.com].
It now lives at [gigablast.com...] and allows up to 500 web sites. I've used it a number of times (the XML version) to build site searches and to build niche searches, although I remember it only allowing 100 sites, so it looks like they've beefed it up, perhaps in response to Yahoo?
swicki.eurekster.com is another build-your-own topic-specific search.
I noticed all mentions (Yahoo and gigablast)are not actually on your site. I just can't see trusting an off site service for this. Sort of like when AdSense had issues a few weeks back as a paralell.
Hannay -- I was thinking the database load not on the crawling aspect, as that's been well-established, but rather storing and keeping these seperate indexes for each individual. To some extent, it might be like gmail with keeping a mailbox/emails but I could see this quickly going out of control because. Even if you surf 15 or 20 sites a day and add them to the sites you want, that can add up to several thousand in a year. How scalable is it to keep track of "good sites" that people want to be able to search through when each person adds a couple thouasand links? Being able to build mini search engines will take a phenomenal amount of computing in the storage and search itself. The crawling isn't so bad, as this might be done once every 3 or 4 days tops.
There will be one index in total, not one per user. Only the user preferences will be stored, like the URLs of the trusted sites and so on. Those preferences will be used to tweak the algo that weighs the results when a user performs a search.
Anyway, this will not going to be a killer. What we need is an engine that is able to guess the users intend and gives them tools to specify their intent in an intuitive manner.
this yahoo idea is completely taken from rollyo.com . On the other hand Rollyo.com uses Yahoo XML Web search API to serve results . It is surprising that this site has 11,000 Alexa rank . I think Yahoo found a new business opportunity in this idea!
It's installed on our site using the correct tailored code but all searches produce a "No results found page".
I've emailed them to see what the problem is.
I'd like this to work, but if it doesn't it will have to come off our site.
yes same here, when you put in your site for 'Search my site/ this site " it comes up with no results, however, it 'does' come up when you click 'web' search, you can put in I think up to 25 sites that will come up when 'web' search is chosen, however, you have all the Y adds appearing top, side and at the bottom,
I am going to have a go with Gigablast and that other one mentioned and see how they do.
off topic but - swicki.eurekster.com is a big pile of tagged up web2.0 SE spam. or am I missing something?
A few folks on this thread have indicated that when they restricted their search to a few sites it did not yield any results for the sites they resricted it to. There was an issue with the URL formatting when restricting the search to specific domains which we identified and have fixed.
Basically, there are a few things you should be aware of:
1.) That there are no quotes around the URLs that you entered when entering the domains to restrict your search to.
2.) That there are no commas separating the URLs that you entered when entering the domains to restrict your search to.
3.) You don't even need to add the www prefix when entering an URL i.e. you are better off simply entering yahoo.com.
If #1 and #2 were an issue for your Search Builder box they should be fixed and no further action is required, but let us know if this did not do the trick.
Please keep the feedback and questions coming,
-Ariel Seidman (Yahoo! Search Product Management)
Thanks Ariel, ours is working after we removed the http:// from the custom code.
Will Y! be having the option to have the serp adverts removed for NFP sites?
If these custom search products are going to show advertising, how about sharing the wealth? This would be great if it were integrated with some type of adwords product.
I have decided to use this on one of my newly re-designed content sites.
I have completely customized it. Good one. However, the results page has yahoo ads and it would be great if they can share the revenue generated from this with the publisher.
This can be linked with YPN if the publisher already has it and the shared revenue can be reflected in the YPN account.
Not sure, if yahoo is considering this?
Another suggestion... I guess there will be the search keyword/phrase tag cloud below the search box. is that so? If not having one customisable tag cloud [displaying the searched parameters would be great and those tags linked to the searches performed.
Yet another suggestion... ability to customise the entire look and feel of the search results page atleast to match my site's look and feel.
Thanks & Cheers
Does anyone know if you can have the Yahoo results from regional Yahoo such as Yahoo Asia or Australia?
how about a version for Yahoo News...? Can it be done?