Welcome to WebmasterWorld Guest from 54.145.235.72

Forum Moderators: martinibuster

Yahoo! has launched a new Slurp

   
3:47 pm on Jul 29, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Yahoo! announced on their blog [ysearchblog.com] that they've launched a new Slurp! that spiders less but is more efficient.

Quote : "you may see some shuffling of the pages that are included in the index and some changes in ranking as well."

Translation : Hang on... another Yahoo!-YoYo ride coming...

[edited by: martinibuster at 8:09 pm (utc) on July 29, 2006]
[edit reason]
[1][edit reason] Added link to YBlog. [/edit]
[/edit][/1]

8:02 pm on Jul 29, 2006 (gmt 0)

WebmasterWorld Senior Member crobb305 is a WebmasterWorld Top Contributor of All Time 10+ Year Member




System: The following message was spliced on to this thread from: http://www.webmasterworld.com/yahoo_search/3027430.htm [webmasterworld.com] by martinibuster - 12:05 pm on July 29, 2006 (utc -8)


A weather report was posted on the ysearchblog about the new Yahoo crawler, and a serp update.

The main thing I notice is that "banned" sites are still banned, despite improvments and quality of content, and that many sites that were indexed with internal pages shown as url only have been reduced to homepage only (site:example.com).

8:13 pm on Jul 29, 2006 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Good riddance to old Slurp. A lot of robots.txt files may get lonely now without being visited three times for every real page indexed.

Hopefully the new bot will make Yahoo at least mildly competitive.

11:17 pm on Jul 29, 2006 (gmt 0)

10+ Year Member



A real improvement or PR(not page rank) move?

I guess time will tell. But I doubt most Y! shareholder would even understand that "weather report" much less put any stock it(pun intended).

I haven't seen any movement in my market yet, anyone seeing anything?

12:11 pm on Jul 30, 2006 (gmt 0)

10+ Year Member




From Y! blog: a 25% decrease in number of requests....

I guess that I was not the only to complain that Yahoo was keeping on crawling sites like crazy and totally blind of any relevance!

Good that shareholders (maybe) start putting pressure on all these flakes unable to build a relevant algorithm and wasting both their time, ours, their money and our bandwidth.

Now if they could explain me why they need 12's of different referrers to crawl a website I would love to know the answer - Because obviously it doesn't make their algo better but it scr@ws up my webstats! 50% of visits are actually some disguise of their bots.

...

7:31 pm on Jul 30, 2006 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I hope they use 10 times as much bandwidth as they have. I hardly think "crawling too much" has been a problem for .0001% of the sites out there. Yahoo needs to crawl much harder and efficiently than they have. Years after they launched their search engine, I don't know of any 1000 page site that is fully indexed correctly.
3:37 pm on Jul 31, 2006 (gmt 0)

10+ Year Member



So basically this is exactly what google did, only 3 months later?
8:29 pm on Jul 31, 2006 (gmt 0)

5+ Year Member



These idiots (sorry, a little frustrated...) finally did something about their crawler. Yahoo! It's been few months now that I've been complaining to them that their crawler was crashing one of my sites. Every night - on the clock @ 2.15am PST – they would unleash 100’s of crawlers – all downloading at the same time! So instead of normal ~100 child process on apache – I would see ~300, 400, 500, 1000 increasing within minutes and then the server would crash… I’ve sent them my logs and politely explained that it’s not very convenient for me to go to the server room in the middle of the night – every night… Anyway, I end up applying mod_evasive on them so now they are still doing this but are limited to ~200 connections so it’s bearable for the server…
10:35 pm on Jul 31, 2006 (gmt 0)

10+ Year Member



Is there a useragent change on this as well?
1:25 am on Aug 1, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



One hour into 01 August 2006 (GMT) Slurp has only hit me 13 times compared to GoogleBot's 150. Before the new Slurp, Yahoo would be at the top of the list with a massive hit-count. Accuracy is more important than quantity for me.
5:52 am on Aug 1, 2006 (gmt 0)

5+ Year Member



I don't think changing the crawler will help webmaster's. Before this update i have noticed releavant and spam free results on Yahoo, but after this update spam sites are coming on top. In few industries free sites or blogs are coming in top.

If they really want to make some change than they should remove spam sites and show good and spam free results.

8:50 pm on Aug 1, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



followgreg indeed i have seen many complaints about Yahoo bots crawling like crazy lately and rarely thsi new "bot" will solve that issue
10:59 pm on Aug 1, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



One of my sites has had 422 different Ink bots in today so far. I don't mind, Y! referrals are up today as well.
3:22 pm on Aug 4, 2006 (gmt 0)

5+ Year Member



well it's about time. now hopefully the serps will improve. (not holding my breath.)
1:25 am on Aug 5, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Update : well, Y!'s Slurp has started hitting like crazy again. Worst still, they've replaced my homepage in the serps with and internal page. I don't know why... I have not done ANYTHING to the homepage and this internal page, so it's not me!

Yahoo! has gone !oohaY

5:00 pm on Aug 5, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It is beginning to look like the summer of search engine nightmares. Not only are we having to deal with the Google dance from hades, but now Yahoo has decided to tag team us. Calgon take me away!
6:35 am on Aug 11, 2006 (gmt 0)

10+ Year Member



Ever since the bot change we have seen slurp basically hit the robots.txt file and a select few pages over and over again each day. We have changed NOTHING recently. Every once in a hoot it will crawl an extra page or two but sticks with the same basic pages for who knows what reason.

Anyone else see this happen and know what is the cause or what can be done?

 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month