Welcome to WebmasterWorld Guest from 184.108.40.206
Most of those links are in very targeted areas - blogs, other forums, etc. - not in useless links pages.
I am also reminded of a rather niche Yahoo group I belong to that has been plugging along for a full decade with a membership of a couple of thousand. It requires registration, and nothing in it appears in any SE. It exists solely on WOM
We'll see how things shake out. Good luck Brett...
Yes, but, speaking for myself, I came here initially because I started noticing that I was ending up here on various searches routinely enough, that brought WebmasterWorld up on my radar
Good luck with this change. Almost every time I've done a search on Google the past year for anything webmaster related, some WW thread come up on the first page. I know I would have never found this place without the first Google search. Probably the best way to judge if this new robots.txt deal is hurting or not would be to keep tabs on the average number of new forum members that join in the next few months and compare it to the same months last year. If you see a drop you might have made a mistake if you want to continue to draw new folks to the forum.
In reading this thread I can't help but wonder how does eBay or Amazon avoid the bot problem, assuming it's as much a threat to them as it is to WebmasterWorld?
Also, as far as bandwidth use goes, Yahoo! has something in place that prevents downloading/archiving too much from any IP (once exceeded, Yahoo! returns a "Yahoo Error Unable to process request at this time -- error 999" page which can last several hours to a day). Maybe something like that could be installed.
Perhaps consulting with the IT of some of the major portals is something you've already done, if not, maybe there's some good ideas to be found there? Just a thought...
It must be feeling great not to mention very inspiring.
F*&^8 am . G in particular, if they can't properly index and rank why allow them to waist so much bandwidth and then penalise left and right for absolutely no reason other then the mode and lack of proper programming knowledge of their clueless PHDs.
I love it...and will probably follow suit with some of my web sites.
Going to allow only yahoo and msn from this point on. They are the only ones that send some traffic anyway...why allow the big new crappy G slow down my sites and inflate their misleading page count for no reason?
No, that'd be skipfactorbot. ;)
Will the buzz create more traffic/membership than the SEs? Will it last? Will Brett do a MSM TV interview? This is going to be better than an old school update thread. Anyway, it's Thanksgiving in the US, and I'm thankful for WW in any color or flavor. If Google can sandbox 'newbies' why can't Webmasters? ...flipping through the Ipod to find that AC/DC "Big Balls..." song
[edited by: eelixduppy at 7:53 pm (utc) on Feb. 18, 2009]
As you know, Brett's been hinting at this. Funny, the wild stuff (I thought at the time) I've read here that I thought insane have had a way of becoming quite clear after a few years. Site down for a week, big move, Holidays, what the heck? However, I'm predicting a new robots.txt on 01-Jan. :)
Before there was site search which was okay, but you removed that in place of a Google site search, now thats removed, we can't search for anything! Please bring it back.
In terms of your search numbers, I don't use search here because I can't be bothered to figure out how to run a search. How bizarre is that?
The fact of the matter is that the community doesn't care about 'why'. They just want you to make it work.
I'm sounding like I'm attacking, let me apologize in advance for that - I don't want to do that. I do want to suggest succinctly and pointedly that there's a number of community members unhappy about what's going on, and from here it tastes and feels like they're being brushed off. Switch it around and look at it from the customer's viewpoint not the internal workings of your org.
That said, you run a great community forum here and I wish you continued success.
Just did a quick experiment and was able to cut a WW page down to 49k from 82k. Thats a ~ 40% savings right there. More savings are even possible IMHO.
I know this issue has been discussed before but the explanation has never been clear to me.
At any rate I wish nothing but the best for the outcome of this 'grand experiment'.
nah - I know several people that did that - most of them before I did actually. Did you really think no one would? This simply provided a wonderful experiment to test that removal tool. I'd wager that the engines hit that robots file daily at least - probably more.
If that's the case then should submission to the removal tool be any faster than just regular crawling of the robots.txt?
WebmasterWorld and the other BS aside this has actual beelyn quite an interesting couple days to see in real time how a major change to a major site really pans out.
I guarantee this has gotten WebmasterWorld more free press than anything to date. Well done BT :)
Have you considered "cloaking" your robots.txt?
You can force robots.txt to be a php file and run all sorts of interesting queries on a mysql DB before you spit the output out.
Accepts cookies: send it a disallow all.
Comes from bad ip range: send it a disallow all.
Comes from a legit G/Y/M DC: send it a proper robots.txt
Another option might be to log to mysql and have a cron job that queries the IP addresses every x minutes and reconfigs your iptable / firewall / deny lists.
(yeah yeah I know you also need to run a business... but I'm sure the geek inside you is yelling to find a more elegant solution!)
I wonder if somepeople are just wanked that their profile will no longer pass pr? So it comes back to their inability to game google through webmasterworld. interesting... this must be what G feels like during an update. lol
That doesn't make any sense what so ever.
If you really felt that way then you would have removed the profile urls. God knows url's aren't allowed anywhere else why make an exception there.
Smoke and Mirrors.
OTOH, given a large database driven site, and given that the db may be backed up daily. Couldn't an incremental backup type thingy be fed to the spiders on a regular interval? Obviously, not as things stand now, but this problem will get bigger, rather than smaller. What are the SE's doing to minimize this problem?
Maybe Brett's moving into PHASE II of the Webmaster World monetization plan.
Dump all the cached pages from all search engines, actually write an in-house search that works, make the in-house search available only to the Supporters Forum and then resubmit only a few select top level pages to the search engines.
I was thinking it might involve technology like Yahoo Subscriptions, or whatever similar technology Google has patented. Maybe it's the next module in BestBBS... ie, the next after site search.
I use search on this site so much that I've got a WebmasterWorld Google search box built into my custom code at the top of this page and also on my start page on my browser.
It's going to be an interesting month or two or three. I may have to start reading books again. ;)
why not implement an external css file instead of all the inline font and other styling tags this board has?
Remember, this is just Webmaster World, not Good Webmaster World!
That would certainly be a good way to trim a little fat (and save a bucket load of bandwidth). 1kb per page isn't much, but it's loads when you multiply it up!
On the subject of banning the bots, I think that word of mouth is the best form of marketing so I'm sure that there will be a steady influx of 'better quality' posters. However, I disagree with the fact that site search should be a fairly low priority.
There is too much good information tucked away too deep in the site not to provide a top notch search solution.
Maybe Brett has bought UKWizz and is going to make a webmaster specific search engine...
I don't know at what point a forum will reach critical mass and continue to grow and prosper without organic search traffic or paid referrals.... A forum without new members will wither and die.
This assumes that SEs are the only suppliers of new members. That could be far from the truth.
The issue of CSS has come up before. And has been discussed extensively together with samples, size comparisons etc. If you haven't read the old threads on this you just need to bide your time till Brett's search is functional. Besides, going CSS is not attacking the main cause of the problem.
The issue of WW search is also something that has come up several times and been discussed to death.
Maybe Brett can now do a 26 steps to ridding yourself of the SE addiction. I'd really like to see something like that :)
PR move: Let's see, if I valued search traffic would it be worth giving up search traffic and banning the bots completely for two days worth of links from blogs? Because, two days is all it'll be. Tomorrow, someone will blow his nose and the blogs will move to cover that.
[edited by: oddsod at 10:32 am (utc) on Nov. 24, 2005]
It's not - system load is.
> why not implement an external css file
We will at some point in the next year. However - no one is looking forward to the inevitable compatability problems. We have about 100 users that regularly surf WebmasterWorld with various cell phones.
> cut a WW page down to 49k from 82k.
Our last test cut down about 10k at best. You can realistically eliminate the font tags - and thats about it. Then you have to load the page up on divs/classes and you are slowly creeping back into the same size area as before. It would certainly help on repeat page views.
> My view is, if you have five children and four of them are costing you
Nice analogy. Here is a another one: you have four kids and the uninvited neighbor kids come over and stop your kids from going to school. Sooner or later, you are going to kick the nighbors out of the house and build a fence.
> Brett has a brand
The site is here for the members - and will long out live me.