Forum Moderators: bakedjake
It has been reported that Grub does not sufficiently secure sensitive information. Because of this, an attacker may be able to gain unauthorized access to Grub user credentials.
Eww. Glad I didn't download that.
Cheers, baby!
;) Y
Funny how you say all that without even doing proper research.
Details both the problem and the solution. [securityfocus.com]
It has been reported that this issue is resolved in version 1.4.3 of the Grub client.
From here [securityfocus.com].
Cheers.
Did legal remove the restriction on your posting here as you mentioned before...?
And, fyi, a lot of us got a note from another bloke over at Look regarding your robots.txt handling...recently ;)
So, I'll believe it's fixed when we go, say, 3 months here @ WebmasterWorld without somebody saying your bot triggered their spider trap.
Restrictions are the same -- I will always keep posts to items that are non-material/public info.
Re: the robots.txt challenge, I'm excited for us to live up to it. I'll take a peek around to see what's going on with the other robots.txt handling issue. We've had a quite a few folks report problems on our own boards, but a fair number of them (about 75%) turn out to be issues like "I banned your robot from my site...oops I banned you from robots.txt too", or "Here's my site, oops, my hosting service forwards robots.txt to their robots.txt", or really bad problems with syntax (as your own analysis has shown).
That's why we think the best solution is absolute transparency in the operation of Grub. If you've got a question about what's going on with robots.txt we should just show you via the system. It's taking a little time to get the "cadillac" version in place, but I hope you'll all be pleased.
Cheers,
Andre