Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: not2easy
If a something is "scalable", it means that regardless of how intense the requirements ever get, the product/service/technology/whatever will be able to satsify those requirements.
If a something is "robust", it means that it works reliably under a variety of different working environments.
What's the word that means "helps to ensure that humans don't accidently break something"
Example: You didn't give an employee root FTP access to your web server not because you don't trust them, but because it increases the [word goes here!] of the system.
It's like security...but security implies that you're trying to keep bad guys out. What's the one that means that you're trying to stop good guys from breaking something?
(This is something that keeps coming up at meetings, and I have no word to explain what I'm talking about.)
"Allowing all employees to have root access increases our accidental-misconfiguration exposure."
"This access restriction is necessary to reduce our employee-error susceptibility."
"We don't allow shell access, in order to limit user-error risk."
Or in more informal terms, "access controls help us idiot-proof the system." :)
But we have a problem fitting it into the right part of speech.
If it matters, a search turns up plenty of uses of the term "idiot-proofness", though I doubt you will find it appearing in a dictionary any time soon.
The field of study that deals with this would be human factors engineering.
(this example isn't meant to be judgmental or ageist in any way.)
a fool is like a baby who could push or sit on any button without even realizing it's a button.
an idiot is like a 100 year old who has never seen the button before or might be afraid to push it or might not realize what happens when it is pushed.
either way, fool proofing and idiot proofing are similar exercises - securing the button and in some cases ignoring the button push.
The closest I could ever come up with is invulnerable to user error.
You didn't give an employee root FTP access to your web server not because you don't trust them, but because it decreases vulnerability to user error.
Here's a carpentry example: You use pushsticks instead of your hands because there is a (small) chance that your hand could slip.
Another tech example: We changed the way we update our web site. It's pretty high traffic, so it can't be down for even one second. So, when we want to update it, we first copy all the files to a "development" web site on our web server. Once we're confident that the development site is error-free, we flip a switch, which makes the development site the live site.
If we just uploaded files to the live site via FTP, our connection could blow up half way through the transfer, and customers would see a half updated (and probably broken) web site.
So that is the same concept, but doesn't involve idiots or fools. The terms that keep going through my head are:
But...those only describe 80% of what I mean...UH!
What's the word?!?
This is driving me nuts.
I need one single word that describes this concept. Anyone who has read "Made to Stick" will understand why. The person that comes up with it can be awarded the status of coming up with a new industry buzzword :)
A word contrived after a gorilla named Bokito escaped from his enclosure in Rotterdam zoo last year and went on the rampage. From Wikipedia:
The word "Bokitoproof", meaning "durable enough to resist the actions of an enraged gorilla", and by extension "durable enough to resist the actions of a non-specific extreme situation", was voted the Dutch language "Woord van het jaar" (Word of the Year) for 2007.
a simple web-related example regarding form input:
- fault tolerance could be returning a useful error message instead of 500 internal server error if someone submits a form without any input.
- what you are looking for is a term describing what prevents malicious or accidental script injection using that form.
Back in the early days of the 8086, every MS-DOS application could write in every memory location. To prevent this, the 80286 processor was given logic to provide access to certain memory locations to some applications but not to others. Applications using these logic were called to run in "protected mode". It gives applications access to their own resources, but not beyond the assigned rights. That is exactly the same as in your situation.
In a protected environment, even if the software application/user goes mad, it won't touch the integrity of the system because the integrity is controlled at an higher level than where the application/user has control rights to.