|
A penny for your thoughts,
a latte for your password
"Pushing the Envelope" from ACM <interactions>, January/February 2006
We probably all know people who juggle 30 or more user ID/password combinations, one set for each application or server they need access to. Some they use every day; some they use only occasionally. Some get changed more often than they get used. Some people keep their passwords in their notebooks, some in desk drawers, some in password-protected spreadsheets. No one tries to remember them all. Each method of tracking IDs and passwords represents some sort of security compromise.
Security is full of compromises. There would be no security problems if our computers held no sensitive information, or if no one else had access to those computers, or if our computers could not do things that we dont want them to do, or if they werent attached to the Internet. But we have security problems, many of them.
If you search the literature on HCI and security, you might think that the biggest problem is making passwords usable, yet unique and indecipherable. Thats hardly the casesecurity is much more than identification, authentication, and authorization. HCI must surely have a hand in system security or security becomes an exercise in spending money without results. A recent volume, Security and Usability: Designing Secure Systems That People Can Use (2), culls contributions from a range of disciplines, but still asks more questions than it answers.
Just scanning the table of contents to Bruce Schneiers Beyond Fear: Thinking Sensibly About Security in an Uncertain World (3), provides a sense of the scope of security issues:
- All Security Involves Trade-offs
- Security Trade-offs Are Subjective
- Attackers Never Change Their Tunes, Just Their Instruments
- Technology Creates Security Imbalances
- Security Is a Weakest-Link Problem
- Security Revolves Around People
These chapter titles represent security principles, basic facts that we can put to use when designing security for computer systems. They should remind us that security involves compromise; that attackers are at least as smart as protectors; and that, ultimately, security is a people problem, not a technology problem.
All security systems depend on humans, and humans are fallible, gullible, bribable, emotional, and greedy. The human factor can be more crucial than the computer. How do usability and other HCI factors mitigate against temptation? One of the most infamous of computer hackers, Kevin Mitnick, relied more on human factors than on technological skill to gain unauthorized access to systems. Mitnick used what he terms social engineering to convince authorized users to let him past security. In an interview published on CNN.com (1) Mitnick says:
A company can spend hundreds of thousands of dollars on firewalls, detection systems and encryption and other security technologies, but if an attacker can call one trusted person within the company, and that person complies, and the attacker gets in, then all that money spent on technology is essentially wasted. Its essentially meaningless.
Does it really matter how usable yet uncrackable we make our passwords if were willing to just give them away? Television news crews have convinced strangers to reveal their user names and passwords in exchange for a latte. Thats a pretty cheap bribe.
We can and do repeatedly say that education is the key, that training users about security issues and password requirements and access protocols will make systems more secure. But many users lack the time, motivation, or background to make training effective. What about better embedded assistance, such as putting password requirements (length, alphanumeric combinations) right next to the input text field? That might help, but it wont make firewall configuration easier for someone who doesnt know what a firewall is or why they should have one. Did your last computer come with a software firewall installed? And was it turned off by default? How does HCI fix that?
The real challenge to HCI is to assist in identifying and prioritizing the true threats and risks, and to provide the means to mitigate those risks. If everything is so important that it requires the same level of protection, then nothing has higher or lower priority, and users become numb and careless. The costs to the human element must be addressed, or we risk losing users attention. All risks are not the same: the likelihood of death by asteroid is far less than by auto accident, and the risk of loss of life is not equivalent to the potential loss of business advantage.
Where are we with HCI and security? Where is the work to be done: in research, or in practice? Do the answers to security questions lie in user education, better technological applications such as biometrics, smarter encryption? Add your voice to the discussion in ACM <interactions>. The May/June 2006 issue will look at HCI and security; if you would like to contribute, please contact the editors-in-chief at eic@interactions.acm.org with your proposal.
References:
1. CNN.com, A convicted hacker debunks some myths. www.cnn.com/2005/TECH/internet/10/07/kevin.mitnick.cnna/index.html, accessed October 11, 2005
2. Cranor, L.F. & Garfinkel, S., Security and Usability: Designing Secure Systems That People Can Use. OReilly Media Inc., Sebastopol, 2005
3. Schneier, B., Beyond Fear: Thinking Sensibly About Security in an Uncertain World. Copernicus Books, New York, 2003
About the Author
Fred Sampson is a co-chair of BayDUX, a member of SIGCHI, and a senior member of STC. In his spare time, Fred works as an information developer at at IBM’s Silicon Valley Lab in San Jose, California. Contact him at wfreds@acm.org.
Copyright Notice
© ACM 2006. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in ACM <interactions>, Volume XIII.1, ISSN 1072-5520, (January/February 2006), http://doi.acm.org/10.1145/1109069.1109077.
|
Contact me:
Fred Sampson
fred@fredsampson.com
|