Indiana University

Skip to:

  1. Search
  2. Breadcrumb Navigation
  3. Content
  4. Browse by Topic
  5. Services & Resources
  6. Additional Resources
  7. Multimedia News

Media Contacts

Steve Chaplin
IU Communications
stjchap@iu.edu
812-856-1896

Last modified: Monday, December 10, 2012

IU informatics professor receives $2.4 million from Homeland Security to keep computer users safe

IU will craft prototype to support security decision-making by general public

FOR IMMEDIATE RELEASE
Dec. 10, 2012

BLOOMINGTON, Ind. -- What to do with that dreaded pop-up warning, "Secure Connection Failed. The certificate is not trusted ..."? Continue anyway, view the security certificate or, tempting fate, add an exception and press forward?

Now an Indiana University Bloomington professor in the School of Informatics and Computing whose research focuses on technology, security and society is helping make such decisions easy for us. L. Jean Camp, whose work has focused on privacy and trust issues in technology, has been awarded over $2.4 million by the U.S. Department of Homeland Security's Cyber Security Division, to give people the information they need to stop a range of attacks.

The IU team led by Camp will focus on developing user-centered security software that reduces cyber-attacks by making sure people have the information they need to support a security decision when they need it. As opposed to annual training on how to spot a phishing email, the computer will ask when you open an attachment whether you realize this is from outside the company when opening it. If you still want to open it, Camp said, "We will then limit the document's ability to change the machine. In contrast, people today are asked about every document, or asked something inscrutable like, 'enable macros?'"

People often fail to see the dangers in sometimes simple actions like downloading files, or they disable security functions thinking they slow the computer down. Computers also completely fail to identify pertinent facts at decision time: If you are entering a bank password at a site you have never visited before, it would be nice if the computer made sure you knew that you had never been there before.

The computer knows it is not a site with which you have interacted, and with Camp's technology it will finally tell you what you need to know to spot a fraud, at the time you need to know it. The Department of Homeland Security project, called CUTS: Coordinating User and Technical Security, aims to implement these human factors that are often overlooked when security systems fail in different applicable contexts, like banking, Web browsing or working from home. Joining Camp as a co-principal investigator on the research is Jim Blythe of the Information Sciences Institute at the University of Southern California.

"People reason analogically about security; they infer what they know about security in general and place that into the context of technology," Camp said. "We're looking, at least in part, to design warnings and remedies that use common mental models in developing intelligent and automated resolutions to security issues that today would require excessive user interaction."

By understanding the models people bring to computing, the researchers can address how problems are identified and how information about those problems is communicated. They can then put forth a prototype that knows when and how much to communicate to the user about the problem, automating responses that are intuitive and timely.

"Unfortunately today, user involvement appears to be required too often and usually in terms that non-technical users have difficulty understanding," Camp said. "Security decision-making lacks effective decision support."

People don't want to be warned too often, and the biases that we have in place -- amount of security knowledge, the cost of distraction in making a security decision versus the time-dependent value of the advice -- have to be identified and weighed into the models.

"We'll be designing with simulations of human error in play," Camp said. "When a system fails systematically because of human behavior, then it is the fault of the system, not the human."

For more information or to speak with Camp, please contact Steve Chaplin, IU Communications, at 812-856-1896 or stjchap@iu.edu.