Micro Focus is now part of OpenText. Learn more >

You are here

You are here

Security and usability go hand in hand: 5 tips to get the development mix right

Rob Lemos Writer and analyst

The conflict between usability and security can often be seen throughout the workplace: normally locked doors cracked open, passwords written down, and files shared through unsanctioned online services.

Such workplace vignettes highlight that any security that hinders usability or slows performance often makes people less secure. Human nature means that people usually find a way around hurdles that make their work or life difficult, says Don Norman, a professor at the University of California, San Diego, and the director of UCSD's DesignLab.

That is a challenge for developers, because mandates for more security are at war with programmers' desire to make usable applications. Yet, while developers are normally focused on functionality and performance, security, privacy, and usability can be built in at the start and are all required for great software, says Norman, a former vice president at Apple and the author of the book The Design of Everyday Things.

"They are closely related, and they tend to interact in often bad ways, but they are not always at odds with each other," he says. "If you work at thinking about usability and security together, I think you will improve both."

Designing anything, from software to web services to appliances, requires balancing usability, functionality, and security. Many of the problems that we have today are caused because development shops focused on programming to the exclusion of usability and security. Security often is added at the end of development, limiting access to applications and reducing their usability.

"It's hard to get usability and security to work well," says Silent Circle, the maker of the privacy-focused Blackphone 2. "Some of that is of course by necessity. It's easier to not lock your door than to lock your door."

Yet making security, usability, and performance work together requires the right team. Development teams need more than just developers, because it takes more than getting the code right to create a usable, secure application. Therefore, any development team should include a usability specialist and a security specialist, Norman says.

"Programmers are experts at programming; they are not experts of human behavior," he says. "That is what the UX [user experience] community is about. Quite often, what they think is secure is not necessarily so."

To build application with all three facets—functionality, usability, and security—development teams need to focus on five steps.

1. Start with a security focus

Many developers start with the assumption that users see security as a bother. While security can make applications less usable and slower, designing in protections from the beginning of development allows the software's architect to make the system as user-friendly as possible.

"When designing products, it is recommended that developers consider both aspects [security and usability] without forgoing one over the other, and it all starts with the code," Sabrina and Sonia Burney, two solutions architects for Internet-performance firm Akamai, said in an email interview. "A security issue can in fact result in a performance issue and vice versa, so it is vital that we take both aspects into account, starting with the code and the business needs."

For example, Silent Circle started by making security a fundamental aspect of the Blackphone. Where many smartphones do not default to requiring a passcode to unlock the device, during setup, the Blackphone walks the user through making a lock code that is at least five characters.

"They can change it later, but we start off making the phone secure," Silent Circle says. "Many phones have defaults of no passcode and present people with too many choices. They start with the assumption that people want no passcode and walk them through that, while supporting the few who do."

2. The simpler, the better

Reducing complexity can also help bolster the triumvirate of performance, usability, and security. Simpler code has fewer vulnerabilities, making it easier to secure. Simpler products are easier to understand for users, making it less likely that the user will do something that causes a security issue.

Amsterdam-based Usabilla, for example, focused on simplicity—only putting in as much security as necessary and nothing more—for its service for garnering consumer feedback. By simplifying services and features, users are encouraged to follow the right path rather than attempting to take an action that could have security implications.

"Security features are, in a sense, a 'barrier' to something else so it is vital that you remove any additional, unnecessary steps so you don't discourage the user," Usabilla stated in a recent article on user experience and security. "When it comes to security, users don't want to feel confused or lost in the process."

3. Treat different users differently

Not all users are the same. In one version of the Pareto Principle, 80 percent of users likely only need 20 percent of the features or access to data in a service or application. By giving most users limited access to software or site features, developers can keep their software and data more security.

In redesigning the user interface for its GoToMyPC service, for example, Citrix used a conversational model to walk users through the creation, and subsequent storage, of an important password. The company's staff UX designer worked alongside its security team to design and implement changes that would help the lion's share of users formulate workable passwords.

In another case, when Citrix noticed that customers had increasing support costs, the company investigated and found that it could make the recovery process simpler and just as secure by automating it and customizing the process, depending on what type of account the users were attempting to access.

Focusing on the user can also improve security by separating the UX from the underlying data structure and system architecture. By only giving users the level of access they need, Akta, a software design firm focused on UX and engagement, has made its clients' systems more secure.

"If security measures fail and someone gains access to your servers, you never want your user interface to provide a road map to the data they're seeking," the company stated in a blog post on the security benefit of good UX design.

4. Test to determine impact

Developers also need to test their features and observe users to look out for potential security issues that may not otherwise be caught.

Permissions are one example where testing and experience have made a difference, says Matt Bridges, chief technology officer of design firm Intrepid Pursuits. Android developers originally structured applications to request all permissions at installation. Yet, when people started questioning the necessity of those permissions, developers needed to minimize the requests, but also found that users could break functionality by rejecting necessary permissions.

"The best practice now is to wait for asking for that permission until it is obvious why they need that permission," Bridges says. "Until they do something that is obvious."

Such observation has resulted in changes to how passwords are handled on different types of systems, according to UCSD's Norman. While the requirement of uppercase, lowercase, numbers, and symbols may be fine for passwords created on a laptop or desktop system, the reality of a small, virtual keyboard makes such requirements needlessly complex.

"If you are typing on a screen keyboard, some of these systems make it difficult to go back and forth," he says. "I go out of my way to create passwords that do not require me to do any shifting."

5. Often, no right answer

For developers focused on performance and functionality, balancing usability and security is not easy, especially because there often are no set answers.

Intrepid Pursuits, for example, had an application rejected from the Apple Store because the company had focused on security to an extent that Apple did not agree with. The application, which uses the software development kit for music service Spotify, had originally been programmed to send users to the Safari browser to log into Spotify's service. By passing control to a known application for login, the developers put security front and center. Apple, however, requires that companies use a native web view, which keeps users in the context of the original app, but could allow a malicious app to sniff the user's login credentials.

In the end, the incident underscores that security may make software more secure if no one uses it, Bridges says.

"I don't know that there is a silver bullet," he says. "Every six months something comes up causing us to think, 'Hey, we probably should not be doing it that way.'"

Keep learning

Read more articles about: App Dev & TestingSecurity