By Mary Beth Rettger, MathWorks
"Usability" defines a quality of the product: a usable product allows users to focus on their work, instead of the tool that they are using. User-centered design techniques attempt to bring the user's perspective to the development process from the earliest stages of design. At MathWorks, we applied several user-centered design techniques throughout the MATLAB® 6 development cycle to help us understand what our users needed and ensure that what we were building actually met those needs.
A commitment to listening to customers is a great first step for any company that wants to understand its users better. This commitment needs to be matched with more formal data collection techniques in order to capture and turn this information into something actionable.
One failure of many development efforts is to rely solely on users' reports of what they say they want. Asking users for lists of features or fixes they want added to the next product is okay, but is incomplete. Users tend to be experts at their jobs (designing radar systems or control systems), but less skilled at our job (designing software). So, relying on their feature descriptions or high-level specifications can be problematic. These descriptions give us an idea of where to start, but they don't help us create the optimal solution.
What's much more effective is to learn about the user's work—preferably by watching them work—or at least by interviewing them in their environment, where they can easily show us examples of what they are trying to accomplish. This process is called contextual interviewing. Rather than just getting a laundry list of new features users want, we can have them show us how they hope to use a new feature, or explain to us what problems they are trying to solve with the current product. This way our software designers can gain a thorough understanding of the problem, and come up with targeted solutions that address the task.
How did we use this process during MATLAB 6 development? In that case, we used two variations: assistance from outside observers, followed by additional interviews done by our own developers. We were fortunate to enlist the assistance of students in the Tufts University Engineering Psychology program. The Usability team at MathWorks gave these students training in contextual interviewing techniques. During the first of two projects, the students interviewed about 20 MATLAB users. The data from this project and our other MATLAB 6 planning efforts raised more questions for us about how new users worked with MATLAB, so we launched a second project in which the students focused on interviewing users with less than three months' experience using MATLAB.
There were several deliverables from these studies. We received formal reports summarizing the top 10 issues observed during the interviews. We received detailed interview results, which also included "artifacts": samples of users' work, sample output, and photos of the users' work environments. Finally, the students created several "Meet Our Users" posters that graphically illustrated several typical users and their work.
The results from these studies were dramatic. Developers who were eager to get a fresh perspective on the problems our users faced appreciated the top 10 lists and the user posters. While these summaries helped put the whole set of findings into perspective, the individual interview transcripts were even more important. Developers pored through this information to get specific data with which to prioritize their efforts.
Several key themes came out during these interviews, including requests that we improve printing, plot editing, and the Help system in MATLAB. In the first two cases, the work artifacts that students collected from users were especially helpful: Seeing real examples of work that users were trying to do (or had difficulty doing) helped the developers gain a much better understanding of where to focus their efforts.
The student projects were especially useful because they allowed us to collect a lot of data with relatively little effort from within the company. However, because the students were not MATLAB experts, we were still left with the task of determining which changes to make to MATLAB. Many of our developers were involved in subsequent interviews with users, asking additional questions that were not covered in these first studies. This combination of efforts helped us come up with an excellent list of priorities for MATLAB 6.
Once they understood what users wanted, the developers were eager to begin coding new solutions. At this stage, we introduced paper prototyping into our development process.
The concept of prototyping a solution is well understood. The problem is that code is expensive to write, and all too often a "prototype" takes so long to develop that there is no time to get feedback on the idea. As a result, the prototype simply becomes the product. Paper prototyping provides an alternative that is cheaper, faster, and more effective for the purposes of ensuring the final product meets user needs.
With paper prototyping, the user interface was mocked up using paper (Figure 1). Most often, the basic design was hand-sketched, and other user interface elements such as drop-down menus, dialog boxes, and error messages were created using sticky notes, tape, and glue. Even the most complex user interface elements could be effectively "faked" with a little imagination.
Typically the process started by having team members construct a task the user would try to complete using the prototype. This approach got the developers to continually focus on the user's work and to think hard about the purpose and value of a feature. It also ensured that only the essential parts of the interface were created: no need to add extra features if they weren't going to be used for the task.
Once the interface was created, users were brought in for usability testing and asked to use the prototype to complete the task. During the test session, one or more developers acted as the "computer," responding non-verbally to the user's interactions with the interface. As the user pointed to menu items or buttons on dialog boxes, the developers displayed the correct user interface elements to show the result of the user's choices. The user was instructed to "think out loud" to help us understand what they were expecting from the product. While users often gave us puzzled looks to start with, they quickly got involved and had fun with the game-like aspects of working with the prototype. More importantly, most users had no problem making the connection between the paper version and the end goal of a more usable product designed for them.
Several important things happened during this process. We found that users tend to be more willing to be critical of a paper prototype that is clearly a work in progress than they are of more polished online prototypes. It was also much easier to quickly make changes to a prototype in the middle of a test session—if the user wanted a "quit" button on a dialog, we could quickly sketch one and see the result. It was possible to completely redraw a prototype between test sessions, making major iterations easy. Since this was paper and not code, several team members could pitch in to redraw a prototype, expediting the process, getting team members involved, and capturing ideas from several people at once. Finally, since it wasn't code, developers didn't feel the same ownership of the design, and were also more willing to scrap previous work.
It was important for us to retest designs after they were implemented in code. In some cases, there were features that truly didn't test as well on paper as online. Also, once implementation started, there were inevitable changes and compromises necessary to express the design in code. At this point, it was necessary to validate the more finished designs.
Much of this testing was done in our usability lab: a two-room lab, with rooms separated by one-way glass. We generally had one developer and a usability specialist sit with the user in the testing room, and additional team members observe from the other side of the one-way glass. This made the process less intimidating for the user (they weren't trying new things in front of a roomful of people).
The testing process was straightforward. Again, we selected users who would typically use the feature we were testing. They were given realistic tasks to complete and asked to "think out loud" while they worked and to let us know if they experienced frustration or satisfaction with the feature being tested. The usability specialist or developer sometimes prompted the user for more information along the way, answering questions to keep the user on task. All the team members were involved in observing and noting issues during the test sessions.
At the end of the sessions, all team members participated in debriefing on the results of the test session. We frequently asked team members to take notes during test sessions on sticky pads, noting one issue per sticky. After the test sessions were completed, the team participated in an affinity diagramming exercise, grouping issues from these sticky notes into like categories. Once the categories were established, it was easier for the team to review the issues and decide what actions to take. The benefit of this process was that the entire team could be involved in learning what was working or not on the interface, and could contribute to the solution. The process not only improved the interface, but served as a useful team-building exercise.
Usability and user-centered design techniques provided us with practical methods for improving MATLAB 6. These methods helped us understand our users' work in detail, which enabled our development team to work together more effectively. Our developers have learned how to create better products that are optimized for our users' tasks, and to produce those products sooner. Because of these efforts, we are confident that MATLAB 6 provides significant improvements that will help our users do their work better, too.
But our efforts are not stopping with MATLAB 6. Once we completed that version, we immediately began testing it to understand what additional improvements could be made. In addition, we have restarted our efforts to visit our users in the field to help our developers better understand what our users are doing with MATLAB 6.
We are constantly looking for users to participate in usability activities. Many of these activities happen in and around our Natick offices, but we also do testing and user visits on the road. Here are some ways that you can get involved with improving the usability of MATLAB and Simulink products: