Site hosted by Angelfire.com: Build your free website today!

Usability Test Report

Home

Usability Test Report Corwin Jones Executive Summary and Goal: The system that was evaluated was a user interface from a program I wrote called Athens Course Scheduler. The goal of this program was too offer a user interface that is quite simple, yet understanding. Users could select a record with this program and calculate a response. In addition to making it understandable I also tried to make it simple. I tried to give more attention to the interface by making it a bright color. The goal of this usability test was to evaluate how potential users would respond to using the interface. This was the first test on this application. Method: The two participants had an average age of 46- 50 and willingly volunteered to help out. Participant #1 is a 46-year-old Instructor at Athens State University. He teaches programs in Computer Networking. Participant #2 is an Instructor over 50 year-old Retiree and assistant professor at the Athens State University He teaches programs in Computer Science and Data Structures. Their skills with the computer varied Environment: I performed the test in the computer lab at Athens State. Each participant ran the application seated at one Pentium class machine running Windows 98. Because I did not own a video camera or audio equipment, I relied on taking photographs and taking notes to record the results of the exercises. Task 1: Locate application and open it. I expected this to be not a so hard problem, since the application was on CD Rom all the participants had to do was click on the computer Icon and launch the interface application. Participant #2 had a tough time trying to find out where the application was located and asked for help. Task: 2. View Interface. I expected this to be a very simple task that required almost no effort. This is one of the most basic tasks a program user would perform. The interface program was chosen because it is a familiar subject and category that would be fun. If the testing participants could not easily figure out how to complete this task, I knew the interface had major problems. I expected no questions from the participants regarding how to complete this task; the users were required to study the screen for 3 minutes without using the mouse. Task: 3. Click the drop down box This also was expected to be a very simple task since it had the drop down box located to the left of the screen it was simple for both participants. Task 4: Viewing the data. Another task that did not product any problems for the users. Task 5: Viewing Records. Each participant was asked to view the records in the interface. The participants clicked on Instructors, Classes, and Schedules etc and started viewing the records. Task 6: When observing the participants when entering the age I noticed that each participant entered the age in different boxes. Participant #2 entered the age right after his name while participant #1 entered the age in the box adjacent to the name box. When entering the age right after the name participant #2 didn’t receive an error message while participant #1 received an error. Task 6: New Search. The users were told to click on the new search button to start a new search for records. Task 7. Update Record Each participant was required to update a record in the system. Participant #1 had no trouble updating the record, but participant #2 had to ask for help. Task 8. Delete Record. Each participant was required to delete a record out of the records. No problems were detected during this task. Task 9. Add Record Each participant was required to add a record into the system. No problems were detected beyond participant #1 or participant #2 Task 10. Searching for records. The participants used the sort functions of the system to search for records. Neither participant had any problems with this. Test Measures / Observations I timed how long it took each participant to complete each task. I thought that it was important to measure how long it would take a tester to complete a task because it would give me a general idea of how usable the interface is overall. However, since the participants were asked to speak their thoughts out loud while performing the tasks, the times I recorded are longer than it actually would have taken them. As expected, the first task was the easiest for our participants. The moderate task proved more difficult. The testers started as I expected by going to the screen to find out what the program was about. Unfortunately, participant #2 was unable to launch the CD because the window opened partially off the screen and the interface never scrolled into view. Participant #1 went straight to the application and seemed more eager to observe the application. Surprisingly, the two participants finished the product task faster than I expected. The participants began by going to the opening screen, and from there though, they took a variety of paths to launch the application, and they all were successful. Participant #1 tried to find a way to enter update a record in the text box and was frustrated when he could not. He then asked me how it was done, and I obliged. Results and Recommendations The participants seemed to like the overall application. Unfortunately participant 2 had a tough time trying to update a record. They both agreed that the interface was easy understand, but they also gave me suggestions for improvements. I found from reading my participants responses that the interface works great. After launching the application the participants were asked to describe what they saw on the desktop. They were asked not to move the mouse. After about 3 minutes, they were told they could start with the application. I heard general comments about the program, because they were told to think out loud, that included the following: “How do you resize it. “ is there a help or file new located here”, one user commented “Interesting choice of colors”, I would improve the system by making it automatically resize to a specific width, because this was a problem by both of the participants, the resizing. I would also make help available, and make the program one basic color. I would add menu’s to the application and give it a more colorful look. The participants both agreed that the application overall did as they expected.