BA/BSc Design for Interactive Media (Top-up) 2006/07
ADM302 – User Centred Design
By Nigel Whitbread
Stage Two - TestingADM302 – User Centred Design
By Nigel Whitbread
In this part of the assignment we are required to evaluate our re-designed prototypes using Neilson's Discounted Usability Engineering Method (DUE). To aid me in this I tested my redesign on a selected user group of my family members from a wide age range.
Testing framework following Neilson’s Discounted Usability Method (DUE).
The DUE approach is intended to keep costs as low as possible for testers with limited resources and because of this should be viewed as a way of serving the user community.
The process I followed for the user testing of my prototype is as follows
• I typed up a scenario describing the path I wanted my users to follow through the interface I’d designed in Flash. Because of the limited functions in place on the prototype this was a fairly rigid way of testing.
• Before I went on to full testing I tried out different ways of delivering the scenario, first of all I talked a user through the stage of the scenario whilst showing them myself how the prototype worked. This approach was quicker then the others but I received less direct feedback because I’d explained everything as I went along.
• I then gave a user the scenario to read a follow but soon found that the scenario was more or less ignored as they tried to work the interface be random button hitting and self discovery. Because not all the functions were working this lead to a lot of errors.
• Taking these two trial runs on board I asked 5 users from my family to test the prototype. (Tognazinni (1993) and Redmond-Pyle & Moore (1995) suggest that three users can give the evaluator a good idea of what users in general might think and that after three there may be diminishing returns from involving additional users. Monk (1993) suggests five users)
• The methodology I’ve chosen for my main user testing is based on Cooperative Evaluation developed by Monk et al (1993). Its aim is quite simply to identify problems with the system. Instead of having the users read through the scenario as the work their way through the prototype I decided to read it out to them and have them work through every stage one step at a time. I also had the users think out loud their actions and encouraged them to ask questions, list the problems they encountered and comment about the user experience.
Users interaction with the prototypeHeuristic Evaluation
Using Jacob Nielsen’s heuristic evaluation of the original and redesign.
Using Jacob Nielsen’s heuristic evaluation of the original and redesign.
- Icons used in the menu structure are sometimes vague and don’t relate well to what their supposed to be symbolising (2. Match between system and the real world),
- No accelerator functions to allow expert users to short cut or tailor frequent actions (7. Flexibility and efficiency of use),
- Camcorder doesn’t contain any integral help functions and the users manual isn’t easy to search through (10. Help and documentation)
http://uwicworkinprogress.moonfruit.com/