Friday, November 10, 2006

ADM302 – User Centred Design Assignment

BA/BSc Design for Interactive Media (Top-up) 2006/07
ADM302 – User Centred Design
By Nigel Whitbread
Stage Two - Testing
In this part of the assignment we are required to evaluate our re-designed prototypes using Neilson's Discounted Usability Engineering Method (DUE). To aid me in this I tested my redesign on a selected user group of my family members from a wide age range.
Testing framework following Neilson’s Discounted Usability Method (DUE).
The DUE approach is intended to keep costs as low as possible for testers with limited resources and because of this should be viewed as a way of serving the user community.

The process I followed for the user testing of my prototype is as follows
• I typed up a scenario describing the path I wanted my users to follow through the interface I’d designed in Flash. Because of the limited functions in place on the prototype this was a fairly rigid way of testing.
• Before I went on to full testing I tried out different ways of delivering the scenario, first of all I talked a user through the stage of the scenario whilst showing them myself how the prototype worked. This approach was quicker then the others but I received less direct feedback because I’d explained everything as I went along.
• I then gave a user the scenario to read a follow but soon found that the scenario was more or less ignored as they tried to work the interface be random button hitting and self discovery. Because not all the functions were working this lead to a lot of errors.
• Taking these two trial runs on board I asked 5 users from my family to test the prototype. (Tognazinni (1993) and Redmond-Pyle & Moore (1995) suggest that three users can give the evaluator a good idea of what users in general might think and that after three there may be diminishing returns from involving additional users. Monk (1993) suggests five users)
• The methodology I’ve chosen for my main user testing is based on Cooperative Evaluation developed by Monk et al (1993). Its aim is quite simply to identify problems with the system. Instead of having the users read through the scenario as the work their way through the prototype I decided to read it out to them and have them work through every stage one step at a time. I also had the users think out loud their actions and encouraged them to ask questions, list the problems they encountered and comment about the user experience.
Users interaction with the prototype
Heuristic Evaluation
Using Jacob Nielsen’s heuristic evaluation of the original and redesign.

  • Icons used in the menu structure are sometimes vague and don’t relate well to what their supposed to be symbolising (2. Match between system and the real world),
  • No accelerator functions to allow expert users to short cut or tailor frequent actions (7. Flexibility and efficiency of use),
  • Camcorder doesn’t contain any integral help functions and the users manual isn’t easy to search through (10. Help and documentation)
Using the feedback gained from user testing I would try to incorporate more features into a redesign of the redesign. People found that the menu’s similarities to mobile phone menus was a concept already familiar to them, so taking functions already in use on a limited scale on phones should transfer well on a larger scale with the much greater memory capacity on the camcorder. Such a function could be having DEMO videos incorporated into a HELP menu showing you how to use the different functions on the camera, in the same way that mobile phones have demo videos introducing you to some of its applications or set up wizards taking you through the basics. In terms of the interface design the icons need to visually describe the action it represents and need be vague or totally unrelated. I would also make the more photo-realistic or even three dimensional to help bring them out of the screen.
Link to Prototype Online
http://uwicworkinprogress.moonfruit.com/