HUMAN-COMPUTER INTERACTION SECOND EDITION
Dix, Finlay, Abowd and Beale


Search Results


Search results for cognitive
Showing 70 to 79 of 79 [<< prev] [new search]


Chapter 11 Evaluation techniques Automatic protocol analysis tools Page 430

A third example is DRUM [148] which also provides video annotation and tagging facilities. DRUM is part of the MUSiC (Measuring the Usability of Systems in Context/Metrics for Usability Standards in Computing) toolkit which supports a complete methodology for evaluation, based upon the application of usability metrics on analytic metrics, cognitive workload, performance and user satisfaction. DRUM is concerned particularly with measuring performance. The methodology provides a range of tools as well as DRUM, including manuals, questionnaires, analysis software and databases.


Chapter 11 Evaluation techniques Design vs. implementation Page 436

Roughly speaking, evaluation at the design stage tends to involve design experts only and be analytic, whereas evaluation of the implementation brings in users as subjects and is experimental. There are of course exceptions to this: participatory design (see Chapter 6) involves users throughout the design process, and techniques such as cognitive walkthrough are expert based and analytic but can be used to evaluate implementations as well as designs.


Chapter 11 Evaluation techniques Subjective vs. objective Page 437

Evaluation techniques also vary according to their objectivity -- some techniques rely heavily on the interpretation of the evaluator, others would provide the same information more or less regardless of who is performing the evaluation. The more subjective techniques, such as cognitive walkthrough or think aloud, rely to a large extent on the knowledge and expertise of the evaluator, who must recognize problems and understand what the user is doing. They can be powerful if used correctly and will provide information that may not be available from more objective methods. However, the problem of evaluator bias should be recognized and avoided. One way to decrease the possibility of bias is to use more than one evaluator. Objective techniques, on the other hand, should produce repeatable results which are not dependent on the persuasion of the particular evaluator. Controlled experiments are an example of an objective measure. These avoid bias and provide comparable results but may not reveal the unexpected problem or give detailed feedback on user experience. Ideally, both objective and subjective measures should be used.


Chapter 11 Evaluation techniques 11.7 Summary Page 440

A design can be evaluated before any implementation work has started, to minimize the cost of early design errors. Most techniques for evaluation at this stage are analytic and involve using an expert to assess the design against cognitive and usability principles. Previous experimental results and modelling approaches can also provide insight at this stage. Once an artefact has been developed (whether a prototype or full system), experimental and observational techniques can be used to get both quantitative and qualitative results. Query techniques provide subjective information from the user.


Chapter 11 Evaluation techniques Exercises Page 441

11.1 In groups or pairs, use the cognitive walkthrough example, and what you know about user psychology (see Chapter 1), to discuss the design of a computer application of your choice (for example, a word processor or a drawing package). (Hint: Focus your discussion on one or two specific tasks within the application.)


Chapter 11 Evaluation techniques Exercises Page 441

11.5 Complete the cognitive walkthrough example for the video remote control design.


Chapter 11 Evaluation techniques Recommended reading Page 442

C. Wharton, J. Rieman, C. Lewis and P. Polson, The Cognitive Walkthrough: A practitioner's guide. In J. Nielsen and R. L. Mack, editors, Usability Inspection Methods, John Wiley, 1994.


Chapter 11 Evaluation techniques Recommended reading Page 442

Describes the revised version of the cognitive walkthrough method of evaluation.


Chapter 11 Evaluation techniques Recommended reading Page 442

Includes material on both cognitive walkthrough and heuristic evaluation.


Chapter 14 CSCW and social issues 14.5.3 Distributed cognition Page 540

The emphasis on external forms is encouraging for a designer. It is not necessary to understand completely the individual's cognitive processing in order to design effective groupware. That is an impossible task. Instead, we can focus our analysis of existing group situations and design of groupware on the external representations used by the participants.


Search results for cognitive
Showing 70 to 79 of 79 [<< prev] [new search]

processed in 0.003 seconds


feedback to feedback@hcibook.com hosted by hiraeth mixed media