HUMAN-COMPUTER INTERACTION SECOND EDITION
Dix, Finlay, Abowd and Beale


Search Results


Search results for evaluation
Showing 101 to 110 of 110 [<< prev] [new search]


Chapter 11 Evaluation techniques Exercises Page 441
  • An outline plan for carrying out the evaluation.

  • Chapter 11 Evaluation techniques Recommended reading Page 442

    An evaluator's guide to using the cooperative evaluation approach successfully.


    Chapter 11 Evaluation techniques Recommended reading Page 442

    M. Helander, editor, Handbook of Human--Computer Interaction, Part V: Tools for Design and Evaluation, North-Holland, 1988.


    Chapter 11 Evaluation techniques Recommended reading Page 442

    Reviews the major evaluation techniques.


    Chapter 11 Evaluation techniques Recommended reading Page 442

    Describes the revised version of the cognitive walkthrough method of evaluation.


    Chapter 11 Evaluation techniques Recommended reading Page 442

    J. Nielsen, Heuristic Evaluation. In J. Nielsen and R. L. Mack, editors, Usability Inspection Methods, John Wiley, 1994.


    Chapter 11 Evaluation techniques Recommended reading Page 442

    Covers the heuristic evaluation method in detail.


    Chapter 11 Evaluation techniques Recommended reading Page 442

    Includes material on both cognitive walkthrough and heuristic evaluation.


    Chapter 12 Help and documentation 12.4.1 Knowledge representation: user modelling Page 451

    The third approach to providing the system with a model of the user, and the one used in adaptive help systems, is to have the system construct and maintain a model of the user based on data gleaned from monitoring the user's interaction. This model can then be consulted when required. This automatic approach to user modelling also has the problem of the setup time required, during which time the user has a default system, but the onus to build the model is taken away from the user. Various suggestions have been made as to how to deal with the setup time, including getting the user to choose an initial default model, and building a model based on pre-use activity, such as game playing. The former is problematic in that again it makes the user decide on a model at a time when he may not have sufficient experience to do so effectively. The latter may not produce a model which is transferable to the actual domain. The most common approach is still to provide a basic start-up default model and concentrate on rapidly updating this for the actual user. The default model may be based on experimental or observational results gleaned in evaluation.


    Chapter 14 CSCW and social issues Recommended reading Page 552

    J. Grudin, Why CSCW applications fail: Problems in the design and evaluation of organizational interfaces. In CSCW'88, Proceedings of the Conference on Computer Supported Cooperative Work, pp. 85-94, ACM, 1988.


    Search results for evaluation
    Showing 101 to 110 of 110 [<< prev] [new search]

    processed in 0.012 seconds


    feedback to feedback@hcibook.com hosted by hiraeth mixed media