HUMAN-COMPUTER INTERACTION
SECOND EDITION
Consider first the successful case in Figure 9.6, the hit. The first significant event is Alison's depression of the mouse button over the on-screen 'delete' button. This event goes directly to the toolkit dialog, which responds by highlighting the 'delete' button. The next event is as Alison lifts her finger from the button. Again this is received by the dialog which this time does two things: it removes the highlight from the 'delete' button, and also causes an event 'delete' for the application. The application then performs the action, deleting the paragraph. The effects of this change in the text are reflected in the screen content.
The unsuccessful case (Figure 9.7, the miss) starts similarly. Alison depresses the mouse button and receives feedback. However, this time, before releasing the mouse button, she accidentally moves the mouse off the button. The toolkit dialog responds to this by removing the highlight from 'delete' -- the same feedback as in the first scenario. Alison's release of the mouse button has no further effect.
The two scenarios are very different in their effect: in one the application deletes some text, in the other it does not. However, Alison does not notice the difference. Her feedback from the toolkit dialog is identical. In theory, she could have seen that
It is quite difficult to see how to avoid the problem occurring. It is not that the current feedback is not salient; it is at the focus of the pointing task. However, all the feedback concerns events at the dialog level. The most important event, the 'delete' to the application, has no corresponding perceived event. The toolkit assumes that the user will see some feedback from the application and therefore does not supply any feedback of its own. But, as we saw, the application's feedback is very likely to be missed.
In the remainder of this chapter, we will address the various layers which constitute the move from the low-level hardware up to the more abstract programming concepts for interaction. We begin in Section 10.2 with the elements of a windowing system, which provide for device independence and resource sharing at the programming level. Programming in a window system frees the programmer from some of the worry about the input and output primitives of the machines the application will run on, and allows her to program the application under the assumption that it will receive a stream of event requests from the window manager. In Section 10.3 we describe the two fundamental ways this stream of events can be processed to link the interface with the application functionality: by means of a read--evaluation control loop internal to the application program or by a centralized notification-based technique external to it. In Section 10.4, we describe the use of toolkits as mechanisms to link input and output at the programming level. In Section 10.5, we discuss the large class of development tools lumped under the categories of user interface management systems, or UIMS, and user interface development systems, UIDS.
Figure 10.6 provides an example of notification-based programming in C using the XView toolkit (toolkits are described in the next section).
From the programmer's perspective, even at the level of a windowing system, input and output are still quite separate for everything except the mouse, and it takes quite a bit of effort in the application program to create the illusion of the interaction object such as the button we have just described. To aid the programmer in fusing input and output behaviours, another level of abstraction is placed on top of the window system -- the toolkit. A toolkit provides the programmer with a set of ready-made interaction objects -- alternatively called interaction techniques, gadgets or widgets -- which she can use to create her application programs. The interaction objects have a predefined
processed in 0.003 seconds
| |
HCI Book 3rd Edition || old HCI 2e home page || search
|
|
feedback to feedback@hcibook.com | hosted by hiraeth mixed media |
|