HUMAN-COMPUTER INTERACTION SECOND EDITION
Dix, Finlay, Abowd and Beale


Search Results


Search results for mouse
Showing 141 to 150 of 176 [<< prev] [next >>] [new search]


Chapter 9 Models of the system 9.4.5 Example -- screen button feedback Page 371

Consider first the successful case in Figure 9.6, the hit. The first significant event is Alison's depression of the mouse button over the on-screen 'delete' button. This event goes directly to the toolkit dialog, which responds by highlighting the 'delete' button. The next event is as Alison lifts her finger from the button. Again this is received by the dialog which this time does two things: it removes the highlight from the 'delete' button, and also causes an event 'delete' for the application. The application then performs the action, deleting the paragraph. The effects of this change in the text are reflected in the screen content.


Chapter 9 Models of the system 9.4.5 Example -- screen button feedback Page 372

The unsuccessful case (Figure 9.7, the miss) starts similarly. Alison depresses the mouse button and receives feedback. However, this time, before releasing the mouse button, she accidentally moves the mouse off the button. The toolkit dialog responds to this by removing the highlight from 'delete' -- the same feedback as in the first scenario. Alison's release of the mouse button has no further effect.


Chapter 9 Models of the system 9.4.5 Example -- screen button feedback Page 373

Furthermore, this closure makes the mistake not just a possibility, but highly likely. Consider the moment when Alison has just pressed down the mouse button and the on-screen 'delete' button has been highlighted. She has done what she wanted and attains closure, and the remaining release of the mouse button is initiated. She now starts to look for the next action and begins to move the mouse to the location of next interaction. However, the two actions, releasing the mouse and moving it, are not synchronized with one another. There is no particular reason why one should happen before the other. It is, of course, a particularly dangerous point in a dialog where the order of two unsynchronized user actions makes a crucial difference to behaviour.


Chapter 9 Models of the system 9.4.5 Example -- screen button feedback Page 373

The solution is fairly obvious: the dialog should itself supply an event, which will be perceived by the user, corresponding to the application-level event. This could be visual, but would have to be very salient as the user's eyes are beginning to move towards the next task. Alternatively, it could be aural, either with a keyboard-like 'click' as the button is successfully pressed, or with a beep if the mouse slips off. This improved feedback could be combined with some dynamic mechanism, such as making the screen button 'magnetic' and difficult to move out of.


Chapter 9 Models of the system Exercises Page 374

9.2 Write a similar schema for RESIZE. It should have the following informal semantics. The width and height attributes are of the shapes bounding box. The resize operation should make one corner of the bounding box be at the current mouse position (supplied as the argument current_pos?).


Chapter 10 Implementation support 10.2 Elements of windowing systems Page 379

The first important feature of a windowing system is its ability to provide programmer independence from the specifics of the hardware devices. A typical workstation will involve some visual display screen, a keyboard and, usually, some pointing device, such as a mouse. Any variety of these hardware devices can be used in any interactive system and they are all different in terms of the data they communicate and the commands that are used to instruct them. It is imperative to be able to program an application which will run on a wide range of these devices. To do this, the programmer wants to direct commands to an abstract terminal which understands a more generic language and can be translated to the language of many other specific devices. Besides making the programming task easier, the abstract terminal makes portability of application programs possible. Only one translation program -- or device driver -- needs to be written for a particular hardware device and then any application program can access it.


Chapter 10 Implementation support 10.2 Elements of windowing systems Page 379

Though these imaging models were initially defined to provide abstract languages for output only, they can serve at least a limited role for input as well. So, for example, the pixel model can be used to interpret input from a mouse in terms of the pixel coordinate system. It would then be the job of the application to process the input event further once it knows where in the image it occurred. The other models above can provide even more expressiveness for the input language, because they can relate the input events to structures which are identifiable by the application program. Both PHIGS and PostScript have been augmented to include a more explicit model of input.


Chapter 10 Implementation support 10.4 Using toolkits Page 390

As we discussed in Chapter 4, a key feature of WIMP interfaces from the user's perspective is that input and output behaviours are intrinsically linked to independent entities on the display screen. This creates the illusion that the entities on the screen are the objects of interest -- interaction objects we have called them -- and that is necessary for the action world of a direct manipulation interface. A classic example is the mouse as a pointing device. The input coming from the hardware device is separate from the output of the mouse cursor on the display screen. However, since the visual movement of the screen cursor is linked with the physical movement of the mouse device, the user feels as if he is actually moving the visual cursor. Even though input and output are actually separate, the illusion causes the user to treat them as one; indeed, both the visual cursor and the physical device are referred to simply as 'the mouse'. In situations where this link is broken, it is easy to see the user's frustration.


Chapter 10 Implementation support 10.4 Using toolkits Page 390

In Figure 10.8, we show an example of how input and output are combined for interaction with a button object. As the user moves the mouse cursor over the button, it changes to a finger to suggest that the user can push it. Pressing the mouse button down causes the button to be highlighted and might even make an audible click like the keys on some keyboards, providing immediate feedback that the button has been pushed. Releasing the mouse button unhighlights the button and moving the mouse off the button changes the cursor to its initial shape, indicating that the user is no longer over the active area of the button.


Chapter 10 Implementation support 10.4 Using toolkits Page 390

From the programmer's perspective, even at the level of a windowing system, input and output are still quite separate for everything except the mouse, and it takes quite a bit of effort in the application program to create the illusion of the interaction object such as the button we have just described. To aid the programmer in fusing input and output behaviours, another level of abstraction is placed on top of the window system -- the toolkit. A toolkit provides the programmer with a set of ready-made interaction objects -- alternatively called interaction techniques, gadgets or widgets -- which she can use to create her application programs. The interaction objects have a predefined behaviour, such as that described for the button, that comes for free without any further programming effort. Toolkits exist for all windowing environments (for example, OSF/Motif and XView for the X Window system, the Macintosh Toolbox and the Software Development Toolkit for Microsoft Windows).


Search results for mouse
Showing 141 to 150 of 176 [<< prev] [next >>] [new search]

processed in 0.005 seconds


feedback to feedback@hcibook.com hosted by hiraeth mixed media