HUMAN-COMPUTER INTERACTION SECOND EDITION
Dix, Finlay, Abowd and Beale


Search Results


Search results for toolkit
Showing 1 to 10 of 44 [next >>] [new search]


Chapter 3 The interaction 3.6 Elements of the WIMP interface Page 123

Together, these elements of the WIMP interfaces are called widgets, and they comprise the toolkit for interaction between user and system. In Chapter 10 we will describe windowing systems and interaction widgets more from the programmer's perspective. There we will discover that though most modern windowing systems provide the same set of basic widgets, the 'look and feel' -- how widgets are physically displayed and how users can interact with them to access their functionality -- of different windowing systems and toolkits can differ drastically.


Chapter 3 The interaction 3.7.5 Localization/internationalization Page 135

It is clear that words have to change and many interface construction toolkits make this easy by using resources. When the program uses names of menu items, error messages and other text, it does not use the text directly, but instead uses a resource identifier, usually simply a number. A simple database is constructed separately that binds these identifiers to particular words and phrases. A different resource database is constructed for each language, and so the program can be customized to use in a particular country by simply choosing the appropriate resource database.


Chapter 4 Usability paradigms and principles 4.2.3 Programming toolkits Page 146

4.2.3 Programming toolkits


Chapter 4 Usability paradigms and principles 4.2.3 Programming toolkits Page 147

Engelbart wrote of how humans attack complex intellectual problems like a carpenter who produces beautifully complicated pieces of woodwork with a good set of tools. The secret to producing computing equipment which aided human problem-solving ability was in providing the right toolkit. Taking this message to heart, his team of programmers concentrated on developing the set of programming tools they would require in order to build more complex interactive systems. The idea of building components of a computer system which will allow you to rebuild a more complex system is called bootstrapping and has been used to a great extent in all of computing. The power of programming toolkits is that small, well-understood components can be composed in fixed ways in order to create larger tools. Once these larger tools become understood, they can continue to be composed with other tools, and the process continues.


Chapter 4 Usability paradigms and principles 4.2.4 Personal computing Page 147

Programming toolkits provide a means for those with substantial computing skills to increase their productivity greatly. But Engelbart's vision was not exclusive to the computer literate. The decade of the 1970s saw the emergence of computing power aimed at the masses, computer literate or not. One of the first demonstrations that the powerful tools of the hacker could be made accessible to the computer novice was a graphics programming language for children called LOGO. The inventor, Seymour Papert, wanted to develop a language that was easy for children to use. He and his colleagues from MIT and elsewhere designed a computer-controlled mechanical turtle that dragged a pen along a surface to trace its path. A child could quite easily pretend they were 'inside' the turtle and direct it to trace out simple geometric shapes, such as a square or a circle. By typing in English phrases, such as Go forward or Turn left, the child/programmer could teach the turtle to draw more and more complicated figures. By adapting the graphical programming language to a model which children could understand and use, Papert demonstrated a valuable maxim for interactive system development -- no matter how powerful a system may be, it will always be more powerful the easier it is to use.


Chapter 4 Usability paradigms and principles 4.2.7 Direct manipulation Page 151

A consequence of the direct manipulation paradigm is that there is no longer a clear distinction between input and output. In the interaction framework in Chapter 3 we talked about a user articulating input expressions in some input language and observing the system-generated output expressions in some output language. In a direct manipulation system, the output expressions are used to formulate subsequent input expressions. The document icon is an output expression in the desktop metaphor, but that icon is used by the user to articulate the move operation. This aggregation of input and output is reflected in the programming toolkits, as widgets are not considered as input or output objects exclusively. Rather, widgets embody both input and output languages, so we consider them as interaction objects.


Chapter 5 The design process 5.3.2 Guidelines Page 196

Other popular graphical user interface (GUI) systems have published guidelines which describe how to adhere to abstract principles for usability in the narrower context of a specific programming environment. These guidelines are often referred to as style guides to reflect that they are not hard and fast rules, but suggested conventions for programming in that environment. Some examples are the OPEN LOOK and the Open Software Foundation (OSF) Motif graphical user interfaces, both of which have published style guides [233, 186]. Programming in the style of these GUIs involves the use of toolkits which provide high-level widgets, as we have mentioned earlier in this book and will discuss in more detail in Chapter 10. More importantly, each of these GUIs has its own look and feel which describes their expected behaviour. The style guides are intended to help a programmer capture the elements of the look and feel of a GUI in her own programming. Therefore, style guides for the look and feel of a GUI promote the consistency within and between applications on the same computer platform.


Chapter 9 Models of the system 9.2.3 Model-oriented notations Page 343

The two major model-oriented specification notations in use today are Z and VDM. Both have been used for interface specifications. For example, Z has been used to specify editors [232], a window manager and a graphics toolkit called Presenter [240]. In the following description, we will follow the conventions defined in the Z notation. We do not assume any prior knowledge of Z; however, this chapter does not serve as a tutorial for the notation (interested readers should consult the Z reference manual for more details [226]).


Chapter 9 Models of the system 9.4.5 Example -- screen button feedback Page 371

Screen buttons activated by clicking the mouse over them are a standard widget in any interface toolkit and are found in most modern application interfaces. The application developer has little control over the detailed user interaction as this is fixed by the toolkit. So, the specific results of this example are most relevant to the toolkit designer, but the general techniques are more widely applicable.


Chapter 9 Models of the system 9.4.5 Example -- screen button feedback Page 371

We have two questions: why is this mistake so frequent, and why didn't she notice? To answer these we use status/event analysis to look at two scenarios, the first where she successfully selects 'delete', and the one where she does not. There are four elements in the analysis: the application (word processor), the button's dialog (in the toolkit), the screen image and the user (Alison). Figures 9.6 and 9.7 depict the two scenarios, the first when successful -- a hit -- and the second when not -- a miss.


Search results for toolkit
Showing 1 to 10 of 44 [next >>] [new search]

processed in 0.004 seconds


feedback to feedback@hcibook.com hosted by hiraeth mixed media