Human-Computer Interaction 3e ­ Dix, Finlay, Abowd, Beale

online!    Moore's law - methods to study new environments

see page 115. Box: Moore's law

Moore's law is one of those things that gets quoted and misquoted ... and to add to this we misquoted in the book :-(

We quoted the 'law' as saying that the speed of processors was doubling approximately every 18 months. To be precise Moore, in his 1965 paper, said that the density of components appeared to be doubling every year. To be fair Intel's own pages on Moore's law quote it as 'doubling of transistors every couple of years' - as manufacturers of chips they perhaps do not want to be hostages to fortune! In fact, their own graphs show that the Intel processor families have indeed managed a doubling rate only a little slower than every year.

In fact Moore's argument was a little more subtle than a simple graph of number of transistors over time. He looked at where the density that was most economic to produce had been at various times as this is what drives actual devices rather than the theoretical maximum.

Processor speed is closely related to density: smaller components have less electrons to move and are closer to one another meaning less transmission delays, but also faster speeds mean more heat produced. Moore thought that the smaller sizes of components would mitigate this, but in recent processor chips heat dissipation has become an increasing problem.

Memory storage also follows Moore's law like growth. For silicon based RAM this is because of the tighter packing of components as for processors. For magnetic media independent technical advances means that we see similar doubling cycles.

the data

The tables given of typical memory capacities in the different editions if the HCI book demonstrate Moore's law very well. They were each based on a 'typical' PC at the time of writing. Remember for each that there is a lag between writing and the publication date, so that the capacities quoted will have been out of date by the time the books hit the shelf!

Here they are:

1993
1st edition p. 81
1998
2nd edition p. 91
2004
3rd edition p. 109
RAM hard disk RAM hard disk RAM hard disk
capacity: 4 Mb 100 Mb 16 Mb 1 Gb 256 Mb 100 GB
access time: 200 ns 10 ms 200 Ns 10 ms 10 Ns 7 ms
transfer rate: 10 Mb/s 100 Kb/s 10 Mb/s 100 Kb/s 100 Mb/s 30 Mb/s

The access and transfer times for the 2nd edition look strangely similar to the 1993 figures - I have a horrible feeling we forgot to update these :-( So ignoring these and moving quickly on ...

The RAM capacity has increased by a factor of 32 in the 11 year period between editions. That is doubling every 2 years whilst the disk capacity has increased 1000 fold over the same period a doubling every year!

Ignoring the suspect 2nd edition timing figures, it is evident that access time, that is the delay from requesting some data and getting it to the processor, has only decreased marginally for hard disks. This is limited by the physical speed that the disk can rotate and the disk head can move around the disk surface. Motor technology does not advance as fast as silicon! However, the transfer rates to disk have increased 300 fold, similar to the capacity increase. This is because gains in disk capacity are largely about squashing more bits per track. Whilst thedisk rotation speeds are slightly faster, once the start of a block of data is found (the access time), the amount that passes under the read head in a single rotation is much greater, hence the very fast transfer rate.

In contrast the access time for RAM has decreased 20 fold and transfer rate increased 10 fold. These are slightly slower than their increase in capacity, but in the same order of magnitude. This is because, like the arguments for processor speed, the increased density allows faster operation. One of the main things that slows down RAM access speeds is that they have to transfer the data across the circuit board of the PC to the processor. With clocking speeds of processors approaching 3GHz even a beam of light can only travel 10cm in one clock cycle; at these speeds the motherboard looks like a big place! Electrical delays are longer hence memory has to operate an order of magnitude slower. On-chip caches work because they are both close and don't have to create the larger current needed to send electricity across the circuit board.

what does this mean ...

Partly this is just more of the same – you can store more pictures, more text, more audio etc. However, quantitative differences in capacities and speeds can eventually lead to qualitative differences in what can be done. The now ubiquitous MP3 music players, USB memory keys and digital camera flash memory are only possible because cheap RAM capacity has exceeded certain thresholds. Similarly increases in hard disk capacity have allowed Tivo and similar disk-based video recorders, also in hospitals new X-Ray storage facilities no longer need to store films physically, but can use banks of hard disks to store hundreds of thousands of images. As well as reducing the space required, this allows instant access by doctors at their own terminals without transporting films around the hospital.

Where does this end? A few years ago Alan wrote an article called "the ultimate interface and the sums of life?", where he looked at storage needed to record everything you heard or saw for every moment of every day of your life. Projecting forward using Moore's law the answer was that if recording began with a new born baby, by the time that baby was 70 years old those complete memories would be able to be stored on a single grain of dust. More amazingly calculating 'up' from the number of atoms per bit stored, it looked that this was achievable using only fairly standard extensions of current technology!

If you don't believe this, then look at some of the technologies around us, even a couple of years on. A current high-capacity USB memory key can store nearly 2 weeks of ISDN quality video and audio (the quality used for video conferencing). A pocket-sized hard disk can store a whole year and a fairly standard desktop hard disk (500Gb) 15 years worth.

In fact, this is an oversimplification. To store things is one thing, but to be able to efficiently access and index them, is far more difficult. Simply storing things on hard disk would enable you to be able to say "I wonder what I was doing at 4pm on 4th April 2004?", but not "when did I have that dinner with Fred and Jenny?". For text documents the free text index (indexing every word) typically consumes at least 10 times more storage than the original text. For video, audio and image data this is even more difficult. Answering the second question would require the system to be able to recognise who was in the images you have seen, not just record and replay them.

the limits to Moore's 'law'

Moore never described his paper as proclaiming a 'law'. The phrase "Moore's law" was an invention of the media. Instead it was 'just' an observation of past trends and a medium term projection. But, the fact though that this projection has continued reasonably consistently for 35 years since his original paper though does suggest a fairly long lasting effect!

However, there are limits to the exponential progression in the speed and capacity of memory and processors. As silicon devices get smaller, fewer and fewer electrons are used to store each bit of information. At some point various quantum mechanical effects start to become important, for example, if the electrical tracks on a chip become very small electrons can effectively jump between tracks through the insulator (called tunneling). The chip starts to behave more like a one-armed bandit than a well-oiled machine. On the other hand, these same quantum mechanical effects may be used to create new types of processors and memories that use the weirdness of the quantum world to solve problems that appear impossible with conventional technology.

Even if Moore's law continues for some time yet, does it mean we really can store more?

Here is a portion of Alan's hard disk where he stores papers and corresponding presentations organised year by year. Now this is fairly rough data, some years he published more than others, occasionally he used different computers and only kept copies of final versions of papers, at other times there are lots of old versions, the papers are in different data formats, etc. However, even taking this into account and seeing past the very uneven trend ... it is absolutely obvious that the trend is upwards and very steep. In fact this looks very like the tables of hard disk capacities. Over 10 years there is more than a 30 fold increase in disk use per year to store these papers ... and no, he isn't writing 30 times more! Happily hard disk capacity has increased 1000 fold over the same period, but it does give pause for thought.

How come there is such a huge increase? Well it is partly because disks are bigger there is less 'tidying up', old versions of documents in more recent years are left on the disk, whilst at earlier times only the final version was kept. However, this is only a small part of the change. Probably more significant is a greater use of high quality images. Older papers at most had vector graphics and line drawings, more recent ones high quality screen shots and photographs. Given the image resolution needed for good print reproduction has not changed significantly ( 300-1200 dpi), it may be that this exponential increase in use may level out ... but perhaps he will start using video clips in presentations etc.

Competing with Moore's law there seems to be some form of Murphy's law where data stored seems to rise proportional to the disk space available ... think of your own use over time aren't your own disks always 70-90% full no matter how big they are?

things to think about

Estimate the memory needed to store things around you: your books, your music, perhaps your entire college or university library. For the latter you could separately estimate the text or the text plus figures. How many hard disks is your library?

How many devices and services that you use would not have been possible 10 years ago because of processing speed or memory capacity?

Imagine everyone had a full record of every sight or sound in their lives. What ways could this help them in their lives? What problems might it cause? What dangers might their be for privacy or civil liberties? Would you want it? :)

Look up on the Internet articles about the Watergate scandal in the US that led to the impeachment of President Nixon. In particular look at the importance of the 'missing' tapes. Compare this to more recent enquiries in the UK and US over pre Iraq war intelligence - the importance of email archives etc. Would complete records be a good or bad thing for government?

 

Alan Dix © 2004