An abridged version of this article appears is: Buxton, W. (2001). Less is More (More or Less), in P. Denning (Ed.),The Invisible Future: The amless integration of technology in everyday life. New York: McGraw Hill, 145 - 179.
Less is More (More or Less)
William Buxton
Buxton Design
Toronto, Ontario
Die Zukunft war früher auch besr.
(The future was better in the past, as well.)
Karl Valentin Abstract
In April 1981 Xerox introduced the Star 8010 workstation, the first commercial
system with a Graphical Ur Interface (GUI) and the first to u the “desktop”
metaphor to organize a ur’s interactions with the computer. Despite the perception of huge progress, from the perspective of design and usage models, there has been precious little progress in the intervening years. In the tradition of Rip van Winkle, a Macintosh ur who awoke today, after having fallen asleep in 1984, would have no
more trouble operating a “modern” PC than operating a modern car.
The “desktop” is still the dominant ur interface paradigm. Equally durable is the
“general purpo” nature of PC design, which assumes that we channel all our
transactions (a diver lot) through a single interface on a single computer.
While common discour about digital media is dominated by the concept of
convergence, we argue that from the perspective of the usage model, just the
opposite concept,divergence, should be the dominant model. We argue that the
diversity of web browrs tomorrow will match the diversity of “ink browrs” (a.k.a.
paper) today.
Systems will have to be tailored to dynamically connect the ur with artefacts
relevant to the ur's current actions -- and do so in a way, form, place, and cost
appropriate to the ur. This change is as inevitable as it is esntial. In the end, it
will leave us with new concepts of computers, communications, and computing --
and of computer science itlf. Few computer science departments or computer
firms appreciate the magnitude of the impending change. It is time for them to wake up.
Introduction: Rushing Slowly into the Future
As someone in the profession of designing computer systems, I have to confess to being torn betwe
en two conflicting ntiments concerning technology. One is a n of excitement about its potential benefits and what might be. The other is a n of disappointment, bordering on embarrassment, at the state of what is.
Despite the hyperbole surrounding new media and technology, I firmly believe that we are far behind where we might have otherwi been, and that our society is all the poorer as a conquence. Furthermore, my view is that our current path has very little hope of getting us to where we should and could be anytime soon, thereby prolonging the misd potential. For this to alter, our approach has to change.
Despite the increasing reliance on technology in our society, in my view, the key to designing a different future is to focus less on technology and engineering, and far more on the humanities and the design arts. This is not a paradox. Technology certainly is a catalyst and will play an important role in what is to come. However, the deep issues holding back progress are more social and behavioural than technological. The skills of the engineer alone are simply not adequate to anticipate, much less address the relevant issues facing us today. Hence, fields such as sociology, anthropology, psychology and industrial design, must be at least equal partners with engineering and technology in framing how we think about, design and manage our future.
While the growth of technology is certain, the inevitability of any particular "future" is not. Like mathematics, perhaps we should start to u the word "future" in the plural,futures,in order to reinforce the fact that there are a number of different futures that might be. The specific future that we build, therefore, will be more easily en to be a conquence of our own decisions, and will, therefore, demand more concern with its design.
What follows is an attempt to establish a conceptual framework from which we can better understand the past and make more informed decisions about the future.
Dimensions of Change
I have begun by expressing disappointment at the state of the art, and at the slow rate of progress, in human terms, in the emerging information technologies and "new media."
Figure 1:The Xerox Star 8010 Workstation. Introduced in 1981. This is the first
commercial system to utili a "windows, icon, menus, pointer" (WIMP), or "graphical ur interface" (GUI).
But given the general perception that technology is changing at such a breakneck speed that even experts have trouble keeping up, how valid is the suggestion that our progress is too slow, and that we could have done better? It all depends on the dimension along which we measure change. What is the relevant metric?
In the areas of microelectronics, telecommunications and materials science, for example, there is no question that there has been staggering change over the past few decades. But if we shift from the dimension of technology to the dimension of the ur, we e something very different. Despite all of the technological changes, I would argue that there has been no significant progress in the conceptual design of the personal computer since 1981. To support this claim, look at the computer shown in the photograph in Figure 1, which dates from that year. My experience is that most computer urs, including professionals, cannot identify the decade, much less the year, in which the photograph was taken! For how many other "fast changing" products is that true?
The computer shown is a Xerox Star 8010 workstation (Smith, Irby, Kimball, Verplank & Harslem, 1983). It incorporated all of the design and ur interface characteristics of a contemporary personal computer: windows, icons, a mou, and CRT1. In fact, there is an argument to be made that this 1981 machine was better designed from an ea of u perspective than most "modern" computers (so rather than progress, we may even have gone backwards in the intervening years!).2大惊失色的近义词
Figure 2:The iMac from Apple Computer. The iMac typifies both the best and the
worst in current computer design. On the one hand, it illustrates how departure from the status quo and investment in design can have a strong impact on the success of
十岁那年a product. On the other hand, it illustrates how superficial our investment in design
has been. If you look past the candy colours and the sleek curves, what you e on the screen is the esntially the same old GUI and conceptual model that was there
on the Xerox Star in 1981.真的好想你歌词
Now I have the greatest respect for the innovators that made this machine possible. But I have to ask, "Did they get it so right that no further design or refinement was required?" I think not. What I feel is missing is the next wave of innovation - innovation that does to the Xerox Star what the Xerox Star did to its predecessors. This is something that I believe we have been capable of, yet have failed to do, for a number of years. This is also something
1 The system also incorporated email, a local area network and a networked lar printer.
2 It is significant that the launch of the Xerox Star was in 1981. This preceded by a year the first Conference on Human Factors in Computer Systems which took place in Gaithersburg, Maryland. This meeting became the start of the ACM Special Interest Group on Human-Computer Interaction (SIGCHI), and the establishment of HCI as a distinct discipline. Obrve that the landmark innovations of the Xerox Star were, therefore, made without the benefit of any significant literature in the field of HCI. And yet, despite a huge body of literature being generated in the intervening years,
none of it has had any impact on the design of personal computers, relative to the impact of the Star. What does that tell us about the discipline and the industry? For me, it is cau for rious concern.
that I feel has to be done before we can achieve the benefits that are so often offered, but so infrequently delivered, by this emerging technology.3
One of the motivations for this essay is to put forward a view on how we can bring the progress of the design and benefits of computational devices more in line with the progress of the underlying technologies and its unfulfilled potential. In order to accomplish this, we need to delve a little deeper into the nature of the changes that have been taking place.
A little practical fieldwork will help us here. The exerci is this: ask 10 people what they think the most significant changes have been in computers over the past 15 years. If the list that you thus obtain is like mine, it will look something like:
1.Smaller:computers are much smaller than they were, making them portable, among
other things
2.Faster:we can do things on small machines that ud to take huge machines
qq美女3.Cheaper: the cost of computation is falling dramatically
4.More of them: the number of computers has exploded. The population of
microprocessors in the world is now about three times the population of humans.
5.Networked: our machines can communicate with one another, with faster speed and
increasingly using wireless means
6.Location/Motion Sensing: our devices are starting to have the capacity to know where
they are, both geographically (such as the GPS-equipped navigation computer in some cars, for example), and "socially". Social awareness comes in two forms: technological and human. Devices are developing the capacity to have an awareness of what other devices are in (what Microsoft calls) the "Network Neighbourhood" (what I call the
杜鹃花夏天怎么养"society of appliances"), and in terms of the human social context.
7.Input/Output (I/O) Transducers: the input output devices available are changing
dramatically, offering a range of options from printers, scanners, etc. to the ur, thereby opening the opportunity to redefine the nature of what constitutes a "computer terminal".
一声声一更更When I have done this exerci, I typically get the same basic results whether it is a layperson or technology professional that I poll. The greatest consistency is with the first three or four items. Further down the list the respons are fuzzier and less consistent.
This is significant, since I would argue that the items are listed in inver order of importance. The things that come first to mind are the least important, and the things that come last to mind and are most vague, are the most important.I e this discrepancy between consciousness and importance as pointing to the root of the stagnation in computer design. Am I suggesting that the improvements of size, speed and cost of microelectronics are not important? No. Rather, my argument is that there are so many resources allocated to solving the underlying problems along the dimensions that the improvements will happen regardless of what you or I do. They have momentum, and verge on inevitable. What is not inevitable, at least in the short term, are some of the opportunities that they afford when coupled with the things at the bottom of the list - things that do not have adequate resources or attention being paid to them.
This brings us to the trap inherent in the above list of changes.
3 The Xerox Star is a data point supporting what I call "The Law of Inertia of Good Ideas." This says: The better an idea in its time, the more it holds back progress in the future. It is relatively easy to displace bad ideas and bad design. Really good ideas, such as the QWERTY keyboard, or the GUI, however, take hold and are extremely difficult to replace.
The models and language that we u to articulate, or discuss, things, frame our perceptions and ways of thinking. As long as we discuss design in terms of this list, our perspective, like the list itlf, will have a technocentric bias. To break out of our current rut, we need to recast our list of changes using a human centric perspective, which reflects the importance of usage and activity rather than technology:
•Who is using the computer
•What they are doing
•Where they are doing it
•When they are able to do it
•Why they are doing it
•How they do it
The are the questions that matter most and can guide design towards the right solution in the right form for the right person in the right location at the right time and at the right cost. They prompt a concern for design that reflects respect for human skill at all three levels: motor-nsory, cognitive and social (Buxton, 1994).
Bridging the Two Solitudes of the Physical and the Virtual
Figure 3:Bridging between the two solitudes of the physical and the virtual
domains. (Image courtesy of Gray Holland)
In the previous ction, I argued that changes in input and output (I/O) technologies constituted perhaps the most important dimension of change in terms of defining the nature of future technologies. Why is that?
One of the most significant issues confronting computer urs, as illustrated Figure 3, is the problem of bridging the gap between the physical and virtual worlds. For most activities, most current systems make it too difficult to move the artefacts back and forth between the two worlds, the physical and virtual. Hence, the relevant documents, designs, etc. are isolated in one or the other, or split between the two.
With appropriate design, however, the I/O technologies of future systems will be designed so as to absorb and desorb the artefacts relevant to the intended activity, thereby providing a much more amless bridge between the two solitudes.
Tuning the I/O for specific activities is contrary to most current design, which more follows what migh
t be called,The Henry Ford School of Design. Just as Ford is reputed to have said about his automobiles, "You can have it in any colour you want as long as it is black," so today’s computer designers say, "You can have it in any form you want as long as it has a keyboard, a display and a mou, regardless what you are doing." Hence, temporarily assuming the role of an anthropologist examining the tools of a society, we notice that there is little significant difference in the tools ud by the physician, accountant, photographer, cretary, stockbroker architect, or ticket agent. Instead of specialisation, a one size fits all approach to design prevails.
As we shall e going forward, I/O devices are key to our ability to tailor computer systems to the specific needs of various urs. Hence, their place at the top of my list in importance, at least insofar as technology is concerned. But technology is only of condary importance. It is the human’s capability, intent and need that should be (but too often is not) the driving function in all of this.
On Complexity, Skill and Human Limitations
If the human should be the centre of focus in the design of technology, how can we get a better understanding of the relevant, but too often neglected issues, especially as they relate to complexity and design?
世界犬类智商排名Let us begin with a ries of examples that I have ud over the past few years, beginning with the graph shown in Figure 4. This is an approximation of the often-cited Moore's Law, which states that the number of transistors that can fit on a chip will double every 18 months. The graph simplifies this to simply state that there will be more technology tomorrow than there is today. So far, so good.
汽车常见故障
Figure 4:Moore's Law: the growth of technology as a function of time. The simple interpretation is that there will be more technology tomorrow than there is today.
The next graph in the quence is shown in Figure 5. It illustrates what I immodestly will call Buxton's Law of Promid Functionality, which states that the functionality promid by technology will grow proportionally with Moore's Law. In layperson's terms, this simply means there is going to be more functionality promid/offered tomorrow, than there is today.
At this point, readers will be excud if they are wondering why I am wasting time and space stating the emingly obvious, and using quasi-scientific graphs in the process?