Sunday, September 9, 2007

Interfaces - what are they and what do they do?

Interfaces...what are they?
by Andre Hayter (3233282)

Interfaces are everywhere. Every time we interact with something in order to achieve something else we are using an interface. This statement is intentionally vague to represent the fact that this can take an almost unlimited number of forms. Interfaces between humans and machines are cleverly called HMI's (Human-Machine Interfaces). Computer Interfaces are characterized today by the use of GUI's (Graphical User Interfaces). However these are just the tip of the iceberg in terms of interfaces, even within their own classification. Wikipedia lists 20 different types of interfaces and this is not meant to be an exhaustive list. There is a whole spectrum of interfaces from cryptic CLI's (Command Line interfaces) that require the user to learn abstract commands to what are called Zero-Input Interfaces which attempt to ascertain what the users want without any direct input, through the use of sensors. I will illustrate several of these types of interfaces with the following examples.

Benjamin Wooley gives us a quick overview of the development of the graphical user interface or GUI (gooey). He points out the main problem of "interfacing" with computers is "the point of contact between the user and
the machine" (1). There are a number of factors in rectifying these problems, however: social, economic, technological. These lead to a situation where the interface that becomes adopted in the mainstream is not necessarily the best form. An example is the keyboard. So much had been invested in developing the QWERTY keyboard and spreading its use (in terms of hardware costs as well as training costs) that even though a better form of keyboard (the DVORAK keyboard, named after one of its developers Dr. August Dvorak) was designed and patented in 1936 it was never adopted.

  • According to Wikipedia: Barbara Balckburn holds the world record according to the Guinness Book of World Records for speed typing. She has maintained 150 wpm for 50 min and has a peak recorded speed of 212 wpm. [] (3)
There are many reasons for this, but some are not as obvious as the simple retraining issues. The gaming industry has used certain keys on the keyboard for other functions like W, A, S, D for moving characters forward, left, right, and backward. Using a Dvorak keyboard means changing these key assignments as they no longer make the inverted T pattern that they do on the QWERTY keyboard. The point is that interfaces are not always chosen for their ultimate usability, but also that interfaces become co-opted for uses they are not originally designed for. This highlights one of the points made by Mathew Fuller that interfaces cannot be seen as separate from the hardware environments they reside in. In other words we can't separate the act of typing as a mode of "interacting" with the computer from the keyboard itself, not to mention the processor and the screen, etc...
[] (2)
As Bill Buxton (a pio
neer in interface technology) points out, a similar situation occurred with the use of the mouse which was invented by Englebart and English in 1965 but did not come into mainstream use until the advent of Windows 95. Yes there are exceptions like the Xerox Star and PERQ workstations, but statistically they are not significant.
[ see his website for a great overview of all things interface related] (4)

So what am I saying with all of this...well that the interface is a slippery beast. No interface is good at everything, the best we can do is make sure it is good at something. The primary method for achieving this being used today is the metaphor. These metaphors
are used to give context to commands that might otherwise be very cryptic like the example given by Wooley of the copy command in the CP/M operating system. This command was PIP. This seems silly to us being so acclimated to thought out metaphors but we should not be so quick to judge. The legacy of development comes back to catch most operating systems.

Tim Rohrer [] (5)
wrote "Metaphors we compute by: bringing magic into interface design" in 1995, in which he comments on the inconsistency of the Mac OS Trash Can as part of its metaphor of the desktop. Even if we ignore the issue of the fact that trash cans are not literally on the desktop, he points out that ejecting a disc from the machine by dragging it to the trash can stretches the limits of the metaphor. It is a metaphor or magic that goes too far. It makes sense to delete a document that way but not to eject a disc. He illustrates this point by referring to his personal experience teaching users the interface and the obvious discomfort this brought up. He argues that the problem stems from the idea that the mind is separate from the body and as such can be transmitted through language across to another. We can explain a concept to someone and thus have the "same idea" in mind. So that an interface designer can get someone to think the same way they do about the interface. The problem is that our minds are not separate from our bodies and are informed and affected by our bodies. Therefore we should not try to ignore the "gut-feeling" that ejecting a disc by trashing it is wrong. Mathew Fuller brings this idea full circle by saying that the interfaces cannot be separated from the physical systems they are a part of, much as we cannot dissociate our minds from our bodies. (6) Our nervous system is actually bi-directional. The nervous system sends out signals to sense organs that affect the way we perceive. In a similar way interfaces are affected by their hardware environments. We have all experienced the shock at printing a picture we have loaded or created on our computer and seeing that the colors do not match. This is because we have not "normalized" the interface taking into account the hardware of the monitor.

To sum up: a good interface must follow two basic rules at the very least. It must be self-consistent. And the metaphors it uses must not contradict the users' intuition or social context.

This intuition can be physical as well as mental. The military has been developing an exo-skeleton that enables a soldier to carry far more than he/she normally would. In addition, as explains
University of California, Berkeley, Mechanical Engineering Prof. H. Kazerooni, "the pilot is not 'driving' the exoskeleton. Instead, the control algorithms in the computer constantly calculate how to move the exoskeleton so that it moves in concert with the human." (7) The interaction with the interface is rendered not only invisible, but by design, is in accord with the user's intuitive understanding of how to interact with the interface.

[images from Machine Design, published 12/8/2005] (7)

This type of physical augmentation is found in surgeries, where surgeons are using a robotic implement guided by their natural movements. This can even be done remotely, from a continent away. This leads us into the realm of augmented reality. Where the user is interacting with the real world (a physical element of an interface usually) and the interface itself. This blurs the line of where one begins and the other ends and further illustrates Fuller's point that the interface cannot be separated from the elements that make it up and the hardware environment in which it resides. In the example of remote surgery, ignoring the fact that data is traveling down a fiber-optic cable in the ocean could lead to disastrous consequences. Even though the user may not need to be aware of this the designers surely must.

Mixed Reality:
Here is an example of mixed reality which tries to work within our standard senses to manipulate the metaphors on screen. Most of these interface types make use of what are called Tangible User Interfaces, meaning that they require the manipulation of physical objects. This is an example of what is called Haptic feedback. This is also found in the gaming industry.
youtube clip provided by glueckauf (
This technology is being designed by an Austrian company: Kommerz (

[ video courtesy of youtube:] (8)

Here is an art piece/video from metacafe posted by zloyshaman that plays with this idea of mixed reality (9):
This is a great piece that shows the dichotomous nature of immediacy and hypermediation. It highlights the effort to render the interface "invisible" or "natural" through the use of metaphors for real physical objects. The irony is that the objects too can be interfaces themselves and metaphors for other "real" objects. We find this situation spreading further and further as we realize that interfaces a
re everywhere and in most things.

[video courtesy of metacafe:] (9)

Perceptive Pixel: a company setup to commercialize Jeff Han's research into multi-touch interfaces using FTIR (Frustrated Total Internal Reflection)
Perceptive Pixel (10)
"The interface kinda disappears" Jeff Han ~06:24 (11)

[video courtesy of youtube:] (11)

These are good examples of the drive to render the interface invisible. To absorb the metaphorical nature of computer interfaces into what is considered a standard. The two finger approach to zooming in on a picture is becoming standard in this type of multi-touch interface, even though it is (or because it is) a metaphor within a metaphor...We are familiar with seeing photos and being able to bring them into the computer and manipulate them with software like Photoshop. But there is no real equivalent for zooming with your fingers. This is a metaphor for the zoom tool used in software. So there is a kind of feedback loop between natural tendencies and learned behaviors that is a common theme throughout interfaces.

Multi-touch interfaces are starting to be sold in the mainstream with products like the Lemur and Microsoft's Surface. But they have been in use in the arts and research for quite some time. Here is an example of an interface used in more "mainstream" art application: a Bjork concert at Coachella 2007 - "Pluto":

[video courtesy of youtube:] (12)

MAX/MSP, PureData, vvvv:
These are all examples of programs that have very different interfaces for creating software. They are generally referred as "patchers" as they work by patching blocks together. The blocks represent code snippets or actions or datatypes. They still require the user to understand the logic and structure of coding, but there is less burden from having to learn language specific syntax.

This image comes from the University at Albany (State University of NY). It is part of their listing of software programs available to students in their electronic music program.

They all work in a similar way. Some have tools for working with video as well as sound. They allow artists to work in ways they might not be able to without the help of a programmer. Much as Microsoft word or Adobe InDesign lets users do typography without special training. Although many would argue that without training you still can't do layout/typography even with these programs, but the interfaces let us do it more easily.

An example from DJ Timski's website showing the integration of MAX, Ableton Live and the Wiimote controller. An example of a reworking of interfaces to create a new result.

[video courtesy of youtube:] (14)

This is a more traditional coding environment (although with a simplified language) that is geared around doing visual "processing". It is interesting that many people are using it to make interfaces. Especially experimental interfaces that aim to change the way we interact with data, and especially data from the internet. This software is being created under an open source model and is available at
The Universe by Jonathan Harris of Daylife is an example of an interface created using this software.

Interactive Spaces:

Christopher Janney's Sonic Forest as exhibited at the Bonnaroo Festival in Manchester, TN. This is an example of hypermediacy in interface. The interface is brought directly into the foreground, as "users" are allowed to wander through the forest and interact with the "trees", making sounds and triggering lights. Where this exhibit fails in my opinion is that the "interaction" is very vague. We have no metaphor for "interacting" with trees, or creating sound by merely moving. Also, it is very difficult to tell who is doing what. Although technically it is an inspiring and interesting work, and sonically it is pleasing, in terms of interface it is frustrating.

[video courtesy of youtube:] (15)
Chunky Move Dance Company recently created the piece "Glow" that creates a user interface (invisibly) in order for the dancer to create the sound and light elements for their performance in real-time. The dancer's movements create and manipulate the sound and projected images.
The nature of the interaction is much more direct and obvious, although it is second hand, it is much less frustrating of an interaction experience than the sonic forest.
Glow video (16)

The next example is a project built by the MESO Digital Interiors design team for an exhibit in which visitors to the fair learned facts & figures about the current energy debate. The nice thing about this interface idea is that the entire room is incorporated, using lighting that reacts to the state of the interactive table. This provides feedback in a deeper way than usually experienced and addresses a sort of virtual reality type of immersion in a very non-intrusive way. This intrusiveness is what seems to be holding back VR. As Wooley quotes Krueger in "Interface", "the idea that people are going to put on gloves and scuba gear to go to work in the morning at least requires some skepticism" (1). (17)

Playing with the concept of interface or "Food for Thought":
These examples are works by artists who are intentionally or not bringing the idea of the interface to the foreground. They force us to think about ways in which interfaces work both successfully and unsuccessfully.

  1. Benjamin Wooley, “Interface”, Virtual Worlds (Oxford: Blackwell, 1997), 138-149
  2. “Dvorak Simplified Keyboard - Wikipedia, the free encyclopedia,” (accessed September 20, 2007).
  3. “Barbara Blackburn - Wikipedia, the free encyclopedia,” (accessed September 20, 2007).
  4. “Multi-Touch Systems that I Have Known and Loved,” (accessed September 20, 2007).
  5. “Metaphors we compute by: bringing magic into interface design,” (accessed September 20, 2007).
  6. Mathew Fuller, “The Impossibility of Interface”, Behind the Blip (New York: Autonomedia, 2003), 99-120
  7. “Giving soldiers a high-tech leg up,” (accessed September 20, 2007).
  8. “YouTube - Mixed Reality Interface,” (accessed September 20, 2007).
  9. “Reality Desktop Video,” (accessed September 20, 2007).
  10. “Perceptive Pixel Video Player,” (accessed September 20, 2007).
  11. “YouTube - Jeff Han on TED Talks,” (accessed September 20, 2007).
  12. “YouTube - Bjork uses the reacTable for Pluto,” (accessed September 20, 2007).
  13. “The Software,” (accessed September 20, 2007).
  14. “YouTube - WiiJ Timski demonstration video 1,” (accessed September 20, 2007).
  15. YouTube - Sonic Forest Bonnaroo 2005,” (accessed September 20, 2007).
  16. “MySpaceTV: Chunky Move,” (accessed September 20, 2007).
  17. “vvvv: a multipurpose toolkit : meso-ISH 07,” (accessed September 20, 2007).