Disappearing Computer (2002)


In 2002, I wrote this dissertation as part of my B.Sc. at UEA. I've kept this edition as close to the original as possible. I've added in links (where they still survive) and inserted a few comments where I was ludicrously wrong or unexpectedly right. This paper is not especially well-written and, if memory serves, received only a adequate mark. Terence Eden - 2016

Executive Summary

This project report draws on many disparate sources to investigate:

  • The future of the computer.
  • How computers can adapt to meet our needs.
  • How computers are currently combining with our environment.
  • How computers will combine with our environment in the future.
  • The potential consequences of this merge.
  • What issues need to be addressed now to prevent mistakes in the future?
  • Whether increased integration between computers and the environment can be beneficial to society.

This report concludes that, while current ideas on The Disappearing Computer have some significant flaws, the prospect of a digitally integrate world can be beneficial, if the projected difficulties can be overcome.

This project has been akin to finding a cutlery-set in a silver mine. This report is a distillation of a great deal of information done by a great many people.

I've no idea what I was trying to say with that!

1. Introduction

1.1. What is The Disappearing Computer?

“The personal computer has evolved historically to become the standard tool for doing things, despite its many flaws, despite its complexity, despite the fact that it is ill-suited for many of the tasks that it performs.” Norman 1993

At the time of writing (2001 - 2002) computers are at their most visible. They are the dull beige boxes sitting beneath our desks, they are the automated systems that incorrectly tell the balance of our bank accounts and they are the shiny tools that require reams of instructions and weeks of lessons before productivity can begin. They are everywhere, trying to do everything; and it's very noticeable.

There are some computers that have ceased to be noticeable. They have vanished inside watches that so cleverly display the time, hidden in fridges to keep food at a satisfactory temperature, hidden in telephones as they convert analogue speech into a stream of digital data.

Over the last 60 years our relationship with computers have evolved from the monolithic box which served several people, to a slightly more equitable relationship where one computer servers one person. Mark Weiser, the “father” of The Disappearing Computer, believes that these first two waves of computing are coming to an end.;

"The third wave, just beginning, has many computers serving each person everywhere in the world. I call this last wave 'ubiquitous computing'." Weiser 1996

The key to this concept of Ubiquitous Computing is to create a mesh of devices and services that do not require us to alter the way in which we live. Once our computers are as non-intrusive as possible, we can get on with our work without having to worry about them.

1.2. What The Disappearing Computer is not.

“Ubiquitous computing is roughly the opposite of virtual reality” Weiser 1995

Many of the original references have succumbed to link-rot. Where possible, I've linked to the Archive.org snapshot from 2002.

There have been many perceptions made of what our relationship with computers will be like in the future. Popular science fiction such as “Star Trek” and “2001 A Space Odyssey” have presented us with the image of the computer as an entity existing only to obey commands. Some take this “slave” idea further – in Asimov’s “I, Robot” the computer is anthropomorphised into a highly intelligent but ultimately servile role. Indeed, the writer Josef Kapel derived the word “Robot” from the Czech word “robota” meaning “drudgery” or “servitude”. The goal of Ubiquitous Computing is not to clutter up the world with superfluous automata obeying our will, it is to seamlessly integrate intelligent features into our existing environment.

For an existing example let us look at how we augment the lives of the disabled to cope with the challenges of modern day life. People with hearing impediments are not able to hear a ringing telephone therefore an alert must be bought to them in an alternate manner. Such alternatives include toggling the lighting of the room or bringing up an alert on the television or even vibrating a Personal Digital Unit (PDU) such as a watch.

PDUs would be quickly supplanted by the mobile phone. Despite owning a mobile in 2002, I didn't quite make the leap to see how useful they'd become.

What could be done to make this feature more useful? Having the house keep track of where the user was would eliminate the need to provide alerts that are not appropriate (if there is no one watching the television there is no need to generate an alert to it).

Amusingly, this is a problem I now face! My phone pauses my TV when I receive a call. But if I'm cooking in the kitchen and my wife is watching TV, she is the one who is interrupted!

This begins to scratch the surface of what Ubiquitous Computing can and will do for the Digitally Integrated World.

1.3. The Goals of The Disappearing Computer

The first step to making the computer disappear is to create zero-maintenance machines. This means creating devices that rely on themselves and their peers for common maintenance tasks. If it is possible to make the drudgery of operating, configuring and repairing disappear from the end user, the high visibility that computers currently have, will be reduced. This might be as simple as keeping all the different time keeping devices in a house synchronised against one master clock, intelligent light bulbs which shut down when the room is empty, or even car tyres which know when they are wearing thin and book an appointment with the local garage.

Most of these are now a reality! I have smart lightbulbs. My car knows its tire-pressue and can schedule a service. I'm still waiting for my microwave to update its time...

One of the other goals of The Disappearing Computer is Ambient Intelligence.

"The main purpose of thinking is to abolish thinking... Once you hit a familiar road, you can stop figuring it out from the map." de Bono 1982

Mark Weiser gives an example of ambient intelligence in Weiser and Brown 1996. Suppose one wished to know the amount of traffic flowing through a computer network. Weiser suggests wiring up a piezoelectric motor to a network connection and attaching a piece of string to the motor. The more traffic flowing through the network, the more violently the string will twitch allowing an instant appraisal of when it is suitable to place a burden on the network.

This is a somewhat partial application; the string has one limited purpose and intrudes on the environment. An Ambient Intelligent Environment should be able to make the best use of the current tools present in the environment to convey the appropriate amount of data in the most acceptable fashion. The other problem with ambient intelligence is that it can be seen as contradictory to The Disappearing Computer – in this particular application the information is being conveyed even when it is not needed. Ambient Intelligence should be available instantly, but it should not interfere or distract except to alert the user to a significant event. This may be as simple as a chime when a task has been completed or as wide-ranging as toggling the lights and changing the colour of the wallpaper to alert the user to a fire alarm.

I use IFTTT to change the colour of my Lifx bulbs when my Internet-connected smoke alarm generates an alert!

A twitching string does have the advantage of being an excellent metaphor for network traffic; unfortunately it is continually distracting with its constant movement. Perhaps a better way of using the string is to invert the motor so the string twitches violently when the network has low traffic. This alerts the user to the desired event without the constant distraction of redundant data. Or, perhaps the network connection point itself (if viewable) should be photo-chromic and change colour as traffic levels change, or have a display that shows the percentage of network capacity that is available. As is explained later in this chapter – metaphors are never as good as the real thing. In the same way, Ambient Intelligence may be better off giving us instant access to the information rather than an abstracted version that supplants our knowledge of the world.

Ubiquitous Computing.

Ubiquitous Computing is the process of having computers everywhere and anywhere. Every flat surface can function as an information provider by sound or vision, every coffee machine wired up to report its usage, having the ability to check email anytime and any place. In short, computers embedded everywhere to such a degree that we become fish swimming in a digital sea.

The over-riding hope is that when technology has permeated our environment it should be possible to use it to augment and enhance our work.

1.4. An Analogy Is Not As Good As The Real Thing

By its nature an analogy is a reduction of the original situation. With ubiquitous computing we can allow computers to aid us with our tasks without getting caught up in the stylisation of a computer’s representation of the activity. Take, for example, the act of writing a letter. The task, at its heart, is

  1. Organise thoughts
  2. Place thoughts onto paper (or more properly, fashion thoughts into a form of communication)
  3. Make revisions if needed
  4. Send.

What it does not involve is

  1. Switching on a box
  2. Opening a specific program
  3. Choosing a particular font and paragraph styling
  4. Remembering which button automatically inserts your address
  5. Hunting for the “A” key on a keyboard. Etc.

Simpler tasks require simpler tools. If computers can permeate our lives and environment to the point where they become invisible then the task becomes simpler. One of the barriers to invisibility is interfacing. A user must have some level of control over the technology in order to input data or control its non-automated functions. The Personal Digital Assistant (PDA) is palm-sized computer suitable for keeping track of appointments and notes. To be effective, any device must cause the minimum amount of disruption to the users’ life. Ideally, a PDA should seamlessly recognise the predominant input method of humans; handwriting. Handwriting recognition is still relatively primitive Plamondon et al 1995, so designers of modern PDAs need a method that allows a human to write both naturally and in a style that the PDA can understand. From this need, Palm Inc. devised Graffiti. Graffiti is a method of inputting text based upon a user handwriting stylised text. In Graffiti, /\ represents A, 3 is B, etc. Graffiti’s designer, Jeff Hawkings, gave his reasons for implementing a partial character recognition system.

“People like to learn how to use things that work." Jeff Hawkins quoted in “The Next Small Thing” – by Pat Dillon. As printed in Fast Company issue 15, page 97 “We told them with Graffiti if they write this particular way, it works. We kept that promise. If you do what we tell you to do, it works. If you make a mistake in that situation, then you blame yourself.” Jeff Hawkins (founder of Palm Inc.) interview with USNews.com

This approach does make sense when you consider that pen and paper based input is a popular method of creating data. But, many people today have learned a different paradigm: the QWERTY-keyboard. Jeff Hawkins has recently acknowledged the need for key-based input system.

“We all felt that the keyboard didn't belong on the PDA [...] and therefore Graffiti or handwriting recognition was the right thing to do. [...] We learned, now that you can actually make a small keyboard and they work really well” Jeff Hawkins (founder of Palm Inc.) from his Keynote Speech at Comdex 2001.

In 2016 handwriting is dead. Touchscreen QWERTY rules as the input method - thanks in part to technology like SwiftKey. It allows you to trace out individual words on a keyboard, rather than recognising letters. As we move in to 2017, voice is looking like "the next big thing" for input.

However, whatever our approach to the task we wish to accomplish, the computer should be under our control and aid us. Whether it be computerised paper that automatically saves and recalls our work, birthday cards that sing personalised songs or envelopes that look up our friends’ addresses; we can be assured that the technology will, for perhaps the first time since its inception, tailor itself to our needs, rather than forcing us to obey its demands.

In short, we gain the benefit of technology without having to submit to its failures.

2. Current Levels

“We are analog beings trapped in a digital world, and the worst part is, we did it to ourselves.” Norman 1993

2.1. The Technology Behind The Revolution.

Moore’s Law states “The number of transistors that can be placed on a silicon chip of constant size doubles every 18 months”. Digital technology, in its conventional sense, is shrinking. As components get smaller, they can be integrated into a greater number of objects. Connectivity also allows integrated peripherals to communicate with each other – this allows them to, potentially, expand on their original design

2.2. Working Examples Of The Disappearing Computer.

PARCTAB

What is a PARCTAB?

The PARCTAB system was a research prototype developed at Xerox PARC to explore the capabilities and impact of mobile computers in an office setting. The PARCTAB system consists of palm-sized mobile computers that can communicate wirelessly through infrared transceivers to workstation-based applications. The first general use PARCTAB system was released in March 1993. By early 1994 up to 41 members of the laboratory were using the system.

What was it designed for?

As well as being a proof of concept design, the PARCTAB was created to allow for multiple functions.

It allowed for information access to remote data repositories.
The ability to access centrally held information is a strong prerequisite for mobile computing as it reduces the burden on both the designer and the user to ensure that their data is both complete and synchronised.

It was equipped to let communication happen via electronic messaging; communication is the method by which our ideas become realities.

It had the capability to provide for Computer Supported Collaboration (CSC). This allowed for taking care of tasks such as note taking in meetings and anonymous voting. Not only does CSC speed up the collaborative process, but also it can be done without having to radically impact working practices.

The PARCTAB also acted as a Remote Control to engage with the multitude of appliances that accepted infrared commands. Although one of the aims of The Disappearing Computer is to create specific tools for specific jobs, it is often easier and more convenient to the user to make a “Swiss army knife” application i.e. one that contains the best available tools in the limited environment.

The PARCTAB also came with a range of simple applications such as a digital notepad and calculator.

How was it used?

Generally it was well received. There were a number of problems that shed a light on to acceptance of Portable Digital Devices. The first discovery was that wearability has a great affect on acceptability. If it is easy to transport without having to make adjustments to one’s usual carrying habits, it is more readily accepted.

In 2016 everyone is happy to carry a digital assistant in the form of a mobile phone.

The second discovery was that, similar to the issue of wearability, non-interference is crucial. People have established work patterns that, on the whole, provide a barrier to learning new systems. Furthermore, work patterns that are effective enough do not warrant a change in habit. Any new technology must enhance without encroaching, integrate without supplanting and work without significant effort.

What can be learned?

People like objects that enhance their work without significantly intruding on their time or methodology. People also have a strong psychological attachment to objects. The original idea was that once a user had finished with a PARCTAB, another user could pick it up and use it. What happened was that once a PARCTAB was finished being used, it was put in a shirt pocket or locked in a draw.

The rise of the mobile telephone has taught us that, above all other advantages, the personalisation and ownership is crucial to the success of a ubiquitous technology. Because people do not all work in the same manner, it is important for technology to conform to our individual needs. This may be as simple as higher contrast displays for those with vision difficulties, or simply personalised alerts to distinguish our alerts from those around us.

The Ubiquitous Computing Environments of the future will have to incorporate this need for ownership. This can be accomplished in two ways.

The first being actual ownership of devices; if the devices we need are convenient enough for us to keep with us at all times our personalisation is assured.

The alternative method is to utilise portable personalisation – If one uses a telephone, anywhere on the planet, it should recognise the user and conform to their personal profile; showing their personal address book, preferred user interface, etc.

We're still not quite there with portable profiles. Most modern mobile phone let a user sign in and then they automatically copy most of the user's preferences over

2.3. Observations on the current state.

The visibility of computers has begun to diminish. Microchips have been embedding themselves into the environment at a rapid pace since the early 1970’s. Here is a brief but surprising summary of the penetration of the microchip.

Designers have a long history of disappearing the complex workings of thermostats. In 1625 the invention of the thermostat allowed people to control their environment without having to be knowledgeable about the system they controlled. Honeywell invented the electronic automatic thermostat in 1886.

In 1972 the first digital watches were marketed to the public by Sinclair. In 1994 Timex created a watch that could communicate with a PC to enable the local archival of phone numbers.

I loved my Timex Data link - it used optical flashes from a CRT to transmit numbers. Overtaken rather rapidly by Infrared and then Bluetooth.

Televisions have often incorporated microelectronics that are transparent to the end user – they are often used to stabilise or enhance picture quality or to decode Teletext. As digital broadcasting replaces analogue transmission more computing power is needed both to receive and operate digital television.

In the scope of monetary transaction, many credit card providers are embedding chips into their cards to make fraudulent transactions harder. Systems such as Mondex aim to implement microchips into every form of money transaction.

Toys that use simple computers are nothing new. Indeed, the modern computing revolution was born from home entertainment systems. Despite the first computer game appearing in 1958, it was not until 1991 that a computer based toy (Nintendo’s GameBoy) won the Toy of the Year award. From then on, the toy market was inundated with computer-based toys. In 2001 a new style of computer based toys come to the market that used transistor logic rather than digital logic to dictate their behaviour, this allowed them to crudely simulate the behaviour of insects and other simple forms of life.

Microchips implanted directly into living tissue have recently become a practical reality. Animals can have a subcutaneous microchip (often based around ISO FDXA/B or ISO 11784 / 11785A) that can be used to store details about the owner, vaccination or travel details Eradus 2001. Trials of a similar device for humans are already underway and in April 2002, the American Federal Food and Drug Administration stated that such chips are not liable to their regulations unless they are used for medical purposes.

Here in the future, implantable chips still aren't popular. I only know one person who has one. Other medical devices are increasingly common - like pacemakers and cochlear implants.

Many of the above devices can be considered computers-without-interfaces. An interface allows us to extend some degree of control over a device, but an interface is an abstraction of what the device is actually doing. This abstraction has a good reason; it would be tedious to manipulate bit-registers and memory addresses when all that is required is a simple operation. At the same time the degree of abstraction must not be so great as to confuse the user as to the result of an action.

A truly disappeared computer has an interface that could be described as transparent (“Vacuum cleaner, move forward.”) and, at best, described as intelligent (“Vacuum cleaner, clean this room.”) and at most, described as proactive (“Vacuum cleaner, thank you for seeing the room was dirty and tidying it.”)

My Roomba vacuum cleaner does the first two! And, if I can be bothered to schedule it, it can do a reasonable simulation of the third!

There are three fundamental barriers in achieving such a system.

The first is physical constraints: size, power and robustness play a large part in the ability for technology to disappear. As has been discussed, the issue of size relies more on the constraints of time than of technology. In chapter 3.1 the issue of power is discussed.

The second constraint is the ability to communicate. Without interconnectivity a state of asynchronous autonomy is derived that reduces the effectiveness of any system. The technology to allow effective data transmission between components is discussed in chapter 3.2.

Finally comes the issue of intelligence. Without the ability to discern behavioural patterns, to recognise needs and wants, and to evaluate the character of the users, an integrated environment is little better than having no integrated appliances. The developments and success of Artificial Intelligence is beyond the scope of this paper, however, in the ISTAG Scenarios for Ambient Intelligence in 2010 Ducatel et al 2001, the authors include a roadmap describing the advances in technology necessary to facilitate The Disappearing Computer. It identifies several key points in the development of computer systems. The paper predicts that fuzzy matching techniques will progress by to allow the realisation of so-called “Smart Agents”. Smart Agents are computer programs that exhibit intelligent behaviour in a specified realm Nwana 1996.

Or, as Ted Selker of IBM Almaden Research Centre phrased it;

"An agent is a software thing that knows how to do things that you could probably do yourself if you had the time." Ted Selker of IBM Almaden Research Centre, as quoted in Janca, Peter. ”Pragmatic Application of Information Agents”. BIS Strategic Decisions, Norwell, United States, May 1995

3. Proposed Future Levels - The Technology In Development.

3.1. Powering The Future.

The difficulty in producing such systems comes not from computer evolution, but from the ability to power such devices. Developments in energy storage technology have neither matched the speed of development in technology nor have they matched the needs of technology. Currently battery developers expect a 5% power capacity improvement every two years, which is in stark contrast to the computer world where Moore’s Law shows a 100% performance gain every eighteen months.

Battery performance is still a limiting factor. Electronics manufacturers are battling to make their devices as efficient as possible while faced with a stagnating energy capacity.

One of the proposed alternatives for supplying energy to power personal devices is through harnessing the energy given off by the human body. The human body generates a great deal of surplus kinetic and thermal energy. A sleeping human generates around 81 watts, not all of which can be captured unobtrusively. However, since the 18th century the movement of the human body has been used to perpetually power devices such as wristwatches Friedberg 1999.

Piezoelectric devices can capture impact motion from the joints and from feet hitting the ground. It is quite possible to generate substantial amounts of energy [Marsden, Montgomery 1971 and Fletcher 1996].

At current levels it is possible to obtain sufficient energy from walking to power a radio frequency identification transmitter Shenck and Paradiso 2001.

It is also possible to use thermoelectric materials to harvest surplus heat from the human body. These currently they operate at only 10% of their theoretical maximum efficiency Siegal 1999 and so are not yet ideally suited to provide power to personal devices.

We're still not there. I would have hoped that low power wearables could be charged from kinetic energy - but it looks like they will stay teathered to a charging cable for several years yet.

3.2. Communication

Communication between computers is the key for effective ubiquity. It is not necessary to have all processing performed at the location at which an action is going to occur. Nor is it important for every component to reproduce information when it could be queried from other local components.

There are already several standards for short-range yet high bandwidth wireless communication that allow computers to share information.

Bluetooth is a wireless communications protocol that utilises the 2.4GHz radio spectrum. Due to its abilities to switch frequency to avoid interference and its abilities to maintain up to 7 connections from a single device, it is ideally suited to provide an information exchange framework between current intelligent devices.

As an interesting side note to the development of short-range communication is their ability to usurp expensive long-range communication systems.

A standard cordless telephone communicates with a base unit using a short-range radio transmitter. The base unit then connects to the global telephone system. Similarly, a digital mobile phone uses a high power transmitter to connect to a base station that is, potentially, many miles away. The base station then communicates with other base stations and relays the transmission through them to the receiver’s phone. In an integrated short-range environment, the system might work something like this.

  1. If the two devices are with range of each other, they communicate directly.
  2. The device connects to a base unit that connects to the communication network. The receiver’s local base unit relays the message.
  3. If a suitable base unit cannot be found, the message could travel via local networks until the receiver is found. This could be done using VoIP.

No idea why I thought VoIP was the answer. It's interesting that Bluetooth and WiFi are now ubiquitous but it is rare that smart devices use both. The general trend seems to be for hub-and-spoke communications, where "hub" is often the fragile cloud.

The ability to dynamically reroute data through the most efficient path will be key in providing ubiquity in information retrieval.

Intel has announced its decision to develop computer chips to take advantage of wireless communication. Intel describes this as “Expanding Moore’s Law” as it should allow the continued expansion of computer power without running into the physical limitations that Moore’s Law encounters. Coupled with products that already make intelligent use of communication systems to function, we should start to see a surge in connected appliances within a short space of time. The ability for a device to seamlessly roam between locations means that portability becomes appreciably easier. More communication means a greater dissemination of data that should provide for a seamless information environment.

Software defined radios are everywhere these days. They're often found on System-on-a-Chip (SoC) - components which give even the cheapest device acess to WiFi, 4G, and Bluetooth.

3.3. Examples Of Proposed Ambient Intelligence.

The First Phase of an Integrated Environment is Self Aware Objects (in the sense that they know about their state, not that they start quoting Descartes).

It is not necessary to create a community of highly intelligent objects; rather, the intelligence should be distributed throughout the community. This distribution of intelligence allows computational load to be effectively balanced Litow et al 1995, provides redundant protection from component failure, reduces initial component cost and, in conjunction with neural networks, can allow the objects to form a pseudo-democratic decisions for greater accuracy in their work Battiti, Colla 1994.

The Intelli-Fridge

ARGH! Smart Fridge!

The idea behind the Intelli-Fridge is to have a central hub that regulates the decisions about food storage and purchases based on information gathered from semi-aware packaging. When a jar of mayonnaise is purchased, it is aware of certain properties about itself; its best-before date, how full it is, whether its lid is secured etc. It can then send periodic updates on its status to the fridge. If the jar noted that it was nearly empty the fridge could decide to order more of the product based on the household’s consumption frequency. If the lid had been left of the product, the problem would be reported to the Intelli-fridge. The household would then be notified by, say, a message appearing on the television. If the household consumed a lot of apples, the Intelli-fridge could search local shops and find the best prices. Seeing how favourably a product was received in the household, the Intelli-fridge could search for similar products that might be of interest to them. If the Intelli-fridge was asked to purchase the ingredients needed for a certain recipe, it could advise on items that would compliment it.

The “Intelli-fridge” is just an example of the way in which certain home services could be intelligently integrated. The goal is to have every object embedded with a technology that can report certain state based conditions. From a car that will inform the garage about the problems with its tyres, to the dish that tells the microwave that is not microwave-proof. From the pen that realises it is running out of ink, to the video-recorder that automatically records programmes without prompting. From a toy that can report to parents that it hasn’t been tidied away and is languishing under a bed, to the briefcase left on the train that, when it is out of range from its owner, gently reminds them where it is.

Wow! So many ideas! And so few which came to fruition. Bluetooth smart tags are available to prevent lost items. The TiVo is a smart PVR. Cars can report their status direct to a manufacturer. But the integrated kitchen is still a pipe-dream.

Another phase of the Integrated Environment is the ability to be aware of the user. A roaming profile of a user (“John Doe”) could enable both convenience and safety. A car could automatically adjust seat height, mirror positioning and preferred radio stations. It would also compensate for both the driving conditions and John’s driving style.

Most high-end cars have this! They will adjust based on the user sat in the driving seat. Sadly, there's no standard for this - so you can't carry your profile over to a hire car.

When John checks into hospital, his profile could inform the staff that he has a history of heart disease and is allergic to penicillin.

There are three possible ways to store and transmit this personal profile.
The first is some form of wearable or implanted computer that contains, records, evaluates and broadcasts the profile to those services that request it. Alternatively, a tracking system could follow John around automatically reporting his preferences to devices in John’s vicinity. Finally there is the possibility of local evaluation; when John walks into a room, his physical characteristics are evaluated and preferences discerned. This can only make use of physical characteristics, which could make it suitable for automatically adjusting room ergonomics.

Devices like Kinnect and Facial Recognition make this last point possible. Rarely used at the moment though.

Here is a table evaluating the pros and cons of these options;

Wearable Tracking Instant Evaluation
Computer has “disappeared”? Not entirely Yes Yes
Full range of preferences? Yes Probably No - limited range
Privacy Concerns? Few if the computer can be user controlled Few if the computer can be user controlled Some. Potential for civil liberties abuse.
Profile Controllable? Yes.,Assuming that user control is regulated so as not to provide false information. As wearable, but dependant on propagation of profile. Simplicity should ensure correctness. But faulty assumptions could be made.

The final phase is objects being aware of their environment. Adaptive technologies are needed to cope not only with the wide range of shapes, sizes and abilities of humans – but also the wide varieties of environments into which they may be placed. An audible alert is of no use if the user is deaf – it is also of no use if the sound levels in the immediate locality prevent it being heard. If there are very low light levels, or if the display is dirty, or if the user has vision impairment, then the display should take into account these factors and compensate.

“We wear clothes, put on jewellery, sit on chairs, and walk on carpets that all share the same profound failing: they are blind, deaf, and very dumb. Cuff links don't, in fact, link with anything else. Fabrics look pretty, but should have a brain, too. Glasses help sight, but they don't see. Hardware and software should merge into "underware". Your shoes should be retrieving the day's personalized news from the carpet before you even have time to take off your coat. We must expect more from our environment.” Things That Think, MIT Media Lab

While this proposal from the MIT Media lab is a little fanciful; it does present the fact that a lot of our technology is “dumb”, it is neither proactive nor reactive, and it is dead without our intervention. How would we improve if our previously inert world mobilised itself for our benefit?

Is it fanciful? Google Glass may have flopped - but it shows how smart glasses could work. Shoes are more likely to track your fitness these days.

There are two paths by which technology can go. It can integrate into our lives, reducing complexity and allow us to hold on to our regular ways of working, or it can radically change the way we think, work, educate and live.

4. Schools – An Investigation.

This chapter is based on the Interacting With Computers (IwC) paper “Supporting educational activities through dynamic web interfaces” Pimentel et al 2001 and information gathered at The BETT 2002 Show.

One erroneous notion is that education only takes place in the classroom, mostly through books and lectures. The basic recipe for education of a nation is very simple: * Take young children: * Open up the tops of their heads * Pour in all the information they are ever going to need to know to get along with life * Continue as long as possible -- for 12 to 20 years Now let them loose upon society, to spend the next 60 - 80 years as productive citizens, never having to be educated again. A very simple scheme, practiced by nations throughout the world. Simple and simple-minded. Norman 2001

Education, it would seem, is in a permanent state of crisis. Whether it is the inability to recruit suitable teachers, the battle fought over examinations and their grades or the warnings from industry about lack of skills. There is growing evidence that traditional educational techniques need either to be augmented or replaced with a student-centric methodology that takes full advantage of the growing power of computers.

4.1. Computers In Schools – How Ubiquitous Technology Can Aid Teaching And Enrich The Learning Experience.

It is important to remember that Ubiquitous Computing does not involve putting a computer on the desk of every child; rather, it involves using computers as a back end to support teaching.

The IwC paper primarily concerns itself with “the web” – using existing World-Wide-Web structures to augment learning. While Internet technology has not been designed to be truly ubiquitous, it does serve as a useful metaphor. The classroom is a multimedia environment where text, images, video and discussion all form an educational experience. All of which can be very overwhelming when confronted with only a paper and pen to record this information for later analysis.

I think I was mostly annoyed about how my computer science lectures were delivered on sheets of paper.

“Students' personal notes, in isolation of the rest of the lecture, are still hard to use as an anchor for class-wide discussion of lecture activities.” Pimentel et al 2001

What is needed, it would seem, is an automated tool to simultaneously record the information that the student should be accessing; a way to encapsulate the information imparted from the classroom. The first way to capture this information relies on non-complex solutions; using the teacher’s lesson plan, the computer can inform each student of the materials taught, assignments set and resources used. This data can become part of a wider record of the students’ educational life. Another way is to use the ubiquity of computers to act as a virtual note-taker. As the lesson progresses, the information imparted by the teacher is distilled by the computer. As resources are accessed and assignments set, the computer keeps track of the data that a student would need to access based on the materials taught.

The Digital White Board (DWB) caters expertly for both of these situations. DWBs are computer-controlled surfaces that a user can write upon, display video and interact. They allow for all the information that is written or displayed on them to be stored, converting handwriting to text if necessary, and replayed at any time. This has several key advantages over a regular lesson display

  • It allows pupils to engage in 24 hour learning or catch up on missed lessons.
  • If there is a shortage of skilled teachers, a suitable lesson could be downloaded from another school and be used to teach. Either a lesson could be relayed live from another school or merely replayed.

This is not an optimal solution, but as an aide to teaching it could be invaluable.

DWBs also allow a range of interactive features. A teacher can set a class test and display it on the board; the pupils’ answers are transmitted back to the whiteboard, whereupon the teacher (and the class) can see the results. This allows the teacher to have an instant appraisal of the classes’ abilities.

There is a supplemental problem of information overload. If every mention of, say, Napoleon links to his biography and every mention of Waterloo links to a military strategist’s description of the battle, the information will be swamped with links and the student could find it very difficult to extract useful information.

This is a problem writers today face. Wikipedia had only just launched when I was writing this paper. Their Manual of Style provides a good template for how links should appear in a document.

4.2. Redesigning the classroom.

In his address to Northwestern University School of Education, Donald Norman argues the case for making education as much like play as possible. As we can see from the animal kingdom and through sociological studies, humans learn best through play and social interaction; so called “Informal Learning”.

Most of the tasks that humans engage in are social activities, practical activities or a combination of the two. How can computers help education to capture this ability to learn through interaction rather than through pedagogy? The way to teach is to teach the tasks, not the tools. Learning to operate a cash register is far less important that teaching the art of monetary competence – there are thousands or different cash machines and thousands of different currencies; there is only one underlying principle for monetary manipulation.

5. The Social Computer – An Investigation

5.1. What Are Social Computers?

The Social Computer is a subset of The Disappearing Computer. It relates to two areas of computing.

  1. Making computers socially acceptable in their usage.
  2. Enhancing sociability within inter-related spheres.

One could argue that once computers have been socially accepted, they actually become invisible. Glasses and digital wristwatches are two good examples. Neither needs much maintenance; neither causes a fuss by either their absence or presence, both exhibit a relatively high degree of technical competence to create and, most importantly, both improve the quality of life for the user without significant change to their environment or habits.

Watches are a prime example of dissapeared computing. Glasses are... Well, Google Glass flopped - but SnapChat's Spectacles might succeed.

When considering the verb “disappear”, we must ask how far we want the device to disappear.

Devices should conform to the user, but this is not always practical. So the user must learn to conform to the device. The ideal solution is to make the advantages of conforming to the device greater than the disadvantages. With a mobile telephone the advantage of being able to indulge in telephony is greater than the disadvantage of carrying around a 100g device. Similarly the advantage of correct vision is greater than the disadvantage of having a glass and metal structure resting on one’s face.

This is still a problem. Mobile phones are portable computers - their advantages (Internet, games, sex, socialising) outweight their disadvantages (expensive, fragile, power-hungry).

5.2. How Can Computers Enhance Socialisation?

Data acquisition computers generally go through three stages of evolution;

  1. Post-Fact. The information is collated and analysed on computer after the data are collected.
  2. Real-time. As the data are collected, they are collated and analysed on computer.
  3. Pre-Fact. Before the information is requested, the computer seeks out likely sources of information and has the processed data available on demand.

Let us take the sphere of an address book. The book itself has already gained social acceptance. We now need a way of placing that data in the digital realm.

In the first phase of evolution, once everyone’s business cards have been acquired, the data contained therein can be placed on computer.

In the second phase, information is beamed from one user to another on a per-request basis. The current models of portable computers have infrared links that enable data to be transmitted between them.

In the third phase, the computer queries the whole data set for information that may be useful and, based on the information in the set, retains or discards data. In a business situation, everyone could have set their “business cards” only to broadcast pertinent business data; Name, Company, Contact Information etc. This information could then be augmented on a case-by-case basis to include further information based on the social interactions occurring.

I got this right! I predicted the Bluetooth Business Card (which I own) and the .tel address book. These are still not ubiquitous. Most address books are either stagnant, or use a service like WhatsApp to maintain updates.

The upshot is, only data pertinent to the event would be broadcast – the information that could be queried would be different at a nightclub (where a doorman may use it to verify age) to a social gathering where personal interests could be broadcast to help match people who share interests.

One current application of this is “The Gaydar”, a device that broadcasts a signal that will activate other Gaydars when they are within a 40-foot radius. The device is intended as an “ice-breaker” for gays and lesbians to help them avoid embarrassment when attempting contact with a potential partner.

There's no need for a discrete device when your phne can run apps like Grindr.

5.3. Real-Time Context-Sensitive Information.

If computers become as cheap and ubiquitous as, say, light bulbs; we enter a world where self-aware objects can be instantly queried on their status.

The Raspberry Pi Zero is only £5. There are other, cheaper computers available. Meanwhile, the cost of Internet connected lightbulbs is still high.

The technology behind this is already in place and forms what is known as “Augmented RealityAzuma 2001.

Augmented reality describes the practice of supplementing our senses with a digital overlay. This is often done with placing a transparent display in front of the eyeball to overlay graphics. It can also take the form of enhanced audio information when, say, touring a museum.

The uses of optical overlays range from the mundane (seeing when the last post is on a post box) to the informative (detailed biographical information about a famous painting), to lifesaving (a fire-fighter seeing a map unhampered by smoke), to social-evolutionary (warning a parent when a convicted sex-offender is nearby).

Boeing uses an Augmented Reality system to display information about wiring on an aeroplane Feiner 2002. Rather that continually querying a manual as to the location of wires in conduits, the conduit can be asked to provide the information. The information is then relayed to the technician’s glasses and overlaid in real time. Similarly, a surgeon could get a real time overlay of the skeletal damage on a patient as he operated and have an addendum of pulse and breath rates placed on the periphery of his vision.

With the advent of biometric recognition combined with a powerful database, it would be possible to pull up and privately display information about a social acquaintance upon seeing them. The information could be as simple as the business card they broadcast or could include private notes about personality, credentials and medical needs.

Eventually we could have a situation where we have information saturation. There is little doubt that Intelligent Agents could filter and direct the flow of data to make it relevant, but there is a very real probability that we could end up with Data Smog. Data Smog occurs when the signal to noise ratio of data becomes to low, or even when the sheer volume of useful data is too great to effectively analyse.

Spam, push notification, continual updates - all digital smog in our lives. Google's Gmail does a relatively good job of helping people out with this.

Ideally the computer, and by association, the data spewing from it should be as transparent as possible. However, we can already see that it is possible that a ubiquitous technology can become invasive rather than pervasive. Large cities suffer the problem of light pollution. Although the streetlights minimise the dangers inherent in surviving in a world without sufficient light, the unfortunate by-products include disruption of local wildlife and disruption of the view of the night sky. Neither of which are serious enough to provoke a rethink in city light planning.

It should be apparent that we are in a position to directly anticipate the problems likely to occur with future technologies. Work needs to be done now to prevent Data Smog obscuring the quality of data we wish to see.

6. Other Revolutions In History – What Is The Impact Of Radically Changing A Paradigm?

6.1. Post-It Notes.

While the impact of the Post-It Note© has been hugely beneficial economically and, to a degree, socially, it does have a rather more sinister side that has caused it to be known in certain sections of the British Government as “The Yellow Peril” (Q34).

Before the invention of the Post-It Note, proposed changes, comments and clarifications to government documents were written directly on the said documents. With the introduction of “The Yellow Peril” the risk is run of such notes being conveniently “lost” and such corrections never happening. The impact on parliamentary historians is also great – without penned superscriptions to government documents, insight into history in the making is lost. Draft parliamentary bills actively seek to make such use of Post-It Notes illegal! Hansard. Q24

The lesson to be drawn from this is that enhancements to working life should strive not to neglect periphery functions that, at first glance, may not seem important.

I'm not sure I agree with myself here. If there is a need to make notes, a digital system should incorporate that. It depresses me that we still see acres of paper used due to the perceived inadequacies of IT systems.

When examining the rise of the Mobile Phone versus the Fixed Line Phone we see a number of progressions that have resulted from a competitive new economy; reduction in costs, a new economy around accessories, and technological advances.

But with these have come unforeseen drawbacks. Health issues associated with mobile phone transmitters have come to light and have caused great concern for sections of society. The privacy of the connected individual has also come under scrutiny. The environmental intrusion on other users has also caused problems.

Citation needed! I've written about the effect phone manufacturing has on the factory workers who assemble them.

6.2. What Are The Problems Faced By People When Faced With Ambient Intelligence?

It is not easy to foresee the problems that come with advances in technology. This section is not intended to be a comprehensive list, in part because many problems will not come to light until the technology is in wide use.

Donald Norman presents the following views of the differences in humans and machines Norman 1993.

The Machine-Centered View
People Machines
Vague Precise
Disorganized Orderly
Distractible Undistractible
Emotional Unemotional
Illogical Logical
The Human-Centered View
People Machines
Creative Unoriginal
Compliant Rigid
Attentive to change Insensitive to change
Resourceful Unimaginative

As we can see, there is a central dichotomy between the biological and the synthetic. It could be argued that we should make our synthetic world more like our biological one – but competing with billions of years of evolution is far from trivial. It could also be argued that we should become more like the synthetic – but that would deny the very nature of what makes us human.

What is needed is a symbiosis whereby our vagueness is actively complimented by the precision of a computer. Where the lack of imagination in the digital world can be dynamically altered by our creativity and passion.

The current level of computing doesn’t allow for this to be done easily. To use a computer one has to learn ways of working that are often obtuse and counter intuitive to the human brain. A word processor does the hard work of typesetting and spell checking – but it cannot take speech and ideas and reproduce them. Instead the user is forced to use an unfamiliar system of reproducing text to get the computer to understand them. Barriers like this must be broken down.

Voice recognition is slowly improving. Most "Digital Assistants" still have a limited user interface and will only respond to specific commands.

Practical Issues

It is far from trivial to create devices that are simple to use, provide complex functions, interact well with humans and provide a useful service.

“The problem is that the same machines that we invented to aid us also demands precise, accurate information from us. As a result, we often end up the slaves of the very machines we have invented to serve us, slaves to their relentless demand for precision, accuracy, and continual supervision.” Norman 1992

Learning Issues

How do we learn how to use this new technology? More importantly, how does this new technology learn how to use us?

In the book “Turn Signals are the Facial Expressions of Automobiles” Donald Norman proposes “The Teddy”, a small personal companion that is introduced at as early an age as possible. He then goes on to explain that, while The Teddy may change its form or become surgically implanted in us; it would always remain physically close.

The Teddy does provide for a great learning tool, and could be instrumental in the acceptance of intelligent devices. Indeed, there are several “intelligent toys” in existence that lay the groundwork for acceptance from a younger audience.

This linked to a review of "My Buddy Bubba" which has since vanished. A copy is available on request.

“The Windows on the World of Information, Communication and Entertainment” (WWICE) project in Eindhoven is a home network fitted out with a range of connected appliances all of which are accessed through a virtual pet named “Bello”. This construct could be useful because “An interactive buddy like the WWICE dog Bello can help prevent the Big Brother effect.” Aarts 2000

I can find no trace of this Philips research project online. A copy of the original press release is available on request.

But is this personification necessary? Children have a great capacity for learning – if by the age of five a child has a vocabulary of thousands of words, would they not also be able to interact with the digital world with some success?

I'd now argue that it isn't necessary. Kids are adept at picking up new technology and don't seem to be restricted by the need for metaphors.

Doubtless the toys that children will play with will have a digital element to them, but to make something so feature rich and so visible as an anthropomorphised representation of the computer is not only counterproductive, it is in contradiction to the principles of the disappearing computer.

The danger may come from inadvertently indoctrinating people into anthropomorphising the underlying technology Shoham and Cousins 1994. While people are happy to endow current “dumb” machines with personalities (cash dispensers are often said to be “willing” to dispense money Nass Steuer 1993) what happens when the metaphors are stretched to breaking point?

Unless the machines truly have intelligence there is a danger that users will delegate tasks to them in the same manner as they would to a human Cohill 1999 with less than desirable results. While people are keen to take their frustrations out on machines, what occurs when the machine’s personality reveals defects or seems to have a vendetta against a user?

Coping With A Paradigm Shift

Will people accept new ways of learning? Will people be able to learn the new ways of learning?

“Sometimes we should leave things as nature made them be. Millions of years of evolution have led us to a very special niche. The human mind is very specially adapted to its environment. It is dangerous to tinker, whether by drugs, by physical manipulation, or by mind-control. It is too complex to be understood, and the effects of the tinkering are more apt to lead to harm than good.” Norman 1992

In his satirical piece “The Usable Planet”, Gary Bunker described a future of ubiquitous technology that is a seeming paradise… for those accustomed to it. He describes a world where computers control even the most mundane tasks, for those who understand how the computer reacts and responds. The hero of the piece only has knowledge of the world as he understands it; he does not understand the tasks and commands that have become second nature to those who have grown-up in this world.

I think we've all suffered the frustration of trying to teach an elderly relative how to "swipe", "click", or "double-click", right?

It is impossible to design for every possibility. Even a low technology solution such as writing with pen and paper requires a user to understand certain fundamental concepts. Similarly, working in world where a shake of the hand is enough to transmit a business card, the paradigm of communication needs some fundamental knowledge. Where Gary Bunker’s spoof world has failed is in its neglecting of the idea that fundamentals should be able to control the majority of devices. Requiring specific interaction in equally specific ways is not the nature of humans – devices must be adept at recognising our vagaries and producing something usable from them.

I still can't use SnapChat properly.

Privacy and Personalisation.

“[Ambient Intelligence] should be controllable by ordinary people – i.e. the ‘off-switch’ should be within reach: these technologies could very easily acquire an aspect of ‘them controlling us’. The experts involved in constructing the scenarios therefore underlined the essential need that people are given the lead in way that systems, services and interfaces are implemented.” “2010 Future Visions” Ducatel et al 2001

However, is the idea of an “off-switch” compatible with the idea of a disappearing computer? Undoubtedly, even the most “invisible” of our current technologies has to be operated in some fashion. It will, ultimately be possible for a computer system to be so pre-empting and “intelligent” that the need for human interaction would be strictly limited.

This brings us back to the introduction of the paper – what is the disappearing computer? Should its use become as ubiquitous as breathing – controlled largely by the subconscious? As a wristwatch – needing only a bezel to operate? A telephone – a basic control system used to control advanced features?

Trust Issues.

Will people accept what could amount to their lives being tracked and analysed every minute of every day? Can the benefits of such a system ever outweigh the concerns people have about their privacy? Can such data collation ever be made secure from prying eyes?

People have accepted Facebook's endless quest to track them. Data collection is getting more secure - but there are still regular leaks of information.

  • If an ambient intelligence felt that a member of its population had committed a crime – should it report them?
  • If an ambient intelligence felt that a member of its population was planning on committing a crime – should it report them? Should it prevent them?

This is now a practical concern.

There is a real possibility that the system, if left unchecked, could be used to install a totalitarian dictatorship. If we look to both literature and life for examples we see that, broadly speaking, such a regime would cure a great deal of the world’s ills. A society where crime is detected and prevented before it has a chance to do damage, a society where your comfort and ease of use are prime concerns, a society where the ability to live with only the fear of that which watches us.

This was written a few months before the Minority Report movie release date.

7. Conclusions

The computer cannot really disappear unless it is interconnected. A servomotor may be imperceptibly tiny - but unless it connects to something, it is useless. Only when the thousands of computers that permeate our life can communicate intelligibly will the computer truly have vanished.

Connected computers can be thought of as equivalent to one computer. A modern PC consists of many processors; a separate computer for audio, for video, for networking, etc. Yet all of these specialised components are lumped together in a singular computer. Operating systems like MOSIX and Beowulf are able to treat thousands of disparate but connect systems as one large CPU.

This means that there may be a significant flaw in Weiser's definition of Ubiquitous Computing Weiser 1996. Technological growth, both in and around the computing arena, has an Expanding/Contracting relationship.

Many People :   1 Computer  (First Wave)
1 Person    :   1 Computer  (Second wave)
1 Person    :   Many Computers  (Third wave) as defined by Weiser.

But as these computers will be able to communicate with each other they should probably be considered as one giant computer, which leads us to;

1 Person    :   1 Computer  (Fourth wave)

But all people will use this computer (system)

Many People :   1 Computer

Which has led us back to where we started.

So-called "Cloud Computing" is similar to the old paradigm of "mainframe" computing.

The actual definition of the ratio is not important. What is important is that because the ratio is hard to define we have reached a state where the concept of a relationship has begun to disappear.

7.1. Can The Disappearing Computer Improve Quality Of Work, Efficiency And Happiness?

“As benign as a tape recorder seems, it is – like almost all technologies – a double-edged sword: While it provides near-perfect reproductions of sound for a cost of next to nothing, it allows one’s memory to slip and encourages dependence on the recording” Shenk 1997

Ubiquitous Computing could lead to Ubiquitous Information. Shenk’s concept of data smog shows how having a saturation of information can provide a significant disruption to effective working practices. The current pace of information bombardment is not creating happiness – it’s creating stress. A recent report by the European Union estimates that “junk “ emails costs worldwide Internet users €10 billion per annum Gauthronet and Drouard 2001. Quality of data and efficiency of work cannot be stimulated under these conditions.

The change in quality of life provided by computers was discussed in a Virginia Polytechnic Institute teleconference held in 1995 the general consensus is that technology is very enabling and has already provided great leaps in social quality and it should continue to do so. It also found that there was growing unease over the prevalence of technology, not only as it supplanted more traditional ways of living, but as it became more restrictive and controlling.

If the Disappearing Computer is to be effective, it must heed the lessons of technological history. To improve work quality, efficiency and happiness, it must:

  • Be non-disruptive. Not to say that it should not cause change – it must resist from trying to add to our conventions.
  • Increase reliability.
  • Provide value without saturation

7.2. Final Remarks.

Traditionally the pain of upheaval has accompanied change. But humans are excellent at adapting to new situations. Less than 60 years after man undertook powered flight, transatlantic crossings by airplane were commonplace, 15 years after the invention of the microprocessor, computers were finding their way into homes, and within 20 years of the first commercial mobile phone more than 66% of the British population owned one.

These technologies have become part of our life by providing enhancement without, on the whole, supplanting our established paradigms. It is surely inconceivable that new technology will not be as rapidly assimilated into our lives and culture.

Indeed, the methodology of low-impact computing should radically reduce the shock of having to establish new ways of working.

The lesson to be learned from this investigation is that we should not conform to technology – technology must conform to us! And in doing so, fade into the background.

We have in our sights a way to blend computers in to the normal fabric of life without causing negative disruption, a way to streamline paradigms rather than replace them and a way to finally break the cycle of working harder to make life easier.

“Along this path [of Human-Centred Development] lies a new attitude of increased acceptance of technology, a technology that is invisible and thoroughly integrated into tools for the tasks and activities people wish to perform. Nothing worthwhile is easy.” Norman 1993

The End! If you ignore my florid writing style, I think it's a solid piece of writing. Several of the concepts in it are a reality, and many of the problems I identified have still not been resolved. I missed out on wireless Internet - WiFi wasn't even a thing when I was writing this. While I talked about how computers might "betray" criminal users, I didn't spot that hackers might try to compromise these systems. If you have any feedback, stick it in the comments box.


Share this post on…

  • Mastodon
  • Facebook
  • LinkedIn
  • BlueSky
  • Threads
  • Reddit
  • HackerNews
  • Lobsters
  • WhatsApp
  • Telegram

One thought on “Disappearing Computer (2002)”

  1. Andrew McGlashan says:

    We are at huge risk from the Internet of Traitors today. Too many devices are connected to the Internet that do not offer any kind of acceptable security. It is expected that we'll have 50 billion devices by 2020 and it would only take a very small proportion of those to create even more havoc than the many years of spam and malware has done in the past.

    When a security professional can setup monitoring for a new device (camera), and watch it be taken over in 98 seconds. The consumer and the manufacturer don't care enough about security. We are doomed by IoT, not blessed by it.

    Reply

What links here from around this blog?

What are your reckons?

All comments are moderated and may not be published immediately. Your email address will not be published.

Allowed HTML: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> <p> <pre> <br> <img src="" alt="" title="" srcset="">