Most Active Stories
- Portraits And Party Scenes From Kansas City's Drag Ball Culture Revealed
- Blue Valley High Lost A 'Star In The Making'
- Music In The '90s: Was There A 'KC Sound'?
- Preschool Trauma Program In Kansas City Getting National Attention
- Kansas City Grocer's Hand-Painted Signs Are A Lost Art In The Modern Age
Mon December 26, 2011
Timeline: A History Of Touch-Screen Technology
Originally published on Mon December 26, 2011 9:15 pm
1948 The Electronic Sackbut
The history of touch technology begins with touch-sensitive music synthesizers. According to the Canada Science and Technology Museum, Hugh Le Caine's Electronic Sackbut, completed in 1948, is widely considered to be the first musical synthesizer. The Sackbut is played with the right hand on the keyboard and the left hand on control board above the keyboard. The right hand controls volume by applying more or less pressure on the keys, while the left hand controls four different sound texture options.
1965 E.A. Johnson Touch Screen
Believed by some to be the world's first touch screen, the screen developed by E.A. Johnson of England's Royal Radar Establishment uses touch technology similar to that found in today's tablets but can read only one touch at a time. Johnson's touch technology was used for air traffic control in the U.K. up until around 1995 and serves as a precursor to the screens found on today's ATMs, ticketing machines and outdoor kiosks.
1972 PLATO IV Touch-Screen Terminal
The University of Illinois' PLATO IV terminal, part of the PLATO educational computer systems the school started developing in the '60s, has an infrared touch panel that allows students to answer questions by touching the screen. Though other touch-screen devices existed before PLATO IV, it is the first to be widely known and used in Illinois classrooms.
1982 First Multitouch Device
Nimish Mehta of the University of Toronto develops a touch tablet that can read multiple points of contact thanks to a video camera that can communicate with a computer.
1983 Video Place
Myron Krueger develops an optical system that tracks hand movements so users can interact using gestures. According to pioneer Bill Buxton, Krueger's research essentially wrote the book on rich gestural interaction.
1984 First Multitouch Screen
Bob Boie of Bell Labs develops the first multitouch screen, which allows users to manipulate graphics with their fingers.
1984 Casio AT-550 Watch
The face of Casio's AT-550 watch is a touch screen that complements the watch's built-in calculator. The user can put the watch in calculator mode by pushing a button on the lower left side then use a finger to draw each digit and mathematical operation (plus sign, minus sign, numerals, etc.) on the screen.
1987 Apple Desktop Bus
The Apple Desktop Bus, or ADB, is an early kind of USB cable. It first appears on Apple's Macintosh II and Macintosh SE and allows for multiple devices — mouse, keyboard, joystick — to be plugged in at the same time. The development of multiple input technology like ADB can be linked to the two-point pinching and stretching functions seen on today's smartphones and tablets because, as Buxton puts it, "the software just needs you to specify two points, and doesn't really care how you specify them, in a language that it understands." The ADB is the first system that allows someone to connect two pointing devices at the same time.
1993 Simon Personal Communicator Phone
In 1993, IBM and telephone company BellSouth team up to release Simon, a cellphone that one 1993 press release describes as "a wireless machine, a pager, an electronic mail device, a calendar, an appointment scheduler, an address book, a calculator and a pen-based sketchpad." It is the first product to combine touch-screen technology with a telephone.
University of Delaware academics John Elias and Wayne Westerman co-found FingerWorks in 1998 and start producing, among other things, a gesture-operated keyboard and the iGesture Pad, a concept similar to Apple's Magic Trackpad. Apple buys FingerWorks in 2005 and puts its technology and its innovators to work on development of the iPhone and iPad.
In 2003, three friends from Bordeaux, France, create a multitouch screen capable of tracking any number of fingers. By 2005, their company, JazzMutant, releases the Lemur, a music controller with a multitouch screen interface. JazzMutant is the first to make transparent multitouch screen technology — or a digital display that can be directly touched to manipulate — available to consumers.
The TactaPad uses a camera to display hand movements onto a computer desktop while a touch pad allows the user to communicate actions. The touch pad gives tactile responses to actions, like a firm, fall-through sensation when pressing an enabled button or a stiff, buzzing sensation when pressing a disabled button. Though introduced in 2005, TactaPad never makes it to the consumer market.
Apple is the first to successfully release a touch-screen smartphone into the marketplace. Its iPhone has a sleek, user-friendly design and limited multitouch capability, so initially someone can't hold the shift key with one finger and type an upper case character with another finger in keyboard mode. It does, however, support the pinching function — first developed by researcher Krueger — used to zoom in and out of maps or photos.
With the iPad, Apple brings touch-screen technology to everyday computing by providing a larger screen. The iPad uses multitouch technology and allows users to surf the Web, read and send email, look at photos, watch videos, listen to music, play games and read e-books, among other things.
2011 Surface 2.0
Samsung's SUR40 uses Microsoft's Surface software, developed in 2007, to create an interactive tabletop. The SUR40 uses optical sensors to track multiple fingers and hands, as well as objects, on its surface, meaning that it can actually "see" what someone puts on it.