April 28, 2017
 

Touching the future: the rise of multitouch interfaces

Johannes Schöning
Advances in sensor and hardware implementation enable expressive gestural control and fluid multi-user collaboration in human/computer interactions.

Multitouch technology with computationally enhanced surfaces has attracted considerable attention in recent years, not least because of its potential to improve human/computer interaction. Optical approaches to system design use image processing to determine the locations of interactions with the surface. Infrared illumination and simple setups mean that these systems can potentially be very robust. Hardware implementations, such as frustrated total internal reflection (FTIR) and diffused illumination (DI), have enabled low-cost development of surfaces. Laser-light plane and diffused-screen illumination offer other advantages.1

Buxton's multi-touch Web page2 provides a thorough overview of the underlying technologies and history of multitouch surfaces and interaction. Despite its recent surge in popularity,3 the technology has actually been available in different forms since the late 1970s. However, Han's 2005 presentation of a low-cost, camera-based sensing technique using FTIR truly highlighted the potential role the technology could play in developing the next generation of human/computer interfaces.4 Han's system was both cheap and easy to build. It was used to illustrate a range of creatively applied interaction techniques. His Youtube demonstration captured the imagination of experts and laymen alike. In 2007, interest increased further when Apple released details of the iPhone,5 a mobile phone with a multitouch screen as user interface. The iPhone's interface and interaction techniques received considerable media attention and brought multitouch to the forefront of the consumer-electronics market.

Later in 2007, Microsoft announced their Surface multitouch table,6 which has the appearance of a coffee table with an embedded interactive screen. Like the HoloWall,7 the Surface uses a DI approach with a diffuser attached to the projection surface and IR illumination from below. Reflections of hands and objects are captured by cameras inside the table. By employing a grid of cameras, the Surface has a sensing resolution sufficient to track objects augmented with visual markers.

The technology's growth is evidenced by the considerable amount of research exploring the benefits of multitouch interaction and surfaces.8–17 In addition, several conferences are held annually in this and related fields. Interactive Tabletops and Surfaces (ITS)18 is the premier conference for presenting research in the design and use of new and emerging tabletop and interactive-surface technologies. As a young community, ITS embraces the discipline's growth in a wide variety of areas, including innovations in hardware, software, and interactive design, and in studies expanding our understanding of design considerations for applications in modern society.

Our goal is to facilitate better application of the technology by pointing out the many disappointing presentations of multitouch-enabled surfaces and addressing the lack of awareness of genuine advantages of multitouch interaction: multi-touch can do more than just rotate and scale pictures and videos on a screen (see Figure 1). Of course, there are also many good examples of applications. Domains such as education, entertainment, or command-and-control scenarios have the potential to highlight the benefits of multitouch interaction. For example, the SMART Table19 represents the first multitouch, multi-user interactive learning experience that allows groups of early-education students to work simultaneously on its surface. Jeff Han's company Perceptive Pixel20 also presents their interactive wall as a tool for command-and-control scenarios. Microsoft's Surface is an ideal platform for various kinds of games (for example, the ‘Firefly’ game or the ‘Dungeons and Dragons’ concept).


Multitouch interaction in different domains. (A), (B), and (C) Interaction with a virtual globe. (D) Exploration of volumetric medical data.

In our own work, we focus on how multitouch can interact with geospatial information. As a demonstration of our own FTIR-based multitouch wall in a pedestrian underpass, we allowed users to navigate through a virtual globe and explore points of interest (see Figure 1). We also focus on how other modalities can be combined to enrich the interaction with spatial information.21,22 In more recent work, we focus on new paradigms and interactions that combine both traditional 2D and novel 3D interactions on a touch surface to form a new class of systems that we refer to as interscopic multitouch surfaces (iMUTS). iMUTS-based user interfaces support interaction with 2D content displayed in monoscopic mode and 3D content usually shown stereoscopically.23

Buxton2 says, “Remember that it took 30 years between when the mouse was invented by Engelbart and English in 1965 to when it became ubiquitous.” In time, multitouch will successfully pass through the inevitable media hype and become a genuinely useful technology, a powerful way in which people interact with next-generation computers.


Author

Johannes Schöning
German Research Centre for Artificial Intelligence (DFKI)

Johannes Schöning is a senior researcher in the Innovative Retail Laboratory. He received a diploma in geoinformatics at the Institute for Geoinformatics of the University of Münster (Germany) in 2007. His research interests focus on new methods and interfaces for intuitive navigation through spatial information or, in general, new intelligent interfaces that help people solve daily tasks more effectively.


References
  1. Johannes Schöning, Peter Brandl, Florian Daiber, Florian Echtler, Otmar Hilliges, Jonathan Hook, Markus Löchtefeld, Nima Motamedi, Laurence Muller, Patrick Olivier, Tim Roth and Ulrich von Zadow, Multi-Touch Surfaces: A Technical Guide,, 2008. Technical University of Munich (Germany)

  2. Bill Buxton, Multi-touch systems that I have known and loved. http://www.billbuxton.com/multitouchOverview.html Accessed 28 March 2010.

  3. Johannes Schöning, Antonio Krüger and Patrick Olivier, Multi-touch is dead, long live multi-touch, Proc. 27th Int'l Conf. Human Fact. Comput. Syst. (CHI '09), pp. 1-4, 2009.

  4. Jefferson Y. Han, Low-cost multi-touch sensing through frustrated total internal reflection, Proc. 18th Annu. Assoc. Comput. Machinery Symp. User Interf. Softw. Technol. (UIST '05), pp. 115-118, 2005.

  5. http://www.apple.com/iphone. Apple's website for the iPhone. Accessed 28 March 2010.

  6. http://www.microsoft.com/surface Microsoft website for the Surface interactive computer. Accessed 28 March 2010.

  7. Nobuyuki Matsushita and Jun Rekimoto, HoloWall: designing a finger, hand, body, and object sensitive wall, Proc. 10th Annu. Assoc. Comput. Machinery Symp. User Interf. Softw. Technol. (UIST '97), pp. 209-210, 1997.

  8. Paul Dietz and Darren Leigh, DiamondTouch: a multi-user touch technology, Proc. 14th Annu. Assoc. Comput. Machinery Symp. User Interf. Softw. Technol. (UIST '01), pp. 219-226, 2001.

  9. Jun Rekimoto, SmartSkin: an infrastructure for freehand manipulation on interactive surfaces, Proc. Spec. Interest Group Comput./Human Interact. Conf. Human Fact. Comput. Syst. (CHI '02), pp. 113-120, 2002.

  10. Tomer Moscovich, Multi-touch interaction, Extend. Abstr. Human Fact. Comput. Syst. (CHI '06), pp. 1775-1778, 2006.

  11. Alessandro Valli and Lorenzo Linari, Natural interaction sensitivetable, Extend. Abstr. Human Fact. Comput. Syst. (CHI '08), pp. 2315-2318, 2008.

  12. Johannes Schöning, Brent Hecht, Martin Raubal, Antonio Krüger, Meredith Marsh and Michael Rohs, Improving interaction with virtual globes through spatial thinking: helping users ask “why?”, Proc. 13th Int'l Conf. Intell. User Interf. (IUI '08), pp. 129-138, 2008.

  13. Tomer Moscovich and John F. Hughes, Indirect mappings of multi-touch input using one and two hands, Proc. 26th Annu. Spec. Interest Group Comput./Human Interact. Conf. Human Fact. Comput. Syst. (CHI '08), pp. 1275-1284, 2008.

  14. Andrew D. Wilson, Shahram Izadi, Otmar Hilliges, Armando Garcia-Mendoza and David Kirk, Bringing physics to the surface, Proc. 21st Annu. Assoc. Comput. Machinery Symp. User Interf. Softw. Technol. (UIST '08), 2008.

  15. Jacob O. Wobbrock, Meredith Ringel Morris and Andrew D. Wilson, User-defined gestures for surface computing, Proc. 27th Int'l Conf. Human Fact. Comput. Syst. (CHI '09), pp. 1083-1092, 2009.

  16. Julien Epps, Serge Lichman and Mike Wu, A study of hand shape use in tabletop gesture interaction, Extend. Abstr. Human Fact. Comput. Syst. (CHI '06), pp. 748-753, 2006.

  17. Clifton Forlines, Alan Esenther, Chia Shen, Daniel Wigdor and Kathy Ryall, Multi-user, multi-display interaction with a single-user, single-display geospatial application, Proc. 19th Annu. Assoc. Comput. Machinery Symp. User Interf. Softw. Technol. (UIST '06), pp. 273-276, 2006.

  18. http://www.its2010.org/ Interactive Tabletops and Surfaces Annu. Conf.

  19. http://www2.smarttech.com/st/en-US/Products/SMART+Table/ SMART Technologies interactive learning centre. Accessed 28 March 2010.

  20. http://www.perceptivepixel.com Perceptive Pixel. Accessed 28 March 2010.

  21. Florian Daiber, Johannes Schöning and Antonio Krüger, Whole body interaction with geospatial data, Smart Graph., pp. 81-92, 2009.

  22. Dimitar Valkov, Frank Steinicke, Gerd Bruder and Klaus H. Hinrichs, Navigation through geospatial environments with a multi-touch enabled human-transporter metaphor, Geoinformatik. In press.

  23. Johannes Schöning, Frank Steinicke, Antonio Krüger, Klaus Hinrichs and Dimitar Valkov, Bimanual interaction with interscopic multi-touch surfaces, Proc. Human-Comput. Interact. (INTERACT 2009), pp. 40-53, 2009.


 
DOI:  10.2417/2201003.002864