The interactivity of the most traditional computer applications comes down to using a mouse and keyboard. The popularity of mobile devices, especially smartphones and tablets, increased the demand for applications where the interaction is performed directly on the place where the information is displayed.
The combination of gestures and elements of an interface that can be operated through these gestures is currently represented by the acronym NUI (Natural User Interfaces). Despite this new way of interacting with electronic devices to be increasingly present in everyday life, there are few professionals who know how to develop interfaces and applications that can extract the true potential of these interfaces.
Based on this context, this article will present how to work with multi-touch interfaces in Java through the use of a framework. The aim of this article is initially present what are the applications, hardware, software and the possibilities that multi-touch interaction, combined with a suitable interface, can provide. The article also shows how to install, configure and work with the MT4J framework for the development of multi-touch platform-independent applications. Based on what was presented in this article the reader will have an overview of how to implement multiple interfaces that recognize gestures and thus adapt what was seen in their own projects.
This second part explains the hardware and software developing alternatives to programming multi-touch applications.
Devices and Hardware
The development of interfaces that enable multi-touch interaction is directly dependent on the hardware. This means that the way to program both the elements and the user interaction largely depends on the device used to capture and also the screen used to display the information. There are also technical details involving comfort, ability to recognize pressure, control of human oil produced from the user's hand on the screen, and the possibility of recognition of visual elements among others.
Currently there are several hardware options that enable multi-touch interaction and more complex gestural interactions. We will see some examples of hardware in this section. The next section will discuss how we can use them in the framework that is described in this article.
Perhaps the first examples that may come to mind when it comes to touch interfaces are smartphones, cell phones and tablets. These devices combine the touchscreen interface with a mobile device. Although you can use these devices in various locations, two common features they all share are the limited screen size and the use of these devices for only one user at a time.
The entertainment industry, particularly those producing video game consoles, is always looking for new ways for the consumer interaction with the media. In this scenario we can highlight the Nintendo Wii's motion control, the WiiMote, the solution of the Playstation console, which uses the Playstation Eye camera in conjunction with the Playstation Move control, and the recent release of Microsoft Kinect, the camera that lets you capture not only the image but also recognize parts of the body and provide data about the depth of the pixels. Each of these controls can be used to build multi-touch interfaces, even without the video game console, and this employ the possibilities of such interfaces in systems and applications outside the entertainment area.
Another classic example are computers with touch sensitive screens or devices that use a tablet pen called stylus, like the TouchSmart line of computers from HP or the IBM ThinkPad laptops. There are also a lot of multi-touch tables that can be assembled at home with low cost or can be purchased directly from the manufacturer, such as Microsoft Surface.
Most devices that enable multi-touch interaction are usually based on a combination of the surface that captures touches with the screen that show the elements of the interface to the users. However, not always these two surfaces are coupled. Moreover, most of these devices are based on rectangular screens but there are approaches that are beyond this limitation. For example, Figure 1 shows a surface of a sphere-based interaction (used to facilitate the navigation in maps), the surface interaction in the form of a cube, a touch sensitive mouse, a desk, a floor for interaction with feet and even the use of the palm as a surface for interaction.
Figure 1. Different surfaces for interaction that supports multi-touch.
While each type of hardware used to capture the touches and display the interface demand different development resources, the basic principles of interaction are usually followed. Due to the differences between the hardware devices it is important to have some level of abstraction that allows developers to create the interface without worrying about certain technical details of a specific hardware implementation. As will be shown in the following sections, the MT4J framework allows abstracting the hardware which is used since the framework makes use of the open protocol called TUIO, which will allow developers to create their interface only working with the events generated by the controls of the interface independent of the details the hardware used. This causes the developer to focus only on the software and not worry about details of the hardware while developing.
When someone starts to study alternatives for the development of multi-touch one of the first findings is that there are several platforms, software development kits (SDKs), frameworks, libraries and other resources for development that are generally linked to a particular device.
Consider the case for developing native applications for smartphones and tablets. Apple products (iPhone, iPod touch and iPad) require a unique development environment for Macintosh (iPhone SDK uses the Objective-C language). On the other hand, devices using the Android operating system have their own Java-based SDK. Windows 7 also has features for developing applications with multi-touch interaction likely to be exploited for the development of applications in future Nokia phones with Windows Phone. Other manufacturers (Motorola, Intel, HP, Sony, Nintendo, etc) also have exclusive development platforms that require a relatively high learning curve.
Due to this variety of platforms and approaches to develop multi-touch applications a developer can get lost. While the hardware manufacturer recommends using the platform-specific development resources, it is now possible to have an alternative based on a standard protocol: TUIO (Tangible User Interface Objects). This protocol is open, XML-based and works as follows: initially the device captures the touches in accordance with the details of the hardware. Then an implementation of the TUIO specific to the hardware transform the details of the touch (coordinates, duration, etc) in events that are sent over a TCP/IP connection that forwards the data related to events of touch for an address and specific port. It is the developer responsibility to configure the address that the implementation of the TUIO protocol uses to send messages and be responsible for receiving the events via the TCP/IP connection. Figure 2 shows a schematic of how the protocol TUIO interacts with a multi-touch table that uses computer vision as a mechanism to capture multiple touches.
Figure 2. Schema of application that use the TUIO protocol on a multi-touch table.
In Figure 2 an implementation of the TUIO (TUIO tracker application) interacts with a computer vision system represented by the camera. This implementation sends a TCP/IP touch events encoded in XML according to the standard TUIO protocol. These events are received by the client application (TUIO client application) that contains the gestural interface and the application itself. In this example the application is shown to the user through a projector facing the lower transparent side of the multi-touch table.
In this article we will discuss the framework MT4J (Multitouch for Java), an open framework, written in Java and released under the GPL license. This framework is useful for developing applications that will receive TUIO protocol messages independent of the hardware used. The official MT4J web site where anyone can download the latest version, documentation, examples and other resources is http://www.mt4j.org/.
The interaction with an application through gestures combined with a suitable interface for this interaction has been increasingly exploited in various devices such as smartphones, tablets, kiosks, tables and other multi-touch devices. Due to the demand for applications that have this type of interaction, this article focuses on developing applications that use multi-touch interfaces in Java using the framework MT4J. Initially the article shown what are the applications, hardware, and software applications that multi-touch interaction can provide. The article also describes how to install, configure and work with the MT4J framework for the development of multi-touch platform-independent applications.
This second part of the article discussed some alternatives to program multi-touch applications and talked about the TUIO protocol focusing on the more abstract level of programming.
In part three, we will see the MT4J framework.
To see the first part, access: http://mrbool.com/p/Multi-Touch-Programming-with-Java-Part-1/23158
Specification and implementations of the protocol TUIO.
Information on the TouchSmart line of computers from HP.
Touch Gesture Reference Guide.
Information about the Microsoft Sphere.
Information about the Gesture Cube.
Information about the Microsoft Touch Mouse.
Information about the BendDesk.
Information about the the multi-touch floor.
Information about the SixthSense project.
A guide for creating multi-touch table with low cost.
Information about the Microsoft Surface.