With the advent of dual and multi touch pointer devices users can now use gestures to interact with the operating system and applications. The major desktop operating systems are all catering for gesture interaction and promoting the use of gestures within applications with well defined gesture application interfaces.
UPDD can be configured to support single, dual and multi-touch devices. For each stylus in use the positional data is made available on the UPDD API as well as optionally passed into the OS and/or applications to cater for gesture utilization.
The injection of single, dual and multi-touch data into operating systems and their applications is dictated by the interface methods implemented within the OS environment. These are the interface methods utilised by the UPDD driver:
The interpretation of the gesture is dependant on the OS and application. There are many articles on the web to describe gesture utilisation by the OS desktop (windows manager) and key standard applications. However, a very useful gesture reference guide can be found here.
It is our intention to build full gesture support into the driver and/or supporting utilities but in some cases we currently utilise external UPDD API based apps to handle gesture functionality. The current implementation of gesture support for various operating system is as follows:
Under Windows Vista and Windows 7 the driver has a user setting in the UPDD Console, called Extended touch. If enabled all touches are fed to the OS via the virtual HID device to invoke the extended touch functionality (gestures etc) built into these operating systems. If disabled all single touches and the touch data from the 1st stylus (of a multi touch device) are passed to the OS via the mouse port interface (mouse emulation).
UPDD CE 4.1.10 handles touch via the CE standard GWES interface so CE gesture support can be utilised by any touch device using UPDD CE driver
For this OS we use a standalone application which supports Snow Leopard, Lion and Mountain Lion gestures and inking which is available here here.
Important Note: Gesture functionality will only work on a production version of the driver. If you run the software on an evaluation version of the driver the following message will be displayed: “The UPDD Gestures software requires a licenced version of the UPDD driver. The driver currently installed is an evaluation version. To purchase the full version of the driver, please visit http://touch-base.com/”
Installation and running the gesture software
To utilise gestures and inking in the Mac environment you need to simply run the gesture application on a system with a production version of UPDD 4.1.10 or above installed Note that some UPDD driver installs will automatically install the gesture software and create a startup item to invoke the software at user logon.
With the latest version the compressed file expands to create the application file ‘UPDD Gestures’. It is highly recommended that this file is moved to the standard utilities folder along with the other UPDD applications.
Once invoked a Menu Bar icon indicates that the gesture application is loaded and running and can be used to quit the gesture touch function. The latest version also has a Settings menu item to invoke the gesture graphical user interface and a link to show the gesture log. Since May 2013 there is now a history revision option to view changes with each release.
The application interacts with the core UPDD driver or TUIO Server to receive all touches and calculate the gesture being performed and inject this into the OS as a touch event and also pass single touch events to the tablet interface.
The schematic for the gesture interface is as follows:
Utilising the gestures and inking functions are described below.
Gestures are performed on the touch screen exactly as they are on a track-pad. The action associated with each gesture can be defined in the gesture settings dialog. To utilize all available gestures you will need to use a multi-touch touch screen that supports up to 5 touches otherwise you will be restricted to the gestures that relate to the number of stylus supported on the touch screen.
To view the gestures being calculated by the gesture engine in real-time invoke the Show Gestures Log option in the menu bar
Invoking Gesture feature at startup
Until we merge the gesture support into the driver it will remain an external function that needs to manually or automatically invoked at start up. The easiest way to get UPDDGestures to start at boot is the following:
· Enable via the Gesture GUI
1. Invoke the Gesture GUI via the Gesture Menu item, Settings entry.
2. Select the Other Settings dialog
3. Check the ‘Start UPDDGestures at login’ option
· Manually set up a start up item
1. Open System Preferences (in the Apple menu)
2. In the System Preferences window, select "Users & Groups"
3. Select the "Login Items" section
4. Add "Open me to start UPDD Gestures" to the login items. (You can drag its icon onto the list, or click the "+" button and add it using a file browser dialog.)
The Two finger Tap invokes a right click which is generated by default under the left stylus. This behaviour can be changed to generate the right click under the right most stylus. A time threshold is also configurable to specify the time in which a two finger tap can occur.
The Press and Tap invokes a right click which is generated by default under the first stylus. This behaviour can be changed to generate the right click under the second stylus.
In OS X Snow Leopard, four finger swipes typically invoke one of the "Expose" features, or invoke the application switcher. Unfortunately there's no supported way to programmatically activate these features, so UPDDGesturesmacosx posts keystrokes that trigger them. Since the hot key for the "Expose" feature can be configured, UPDDGesturesmacosx reads in the Apple hot key preferences to determine which keystroke is the correct one to press. We believe this works quite successfully and in our test these features get activated consistently. We are keen to find out if it works consistently for our users – any feedback much appreciated!
Single Touch gestures
Mac OS gestures utilise 2 or more touches. However, in some circumstances, user may wish to map single touches to simulate flicks and swipes, especially when using single touch touch screens.
With the latest gesture software each gesture can now be configured on a gesture-by-gesture basis. A typical gesture configuration for single touch touch screens would be as follows:
Set Tap -> Click*
Set Press -> Click and drag*
Set Drag -> Scroll
* note for these gestures that action is the default.
Finally, check the "Disable multitouch gestures" option in the "Other Settings" section.
The iOS simulator allows applications built for iOS (such as the iPhone, iPad) to be developed and tested on an iMac system. To test gestures in this environment you normally hold down the alt/apple key on the keyboard and use a mouse. For users wishing to test touch gestures with a dual/multi touch touch screen we have introduced an option in the gesture engine to run in ‘iOS simulation mode’. This option can be enabled in the gesture settings dialog.
When running in iOS simulator mode please note the following:
note: There's a setting in Mac OS X that allows applications to use
"accessibility"features for interacting with windows and other
elements on the screen and it must be enabled for the gestures
to work in an iOS Simulator mode. Here's how you turn it on:
2. When starting a two finger gesture, it was necessary to send an event releasing the first finger before sending an event to press both fingers down. This didn't have any noticeable effect in our tests with iOS apps.
3. At the start of a two finger gesture there will be a little visual "blip". This is because the mouse is being repositioned to so that the touches in the iOS Simulator match the touches on the touch-screen.
4. It is difficult to send the exact movement of both fingers into the simulator, so it's possible for the touches in the simulator to diverge slightly from the touches on the touch screen. However, performing individual pinch, rotate, and two finger drag gestures works as expected and this didn't have a noticeable effect in the tests we performed.
QT – Cross platform development tool
Some multi-touch applications use a cross platform development tool called QT and use the QTouchEvents interface to receive system-level touches. Unfortunately the standard way Qt determines the screen location of the touches in Mac OS X is incompatible with UPDDGestures: Qt assumes that the touches are coming from a trackpad since all system touches are assumed to originate from a trackpad. In that case, Qt has the touch start at the mouse cursor location (which is not what is needed in a touch screen environment), and the touch's movement speed is calculated using the physical dimensions of the trackpad. However, it won't get any dimensions since no trackpad is present. Instead, it calculates that the trackpad has a width and height of 0, and consequently the touches won't move anywhere.
For touch-enabled Qt applications to work with UPDDGestures they must use the normalized position of the touches, not the screen position. Sadly, we suspect most Qt apps use a touch's screen position.
One such popular multi-touch enabled application is Snowflake from NUITEQ. The developers of Snowflake are working on changing the interface to utilize the normalized position but until this change is made you will need to use our UPDD TUIO bridge to utilize Snowflake’s TUIO interface. This has been tested and works well but it does mean that the gesture software cannot be used at the same time as the TUIO interface as it causes a phantom touch in Snowflake.
Multi-touch browser applications
If you are writing touch-enabled web pages (e.g http://alteredqualia.com/touchtoy/) for use or testing on Mac OS X with touch devices then one solution is to add some extra code to allow receiving touches from TUIO and use a TUIO interface.
Here is a page that we found that outlines how it can be achieved: http://smus.com/multi-touch-browser-patch/
When using UPDD driver then the only difference is rather than running "TongSeng TUIO tracker" (which converts magic trackpad touches into TUIO), you need to run the UPDD TUIO server and the browser would receive touches from the UPDD driver.
Ideally, there is a need of a browser extension that receives touches from TUIO or UPDD and posted them into the browser as W3C TouchEvents (which we believe is the correct name for the in-browser touch events supported by iOS and Android). That would allow all of these touch-enabled web pages to work with TUIO or UPDD touch devices out of the box. We are experimenting with a browser hack to see if this can be achieved.
Inking allows drawings and hand writing on tablet type devices to interact with applications. When a real or virtual tablet is seen by the OS the Inking function is enabled. With the latest version of the gesture application, available since 14/9/11, the inking function is also enabled and touch data is passed to the tablet interface. Real tablets pass more data than the X and Y co-ordinates, such as stylus angle, but when touch is being used this type of data has a fixed value.
After installing the software, and if Inking is available on the system, the Inking option is shown in the System preferences:
Launch the Ink settings panel to enable Hand recognition
Once enabled the Ink floating windows will be displayed
In the following example the touch screen has been used to write “Touch” on to the Inking paper and has been translated ready to be sent to the waiting application:
With Inking enabled, writing into any ink aware application will invoke an inking area in which to write, as in this example :
In addition to hand writing recognition and drawing, gestures can be used to perform various app functions, as listed below:
Given that the UPDD inking function is implemented at a software level and does not create a virtual tablet device there may be some Inking applications that do not enable their inking capabilities due to the lack of a real tablet device on the system.
Further, given that there is no dedicated ‘tablet stylus’ in use the "hold lower button to ink" and "hold upper button to ink" settings have no meaning when inking with UPDD.
By default the gesture actions mimic those gestures associated with an Apple multi touch track for the host version of Mac OS X but can be defined as required. Gesture actions and other gesture settings are held either in the UPDD Settings file. Further, the latest version now utilises a graphics user interface for maintaining and updating the settings.
Gesture GUI and UPDD settings file
With the latest gesture software, first release Jan 2013, the gesture settings are now stored in the UPDD settings file and a graphic interface is available for defining and maintaining the gesture settings. Settings in the UPDD settings file can also be updated with the UPDD command line interface
of the actions that can be invoked by a gesture, such as notification center,
require that "Enable Access for Assistive Devices" is turned on in
the "Universal Access" system preferences.
2. Really important point: When using gestures in Mac OS X the gestures are processed by the application window under the mouse cursor. Dual and multi-touch gestures can be performed on any part of the touch screen but will be processed by the target area. So, for example, if you have a Preview window open and the cursor is in the viewing area then the area will respond to gestures. If the cursor is on the Preview dialog but not in the view area then gestures will be ignored.
Driver setting considerations
The gesture application turns off the UPDD mouse interface and receives all touch data. There are a number of UPDD utilities that re-enable the mouse interface when they terminate, such as calibration and test. Until we change these utilities to retain the current mouse port state they should only be used with the gesture application disabled. Since release April 2013 the Gesture software now caters for this situation.
Lion Full Screen Mode
A users reported that ‘in Lion, moving the "mouse arrow" to the top of the screen may not reveal the Mac OS window bar necessary to get out of Full Screen mode necessitating a need of a proper mouse‘. This may be because the cursor, being under the stylus, is stopping short of the top of the screen. You can force the cursor to the top by using Edge Acceleration settings in the UPDD Console, Properties dialog described here.
When the gesture software loads a gesture menu bar item will be shown. This is the indication that the program is running and hopefully working as expected. If this is not the case you can run the gesture application from a terminal window and see if any error messages are issued at the time the program is invoked.
To run the gesture application execute the following command line:
/Applications/Utilities/UPDD\ Gestures.app/Contents/MacOS/UPDD\ Gestures
Please report any error message shown to email@example.com.
Creates a virtual touch device that is registered with the system as a multi-touch capable device thro’ which all stylus touch data is passed.
The standalone application utilised in Mac OS X calculates individual gestures from the incoming stylus data streams and as such can be considered a ‘gesture engine’. It is our intention in a future release of UPDD build this gesture engine in the driver such that in all cases this gesture information is made available on UPDD’s API so that there is a common interface across all platforms supported by the driver (all individual stylus information is also available).
For further information or technical assistance please email the technical support team at firstname.lastname@example.org.