With the advent of dual and multi touch pointer devices users can now use gestures to interact with the operating system and applications. The major desktop operating systems are all catering for gesture interaction and promoting the use of gestures within applications with well defined gesture application interfaces.
UPDD can be configured to support single, dual and multi-touch devices. For each stylus in use the positional data is made available on the UPDD API as well as optionally passed into the OS and/or applications to cater for gesture utilization.
The injection of single, dual and multi-touch data into operating systems and their applications is dictated by the interface methods implemented within the OS environment. These are the interface methods utilised by the UPDD driver:
The interpretation of the gesture is dependant on the OS and application. There are many articles on the web to describe gesture utilisation by the OS desktop (windows manager) and key standard applications. However, a very useful gesture reference guide can be found here.
It is our intention to build full gesture support into the driver and/or supporting utilities but in some cases we currently utilise external UPDD API based apps to handle gesture functionality. The current implementation of gesture support for various operating system is as follows:
Under Windows Vista and Windows 7 the driver has a user setting in the UPDD Console, called Extended touch. If enabled all touches are fed to the OS via the virtual HID device to invoke the extended touch functionality (gestures etc) built into these operating systems. If disabled all single touches and the touch data from the 1st stylus (of a multi touch device) are passed to the OS via the mouse port interface (mouse emulation).
UPDD CE 4.1.10 handles touch via the CE standard GWES interface so CE gesture support can be utilised by any touch device using UPDD CE driver
For this OS we use a standalone application which supports gestures and inking in Snow Leopard – 10.6, Lion – 10.7, Mountain Lion – 10.8 and Mavericks – 10.9.
This document refers to gesture version 2.0.14 and above, issued 11th July 2013
Installation and running the gesture software
To utilise gestures and inking in the Mac environment you need to simply download the gesture .zip file below and expand the compressed file to create the application file ‘UPDD Gestures’:
It is highly recommended that this file is moved to the standard utilities folder along with the other UPDD Mac applications.
Simply click on the application to run gestures. When running, a gesture menu bar item, if enabled, will be shown in the menu bar.
If this is enabled but does not appear see the troubleshooting section below.
If the menu bar item is disabled but the gesture software is loaded then running the gesture software again will invoke the gesture settings dialog.
Important installation notes:
1. Retain original name
2. Requires production version of the driver
3. Automatically installed with driver
4. Invoking gesture software at startup
Enable via the Gesture GUI
Manually set up a start up item
There are 2 simple steps to manually uninstall the gesture software:
1. Remove the start up item if gestures have been configured to be invoked at start up:
· Invoke the Gesture GUI via the Gesture Menu item, Settings entry.
· Select the Other Settings dialog
Uncheck the ‘Start UPDD Gestures at
2. Delete the application
· Locate and drag the application to the trash can.
Once invoked, and if enabled, a Menu Bar icon indicates that the gesture application is loaded and running and can be used to quit the gesture touch function.
Since version 2.0.14 the Menu Bar item can be optionally disabled and with some OEM versions of the gesture software this is the default state so no Menu Bar item is shown. The Menu Bar item can be enabled / disabled as required in the gesture settings dialog.
With gesture version 2.x.x there is now a Settings menu item to invoke the gesture graphical user interface and a link to show the gesture log. Since 2.0.10 there is now a history revision option to view changes with each release.
The application interacts with the core UPDD driver or TUIO Server to receive all touches and calculate the gesture being performed and inject this into the OS as a touch event and also pass single touch events to the tablet interface.
The schematic for the gesture interface is as follows:
Utilising the gestures and inking functions are described below.
Gestures are performed on the touch screen exactly as they are on a track-pad. The action associated with each gesture can be defined in the gesture settings dialog. To utilize all available gestures you will need to use a multi-touch touch screen that supports up to 5 touches otherwise you will be restricted to the gestures that relate to the number of stylus supported on the touch screen.
A number of videos have been posted on the web from end uses such as this one here.
To view the gestures being calculated by the gesture engine in real-time invoke the Show Gestures Log option in the menu bar:
For each “Detected gesture” there should be a corresponding “Effective gesture” indicating the gesture performed.
The example log above is from version 2.0.23 with additional logging information implemented to track down any reported occurrences where some "detected" gestures don't have an "effective" gesture. E.g. In the case of three finger drags and swipes, it should choose an "effective" gesture for one or the other every single time but we have had reported incidences of no effective gesture being selected. Should you experience this please send log output along with your support email.
A more detailed explanation follows of the log entries:
Detected gesture: *** ---> means the analysis detected a basic gesture, like three finger drag.
...Detected: *** ---> reveals new information from the analysis about the current gesture, e.g. the three finger drag is swiping left
Effective gesture: *** Performing action: *** ---> is which of the configurable gestures in the GUI is matched to the gesture from the analysis, and which action is being performed, e.g. it picks "three finger swipe left" instead of "three finger drag" and performs the appropriate action
Inoperative gesture: *** ---> is which of the configurable gestures could have been matched to the analysis but wasn't because of the user's settings, e.g. "three finger drag" because the effective gesture was "three finger swipe left"
(gesture ended) ---> means the analysis detected the end of a gesture
The Two finger Tap invokes a right click which is generated by default under the left stylus. This behaviour can be changed to generate the right click under the right most stylus. A time threshold is also configurable to specify the time in which a two finger tap can occur.
The Press and Tap invokes a right click which is generated by default under the first stylus. This behaviour can be changed to generate the right click under the second stylus.
In OS X Snow Leopard, four finger swipes typically invoke one of the "Expose" features, or invoke the application switcher. Unfortunately there's no supported way to programmatically activate these features, so UPDDGesturesmacosx posts keystrokes that trigger them. Since the hot key for the "Expose" feature can be configured, UPDDGesturesmacosx reads in the Apple hot key preferences to determine which keystroke is the correct one to press. We believe this works quite successfully and in our test these features get activated consistently. We are keen to find out if it works consistently for our users – any feedback much appreciated!
Single Touch gestures
Mac OS gestures utilise 2 or more touches. However, in some circumstances, user may wish to map single touches to simulate flicks and swipes, especially when using single touch touch screens.
With the latest gesture software each gesture can now be configured on a gesture-by-gesture basis. A typical gesture configuration for single touch touch screens would be as follows:
Set Tap -> Click*
Set Press -> Click and drag*
Set Drag -> Scroll
* note for these gestures that action is the default.
Finally, check the "Disable multitouch gestures" option in the "Other Settings" section.
Observed gesture action delay
These issues were initially raised by users using Avid Pro Tools and Proppelerhead Reason when using small audio faders or adjusting some controls to adjust their values.
Gestures cannot be certain a single touch is actually a tap until the finger has been released. You may observe a slight delay when tapping on buttons that's not present when Gestures isn't running or when using a mouse.
By default, a single touch needs to move a minimum of 10 screen pixels in order for Gestures to detect it as a "drag" gesture so applications won't receive a mouse event until the 10 pixels have been traversed, creating the slight lag. With version 2.0.22 the pixel threshold is now configurable in the settings program, Drags and Swipes dialog and a smaller value reduces the lag or can eliminate it altogether.
Understanding Swipes and drags
As far as the gesture analysis function is concerned a swipe is fundamentally a drag gesture that's moving in one of the cardinal directions, which is to say swipes are a subset of drag gestures. The speed of the fingers or the time for which they contact the touch screen isn't considered, so a quick temp swipe on the screen and a slower, well defined movement should both be detected as a drag, and depending on the direction of movement also a swipe. It is for the reason the Gestures GUI has a setting for choosing between the two.
What's going on behind the scenes in Gestures is that when the analysis determines that the touches are performing some manner of drag gesture, it also reports the swipe direction, if one is detected. This way an app using the analysis can decide which specific gesture it wants to respond to.
We designed the analysis to be more general purpose, reporting all of the relevant information about a current gesture. There's separate logic in gestures to take the results of the analysis and decide which of the specific gestures in the GUI to respond to (like swipes vs. drags) based on the current gesture settings.
Starting with gesture release 2.0.12 you can configure the system onscreen keyboard to be automatically invoked.
Using the Gesture settings program, Other Settings tab, you can configure the keyboard to be displayed when a text input field has focus or restrict the usage to secure text fields only:
Unfortunately this feature has some limitations:
· It does not work at system login – we may be able to overcome this limitation if required – please contact us.
· It only works with applications that support OS X's accessibility features. Two prominent examples of apps that don't support it are Chrome and Firefox. Fortunately it works great in Safari. Also, all Qt apps currently don't support OS X accessibility, though apparently Qt 5.1 addresses this issue.
In order for this feature to work in OS X 10.6, the "Show Keyboard & Character Viewers in menu bar" button must be checked in the Keyboard system preferences.
A user reported that is that the pop-up keyboard doesn't latch control buttons i.e. Shift, Ctrl, Command and Alt keys. An on-line discussion suggested activating Sticky Keys in preferences/universal access/keyboard.
The gesture software can be used to hide the mouse cursor. The feature uses an undocumented API (presumably a feature not encouraged by Apple) to hide the cursor so there are no guarantees it works 100%. This feature can be enabled in the gesture settings dialog, Other Settings tab, ‘Hide Mouse Cursor during touches’ checkbox. The cursor is enabled as soon as the cursor is moved by something other than the touch screen, including a mouse or trackpad.
We have found that in some circumstances the cursor is made visible, such as switching apps or touching into flash movies or moving over the system dock, so since gesture version 2.0.21 the cursor is periodically hidden if it is found to be visible. The periodic test to check the cursor is started when the "hide cursor during touches" feature is enabled, but it will only hide the cursor if it is supposed to be hidden, such that if it is being shown due to mouse/trackpad movement then the cursor will stay visible until the next time a touch begins.
Unfortunately, if an application constantly forces the cursor to be visible, such as if the cursor is over a flash movie, then the cursor will stay visible.
Some applications check for the device id of the device generating the gesture before processing the gesture. One such feature that can be enabled if a real track pad is connected is the Asian Trackpad handwriting feature. If a trackpad is connected and the gesture is performed on the trackpad only then will this feature work.
Since gesture version 2.0.21 the gesture software will utilise the same device id of a connected trackpad to cater for any features or applications that are testing the source of the gesture. If no trackpad is found then gestures uses its own ‘dummy’ device id. Gestures can be forced to use its own id, even if a trackpad is connected, if the setting gesturedefaultdeviceid=1 is defined in the UPDD Setting file.
TUIO Server interface
The gesture software can be configured to receive touch data from a TUIO server and generate gesture functions. This is useful on systems where touch co-ordinate and stylus data is being delivered on the TUIO interface, such as a touch screen device that may not be supported by the UPDD driver but does have Mac OS X software to create a TUIO server interface. This setup would normally be created to run TUIO client applications.
Using the Gesture settings program, Other Settings tab, you can setup the TUIO interface as shown below:
When used to work with a touch device not directly supported by our UPDD driver, the driver must still be installed on the system to satisfy the gesture software that it is working with our UPDD driver.
The iOS simulator allows applications built for iOS (such as the iPhone, iPad) to be developed and tested on an iMac system. To test gestures in this environment you normally hold down the alt/apple key on the keyboard and use a mouse. For users wishing to test touch gestures with a dual/multi touch touch screen we have introduced an option in the gesture engine to run in ‘iOS simulation mode’. This option can be enabled in the gesture settings dialog.
When running in iOS simulator mode please note the following:
note: There's a setting in Mac OS X that allows applications to use
"accessibility"features for interacting
with windows and other elements on the screen and it must be enabled for the
gestures to work in an iOS Simulator mode. Here's
how you turn it on:
2. When starting a two finger gesture, it was necessary to send an event releasing the first finger before sending an event to press both fingers down. This didn't have any noticeable effect in our tests with iOS apps.
3. At the start of a two finger gesture there will be a little visual "blip". This is because the mouse is being repositioned to so that the touches in the iOS Simulator match the touches on the touch-screen.
4. It is difficult to send the exact movement of both fingers into the simulator, so it's possible for the touches in the simulator to diverge slightly from the touches on the touch screen. However, performing individual pinch, rotate, and two finger drag gestures works as expected and this didn't have a noticeable effect in the tests we performed.
Since version 2.0.24 the gesture extension has added support for Pen devices that present proximity, left click (via nib) Right click barrel button, pressure and eraser features. Utilising the driver only will support the Left (nib) and Right clicks but pressure and eraser support is currently built into the gesture extension software.
Tablet/Pen device inputs are always passed into the system as tablet events. Pen nib and co-ordinate information bypasses the gesture engine, i.e. is not processed for gesture consideration.
Currently there is no support for the pen upper side switch (as seen on Wacom pen devices).
QT – Cross platform development tool
Some multi-touch applications use a cross platform development tool called QT and use the QTouchEvents interface to receive system-level touches. Unfortunately the standard way Qt determines the screen location of the touches in Mac OS X is incompatible with UPDDGestures: Qt assumes that the touches are coming from a trackpad since all system touches are assumed to originate from a trackpad. In that case, Qt has the touch start at the mouse cursor location (which is not what is needed in a touch screen environment), and the touch's movement speed is calculated using the physical dimensions of the trackpad. However, it won't get any dimensions since no trackpad is present. Instead, it calculates that the trackpad has a width and height of 0, and consequently the touches won't move anywhere.
For touch-enabled Qt applications to work with UPDDGestures they must use the normalized position of the touches, not the screen position. Sadly, we suspect most Qt apps use a touch's screen position.
One such popular multi-touch enabled application is Snowflake from NUITEQ. The developers of Snowflake are working on changing the interface to utilize the normalized position but until this change is made you will need to use our UPDD TUIO bridge to utilize Snowflake’s TUIO interface. This has been tested and works well but it does mean that the gesture software cannot be used at the same time as the TUIO interface as it causes a phantom touch in Snowflake.
Programmers reported that when using Gestures, the initial touch does not trigger MouseDown code within their application. Further investigation showed that if gestures was to invoke a mouse down on touch then there's no way to "cancel" a mouse down if a single touch ends up being a gesture without causing a click at the point of touch. Clicking at the point of touch, when performing, say a scroll, could cause highly irritating and confusing situations to occur -- all sorts of things on the screen will end up being unintentionally clicked! This is why Gestures withholds any mouse events until it's determined what gesture is being performed and hence a mouse down event will not immediately be triggered.
It is our recommendation that a developer should use the system-wide touch events that Gestures generates (rather than mouse events) to cause a ‘mouse down’ event when a touch is occurring. These touch events are delivered as soon as a touch occurs, so there's no delay, and they won't trigger the button to be clicked at inappropriate times.
Multi-touch browser applications
Starting with JavaFX 2.2, users can interact with JavaFX applications using touches and gestures on touch-enabled devices. Touches and gestures can involve a single point or multiple points of contact. The type of event that is generated is determined by the touch or type of gesture that the user makes. This link describes working with events from Touch-enabled devices. We have compiled and tested the Gesture Events example program with our Mac OS X gesture implementation and found it all to work as expected therefore any JavaFX touch event driven applications should work as expected.
Logic Pro X
A Logic Pro user reported they were trying to make logic learn commands that come from the touch screen.
We were able to control Logic Pro X using Gestures by assigning different gestures to the "Keystroke" action, and then setting the keystroke either to an existing key command for Logic, or a new keystroke, and then teaching it to Logic using its "Key Commands" editor.
Inking allows drawings and hand writing on tablet type devices to interact with applications. When a real or virtual tablet is seen by the OS the Inking function is enabled. With the latest version of the gesture application, available since 14/9/11, the inking function is also enabled and touch data is passed to the tablet interface. Real tablets pass more data than the X and Y co-ordinates, such as stylus angle, but when touch is being used this type of data has a fixed value.
After installing the software, and if Inking is available on the system, the Inking option is shown in the System preferences:
Note: The "Ink" system preference usually only appears when a tablet is connected. However, if a tablet is not connected, and to make Inking configurable with Gestures, we make a copy of the Ink preference pane that appears only when the original one is hidden. In this situation the Ink will appear in the "Other" section of System Preferences rather than the "Hardware" section.
Launch the Ink settings panel to enable Hand recognition
Once enabled the Ink floating windows will be displayed
In the following example the touch screen has been used to write “Touch” on to the Inking paper and has been translated ready to be sent to the waiting application:
With Inking enabled, writing into any ink aware application will invoke an inking area in which to write, as in this example :
In addition to hand writing recognition and drawing, gestures can be used to perform various app functions, as listed below:
Given that the UPDD inking function is implemented at a software level and does not create a virtual tablet device there may be some Inking applications that do not enable their inking capabilities due to the lack of a real tablet device on the system.
Further, given that there is no dedicated ‘tablet stylus’ in use the "hold lower button to ink" and "hold upper button to ink" settings have no meaning when inking with UPDD.
By default the gesture actions mimic those gestures associated with an Apple multi touch track for the host version of Mac OS X but can be defined as required. Gesture actions and other gesture settings are held either in the UPDD Settings file. Further, the latest version now utilises a graphics user interface for maintaining and updating the settings.
Gesture GUI and UPDD settings file
With version 2 of the gesture software, first release Dec 2012, the gesture settings are now stored in the UPDD settings file and a graphic interface is available for defining and maintaining the gesture settings. Settings in the UPDD settings file can also be updated with the UPDD command line interface
Previous GUI here.
1. Really important point: When using gestures in Mac OS X the gestures are processed by the application window under the mouse cursor. Dual and multi-touch gestures can be performed on any part of the touch screen but will be processed by the target area. So, for example, if you have a Preview window open and the cursor is in the viewing area then the area will respond to gestures. If the cursor is on the Preview dialog but not in the view area then gestures will be ignored.
of the actions that can be invoked by a gesture, such as notification center, require that "Enable Access for Assistive
Devices" is turned on in the "Universal Access" system
preferences as described below.
· Any mouse events created by Gestures through "Click" and "Click and drag" actions are also tablet events. This is because in OS X tablet events are actually special mouse events with extra tablet data included.
The "Ink" preference pane will be
visible in System Preferences, if it was not already, allowing Inking to be
configured. If Inking is enabled, it can be used with Gestures through the
"Click" and "Click and drag" actions.
Driver setting considerations
The gesture application turns off the UPDD mouse interface and receives all touch data. There are a number of UPDD utilities that re-enable the mouse interface when they terminate, such as calibration and test. Until we change these utilities to retain the current mouse port state they should only be used with the gesture application disabled. Since release April 2013 the Gesture software now caters for this situation.
Lion Full Screen Mode
A users reported that ‘in Lion, moving the "mouse arrow" to the top of the screen may not reveal the Mac OS window bar necessary to get out of Full Screen mode necessitating a need of a proper mouse‘. This may be because the cursor, being under the stylus, is stopping short of the top of the screen. You can force the cursor to the top by using Edge Acceleration settings in the UPDD Console, Properties dialog described here.
Gestures does not load (no error issued)
When the gesture software loads a gesture menu bar item will be shown (if enabled to be shown in the gesture settings). This is the indication that the program is running and hopefully working as expected. If the gesture software does not appear to be running or loading correctly you can run the gesture application from a terminal window and see if any error messages are issued at the time the program is invoked.
To run the gesture application execute the following command line:
/Applications/Utilities/UPDD\ Gestures.app/Contents/MacOS/UPDD\ Gestures
Please report any error message shown to email@example.com.
Gestures stop working
If the gesture menu bar item is enabled but ‘disappears’ from the menu bar and/or gestures stop working it is possible that the gesture program has crashed. If this is the case there should be a crash log located in the following path:
Usually the Library folder is hidden, so you may need to do the following to open it:
1. In the Finder pick the "Go to folder..." menu item in the "Go" menu, or press Command K.
2. Type in the following:
Difficulty generating gesture
The performed gesture does not work as expected. In this instance please view the gesture log, as selected from the Menu Bar item, to see what gesture is being calculated by the gesture engine. If the correct gesture is shown and it is consistent with the gesture being performed then ensure your usage of the gesture is correct for the application or desktop function.
If the log shows that the gesture is inconsistent in its generation see if any setting in the driver is causing the issue, such as the lift off time needing to be increased so that short breaks in the touch are ignored (UPDD Console, Properties, Lift off time).
Creates a virtual touch device that is registered with the system as a multi-touch capable device thro’ which all stylus touch data is passed.
The standalone application utilised in Mac OS X calculates individual gestures from the incoming stylus data streams and as such can be considered a ‘gesture engine’. It is our intention in a future release of UPDD build this gesture engine in the driver such that in all cases this gesture information is made available on UPDD’s API so that there is a common interface across all platforms supported by the driver (all individual stylus information is also available).
For further information or technical assistance please email the technical support team at firstname.lastname@example.org.