Home‎ > ‎My Notebook‎ > ‎

Multi Touch


WM_GestureNotify, WM_Gesture & WM_Touch

In Windows 7 we have WM_GestureNotify, WM_Gesture and WM_Touch Messages. We get WM_GestureNotify and WM_Gesture by default (like mouse messages). But WM_Touch messages only fire if you enable them by calling RegisterTouchWindow().

WM_Touch gives you more processing power than WM_GESTURE. For example tacking multiple fingers (up, down, move) on multiple targets. You can pipe the touch events through manipulation and inertia processing to get standard gesture type ability. Enabling WM_Touch messages for a control will stop that control from receiving WM_Gesture messages (you can't have both). But WM_GestureNotify messages will always fire. Calling UnregisterTouchWindow() will restore WM_Gesture messages for the control. There is no requirement to call UnregisterTouchWindow() once for every RegisterTouchWindow() call.

In the WM_Touch event we call GetTouchInputInfo() to get an array of TouchInput records. WParam=number of records. LParam=Touch Input Handle (passed to GetTouchInputInfo). Each TouchInput record tells us a finger position and state (up, down, move) etc. Make sure you call DefWindowProc() or CloseTouchInputHandle() to close the handle and pass on the touch event.

WM_GestureNotify fires as soon as you touch the screen. It always proceeds other touch & gesture messages. You get the screen position of the first finger touch, and the handle of the screen control you touched. When the GestureNotify event fires you can call SetGestureConfig() to control which WM_Gesture messages are sent and whether the Pan gesture should include inertia, gutter effect, etc.

WM_Gestures gives you a great variety of standard gestures. The message WParam gives you the gesture id (GID_...), while LParam is a handle used to call GetGestureInfo(Msg.LParam, gestureInfo) to get gesture detail. It's important to free this handle by calling DefWindowProc() or CloseGestureInfoHandle(Msg.LParam). If you don't handle the message then make sure you call DefWindowProc() so Windows can process the message.
  • GID_Pan (tracking of one or two fingers)  - You get the live distance between fingers (zero for single finger), and the center point between fingers (which can move). So we have both pan & zoom messages together. Pan options include: X and Y gutter, and inertia (messages continue after your finger leaves the screen if you flick). 
  • GID_Rotate - The first message gives you the angle made by your two fingers. Other messages provide the angle delta (change of angle since first message). You also get the screen position between your fingers (which can move). So we have both rotation & pan messages together.
  • GID_Zoom - Gives distance between fingers and screen position between fingers (which can move). So we have both zoom & pan.
  • GID_PressAndTap - Gives the position of the first finger (which can move), and the x,y delta offset of the second finger (when you tap). So again we have panning as a kind of after touch event.
  • GID_TwoFingerTap - Gives you the screen position between both fingers. Now since WM_GestureNotify gives you the first finger position, you can calculate the second finger position if required.
  • GID_Begin, GID_END - These messages come in before and after the above messages. MSDN help tells us that applications should ignore these two messages, but you must pass them on to DefWindowProc(). 

Typical Debug - Zoom

Here is some typical debug for a Zoom (pinch) gesture. After calling GetGestureInfo(Msg.LParam, gestureInfo) we can find the dwFlag, hwndTarget etc. Physical screen coordinates usually need to be converted to client coordinates (using Delphi's Ctrl.ScreenToClient() method) to be useful.

WM_GESTURENOTIFY, $0000, $12E3F0 // PGestureNotifyStruct($12E3F0) = handle of control touched + Screen location of touch
WM_GESTURE, WParam=1=GID_BEGIN, LParam=Handle($2DD40000) // ignore
WM_GESTURE, WParam=3=GID_ZOOM,  LParam=Handle($2DDA0000) >> dwFlags:$0001=GF_BEGIN, hwndTarget:000105A6, ptsLocation(702,355)
WM_GESTURE, WParam=3=GID_ZOOM,  LParam=Handle($2DE30000) >> dwFlags:$0000=nothing,  hwndTarget:000105A6, ptsLocation(704,350)
WM_GESTURE, WParam=3=GID_ZOOM,  LParam=Handle($2DE60000) >> dwFlags:$0000=nothing,  hwndTarget:000105A6, ptsLocation(704,350)
WM_GESTURE, WParam=3=GID_ZOOM,  LParam=Handle($2DE00000) >> dwFlags:$0004=GF_END,   hwndTarget:000105A6, ptsLocation(704,350)
WM_GESTURE, WParam=2=GID_END,   LParam=Handle($2DE60000) // ignore

Typical Debug - Pan with inertia

For Pan we see similar but notice we also get inertia messages (if enabled). This is when your finger flicks and leaves the screen but the location continues to move (in the direction of the flick) gradually slowing to a stop. Inertia data gives you a kind of x,y countdown. You can see why you might want to turn off inertia if you don't need it. Notice the Y values don't change? This is because Pan Gutter mode is enabled and although my finger is moving erratically, the gesture system knows I'm mainly moving in an -ve X direction and is keeping the location in an X gutter.

WM_GESTURENOTIFY, $0000, $12F94C >> hwndTarget=$20592, ptsLocation=(1016,149)
WM_GESTURE, WParam=1=GID_BEGIN, LParam=Handle($2E380000) // ignore
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E3A0000) >> dwFlags:$0001=GF_BEGIN,   hwndTarget:00020592, ptsLoc:(1016,149)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E3C0000) >> dwFlags:$0000=nothing,    hwndTarget:00020592, ptsLoc:(1001,149)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E3E0000) >> dwFlags:$0000=nothing,    hwndTarget:00020592, ptsLoc:(929,149)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E400000) >> dwFlags:$0000=nothing,    hwndTarget:00020592, ptsLoc:(881,149)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E420000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(813,149), ullArg:(-1630,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E440000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(748,149), ullArg:(-1520,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E460000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(593,149), ullArg:(-1409,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E480000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(563,149), ullArg:(-1333,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E4A0000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(454,149), ullArg:(-1174,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E4C0000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(401,149), ullArg:(-1026,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E4E0000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(352,149), ullArg:(-950,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E500000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(261,149), ullArg:(-950,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E520000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(219,149), ullArg:(-250,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E540000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(141,149), ullArg:(-120,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E560000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(107,149), ullArg:(-30,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E580000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(44,149),  ullArg:(0,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E5A0000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(16,149),  ullArg:(0,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E5C0000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(-8,149),  ullArg:(0,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E5E0000) >> dwFlags:$0002=GF_INERTIA, hwndTarget:00020592, ptsLoc:(-52,149), ullArg:(0,0)
WM_GESTURE, WParam=4=GID_PAN, LParam=Handle($2E6E0000) >> dwFlags:$0006=GF_END+GF_INERTIA, Hwnd:00020592,ptsLoc:(-131,149),ullArg:(0,0)
WM_GESTURE, WParam=2=GID_END, LParam=Handle($2E700000) // ignore

WM_Tablet_Flick

Flicks are also a standard gesture but come in via the older tablet message WM_Tablet_Flick and not via WM_GESTURE.

The Problem and the Solution

If your form is listening to WM_Tablet_Flick messages and the form is empty of child controls you will get the messages. However, unlike WM_Gesture above, once you cover your form with panels and controls the messages stop arriving, This is because the flick message are sent only to the control you touch. Where as WM_Gesture message go to the form even if you touch a control on the form. 

This is why WM_GESTURE is such a nice event. You can simply listen at the form level and you will receive all WM_Gesture events including the handle of the control touched.

So WM_Tablet_Flick is a problem for pre-Delphi 2010 since we don't really want to sub-class all our controls (make new controls) just so we can get flick events. 

The solution (work-around) is to forget about WM_Tablet_Flick and instead listen to the follow-on key events. The way Windows works is that when a gesture occurs, Windows also sends out various Mouse, Scroll, Key events so that touch works with standard controls (no extra programming required).

Delphi & Multi-Touch

I'm going to talk from a Delphi 3/4/5/6/7 perspective since that's what I'm currently working in. 
First a couple of notes:
  • Before calling multi-touch functions make sure you detect Windows 7 (Windows version 6.1) or geater. 
  • Touch APIs are win32 so .NET framework apps must use Interoperability to access touch. 

Delphi 2010 

D2010 now natively supports multi-touch, with each control offering touch events. And I believe you can even define your own gestures. So if you are lucky enough to be using D2010 or later you can forget the rest of this chat and go and read the Delphi manual.

Delphi 4/5/6/7...2009

D2/D3 were 32bit windows but I haven't played that far back. Our demos may work, and may not.
D4 was the last Delphi to have binary forms. You will need to use a Delphi utility to convert the demo .DFM form files to binary.
D4/5/6 did not have a UInt64 type, but did have int64 which is close enough when porting the the touch header files.

Adding touch messages to a Delphi form is fairly trivial. Note for WM_Gesture and WM_Touch we get a handle via Msg.LParam. It's important that once you have finished with these handles you must closed them to avoid memory leaks. Either call DefWindowProc() or if you handle the message, you can explicitly close the handle by calling CloseGestureInfoHandle(LParam) for WM_Gesture, or CloseTouchInputHandle(LParam) for WM_Touch.

const
  WM_GESTURENOTIFY    = $011A;
  WM_GESTURE          = $0119;
  WM_TOUCH            = $0240;

  WM_TABLET_DEFBASE   = $02C0;
  WM_TABLET_FLICK = WM_TABLET_DEFBASE + 11;

type
  TGestureNotifyStruct = record
    cbSize: UINT;                    // size, in bytes, of this structure
    dwFlags: DWORD;                  // unused
    hwndTarget: HWND;                // handle to window targeted by the gesture
    ptsLocation: TSmallPoint;        // starting location
    dwInstanceID: DWORD;             // internally used
  end;
  PGestureNotifyStruct = ^TGestureNotifyStruct;

  TWMGestureNotify = packed record
    Msg: Cardinal;
    Unused: WPARAM;
    NotifyStruct: PGestureNotifyStruct;
    Result: Integer;
  end;

type
  TMyForm = class(TForm)
    ...
  protected
    procedure WMGestureNotify(var Msg: TWMGestureNotify); message WM_GESTURENOTIFY;
    procedure WMGesture(var Msg: TMessage); message WM_GESTURE;
    procedure WMTabletFlick(var Msg: TMessage); message WM_TABLET_FLICK;
    ...
  end;

procedure TMyForm.WMGestureNotify(var Msg: TWMGestureNotify);
begin
  ...
  Msg.Result := DefWindowProc(mForm.Handle, Msg.Msg, Msg.Unused, Longint(Msg.NotifyStruct));
end;

procedure TMyForm.WMGesture(var Msg: TMessage);
begin
  ...
  Msg.Result := DefWindowProc(mForm.Handle, Msg.Msg, Msg.WParam, Msg.LParam);
end;

procedure TMyForm.WMTabletFlick(var Msg: TMessage);
begin
  ...
  Msg.Result := DefWindowProc(mForm.Handle, Msg.Msg, Msg.WParam, Msg.LParam);
end;

You can of course design controls that listen to these multi-touch messages. Special controls such as a graphical displays you probably should go to the trouble if you have the source. If you use a spin edit control you might want to sub-class so you can touch the control and pan up and down to change the spin edit value. You code once and all instances of the control behave the same.

Microsoft have done a great job implementing touch. Touch just works without you having to write extra code (as long as your controls inherit from standard Windows controls). There may be a few places where you do need touch messages. Instead of writing new controls, just listen at the form level. As you see in the code above it's clean and simple to do this. Even at the form level you will get the handle of the control receiving the event.

Tips

Converting X, Y Points

XY touch points are given in physical screen coordinates. To convert to logical control coordinates do the following. The function PhysicalToLogicalPoint() is a new function for Windows 7. Here winControl is the control touched. For WM_Touch messages we get X,Y positions in 100th of a pixel, so we need to do an integer divide by 100. _TouchInputs[] is the array of TouchInput we received from the WM_TOUCH message using GetTouchInputInfo().

var p: TPoint;
begin
  p.x := _TouchInputs[i].x DIV 100; 
  p.y := _TouchInputs[i].y DIV 100; 
  PhysicalToLogicalPoint(winControl.handle, p);
  p := winControl.ScreenToClient(p);
end;

Converting a Project to Touch

At the moment we are converting our graph controls over to touch. In this case we plan to edit our graph component and enable WM_TOUCH messages for just that control.
  • Pinch (zoom) the x or y scales will change just the x or y scale.
  • Two finger pinch vertically - changes y axis scale.
  • Two finger pinch horizontally - changes x axis scale.
  • Two finger pinch in between - changes both x & y proportionally to the amount of x & y pinched.
  • One or two finger panning (obviously you need to be zoomed in to pan).
  • Two finger tap - reset the graph zoom to defaults (auto scale).
The rest of the project:
  • Our spin number control may use touch and pan to spin the value. Like Adobe do with a mouse click and drag.
  • Multi-Page Controls will use swipe to change the page.
  • Everything else we get for free with Windows 7 touch (since touch controls the mouse).
  • We need to check all controls with scrollbars and fix if they can't be controlled using touch.
  • Some buttons and spacing may need to be adjusted to work with touch.



Comments