In Windows 7 we have WM_GestureNotify, WM_Gesture and WM_Touch Messages. We get WM_GestureNotify and WM_Gesture by default (like mouse messages). But WM_Touch messages only fire if you enable them by calling RegisterTouchWindow().
WM_Touch gives you more processing power than WM_GESTURE. For example tacking multiple fingers (up, down, move) on multiple targets. You can pipe the touch events through manipulation and inertia processing to get standard gesture type ability. Enabling WM_Touch messages for a control will stop that control from receiving WM_Gesture messages (you can't have both). But WM_GestureNotify messages will always fire. Calling UnregisterTouchWindow() will restore WM_Gesture messages for the control. There is no requirement to call UnregisterTouchWindow() once for every RegisterTouchWindow() call.
In the WM_Touch event we call GetTouchInputInfo() to get an array of TouchInput records. WParam=number of records. LParam=Touch Input Handle (passed to GetTouchInputInfo). Each TouchInput record tells us a finger position and state (up, down, move) etc. Make sure you call DefWindowProc() or CloseTouchInputHandle() to close the handle and pass on the touch event.
WM_GestureNotify fires as soon as you touch the screen. It always proceeds other touch & gesture messages. You get the screen position of the first finger touch, and the handle of the screen control you touched. When the GestureNotify event fires you can call SetGestureConfig() to control which WM_Gesture messages are sent and whether the Pan gesture should include inertia, gutter effect, etc.
WM_Gestures gives you a great variety of standard gestures. The message WParam gives you the gesture id (GID_...), while LParam is a handle used to call GetGestureInfo(Msg.LParam, gestureInfo) to get gesture detail. It's important to free this handle by calling DefWindowProc() or CloseGestureInfoHandle(Msg.LParam). If you don't handle the message then make sure you call DefWindowProc() so Windows can process the message.
See also Windows Touch Gesture Overview (MSDN).
Here is some typical debug for a Zoom (pinch) gesture. After calling GetGestureInfo(Msg.LParam, gestureInfo) we can find the dwFlag, hwndTarget etc. Physical screen coordinates usually need to be converted to client coordinates (using Delphi's Ctrl.ScreenToClient() method) to be useful.
For Pan we see similar but notice we also get inertia messages (if enabled). This is when your finger flicks and leaves the screen but the location continues to move (in the direction of the flick) gradually slowing to a stop. Inertia data gives you a kind of x,y countdown. You can see why you might want to turn off inertia if you don't need it. Notice the Y values don't change? This is because Pan Gutter mode is enabled and although my finger is moving erratically, the gesture system knows I'm mainly moving in an -ve X direction and is keeping the location in an X gutter.
Flicks are also a standard gesture but come in via the older tablet message WM_Tablet_Flick and not via WM_GESTURE.
If your form is listening to WM_Tablet_Flick messages and the form is empty of child controls you will get the messages. However, unlike WM_Gesture above, once you cover your form with panels and controls the messages stop arriving, This is because the flick message are sent only to the control you touch. Where as WM_Gesture message go to the form even if you touch a control on the form.
This is why WM_GESTURE is such a nice event. You can simply listen at the form level and you will receive all WM_Gesture events including the handle of the control touched.
So WM_Tablet_Flick is a problem for pre-Delphi 2010 since we don't really want to sub-class all our controls (make new controls) just so we can get flick events.
The solution (work-around) is to forget about WM_Tablet_Flick and instead listen to the follow-on key events. The way Windows works is that when a gesture occurs, Windows also sends out various Mouse, Scroll, Key events so that touch works with standard controls (no extra programming required).
I'm going to talk from a Delphi 3/4/5/6/7 perspective since that's what I'm currently working in.
First a couple of notes:
D2010 now natively supports multi-touch, with each control offering touch events. And I believe you can even define your own gestures. So if you are lucky enough to be using D2010 or later you can forget the rest of this chat and go and read the Delphi manual.
D2/D3 were 32bit windows but I haven't played that far back. Our demos may work, and may not.
D4 was the last Delphi to have binary forms. You will need to use a Delphi utility to convert the demo .DFM form files to binary.
D4/5/6 did not have a UInt64 type, but did have int64 which is close enough when porting the the touch header files.
Adding touch messages to a Delphi form is fairly trivial. Note for WM_Gesture and WM_Touch we get a handle via Msg.LParam. It's important that once you have finished with these handles you must closed them to avoid memory leaks. Either call DefWindowProc() or if you handle the message, you can explicitly close the handle by calling CloseGestureInfoHandle(LParam) for WM_Gesture, or CloseTouchInputHandle(LParam) for WM_Touch.
You can of course design controls that listen to these multi-touch messages. Special controls such as a graphical displays you probably should go to the trouble if you have the source. If you use a spin edit control you might want to sub-class so you can touch the control and pan up and down to change the spin edit value. You code once and all instances of the control behave the same.
Microsoft have done a great job implementing touch. Touch just works without you having to write extra code (as long as your controls inherit from standard Windows controls). There may be a few places where you do need touch messages. Instead of writing new controls, just listen at the form level. As you see in the code above it's clean and simple to do this. Even at the form level you will get the handle of the control receiving the event.
XY touch points are given in physical screen coordinates. To convert to logical control coordinates do the following. The function PhysicalToLogicalPoint() is a new function for Windows 7. Here winControl is the control touched. For WM_Touch messages we get X,Y positions in 100th of a pixel, so we need to do an integer divide by 100. _TouchInputs is the array of TouchInput we received from the WM_TOUCH message using GetTouchInputInfo().
The rest of the project: