HP TouchSmart 9100 - Business PC User guide

1
Resources for Developing Touch-Friendly
Applications for HP Business Touch-
Enabled Desktops
Table of Contents:
Overview...........................................................................................3
Scope ...............................................................................................3
Hardware Platforms .................................................................................3
Operating Systems...................................................................................3
Developing Touch-Friendly Applications on an HP Touch Platform ............4
Software Development Practice ............................................................4
General User Experience Guidelines for a Touch-Friendly Application ......5
Usability Testing for Touch Applications ................................................6
Touch Support in Windows XP®...........................................................7
Touch Support in Windows Vista® .......................................................7
System Gestures ......................................................................................7
Flick Gestures..........................................................................................8
Application Gestures..............................................................................10
Enabling/Disabling Touch Pointer ............................................................11
Disabling System Gestures ......................................................................13
Touch Support in Windows 7® ..........................................................13
Gestures...............................................................................................14
Touch Input Messages ............................................................................17
WPF Stylus Events..................................................................................19
What to Consider When Developing Touch Applications for Multiple
Windows Versions............................................................................24
Stylus ..............................................................................................24
HP dx9000 Specifics ........................................................................25
HP Touch Screen Configuration ...............................................................25
Enabling /Disabling the Touch Device ......................................................26
Disabling the HP TouchSmart Button .........................................................26
Disabling the Volume and Mute Controls ..................................................26

2
HP TouchSmart 9100 Specifics ..........................................................27
Disabling the HP TouchSmart Button .........................................................27
Changing TouchSmart Style with HP TouchSmart Style Utility .......................28
HP TouchSmart Software Basic Style .................................................28
HP TouchSmart Software Ultimate Style .............................................28
Putting a System to Cleaning/Maintaining Mode with HP Maintenance Utility 28
Disabling/Retasking Volume Side Buttons .................................................29
HP System Configuration Schema Version 1.0 .....................................29
Schema................................................................................................29
Example 1 –XML to Disable All Side Volume Buttons..................................31
Example 2 –XML to Disable Side Mute Volume Button and Re-task Side Volume
Up and Down Buttons to Launch Applications ............................................31
Example 3 –XML to Re-task Side Volume Buttons to Control Display Brightness
and Toggle the Display On/Off...............................................................32
Example 4 –XML to Retask Volume Mute Button to Cleaning Mode ..............33
Button Element Attribute Summary ............................................................33
References .......................................................................................36
For More Information ........................................................................36
Call to Action...................................................................................37

3
Overview
Touching a screen is a natural and inspiring way to interact with computers. HP has
expanded its portfolio to offer business customers touch-enabled desktops. Touch
products will help businesses provide services and environments where touch
enhances the user experience and satisfaction over the normal mouse and
keyboard experience. Software applications designed with touch in mind will
accelerate the touch experience on touch-enabled platforms. Software developers
can take applications to another level to provide a touch-friendly experience. This
paper offers developers high level information and resources to jump start their
development of touch-friendly applications for HP business touch-enabled desktops.
Touch-friendly applications allow users to use their finger tips to perform most of the
important operations in place of using a keyboard and/or a mouse. The paper
describes the elements needed to develop a touch friendly application: the user
experience design guidelines, the testing practices for a touch application, and the
touch support in Microsoft Windows Vista®and Microsoft Windows 7®.
How to develop a fully touch-optimized application, which provides multi-touch,
manipulation and inertia, is beyond the scope of this paper.
The paper also provides platform-specific information about the supported HP touch-
enabled products and features such as the touch device and functional buttons. You
may find the information useful to configure these systems for your environment.
Scope
Hardware Platforms
HP dx9000 Touchsmart Business PC
HP TouchSmart 9100 Business PC
Operating Systems
Microsoft Windows XP®Professional
Microsoft Windows Vista®Business (32-bit and 64-bit)
Microsoft Windows 7®Professional (32-bit and 64-bit)
Note
The touch feature of the operating systems in scope is
discussed, but each hardware platform has its own list of
supported operating systems. Be sure to check specific
platform sections towards the end of the document for HP
recommended operating system for your hardware.

4
Developing Touch-Friendly Applications on an HP
Touch Platform
Hardware: A supported system above
Software: Microsoft Windows 7®, Windows Vista®or Windows XP®(Please see
platform specific sections towards the end of the document for HP
recommended operating system for each platform)
Touch driver and touch support: The following table summarizes the touch
support in different Windows versions and whether a touch driver is required
for the touch support.
Operating System
Single Touch
Multi-touch
Windows 7
X
Xd
Windows Vista
X
Windows XP
X
X: The touch support is native in the Windows version.
Xd: An appropriate touch driver is required for the touch support.
The touch support of each hardware platform and each Windows version is
different. Several sections later in this paper will discuss specific hardware
platforms and the recommended Windows version for each platform in details.
What to do:
1. Examine the touch support of available hardware platforms and operating
system versions.
2. Define the level of touch support and the user experience your application
will provide.
3. If you are updating an existing application for touch, evaluate the
application user interface and flow for touch. If developing a new touch
application, design it carefully with touch in mind. The following section
Software Development Practice can be helpful for the evaluation and the
design.
4. Implement the development needed to support touch. Most of the single
touch gestures are synthesized to mouse messages by the operating system,
so in many cases, legacy applications may just need a face slip with the
rearrangement of objects and the flow of the user interface without
implementing touch programming.
Software Development Practice
Refer to the “Windows User Experience Interaction Guidelines” for an
application running in Windows on the Microsoft Developer Network (MSDN).
These articles include the recommended guidelines for all Windows
applications and touch applications.
Pay attention to the guidelines for a touch application.

5
Note: The preinstalled image of HP business touch-enabled desktops may come
with HP TouchSmart software. This software is a framework hosting other registered
applications to provide the unique HP touch experience. The instructions to write a
hosted application in HP TouchSmart software are available at
http://www.touchsmartdevzone.com/download/60/HP-TouchSmart-Software-
Developer-Guidelines/. The TouchSmart Community website has been divided into
two sites - nextBench and TouchSmart Dev Zone.
The following sections summarize the main points for developing a touch-friendly
application. Most of the information was collected from MSDN articles. Links to
these articles are located in the Reference section of this paper.
General User Experience Guidelines for a Touch-
Friendly Application
The majority of work required to ensure a touch-friendly application goes into
designing the user experience. An existing Windows application with small controls
designed for the mouse and keyboard is difficult for users to touch the controls and
to give input. Additionally, an application that requires constant alternating
between touching, clicking, or typing can lead to user aggravation and frustration.
The following table summarizes recommendations by Microsoft for a better touch
experience.
Aspect
Recommendation
Reasoning
Touch target/Control size
At least 23x23 pixels (13x13
dialog units (DLUs)
Frequently used controls can be
bigger, 40x40 pixels (23x22
DLUs).
For fingers to hit targets easily
Space between controls
At least 5 pixels of space
between controls.
For fingers not to accidentally hit
controls close to the target(s)
Control location
Place controls in a way to avoid
cross-screen movements when
touching controls.
Also, avoid placing small
controls near window edges.
Avoid placing controls leftmost
or rightmost of the screen.
Arms get hurt quickly.
Users may inadvertently hit the
outside area of the window
instead of the targets.
People are either left-handed or
right-handed. Placing the
controls leftmost or rightmost
may be inconvenient for either
type of people.
Common control usage
Use common controls wherever
you can because they are
designed to support a good
touch experience.
If you really need to use custom
controls, make sure that they are
well-implemented for touch
interactions.
Hover
Must not require users to hover
Hover is not supported by the

6
Aspect
Recommendation
Reasoning
the fingers on the screen to
perform an action.
touch technology of the
hardware platforms.
Forgiveness
Allow users to easily
undo/reverse their actions.
Finger touching is not as
accurate as pen or stylus or
other input devices.
Text input
Minimize text input by providing
selections like sliders,
checkboxes, option buttons,
auto-text, appropriate default
values, etc.
It is cumbersome to switch
between finger touching and
keyboard typing.
Touch pointer
Should not rely on touch pointer
to provide input.
Touch pointer is not as easy and
natural to use as direct touch
input.
Feedback
Should provide clear visual
feedbacks right after each user
interaction.
To minimize chances of
confusion and dissatisfaction
For more details, please read the Windows User Experience Guidelines for Touch
on MSDN.
Usability Testing for Touch Applications
Usability tests for touch applications require additional user experience tests than
the standard application tests. Below are basic points for you to consider when
writing a test plan for a touch application.
Ensure the use of finger tips (not finger nails) for both normal and big sizes
Include right-hand and left-hand user tests
Make sure tests cover the same amount of time a typical user will spend on
your application to determine if there is physical discomfort to hands and arms
Include tests in which users touch the boundary of controls and touch targets
Test for how forgiving your application is when users accidentally hit unwanted
targets; in particular, destructive operations like delete, erase, etc.
If applicable, test your applications with different screen resolution and DPI
settings to ensure there are no obscure controls or touch targets and that the
display is correct for the user interface
Depending on your applications, perform usability tests on a variety of focus
groups
As new touch gestures are defined and increase in popularity, make sure your
application behaves gracefully even with unsupported gestures.

7
Touch Support in Windows XP®
Only single touches work on Windows XP. System gestures are synthesized to the
equivalent mouse messages shown below so your applications only need to
respond to these mouse messages.
System gesture
Equivalent mouse message
Tap (down and up)
Mouse left-click
Double tap (down and up twice)
Mouse double click
Press and hold (down, pause then up)
Mouse right-click
Drag (down, move and up)
Mouse left-drag
Select (down, move over target(s), up)
Mouse select
Touch Support in Windows Vista®
Windows Tablet and Touch Technology is a standard component of Windows
Vista. Hence, applications running on Windows Vista and a hardware platform
supporting touch input like the HP dx9000 can and should take advantage of the
touch technology. This section highlights the gestures supported by Windows Vista
and provide links for more touch input manipulation.
System Gestures
A touch gesture is a quick movement of a finger on the screen that a computer
understands. The following table lists the system gestures defined by Windows and
supported by the hardware platform referenced in this paper. Windows Vista maps
the following touch gestures directly into equivalent mouse events. If your
application responds to any of the mouse events below, it will automatically
respond to the corresponding system gesture on the supported hardware platform.
System gesture
Equivalent mouse message
Tap (down and up)
Mouse left-click
Double tap (down and up twice)
Mouse double click
Press and hold (down, pause then up)
Mouse right-click
Drag (down, move and up)
Mouse left-drag
Hold and drag (down, pause, move and up)
Mouse right-drag
Select (down, move over target(s), up)
Mouse select

8
The following code fragment of a Windows Presentation Foundation (WPF)
application shows that the code to handle the users touching/tapping the exit
button is just the very same code that handles a mouse click to the exit button.
public Window1()
{
InitializeComponent();
//add the event handler to handle users click/tap the exit button
button1.Click += new RoutedEventHandler(exitButton_Click);
}
private void exitButton_Click(object sender, RoutedEventArgs e)
{
//close the window to exit the application
this.Close();
}
You may wonder why there is nothing relating to touch in this code. The operating
system synthesizes the tap or the touch on the exit button to the mouse left click
message, so no extra code is needed to handle the tap/touch to the button other
than the usual left mouse click event handler. When users use the mouse and left-
click on the button, exitButton_Click is detected. And the very same exitButton_Click
handler is detected when users touch the button.
Flick Gestures
A flick is a simple touch gesture that can be interpreted as a keystroke command.
The following table lists the flick gestures and default assignment in Windows Vista.
Flick
Equivalent command
Navigational flicks
Flick left
Back command
Enabled by default
Flick right
Forward command
Flick up
Keyboard Scroll Down
one screenful
Flick down
Keyboard Scroll Up one
screenful
Editing flicks
Flick up-left diagonal
Keyboard Delete
Not enabled by
default (because
these flicks are not
natural and
require more
precision )
Flick down-left diagonal
Keyboard Undo
Flick up-right diagonal
Keyboard Copy
Flick down-right
diagonal
Keyboard Paste

9
When a flick is detected, Windows will first send a flick message
(WM_TABLET_FLICK Message). If the flick message is not handled, Windows
follows up by sending the applicable WM_APPCOMMAND notification. Your
application can either handle flick messages directly or responds to follow-up
command notifications. Please note that users can re-assign these flicks to the
commands of their choice in Windows| Control Panel | Pen and Input Devices.
Hence, if your application responds to commands instead of flick events, it is
recommended that the application responds to application commands that can be
potentially assigned to a flick gesture. A more detailed discussion about flick
gestures is available on MSDN.
If you would like to support flick gestures in your application, refer to “Flicks API
Reference”on MSDN.
If your application handles command messages, the window procedure (WndProc)
of the window that handles the command message should be similar to the
following pseudo-code:
if (message == WM_APPCOMMAND)
{
switch (GET_APPCOMMAND_LPARAM(lParam))
{
case APPCOMMAND_BROWSER_BACKWARD:
//do something
//return TRUE after processing the APPCOMMAND message
return 1;
case APPCOMMAND_BROWSER_FORWARD:
//do something
//return TRUE after processing the APPCOMMAND message
return 1;
}
}
The constants and macros relating to WM_APPCOMMAND are defined in
WinUser.h of Microsoft Windows Software Development Kit (SDK).

10
Application Gestures
In addition to gestures and flick gestures, Windows Vista also supports application
gestures. You can use Microsoft Gesture Recognizer, or create your custom gesture
recognizer or use a combination of both. Microsoft Gesture Recognizer can
recognize over forty gestures. The following table lists a few examples of gestures
recognized by Microsoft Gesture Recognizer.
Gesture
Gesture name
Suggested
behavior
Notes
Scratch-out
Erase content
Make the strokes as horizontal as possible, and
draw at least three strokes. If the height of the
gesture increases, the number of back and forth
strokes also needs to increase.
Triangle
Insert
Draw the triangle in a single stroke, without
lifting the finger. Make sure that the top of the
triangle points upward.
Square
Action item
Draw the square starting at the upper left corner.
Draw the square with a single stroke, without
lifting the pen.
A complete list of application gestures recognized by Microsoft Gesture Recognizer
is available on MSDN at http://msdn.microsoft.com/en-
us/library/ms698540(VS.85).aspx.
The following code fragment of a WPF application gives you some idea about
Windows-defined application gestures. The application has an ink canvas control
that only accepts the enabled application gestures: triangle, square and circle.
When a user runs the application and uses his finger to draw a stroke on the
canvas, the inkCanvas_OnGesture method is called. If the stroke is a one-stroke
triangle or square or circle and the gesture is recognized as a defined enabled
gesture, the stroke is removed from the canvas. Otherwise, the stroke remains on the
canvas.
public Window1()
{
InitializeComponent();
if (inkCanvas.IsGestureRecognizerAvailable)
{
inkCanvas.EditingMode = InkCanvasEditingMode.InkAndGesture;
inkCanvas.Gesture += new
InkCanvasGestureEventHandler(inkCanvas_OnGesture);
//enable only certain application gestures
inkCanvas.SetEnabledGestures(new ApplicationGesture[]
{ApplicationGesture.Triangle,
ApplicationGesture.Square,
ApplicationGesture.Circle});
}

11
}
private void inkCanvas_OnGesture(object sender, InkCanvasGestureEventArgs
e)
{
ApplicationGesture gesture =
e.GetGestureRecognitionResults()[0].ApplicationGesture;
if (gesture == ApplicationGesture.Triangle ||
gesture == ApplicationGesture.Square ||
gesture == ApplicationGesture.Circle)
{
//remove the stroke on the ink canvas
StrokeCollection strokesToDelete =
inkCanvas.Strokes.HitTest(e.Strokes.GetBounds(), 10);
inkCanvas.Strokes.Remove(strokesToDelete);
//do some action here
}
else
{
//not the expected gestures
e.Cancel = true;
}
}
Enabling/Disabling Touch Pointer
The touch pointer is the Windows floating graphic that looks like a mouse on
screen. This pointer helps you target small objects since touching small targets is not
accurate for a finger. To show the touch pointer when touching the screen, click
Control Panel > Pen and Input Devices > Touch tab > Make sure Show the touch
pointer when I’m interacting with items on the screen. Click Apply.
If you need to disable and/or enable the touch pointer in your code, you can
intercept operating system window messages using a window procedure
(WndProc) and modify the Windows messages. The C# pseudo-code below shows
how to enable and disable the touch pointer:
const int WM_TABLET_QUERY_SYSTEM_GESTURE_STATUS = 716;
const uint SYSTEM_GESTURE_STATUS_TOUCHUI_FORCEON = 0x00000100;
const uint SYSTEM_GESTURE_STATUS_TOUCHUI_FORCEOFF = 0x00000200;
protected override void WndProc(ref Message msg)
{
switch (msg.Msg)
{
case WM_TABLET_QUERY_SYSTEM_GESTURE_STATUS:

12
{
uint result = 0;
if (...)
{
//enable the touch pointer
result |= SYSTEM_GESTURE_STATUS_TOUCHUI_FORCEON;
}
if (...)
{
//disable the touch pointer
result |= SYSTEM_GESTURE_STATUS_TOUCHUI_FORCEOFF;
}
//return the modified messages

HP recommends Windows Vista® Business
13
msg.Result = (IntPtr)result;
}
break;
default:
base.WndProc(ref msg);
break;
}
}
Refer to the article, “Touch Input Support in Windows Vista,”on MSDN for more
details.
Disabling System Gestures
By default a window application receives all system gesture events. You may need
to disable some system gesture events by responding to
WM_TABLET_QUERYSYSTEMGESTURESTATUS message in your window procedure.
Refer to the article, “WM_TABLET_QUERYSYSTEMGESTURESTATUS Message
Message,” on MSDN for more details:
Additional advanced touch input manipulation is available on MSDN at
http://msdn.microsoft.com/.
Touch Support in Windows 7®
Windows 7 provides a richer support for the development of touch applications.
Support includes:
The default behavior for many gestures that touch-unaware applications
automatically receive
Gesture messages (WM_GESTURE) to get gestures performed by users
Touch input messages (WM_TOUCH) to get information about specific contact
points
Manipulation APIs (using manipulation processor and inertia processor) on top
of touch input messages to get actions performed by users as well as how
actions can be interpreted
An application can receive either gesture messages or touch input messages but not
both at the same time. By default, Windows sends WM_GESTURE notifications to
an application. If the application does not handle these events, they are bubbled
up and Windows will send back equivalent legacy messages like mouse left button
click, mouse left double click, mouse right button up, mouse right button down, etc.
Consequently, that is why touch-unaware applications can still get good touch
support.
Not all gestures have equivalent legacy messages. To provide a better touch user
experience, you may need to handle gesture messages that have no legacy

14
messages, or you may need to override the default behavior of the default legacy
messages in your application.
You can also choose to wow your users with the best touch experience by having
your application register for touch input messages to handle them. The lowest level
raw touch data provided by Windows Touch API allows you to detect the action,
identify each touch point, its position and contact area. You then can calculate the
physic aspects of user touch actions and simulate lively physic responses engaging
the user actions.
With touch input messages, you can also choose to provide event sinks for
manipulation processors and inertia processors. This mechanism helps the
animation of complex gestures like rotation and translation at the same time,
deceleration of objects on their trajectories, bouncing off effects when hitting a
boundary, etc.
At the release of Windows 7, with WPF 3.5 SP1 the development support for new
multi-touch features is available for native code via Win32 and COM. WPF 4.0 is
expected to have support for all multi-touch features. In the interim, WPF
application developers have the following options:
Use .NET wrappers in Windows 7 Multitouch .NET Interop Sample Library or
Use stylus events
The following sub-sections summarize the basics of handling gesture and touch
input messages using unmanaged code. Subsequently you will note how to get
multi-touch data in a WPF application. Code excerpts from MSDN and other
sources are included for illustration.
Gestures
The following table lists the gestures defined in Windows 7 and the default
synthesized messages on applications that do not handle gesture messages.
Gesture
Equivalent message
Tap (down and up quickly)
Mouse left-click
Double tap (down and up quickly twice)
Mouse double click
Select (down, move over target(s), up)
Mouse select
Drag (down, move and up)
Mouse left-drag
Press and hold (down, pause until the blue ring
appears around your finger then up)
Mouse right-click
Press and tap (press one finger on the target, then
quickly tap down the other finger. First finger
needs to be on the target.)
Mouse right-click
Two-finger tap (tap two fingers at the same time
and the target is the midpoint between two fingers
Not applicable
Panning (drag one or two fingers up and/or
Mouse scroll wheel

15
Gesture
Equivalent message
down)
Zoom (move two fingers apart or together)
Control key + mouse scroll wheel
Rotate (move two fingers in opposite directions or
move one finger pivoting around another finger)
Not applicable
Flicks (down and move quickly in a direction)
Navigational flicks
Flick left
Flick right
Flick up
Flick down
Editing flicks
Flick up-left diagonal
Flick down-left diagonal
Flick up-right diagonal
Flick down-right diagonal
Note: Editing flicks are not enabled by default. You can enabled
them by configure Pen and Touch settings in Windows.
Forward command
Back command
Keyboard Scroll Down one screenful
Keyboard Scroll Up one screenful
Keyboard Delete
Keyboard Undo
Keyboard Copy
Keyboard Paste
The following MSDN unmanaged code excerpt shows how to catch and decode
gesture messages into a gesture information structure. Basically, you need to check
for WM_GESTURE messages, then call GetGestureInfo to retrieve the gesture
information from lParam.The field dwID of the gesture information structure will
identify the type of gestures.
//catch gesture messages in WndProc
LRESULT CALLBACK WndProc(HWND hWnd, UINT message, WPARAM wParam, LPARAM
lParam)
{
//. . .
switch (message)
{
case WM_GESTURE:
// Insert handler code here to interpret the gesture.
return DecodeGesture(hWnd, message, wParam, lParam);
//. . .
}
//decode gesture messages using GetGestureInfo
LRESULT DecodeGesture(HWND hWnd, UINT message, WPARAM wParam, LPARAM
lParam){
// Create a structure to populate and retrieve the extra message info.
GESTUREINFO gi;
ZeroMemory(&gi, sizeof(GESTUREINFO));
gi.cbSize = sizeof(GESTUREINFO);

16
BOOL bResult = GetGestureInfo((HGESTUREINFO)lParam, &gi);
BOOL bHandled = FALSE;
if (bResult){
// now interpret the gesture
switch (gi.dwID){
case GID_ZOOM:
// Code for zooming goes here
bHandled = TRUE;
break;
case GID_PAN:
// Code for panning goes here
bHandled = TRUE;
break;
case GID_ROTATE:
// Code for rotation goes here
bHandled = TRUE;
break;
case GID_TWOFINGERTAP:
// Code for two-finger tap goes here
bHandled = TRUE;
break;
case GID_PRESSANDTAP:
// Code for roll over goes here
bHandled = TRUE;
break;
default:
// A gesture was not recognized
break;
}
}else{
DWORD dwErr = GetLastError();
if (dwErr > 0){
//MessageBoxW(hWnd, L"Error!", L"Could not retrieve a
GESTUREINFO structure.", MB_OK);
}
}
if (bHandled){
return 0;
}else{
return DefWindowProc(hWnd, message, wParam, lParam);
}
}
The gesture information structure GESTUREINFO contains additional information that
may be helpful to your application:
dwFlags shows the states of the gesture: GID_BEGIN, GID_INERTIA or
GID_END.
pstLocation shows the location relating to the gesture (for example, pstLocation
indicates the center of a zoom gesture)

17
ullArguments shows more information about the gesture (for example, the
distance between the two points of a zoom)
You can configure the gesture messages sent from Windows by calling
SetGestureConfig. Typical usages are disabling/enabling gestures,
blocking/receiving vertical/horizontal and gutter and inertia for single finger
panning.
A few things to consider when handling WM_GESTURE:
Do not call RegisterTouchWindow. Your application will stop receiving
WM_GESTURE messages.
Ignore the gesture starting and ending messages (GID_BEGIN and GID_END).
The default gesture handler uses these. Application behavior is undefined if
your application consumes these.
Pass unconsumed messages to DefWindowProc to ensure all messages are
handled appropriately.
Call CloseGestureInfoHandle to close the gesture input handle for messages
that your application handles.
Note that compound gestures are coalesced. For example, a touch gesture
starting as a zoom then changing to panning will result in only a zoom gesture
message. If you want to respond exactly to zooming then panning, you need to
handle WM_TOUCH instead of WM_GESTURE and detect the gesture change
yourself.
More details about touch support in Windows 7® are available on MSDN at
http://msdn.microsoft.com/en-us/library/dd562197(VS.85).aspx.
Touch Input Messages
Touch input messages gives you the raw touch data and all the freedom to
interpret it. Associate the raw touch points to the target objects and figure out what
action a user or users intend to do. The following unmanaged MSDN code
excerpts show the different steps to getting touch input messages: test the
capabilities of the input digitizer, register to receive touch input messages and
handle the messages.Test the capability of the touchscreen:
#include <windows.h>
. . .
// test for touch
int value = GetSystemMetrics(SM_DIGITIZER);
if (value & NID_READY){ /* stack ready */}
if (value & NID_MULTI_INPUT){
/* digitizer is multitouch */
MessageBoxW(hWnd, L"Multitouch found", L"IsMulti!", MB_OK);
}
if (value & NID_INTEGRATED_TOUCH){ /* Integrated touch */}

18
Register to receive touch input messages:
LRESULT CALLBACK WndProc(HWND hWnd, UINT message, WPARAM wParam, LPARAM
lParam)
{
int wmId, wmEvent;
PAINTSTRUCT ps;
HDC hdc;
switch (message)
{
// pass touch messages to the touch handler
case WM_TOUCH:
OnTouch(hWnd, wParam, lParam);
break;
Handle the messages:
LRESULT OnTouch(HWND hWnd, WPARAM wParam, LPARAM lParam ){
BOOL bHandled = FALSE;
UINT cInputs = LOWORD(wParam);
PTOUCHINPUT pInputs = new TOUCHINPUT[cInputs];
if (pInputs){
if (GetTouchInputInfo((HTOUCHINPUT)lParam, cInputs, pInputs,
sizeof(TOUCHINPUT))){
for (UINT i=0; i < cInputs; i++){
TOUCHINPUT ti = pInputs[i];
//do something with each touch input entry
}
bHandled = TRUE;
}else{
/* handle the error here */
}
delete [] pInputs;
}else{
/* handle the error here, probably out of memory */
}

19
if (bHandled){
// if you handled the message, close the touch input handle and
return
CloseTouchInputHandle((HTOUCHINPUT)lParam);
return 0;
}else{
// if you didn't handle the message, let DefWindowProc handle it
return DefWindowProc(hWnd, WM_TOUCH, wParam, lParam);
}
}
Consider the following when handling WM_TOUCH:
Pass unconsumed messages to DefWindowProc to ensure all messages are
handled appropriately.
Call CloseTouchInputHandle to close the touch input handle for messages that
your application handles.
Be aware that by default, touch messages are coalesced and palm detection
are enabled. Palm detection may affect the performance of your application
unnecessarily. You can turn both of these off in the call to
RegisterTouchWindow.
Note that WM_TOUCH messages are “greedy”. Once Windows sends the first
touch message to a window, all subsequent touch messages are sent to that
window until another window gets focus.
Interpreting touch input messages is usually complex. You may want to utilize COM-
based manipulation processors and inertia processors to make handling touch input
messages easier for you. You can feed raw touch input messages to a manipulation
processor and get manipulation delta events back with all the deltas you want to
know to respond to touch messages: translation delta, scale delta, expansion delta,
rotation delta, etc. With this help, you can respond to gesture combinations like
zoom and rotate, or rotate and translate without having to do a lot of calculations
yourself. You can also use an inertia processor together with a manipulation
processor to simulate object physics.
More on manipulations and inertia is available on MSDN, section Manipulations
and Inertia at http://msdn.microsoft.com/en-us/library/dd317309(VS.85).aspx.
You may also refer to Implementing Windows Touch with Multiple Manipulation
Processors blog at http://gclassy.com/2009/08/06/implementing-windows-touch-
with-multiple-manipulation-processors/.
WPF Stylus Events
Besides handling gesture and touch input messages using unmanaged code, you
can also utilize stylus events in WPF to get touch data. This path is for those who
want to develop with WPF prior to WPF 4.0. Andrew Eichacker showed how to get

20
the two contact points using stylus events in his blog. First, he enabled touch events
to the application. Then he subscribed to stylus events and identified the two touch
points via the touch IDs. In the example, two squares, one black and one red
appear at the touch down locations. The squares move accordingly to the touch
contacts and disappear when the touches are released.
Window1.xaml
<Window x:Class="MultitouchTest.Window1"
xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
Title="Window1" Height="800" Width="1200">
<Canvas>
<Rectangle Canvas.Left="-20" Canvas.Top="0" Height="20"
Name="Touch1" Stroke="Black" Fill="Black" Width="20" />
<Rectangle Canvas.Left="-20" Canvas.Top="0" Height="20"
Name="Touch2" Stroke="Red" Width="20" Fill="Red" />
</Canvas>
</Window>
Window1.xaml.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Data;
using System.Windows.Documents;
using System.Windows.Input;
using System.Windows.Media;
using System.Windows.Media.Imaging;
using System.Windows.Navigation;
using System.Windows.Shapes;
using System.Windows.Interop;
using System.Runtime.InteropServices;
using System.Diagnostics;
namespace MultitouchTest
{
public partial class Window1 : Window
{
#region Class Variables
private int Touch1ID = 0; // id for first touch contact
private int Touch2ID = 0; // id for second touch contact
#endregion
#region P/Invoke
// just a little interop. It's different this time!
[DllImport("user32.dll")]
public static extern bool SetProp(IntPtr hWnd, string lpString,
IntPtr hData);
This manual suits for next models
2
Table of contents
Other HP Desktop manuals

HP
HP Z620 Series User manual

HP
HP 268 Pro G1 MT Manual

HP
HP ProDesk 400 G2.5 Manual

HP
HP Compaq CQ2009F User manual

HP
HP Compaq Presario User manual

HP
HP TouchSmart 610-1000 Instruction Manual

HP
HP Pavilion Ultimate d4900 - Desktop PC User manual

HP
HP Vectra VL400 Product manual

HP
HP XM600 - Kayak - 128 MB RAM User manual

HP
HP Net Vectra n30 Manual

HP
HP Compaq d220 MT User manual

HP
HP Pavilion p6730f User manual

HP
HP ProOne 600 G4 User manual

HP
HP Vectra VL410 User manual

HP
HP P Class 450/500/550/600/650/700/750 Manual

HP
HP xw8600 - Workstation User manual

HP
HP TouchSmart dx9000 Operating manual

HP
HP Vectra XE310 User manual

HP
HP xw9400 User instructions

HP
HP Compaq Presario Manual