blob: 234188f850045c0fa7a18afcc00d81a3c3118a62 [file] [log] [blame] [view]
Jeff Brown590a9d62011-06-30 12:55:34 -07001<!--
2 Copyright 2011 The Android Open Source Project
3
4 Licensed under the Apache License, Version 2.0 (the "License");
5 you may not use this file except in compliance with the License.
6 You may obtain a copy of the License at
7
8 http://www.apache.org/licenses/LICENSE-2.0
9
10 Unless required by applicable law or agreed to in writing, software
11 distributed under the License is distributed on an "AS IS" BASIS,
12 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13 See the License for the specific language governing permissions and
14 limitations under the License.
15-->
16
17# Touch Devices #
18
19Android supports a variety of touch screens and touch pads, including
20stylus-based digitizer tablets.
21
22Touch screens are touch devices that are associated with a display such that
23the user has the impression of directly manipulating items on screen.
24
25Touch pads are touch devices that are not associated with a display such as a
26digitizer tablet. Touch pads are typically used for pointing or for
27absolute indirect positioning or gesture-based control of a user interface.
28
29Touch devices may have buttons whose functions are similar to mouse buttons.
30
31Touch devices can sometimes be manipulated using a variety of different tools
32such as fingers or a stylus depending on the underlying touch sensor technology.
33
34Touch devices are sometimes used to implement virtual keys. For example, on
35some Android devices, the touch screen sensor area extends beyond the edge of
36the display and serves dual purpose as part of a touch sensitive key pad.
37
38Due to the great variety of touch devices, Android relies on a large number of
39configuration properties to describe the characteristics and desired behavior
40of each device.
41
42## Touch Device Classification ##
43
44An input device is classified as a *multi-touch* device if both of
45the following conditions hold:
46
47* The input device reports the presence of the `ABS_MT_POSITION_X` and
48 `ABS_MT_POSITION_Y` absolute axes.
49
50* The input device does not have any gamepad buttons. This condition
51 resolves an ambiguity with certain gamepads that report axes with codes
52 that overlaps those of the MT axes.
53
54An input device is classified as a *single-touch* device if both of the
55following conditions hold:
56
57* The input device is not classified as a multi-touch device. An input device
58 is either classified as a single-touch device or as a multi-touch device,
59 never both.
60
61* The input device reports the presence of the `ABS_X` and `ABS_Y` absolute
62 axes, and the presence of the `BTN_TOUCH` key code.
63
64Once an input device has been classified as a touch device, the presence
65of virtual keys is determined by attempting to load the virtual key map file
66for the device. If a virtual key map is available, then the key layout
67file for the device is also loaded.
68
69Refer to the section below about the location and format of virtual key map
70files.
71
72Next, the system loads the input device configuration file for the touch device.
73
74**All built-in touch devices should have input device configuration files.**
75If no input device configuration file is present, the system will
76choose a default configuration that is appropriate for typical general-purpose
77touch peripherals such as external USB or Bluetooth HID touch screens
78or touch pads. These defaults are not designed for built-in touch screens and
79will most likely result in incorrect behavior.
80
81After the input device configuration loaded, the system will classify the
82input device as a *touch screen*, *touch pad* or *pointer* device.
83
84* A *touch screen* device is used for direct manipulation of objects on the
85 screen. Since the user is directly touching the screen, the system does
86 not require any additional affordances to indicate the objects being
87 manipulated.
88
89* A *touch pad* device is used to provide absolute positioning information
90 to an application about touches on a given sensor area. It may be useful
91 for digitizer tablets.
92
93* A *pointer* device is used for indirect manipulation of objects on the
94 screen using a cursor. Fingers are interpreted as multi-touch pointer
95 gestures. Other tools, such as styluses, are interpreted using
96 absolute positions.
97
98 See [Indirect Multi-touch Pointer Gestures](#indirect-multi-touch-pointer-gestures)
99 for more information.
100
101The following rules are used to classify the input device as a *touch screen*,
102*touch pad* or *pointer* device.
103
104* If the `touch.deviceType` property is set, then the device type will be
105 set as indicated.
106
107* If the input device reports the presence of the `INPUT_PROP_DIRECT`
108 input property (via the `EVIOCGPROP` ioctl), then the device type will
109 be set to *touch screen*. This condition assumes that direct input touch
110 devices are attached to a display that is also connected.
111
112* If the input device reports the presence of the `INPUT_PROP_POINTER`
113 input property (via the `EVIOCGPROP` ioctl), then the device type will
114 be set to *pointer*.
115
116* If the input device reports the presence of the `REL_X` or `REL_Y` relative
117 axes, then the device type will be set to *touch pad*. This condition
118 resolves an ambiguity for input devices that consist of both a mouse and
119 a touch pad. In this case, the touch pad will not be used to control
120 the pointer because the mouse already controls it.
121
122* Otherwise, the device type will be set to *pointer*. This default ensures
123 that touch pads that have not been designated any other special purpose
124 will serve to control the pointer.
125
126## Buttons ##
127
128Buttons are *optional* controls that may be used by applications to perform
129additional functions. Buttons on touch devices behave similarly to mouse
130buttons and are mainly of use with *pointer* type touch devices or with a
131stylus.
132
133The following buttons are supported:
134
135* `BTN_LEFT`: mapped to `MotionEvent.BUTTON_PRIMARY`.
136
137* `BTN_RIGHT`: mapped to `MotionEvent.BUTTON_SECONDARY`.
138
139* `BTN_MIDDLE`: mapped to `MotionEvent.BUTTON_MIDDLE`.
140
141* `BTN_BACK` and `BTN_SIDE`: mapped to `MotionEvent.BUTTON_BACK`.
142 Pressing this button also synthesizes a key press with the key code
143 `KeyEvent.KEYCODE_BACK`.
144
145* `BTN_FORWARD` and `BTN_EXTRA`: mapped to `MotionEvent.BUTTON_FORWARD`.
146 Pressing this button also synthesizes a key press with the key code
147 `KeyEvent.KEYCODE_FORWARD`.
148
149* `BTN_STYLUS`: mapped to `MotionEvent.BUTTON_SECONDARY`.
150
151* `BTN_STYLUS2`: mapped to `MotionEvent.BUTTON_TERTIARY`.
152
153## Tools and Tool Types ##
154
155A *tool* is a finger, stylus or other apparatus that is used to interact with
156the touch device. Some touch devices can distinguish between different
157types of tools.
158
159Elsewhere in Android, as in the `MotionEvent` API, a *tool* is often referred
160to as a *pointer*.
161
162The following tool types are supported:
163
164* `BTN_TOOL_FINGER` and `MT_TOOL_FINGER`: mapped to `MotionEvent.TOOL_TYPE_FINGER`.
165
166* `BTN_TOOL_PEN` and `MT_TOOL_PEN`: mapped to `MotionEvent.TOOL_TYPE_STYLUS`.
167
168* `BTN_TOOL_RUBBER`: mapped to `MotionEvent.TOOL_TYPE_ERASER`.
169
170* `BTN_TOOL_BRUSH`: mapped to `MotionEvent.TOOL_TYPE_STYLUS`.
171
172* `BTN_TOOL_PENCIL`: mapped to `MotionEvent.TOOL_TYPE_STYLUS`.
173
174* `BTN_TOOL_AIRBRUSH`: mapped to `MotionEvent.TOOL_TYPE_STYLUS`.
175
176* `BTN_TOOL_MOUSE`: mapped to `MotionEvent.TOOL_TYPE_MOUSE`.
177
178* `BTN_TOOL_LENS`: mapped to `MotionEvent.TOOL_TYPE_MOUSE`.
179
180* `BTN_TOOL_DOUBLETAP`, `BTN_TOOL_TRIPLETAP`, and `BTN_TOOL_QUADTAP`:
181 mapped to `MotionEvent.TOOL_TYPE_FINGER`.
182
183## Hovering vs. Touching Tools ##
184
185Tools can either be in contact with the touch device or in range and hovering
186above it. Not all touch devices are able to sense the presence of a tool
187hovering above the touch device. Those that do, such as RF-based stylus digitizers,
188can often detect when the tool is within a limited range of the digitizer.
189
190The `InputReader` component takes care to distinguish touching tools from hovering
191tools. Likewise, touching tools and hovering tools are reported to applications
192in different ways.
193
194Touching tools are reported to applications as touch events
195using `MotionEvent.ACTION_DOWN`, `MotionEvent.ACTION_MOVE`, `MotionEvent.ACTION_DOWN`,
196`MotionEvent.ACTION_POINTER_DOWN` and `MotionEvent.ACTION_POINTER_UP`.
197
198Hovering tools are reported to applications as generic motion events using
199`MotionEvent.ACTION_HOVER_ENTER`, `MotionEvent.ACTION_HOVER_MOVE`
200and `MotionEvent.ACTION_HOVER_EXIT`.
201
202## Touch Device Driver Requirements ##
203
2041. Touch device drivers should only register axes and key codes for the axes
205 and buttons that they actually support. Registering excess axes or key codes
206 may confuse the device classification algorithm or cause the system to incorrectly
207 detect the capabilities of the device.
208
209 For example, if the device reports the `BTN_TOUCH` key code, the system will
210 assume that `BTN_TOUCH` will always be used to indicate whether the tool is
211 actually touching the screen or is merely in range and hovering.
212
2132. Single-touch devices use the following Linux input events:
214
215 * `ABS_X`: *(REQUIRED)* Reports the X coordinate of the tool.
216
217 * `ABS_Y`: *(REQUIRED)* Reports the Y coordinate of the tool.
218
219 * `ABS_PRESSURE`: *(optional)* Reports the physical pressure applied to the tip
220 of the tool or the signal strength of the touch contact.
221
222 * `ABS_TOOL_WIDTH`: *(optional)* Reports the cross-sectional area or width of the
223 touch contact or of the tool itself.
224
225 * `ABS_DISTANCE`: *(optional)* Reports the distance of the tool from the surface of
226 the touch device.
227
228 * `ABS_TILT_X`: *(optional)* Reports the tilt of the tool from the surface of the
229 touch device along the X axis.
230
231 * `ABS_TILT_Y`: *(optional)* Reports the tilt of the tool from the surface of the
232 touch device along the Y axis.
233
234 * `BTN_TOUCH`: *(REQUIRED)* Indicates whether the tool is touching the device.
235
236 * `BTN_LEFT`, `BTN_RIGHT`, `BTN_MIDDLE`, `BTN_BACK`, `BTN_SIDE`, `BTN_FORWARD`,
237 `BTN_EXTRA`, `BTN_STYLUS`, `BTN_STYLUS2`:
238 *(optional)* Reports [button](#buttons) states.
239
240 * `BTN_TOOL_FINGER`, `BTN_TOOL_PEN`, `BTN_TOOL_RUBBER`, `BTN_TOOL_BRUSH`,
241 `BTN_TOOL_PENCIL`, `BTN_TOOL_AIRBRUSH`, `BTN_TOOL_MOUSE`, `BTN_TOOL_LENS`,
242 `BTN_TOOL_DOUBLETAP`, `BTN_TOOL_TRIPLETAP`, `BTN_TOOL_QUADTAP`:
243 *(optional)* Reports the [tool type](#tools-and-tool-types).
244
2453. Multi-touch devices use the following Linux input events:
246
247 * `ABS_MT_POSITION_X`: *(REQUIRED)* Reports the X coordinate of the tool.
248
249 * `ABS_MT_POSITION_Y`: *(REQUIRED)* Reports the Y coordinate of the tool.
250
251 * `ABS_MT_PRESSURE`: *(optional)* Reports the physical pressure applied to the
252 tip of the tool or the signal strength of the touch contact.
253
254 * `ABS_MT_TOUCH_MAJOR`: *(optional)* Reports the cross-sectional area of the
255 touch contact, or the length of the longer dimension of the touch contact.
256
257 * `ABS_MT_TOUCH_MINOR`: *(optional)* Reports the length of the shorter dimension of the
258 touch contact. This axis should not be used if `ABS_MT_TOUCH_MAJOR` is reporting an
259 area measurement.
260
261 * `ABS_MT_WIDTH_MAJOR`: *(optional)* Reports the cross-sectional area of the tool itself,
262 or the length of the longer dimension of the tool itself.
263 This axis should not be used if the dimensions of the tool itself are unknown.
264
265 * `ABS_MT_WIDTH_MINOR`: *(optional)* Reports the length of the shorter dimension of
266 the tool itself. This axis should not be used if `ABS_MT_WIDTH_MAJOR` is reporting
267 an area measurement or if the dimensions of the tool itself are unknown.
268
269 * `ABS_MT_ORIENTATION`: *(optional)* Reports the orientation of the tool.
270
271 * `ABS_MT_DISTANCE`: *(optional)* Reports the distance of the tool from the
272 surface of the touch device.
273
274 * `ABS_MT_TOOL_TYPE`: *(optional)* Reports the [tool type](#tools-and-tool-types) as
275 `MT_TOOL_FINGER` or `MT_TOOL_PEN`.
276
277 * `ABS_MT_TRACKING_ID`: *(optional)* Reports the tracking id of the tool.
278 The tracking id is an arbitrary non-negative integer that is used to identify
279 and track each tool independently when multiple tools are active. For example,
280 when multiple fingers are touching the device, each finger should be assigned a distinct
281 tracking id that is used as long as the finger remains in contact. Tracking ids
282 may be reused when their associated tools move out of range.
283
284 * `ABS_MT_SLOT`: *(optional)* Reports the slot id of the tool, when using the Linux
285 multi-touch protocol 'B'. Refer to the Linux multi-touch protocol documentation
286 for more details.
287
288 * `BTN_TOUCH`: *(REQUIRED)* Indicates whether the tool is touching the device.
289
290 * `BTN_LEFT`, `BTN_RIGHT`, `BTN_MIDDLE`, `BTN_BACK`, `BTN_SIDE`, `BTN_FORWARD`,
291 `BTN_EXTRA`, `BTN_STYLUS`, `BTN_STYLUS2`:
292 *(optional)* Reports [button](#buttons) states.
293
294 * `BTN_TOOL_FINGER`, `BTN_TOOL_PEN`, `BTN_TOOL_RUBBER`, `BTN_TOOL_BRUSH`,
295 `BTN_TOOL_PENCIL`, `BTN_TOOL_AIRBRUSH`, `BTN_TOOL_MOUSE`, `BTN_TOOL_LENS`,
296 `BTN_TOOL_DOUBLETAP`, `BTN_TOOL_TRIPLETAP`, `BTN_TOOL_QUADTAP`:
297 *(optional)* Reports the [tool type](#tools-and-tool-types).
298
2994. If axes for both the single-touch and multi-touch protocol are defined, then
300 only the multi-touch axes will be used and the single-touch axes will be ignored.
301
3025. The minimum and maximum values of the `ABS_X`, `ABS_Y`, `ABS_MT_POSITION_X`
303 and `ABS_MT_POSITION_Y` axes define the bounds of the active area of the device
304 in device-specific surface units. In the case of a touch screen, the active area
305 describes the part of the touch device that actually covers the display.
306
307 For a touch screen, the system automatically interpolates the reported touch
308 positions in surface units to obtain touch positions in display pixels according
309 to the following calculation:
310
311 displayX = (x - minX) * displayWidth / (maxX - minX + 1)
312 displayY = (y - minY) * displayHeight / (maxY - minY + 1)
313
314 A touch screen may report touches outside of the reported active area.
315
316 Touches that are initiated outside the active area are not delivered to applications
317 but may be used for virtual keys.
318
319 Touches that are initiated inside the active area, or that enter and exit the display
320 area are delivered to applications. Consequently, if a touch starts within the
321 bounds of an application and then moves outside of the active area, the application
322 may receive touch events with display coordinates that are negative or beyond the
323 bounds of the display. This is expected behavior.
324
325 A touch device should never clamp touch coordinates to the bounds of the active
326 area. If a touch exits the active area, it should be reported as being outside of
327 the active area, or it should not be reported at all.
328
329 For example, if the user's finger is touching near the top-left corner of the
330 touch screen, it may report a coordinate of (minX, minY). If the finger continues
331 to move further outside of the active area, the touch screen should either start
332 reporting coordinates with components less than minX and minY, such as
333 (minX - 2, minY - 3), or it should stop reporting the touch altogether.
334 In other words, the touch screen should *not* be reporting (minX, minY)
335 when the user's finger is really touching outside of the active area.
336
337 Clamping touch coordinates to the display edge creates an artificial
338 hard boundary around the edge of the screen which prevents the system from
339 smoothly tracking motions that enter or exit the bounds of the display area.
340
3416. The values reported by `ABS_PRESSURE` or `ABS_MT_PRESSURE`, if they
342 are reported at all, must be non-zero when the tool is touching the device
343 and zero otherwise to indicate that the tool is hovering.
344
345 Reporting pressure information is *optional* but strongly recommended.
346 Applications can use pressure information to implement pressure-sensitive drawing
347 and other effects.
348
3497. The values reported by `ABS_TOOL_WIDTH`, `ABS_MT_TOUCH_MAJOR`, `ABS_MT_TOUCH_MINOR`,
350 `ABS_MT_WIDTH_MAJOR`, or `ABS_MT_WIDTH_MINOR` should be non-zero when the tool
351 is touching the device and zero otherwise, but this is not required.
352 For example, the touch device may be able to measure the size of finger touch
353 contacts but not stylus touch contacts.
354
355 Reporting size information is *optional* but strongly recommended.
356 Applications can use pressure information to implement size-sensitive drawing
357 and other effects.
358
3598. The values reported by `ABS_DISTANCE` or `ABS_MT_DISTANCE` should approach
360 zero when the tool is touching the device. The distance may remain non-zero
361 even when the tool is in direct contact. The exact values reported depend
362 on the manner in which the hardware measures distance.
363
364 Reporting distance information is *optional* but recommended for
365 stylus devices.
366
3679. The values reported by `ABS_TILT_X` and `ABS_TILT_Y` should be zero when the
368 tool is perpendicular to the device. A non-zero tilt is taken as an indication
369 that the tool is held at an incline.
370
371 The tilt angles along the X and Y axes are assumed to be specified in degrees
372 from perpendicular. The center point (perfectly perpendicular) is given
373 by `(max + min) / 2` for each axis. Values smaller than the center point
374 represent a tilt up or to the left, values larger than the center point
375 represent a tilt down or to the right.
376
377 The `InputReader` converts the X and Y tilt components into a perpendicular
378 tilt angle ranging from 0 to `PI / 2` radians and a planar orientation angle
379 ranging from `-PI` to `PI` radians. This representation results in a
380 description of orientation that is compatible with what is used to describe
381 finger touches.
382
383 Reporting tilt information is *optional* but recommended for stylus devices.
384
38510. If the tool type is reported by `ABS_MT_TOOL_TYPE`, it will supercede any tool
386 type information reported by `BTN_TOOL_*`.
387 If no tool type information is available at all, the tool type defaults to
388 `MotionEvent.TOOL_TYPE_FINGER`.
389
39011. A tool is determined to be active based on the following conditions:
391
392 * When using the single-touch protocol, the tool is active if `BTN_TOUCH`,
393 or `BTN_TOOL_*` is 1.
394
395 This condition implies that the `InputReader` needs to have at least some
396 information about the nature of the tool, either whether it is touching,
397 or at least its tool type. If no information is available,
398 then the tool is assumed to be inactive (out of range).
399
400 * When using the multi-touch protocol 'A', the tool is active whenever it
401 appears in the most recent sync report. When the tool stops appearing in
402 sync reports, it ceases to exist.
403
404 * When using the multi-touch protocol 'B', the tool is active as long as
405 it has an active slot. When the slot it cleared, the tool ceases to exist.
406
40712. A tool is determined to be hovering based on the following conditions:
408
409 * If the tool is `BTN_TOOL_MOUSE` or `BTN_TOOL_LENS`, then the tool
410 is not hovering, even if either of the following conditions are true.
411
412 * If the tool is active and the driver reports pressure information,
413 and the reported pressure is zero, then the tool is hovering.
414
415 * If the tool is active and the driver supports the `BTN_TOUCH` key code and
416 `BTN_TOUCH` has a value of zero, then the tool is hovering.
417
41813. The `InputReader` supports both multi-touch protocol 'A' and 'B'. New drivers
419 should use the 'B' protocol but either will work.
420
42114. **As of Android Ice Cream Sandwich 4.0, touch screen drivers may need to be changed
422 to comply with the Linux input protocol specification.**
423
424 The following changes may be required:
425
426 * When a tool becomes inactive (finger goes "up"), it should stop appearing
427 in subsequent multi-touch sync reports. When all tools become inactive
428 (all fingers go "up"), the driver should send an empty sync report packet,
429 such as `SYN_MT_REPORT` followed by `SYN_REPORT`.
430
431 Previous versions of Android expected "up" events to be reported by sending
432 a pressure value of 0. The old behavior was incompatible with the
433 Linux input protocol specification and is no longer supported.
434
435 * Physical pressure or signal strength information should be reported using
436 `ABS_MT_PRESSURE`.
437
438 Previous versions of Android retrieved pressure information from
439 `ABS_MT_TOUCH_MAJOR`. The old behavior was incompatible with the
440 Linux input protocol specification and is no longer supported.
441
442 * Touch size information should be reported using `ABS_MT_TOUCH_MAJOR`.
443
444 Previous versions of Android retrieved size information from
445 `ABS_MT_TOOL_MAJOR`. The old behavior was incompatible with the
446 Linux input protocol specification and is no longer supported.
447
448 Touch device drivers no longer need Android-specific customizations.
449 By relying on the standard Linux input protocol, Android can support a
450 wider variety of touch peripherals, such as external HID multi-touch
451 touch screens, using unmodified drivers.
452
453## Touch Device Operation ##
454
455The following is a brief summary of the touch device operation on Android.
456
4571. The `EventHub` reads raw events from the `evdev` driver.
458
4592. The `InputReader` consumes the raw events and updates internal state about
460 the position and other characteristics of each tool. It also tracks
461 button states.
462
4633. If the BACK or FORWARD buttons were pressed or released, the `InputReader`
464 notifies the `InputDispatcher` about the key event.
465
4664. The `InputReader` determines whether a virtual key press occurred. If so,
467 it notifies the `InputDispatcher` about the key event.
468
4695. The `InputReader` determines whether the touch was initiated within the
470 bounds of the display. If so, it notifies the `InputDispatcher` about
471 the touch event.
472
4736. If there are no touching tools but there is at least one hovering tool,
474 the `InputReader` notifies the `InputDispatcher` about the hover event.
475
4767. If the touch device type is *pointer*, the `InputReader` performs pointer
477 gesture detection, moves the pointer and spots accordingly and notifies
478 the `InputDispatcher` about the pointer event.
479
4808. The `InputDispatcher` uses the `WindowManagerPolicy` to determine whether
481 the events should be dispatched and whether they should wake the device.
482 Then, the `InputDispatcher` delivers the events to the appropriate applications.
483
484## Touch Device Configuration ##
485
486Touch device behavior is determined by the device's axes, buttons, input properties,
487input device configuration, virtual key map and key layout.
488
489Refer to the following sections for more details about the files that
490participate in keyboard configuration:
491
492* [Input Device Configuration Files](/tech/input/input-device-configuration-files.html)
493* [Virtual Key Map Files](#virtual-key-map-files)
494
495### Properties ###
496
497The system relies on many input device configuration properties to configure
498and calibrate touch device behavior.
499
500One reason for this is that the device drivers for touch devices often report
501the characteristics of touches using device-specific units.
502
503For example, many touch devices measure the touch contact area
504using an internal device-specific scale, such as the total number of
505sensor nodes that were triggered by the touch. This raw size value would
506not be meaningful applications because they would need to know about the
507physical size and other characteristics of the touch device sensor nodes.
508
509The system uses calibration parameters encoded in input device configuration
510files to decode, transform, and normalize the values reported by the touch
511device into a simpler standard representation that applications can understand.
512
513### Documentation Conventions ###
514
515For documentation purposes, we will use the following conventions to describe
516the values used by the system during the calibration process.
517
518#### Raw Axis Values ####
519
520The following expressions denote the raw values reported by the touch
521device driver as `EV_ABS` events.
522
523`raw.x`
524: The value of the `ABS_X` or `ABS_MT_POSITION_X` axis.
525
526`raw.y`
527: The value of the `ABS_Y` or `ABS_MT_POSITION_Y` axis.
528
529`raw.pressure`
530: The value of the `ABS_PRESSURE` or `ABS_MT_PRESSURE` axis, or 0 if not available.
531
532`raw.touchMajor`
533: The value of the `ABS_MT_TOUCH_MAJOR` axis, or 0 if not available.
534
535`raw.touchMinor`
536: The value of the `ABS_MT_TOUCH_MINOR` axis, or `raw.touchMajor` if not available.
537
538`raw.toolMajor`
539: The value of the `ABS_TOOL_WIDTH` or `ABS_MT_WIDTH_MAJOR` axis, or 0 if not available.
540
541`raw.toolMinor`
542: The value of the `ABS_MT_WIDTH_MINOR` axis, or `raw.toolMajor` if not available.
543
544`raw.orientation`
545: The value of the `ABS_MT_ORIENTATION` axis, or 0 if not available.
546
547`raw.distance`
548: The value of the `ABS_DISTANCE` or `ABS_MT_DISTANCE` axis, or 0 if not available.
549
550`raw.tiltX`
551: The value of the `ABS_TILT_X` axis, or 0 if not available.
552
553`raw.tiltY`
554: The value of the `ABS_TILT_Y` axis, or 0 if not available.
555
556#### Raw Axis Ranges ####
557
558The following expressions denote the bounds of raw values. They are obtained
559by calling `EVIOCGABS` ioctl for each axis.
560
561`raw.*.min`
562: The inclusive minimum value of the raw axis.
563
564`raw.*.max`
565: The inclusive maximum value of the raw axis.
566
567`raw.*.range`
568: Equivalent to `raw.*.max - raw.*.min`.
569
570`raw.*.fuzz`
571: The accuracy of the raw axis. eg. fuzz = 1 implies values are accurate to +/- 1 unit.
572
573`raw.width`
574: The inclusive width of the touch area, equivalent to `raw.x.range + 1`.
575
576`raw.height`
577: The inclusive height of the touch area, equivalent to `raw.y.range + 1`.
578
579#### Output Ranges ####
580
581The following expressions denote the characteristics of the output coordinate system.
582The system uses linear interpolation to translate touch position information from
583the surface units used by the touch device into the output units that will
584be reported to applications such as display pixels.
585
586`output.width`
587: The output width. For touch screens (associated with a display), this
588 is the display width in pixels. For touch pads (not associated with a display),
589 the output width equals `raw.width`, indicating that no interpolation will
590 be performed.
591
592`output.height`
593: The output height. For touch screens (associated with a display), this
594 is the display height in pixels. For touch pads (not associated with a display),
595 the output height equals `raw.height`, indicating that no interpolation will
596 be performed.
597
598`output.diag`
599: The diagonal length of the output coordinate system, equivalent to
600 `sqrt(output.width ^2 + output.height ^2)`.
601
602### Basic Configuration ###
603
604The touch input mapper uses many configuration properties in the input device
605configuration file to specify calibration values. The following table describes
606some general purpose configuration properties. All other properties are described
607in the following sections along with the fields they are used to calibrate.
608
609#### `touch.deviceType` ####
610
611*Definition:* `touch.deviceType` = `touchScreen` | `touchPad` | `pointer` | `default`
612
613Specifies the touch device type.
614
615* If the value is `touchScreen`, the touch device is a touch screen associated
616 with a display.
617
618* If the value is `touchPad`, the touch device is a touch pad not associated
619 with a display.
620
621* If the value is `pointer`, the touch device is a touch pad not associated
622 with a display, and its motions are used for
623 [indirect multi-touch pointer gestures](#indirect-multi-touch-pointer-gestures).
624
625* If the value is `default`, the system automatically detects the device type
626 according to the classification algorithm.
627
628Refer to the [Classification](#touch-device-classification) section for more details
629about how the device type influences the behavior of the touch device.
630
631Prior to Honeycomb, all touch devices were assumed to be touch screens.
632
633#### `touch.orientationAware` ####
634
635*Definition:* `touch.orientationAware` = `0` | `1`
636
637Specifies whether the touch device should react to display orientation changes.
638
639* If the value is `1`, touch positions reported by the touch device are rotated
640 whenever the display orientation changes.
641
642* If the value is `0`, touch positions reported by the touch device are immune
643 to display orientation changes.
644
645The default value is `1` if the device is a touch screen, `0` otherwise.
646
647The system distinguishes between internal and external touch screens and displays.
648An orientation aware internal touch screen is rotated based on the orientation
649of the internal display. An orientation aware external touch screen is rotated
650based on the orientation of the external display.
651
652Orientation awareness is used to support rotation of touch screens on devices
653like the Nexus One. For example, when the device is rotated clockwise 90 degrees
654from its natural orientation, the absolute positions of touches are remapped such
655that a touch in the top-left corner of the touch screen's absolute coordinate system
656is reported as a touch in the top-left corner of the display's rotated coordinate system.
657This is done so that touches are reported with the same coordinate system that
658applications use to draw their visual elements.
659
660Prior to Honeycomb, all touch devices were assumed to be orientation aware.
661
662#### `touch.gestureMode` ####
663
664*Definition:* `touch.gestureMode` = `pointer` | `spots` | `default`
665
666Specifies the presentation mode for pointer gestures. This configuration property
667is only relevant when the touch device is of type *pointer*.
668
669* If the value is `pointer`, the touch pad gestures are presented by way of a cursor
670 similar to a mouse pointer.
671
672* If the value is `spots`, the touch pad gestures are presented by an anchor
673 that represents the centroid of the gesture and a set of circular spots
674 that represent the position of individual fingers.
675
676The default value is `pointer` when the `INPUT_PROP_SEMI_MT` input property
677is set, or `spots` otherwise.
678
679### `X` and `Y` Fields ###
680
681The X and Y fields provide positional information for the center of the contact area.
682
683#### Calculation ####
684
685The calculation is straightforward: positional information from the touch driver is
686linearly interpolated to the output coordinate system.
687
688 xScale = output.width / raw.width
689 yScale = output.height / raw.height
690
691 If not orientation aware or screen rotation is 0 degrees:
692 output.x = (raw.x - raw.x.min) * xScale
693 output.y = (raw.y - raw.y.min) * yScale
694 Else If rotation is 90 degrees:
695 output.x = (raw.y - raw.y.min) * yScale
696 output.y = (raw.x.max - raw.x) * xScale
697 Else If rotation is 180 degrees:
698 output.x = (raw.x.max - raw.x) * xScale
699 output.y = (raw.y.max - raw.y) * yScale
700 Else If rotation is 270 degrees:
701 output.x = (raw.y.max - raw.y) * yScale
702 output.y = (raw.x - raw.x.min) * xScale
703 End If
704
705### `TouchMajor`, `TouchMinor`, `ToolMajor`, `ToolMinor`, `Size` Fields ###
706
707The `TouchMajor` and `TouchMinor` fields describe the approximate dimensions
708of the contact area in output units (pixels).
709
710The `ToolMajor` and `ToolMinor` fields describe the approximate dimensions
711of the [tool](#tools-and-tool-types) itself in output units (pixels).
712
713The `Size` field describes the normalized size of the touch relative to
714the largest possible touch that the touch device can sense. The smallest
715possible normalized size is 0.0 (no contact, or it is unmeasurable), and the largest
716possible normalized size is 1.0 (sensor area is saturated).
717
718When both the approximate length and breadth can be measured, then the `TouchMajor` field
719specifies the longer dimension and the `TouchMinor` field specifies the shorter dimension
720of the contact area. When only the approximate diameter of the contact area can be measured,
721then the `TouchMajor` and `TouchMinor` fields will be equal.
722
723Likewise, the `ToolMajor` field specifies the longer dimension and the `ToolMinor`
724field specifies the shorter dimension of the tool's cross-sectional area.
725
726If the touch size is unavailable but the tool size is available, then the tool size
727will be set equal to the touch size. Conversely, if the tool size is unavailable
728but the touch size is available, then the touch size will be set equal to the tool size.
729
730Touch devices measure or report the touch size and tool size in various ways.
731The current implementation supports three different kinds of measurements:
732diameter, area, and geometric bounding box in surface units.
733
734#### `touch.size.calibration` ####
735
736*Definition:* `touch.size.calibration` = `none` | `geometric` | `diameter`
737| `area` | `default`
738
739Specifies the kind of measurement used by the touch driver to report the
740touch size and tool size.
741
742* If the value is `none`, the size is set to zero.
743
744* If the value is `geometric`, the size is assumed to be specified in the same
745 surface units as the position, so it is scaled in the same manner.
746
747* If the value is `diameter`, the size is assumed to be proportional to
748 the diameter (width) of the touch or tool.
749
750* If the value is `area`, the size is assumed to be proportional to the
751 area of the touch or tool.
752
753* If the value is `default`, the system uses the `geometric` calibration if the
754 `raw.touchMajor` or `raw.toolMajor` axis is available, otherwise it uses
755 the `none` calibration.
756
757#### `touch.size.scale` ####
758
759*Definition:* `touch.size.scale` = &lt;a non-negative floating point number&gt;
760
761Specifies a constant scale factor used in the calibration.
762
763The default value is `1.0`.
764
765#### `touch.size.bias` ####
766
767*Definition:* `touch.size.bias` = &lt;a non-negative floating point number&gt;
768
769Specifies a constant bias value used in the calibration.
770
771The default value is `0.0`.
772
773#### `touch.size.isSummed` ####
774
775*Definition:* `touch.size.isSummed` = `0` | `1`
776
777Specifies whether the size is reported as the sum of the sizes of all
778active contacts, or is reported individually for each contact.
779
780* If the value is `1`, the reported size will be divided by the number
781 of contacts prior to use.
782
783* If the value is `0`, the reported size will be used as is.
784
785The default value is `0`.
786
787Some touch devices, particularly "Semi-MT" devices cannot distinguish the
788individual dimensions of multiple contacts so they report a size measurement
789that represents their total area or width. This property should only be set to
790`1` for such devices. If in doubt, set this value to `0`.
791
792#### Calculation ####
793
794The calculation of the `TouchMajor`, `TouchMinor`, `ToolMajor`, `ToolMinor`
795and `Size` fields depends on the specified calibration parameters.
796
797 If raw.touchMajor and raw.toolMajor are available:
798 touchMajor = raw.touchMajor
799 touchMinor = raw.touchMinor
800 toolMajor = raw.toolMajor
801 toolMinor = raw.toolMinor
802 Else If raw.touchMajor is available:
803 toolMajor = touchMajor = raw.touchMajor
804 toolMinor = touchMinor = raw.touchMinor
805 Else If raw.toolMajor is available:
806 touchMajor = toolMajor = raw.toolMajor
807 touchMinor = toolMinor = raw.toolMinor
808 Else
809 touchMajor = toolMajor = 0
810 touchMinor = toolMinor = 0
811 size = 0
812 End If
813
814 size = avg(touchMajor, touchMinor)
815
816 If touch.size.isSummed == 1:
817 touchMajor = touchMajor / numberOfActiveContacts
818 touchMinor = touchMinor / numberOfActiveContacts
819 toolMajor = toolMajor / numberOfActiveContacts
820 toolMinor = toolMinor / numberOfActiveContacts
821 size = size / numberOfActiveContacts
822 End If
823
824 If touch.size.calibration == "none":
825 touchMajor = toolMajor = 0
826 touchMinor = toolMinor = 0
827 size = 0
828 Else If touch.size.calibration == "geometric":
829 outputScale = average(output.width / raw.width, output.height / raw.height)
830 touchMajor = touchMajor * outputScale
831 touchMinor = touchMinor * outputScale
832 toolMajor = toolMajor * outputScale
833 toolMinor = toolMinor * outputScale
834 Else If touch.size.calibration == "area":
835 touchMajor = sqrt(touchMajor)
836 touchMinor = touchMajor
837 toolMajor = sqrt(toolMajor)
838 toolMinor = toolMajor
839 Else If touch.size.calibration == "diameter":
840 touchMinor = touchMajor
841 toolMinor = toolMajor
842 End If
843
844 If touchMajor != 0:
845 output.touchMajor = touchMajor * touch.size.scale + touch.size.bias
846 Else
847 output.touchMajor = 0
848 End If
849
850 If touchMinor != 0:
851 output.touchMinor = touchMinor * touch.size.scale + touch.size.bias
852 Else
853 output.touchMinor = 0
854 End If
855
856 If toolMajor != 0:
857 output.toolMajor = toolMajor * touch.size.scale + touch.size.bias
858 Else
859 output.toolMajor = 0
860 End If
861
862 If toolMinor != 0:
863 output.toolMinor = toolMinor * touch.size.scale + touch.size.bias
864 Else
865 output.toolMinor = 0
866 End If
867
868 output.size = size
869
870### `Pressure` Field ###
871
872The `Pressure` field describes the approximate physical pressure applied to the
873touch device as a normalized value between 0.0 (no touch) and 1.0 (full force).
874
875A zero pressure indicates that the tool is hovering.
876
877#### `touch.pressure.calibration` ####
878
879*Definition:* `touch.pressure.calibration` = `none` | `physical` | `amplitude` | `default`
880
881Specifies the kind of measurement used by the touch driver to report the pressure.
882
883* If the value is `none`, the pressure is unknown so it is set to 1.0 when
884 touching and 0.0 when hovering.
885
886* If the value is `physical`, the pressure axis is assumed to measure the actual
887 physical intensity of pressure applied to the touch pad.
888
889* If the value is `amplitude`, the pressure axis is assumed to measure the signal
890 amplitude, which is related to the size of the contact and the pressure applied.
891
892* If the value is `default`, the system uses the `physical` calibration if the
893 pressure axis available, otherwise uses `none`.
894
895#### `touch.pressure.scale` ####
896
897*Definition:* `touch.pressure.scale` = &lt;a non-negative floating point number&gt;
898
899Specifies a constant scale factor used in the calibration.
900
901The default value is `1.0 / raw.pressure.max`.
902
903#### Calculation ####
904
905The calculation of the `Pressure` field depends on the specified calibration parameters.
906
907 If touch.pressure.calibration == "physical" or "amplitude":
908 output.pressure = raw.pressure * touch.pressure.scale
909 Else
910 If hovering:
911 output.pressure = 0
912 Else
913 output.pressure = 1
914 End If
915 End If
916
917### `Orientation` and `Tilt` Fields ###
918
919The `Orientation` field describes the orientation of the touch and tool as an
920angular measurement. An orientation of `0` indicates that the major axis is
921oriented vertically, `-PI/2` indicates that the major axis is oriented to the left,
922`PI/2` indicates that the major axis is oriented to the right. When a stylus
923tool is present, the orientation range may be described in a full circle range
924from `-PI` or `PI`.
925
926The `Tilt` field describes the inclination of the tool as an angular measurement.
927A tilt of `0` indicates that the tool is perpendicular to the surface.
928A tilt of `PI/2` indicates that the tool is flat on the surface.
929
930#### `touch.orientation.calibration` ####
931
932*Definition:* `touch.orientation.calibration` = `none` | `interpolated` | `vector` | `default`
933
934Specifies the kind of measurement used by the touch driver to report the orientation.
935
936* If the value is `none`, the orientation is unknown so it is set to 0.
937
938* If the value is `interpolated`, the orientation is linearly interpolated such that a
939 raw value of `raw.orientation.min` maps to `-PI/2` and a raw value of
940 `raw.orientation.max` maps to `PI/2`. The center value of
941 `(raw.orientation.min + raw.orientation.max) / 2` maps to `0`.
942
943* If the value is `vector`, the orientation is interpreted as a packed vector consisiting
944 of two signed 4-bit fields. This representation is used on Atmel Object Based Protocol
945 parts. When decoded, the vector yields an orientation angle and confidence
946 magnitude. The confidence magnitude is used to scale the size information,
947 unless it is geometric.
948
949* If the value is `default`, the system uses the `interpolated` calibration if the
950 orientation axis available, otherwise uses `none`.
951
952#### Calculation ####
953
954The calculation of the `Orientation` and `Tilt` fields depends on the specified
955calibration parameters and available input.
956
957 If touch.tiltX and touch.tiltY are available:
958 tiltXCenter = average(raw.tiltX.min, raw.tiltX.max)
959 tiltYCenter = average(raw.tiltY.min, raw.tiltY.max)
960 tiltXAngle = (raw.tiltX - tiltXCenter) * PI / 180
961 tiltYAngle = (raw.tiltY - tiltYCenter) * PI / 180
962 output.orientation = atan2(-sin(tiltXAngle), sinf(tiltYAngle))
963 output.tilt = acos(cos(tiltXAngle) * cos(tiltYAngle))
964 Else If touch.orientation.calibration == "interpolated":
965 center = average(raw.orientation.min, raw.orientation.max)
966 output.orientation = PI / (raw.orientation.max - raw.orientation.min)
967 output.tilt = 0
968 Else If touch.orientation.calibration == "vector":
969 c1 = (raw.orientation & 0xF0) >> 4
970 c2 = raw.orientation & 0x0F
971
972 If c1 != 0 or c2 != 0:
973 If c1 >= 8 Then c1 = c1 - 16
974 If c2 >= 8 Then c2 = c2 - 16
975 angle = atan2(c1, c2) / 2
976 confidence = sqrt(c1*c1 + c2*c2)
977
978 output.orientation = angle
979
980 If touch.size.calibration == "diameter" or "area":
981 scale = 1.0 + confidence / 16
982 output.touchMajor *= scale
983 output.touchMinor /= scale
984 output.toolMajor *= scale
985 output.toolMinor /= scale
986 End If
987 Else
988 output.orientation = 0
989 End If
990 output.tilt = 0
991 Else
992 output.orientation = 0
993 output.tilt = 0
994 End If
995
996 If orientation aware:
997 If screen rotation is 90 degrees:
998 output.orientation = output.orientation - PI / 2
999 Else If screen rotation is 270 degrees:
1000 output.orientation = output.orientation + PI / 2
1001 End If
1002 End If
1003
1004### `Distance` Field ###
1005
1006The `Distance` field describes the distance between the tool and the touch device
1007surface. A value of 0.0 indicates direct contact and larger values indicate
1008increasing distance from the surface.
1009
1010#### `touch.distance.calibration` ####
1011
1012*Definition:* `touch.distance.calibration` = `none` | `scaled` | `default`
1013
1014Specifies the kind of measurement used by the touch driver to report the distance.
1015
1016* If the value is `none`, the distance is unknown so it is set to 0.
1017
1018* If the value is `scaled`, the reported distance is multiplied by a
1019 constant scale factor.
1020
1021* If the value is `default`, the system uses the `scaled` calibration if the
1022 distance axis available, otherwise uses `none`.
1023
1024#### `touch.distance.scale` ####
1025
1026*Definition:* `touch.distance.scale` = &lt;a non-negative floating point number&gt;
1027
1028Specifies a constant scale factor used in the calibration.
1029
1030The default value is `1.0`.
1031
1032#### Calculation ####
1033
1034The calculation of the `Distance` field depends on the specified calibration parameters.
1035
1036 If touch.distance.calibration == "scaled":
1037 output.distance = raw.distance * touch.distance.scale
1038 Else
1039 output.distance = 0
1040 End If
1041
1042### Example ###
1043
1044 # Input device configuration file for a touch screen that supports pressure,
1045 # size and orientation. The pressure and size scale factors were obtained
1046 # by measuring the characteristics of the device itself and deriving
1047 # useful approximations based on the resolution of the touch sensor and the
1048 # display.
1049 #
1050 # Note that these parameters are specific to a particular device model.
1051 # Different parameters will need to be used for other devices.
1052
1053 # Basic Parameters
1054 touch.deviceType = touchScreen
1055 touch.orientationAware = 1
1056
1057 # Size
1058 # Based on empirical measurements, we estimate the size of the contact
1059 # using size = sqrt(area) * 28 + 0.
1060 touch.size.calibration = area
1061 touch.size.scale = 28
1062 touch.size.bias = 0
1063 touch.size.isSummed = 0
1064
1065 # Pressure
1066 # Driver reports signal strength as pressure.
1067 #
1068 # A normal index finger touch typically registers about 80 signal strength
1069 # units although we don't expect these values to be accurate.
1070 touch.pressure.calibration = amplitude
1071 touch.pressure.scale = 0.0125
1072
1073 # Orientation
1074 touch.orientation.calibration = vector
1075
1076### Compatibility Notes ###
1077
1078The configuration properties for touch devices changed significantly in
1079Android Ice Cream Sandwich 4.0. **All input device configuration files for touch
1080devices must be updated to use the new configuration properties.**
1081
1082Older touch device [drivers](#touch-device-driver-requirements) may also need to be
1083updated.
1084
1085## Virtual Key Map Files ##
1086
1087Touch devices are often used to implement virtual keys.
1088
1089There are several ways of doing this, depending on the capabilities of the
1090touch controller. Some touch controllers can be directly configured to implement
1091soft keys by setting firmware registers. Other times it is desirable to perform
1092the mapping from touch coordinates to key codes in software.
1093
1094When virtual keys are implemented in software, the kernel must export a virtual key map
1095file called `virtualkeys.<devicename>` as a board property. For example,
1096if the touch screen device drivers reports its name as "touchyfeely" then
1097the virtual key map file must have the path `/sys/board_properties/virtualkeys.touchyfeely`.
1098
1099A virtual key map file describes the coordinates and Linux key codes of virtual keys
1100on the touch screen.
1101
1102In addition to the virtual key map file, there must be a corresponding key layout
1103file and key character map file to map the Linux key codes to Android key codes and
1104to specify the type of the keyboard device (usually `SPECIAL_FUNCTION`).
1105
1106### Syntax ###
1107
1108A virtual key map file is a plain text file consisting of a sequence of virtual key
1109layout descriptions either separated by newlines or by colons.
1110
1111Comment lines begin with '#' and continue to the end of the line.
1112
1113Each virtual key is described by 6 colon-delimited components:
1114
1115* `0x01`: A version code. Must always be `0x01`.
1116* &lt;Linux key code&gt;: The Linux key code of the virtual key.
1117* &lt;centerX&gt;: The X pixel coordinate of the center of the virtual key.
1118* &lt;centerY&gt;: The Y pixel coordinate of the center of the virtual key.
1119* &lt;width&gt;: The width of the virtual key in pixels.
1120* &lt;height&gt;: The height of the virtual key in pixels.
1121
1122All coordinates and sizes are specified in terms of the display coordinate system.
1123
1124Here is a virtual key map file all written on one line.
1125
1126 # All on one line
1127 0x01:158:55:835:90:55:0x01:139:172:835:125:55:0x01:102:298:835:115:55:0x01:217:412:835:95:55
1128
1129The same virtual key map file can also be written on multiple lines.
1130
1131 # One key per line
1132 0x01:158:55:835:90:55
1133 0x01:139:172:835:125:55
1134 0x01:102:298:835:115:55
1135 0x01:217:412:835:95:55
1136
1137In the above example, the touch screen has a resolution of 480x800. Accordingly, all of
1138the virtual keys have a &lt;centerY&gt; coordinate of 835, which is a little bit below
1139the visible area of the touch screen.
1140
1141The first key has a Linux scan code of `158` (`KEY_BACK`), centerX of `55`,
1142centerY of `835`, width of `90` and height of `55`.
1143
1144### Example ###
1145
1146Virtual key map file: `/sys/board_properties/virtualkeys.touchyfeely`.
1147
1148 0x01:158:55:835:90:55
1149 0x01:139:172:835:125:55
1150 0x01:102:298:835:115:55
1151 0x01:217:412:835:95:55
1152
1153Key layout file: `/system/usr/keylayout/touchyfeely.kl`.
1154
1155 key 158 BACK
1156 key 139 MENU
1157 key 102 HOME
1158 key 217 SEARCH
1159
1160Key character map file: `/system/usr/keychars/touchyfeely.kcm`.
1161
1162 type SPECIAL_FUNCTION
1163
1164## Indirect Multi-touch Pointer Gestures ##
1165
1166In pointer mode, the system interprets the following gestures:
1167
11681. Single finger tap: click.
1169
11702. Single finger motion: move the pointer.
1171
11723. Single finger motion plus button presses: drag the pointer.
1173
11744. Two finger motion both fingers moving in the same direction: drag the area under the pointer
1175 in that direction. The pointer itself does not move.
1176
11775. Two finger motion both fingers moving towards each other or apart in
1178 different directions: pan/scale/rotate the area surrounding the pointer.
1179 The pointer itself does not move.
1180
11816. Multiple finger motion: freeform gesture.
1182
1183## Further Reading ##
1184
11851. [Linux multi-touch protocol](http://www.kernel.org/doc/Documentation/input/multi-touch-protocol.txt)
11862. [ENAC list of available multitouch devices on Linux](http://lii-enac.fr/en/architecture/linux-input/multitouch-devices.html)