Device implementations:
If device implementations include support for third-party Input Method Editor (IME) applications, they:
android.software.input_methods
feature flag.Input Management Framework
Device implementations:
Android includes support for d-pad, trackball, and wheel as mechanisms for non-touch navigation.
Device implementations:
If device implementations lack non-touch navigations, they:
The Home, Recents, and Back functions typically provided via an interaction with a dedicated physical button or a distinct portion of the touch screen, are essential to the Android navigation paradigm and therefore, device implementations:
<intent-filter>
set with ACTION=MAIN
and CATEGORY=LAUNCHER
or CATEGORY=LEANBACK_LAUNCHER
for Television device implementations. The Home function SHOULD be the mechanism for this user affordance.If the Home, Recents, or Back functions are provided, they:
Device implementations:
If device implementations provide the Menu function, they:
If device implementations do not provide the Menu function, for backwards compatibility, they:
targetSdkVersion
is less than 10, either by a physical button, a software key, or gestures. This Menu function should be accessible unless hidden together with other navigation functions.If device implementations provide the Assist function, they:
If device implementations use a distinct portion of the screen to display the navigation keys, they:
View.setSystemUiVisibility()
API method, so that this distinct portion of the screen (a.k.a. the navigation bar) is properly hidden away as documented in the SDK.If the navigation function is provided as an on-screen, gesture-based action:
WindowInsets#getMandatorySystemGestureInsets()
MUST only be used to report the Home gesture recognition area.View#setSystemGestureExclusionRects()
, but outside of WindowInsets#getMandatorySystemGestureInsets()
, MUST NOT be intercepted for the navigation function as long as the exclusion rect is allowed within the max exclusion limit as specified in the documentation for View#setSystemGestureExclusionRects()
.MotionEvent.ACTION_CANCEL
event once touches start being intercepted for a system gesture, if the foreground app was previously sent an MotionEvent.ACTION_DOWN
event.WindowInsets#getMandatorySystemGestureInsets()
SHOULD NOT be affected by exclusion rects provided by the foreground application via View#setSystemGestureExclusionRects()
.If a navigation function is provided from anywhere on the left and right edges of the current orientation of the screen:
View.SYSTEM_UI_FLAG_IMMERSIVE
or View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY
flags set, swiping from the edges MUST behave as implemented in AOSP, which is documented in the SDK.View.SYSTEM_UI_FLAG_IMMERSIVE
or View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY
flags set, custom swipeable system panels MUST be hidden until the user brings in the system bars (a.k.a. navigation and status bar) as implemented in AOSP.Android includes support for a variety of pointer input systems, such as touchscreens, touch pads, and fake touch input devices. Touchscreen-based device implementations are associated with a display such that the user has the impression of directly manipulating items on screen. Since the user is directly touching the screen, the system does not require any additional affordances to indicate the objects being manipulated.
Device implementations:
If device implementations include a touchscreen (single-touch or better), they:
TOUCHSCREEN_FINGER
for the Configuration.touchscreen
API field.android.hardware.touchscreen
and android.hardware.faketouch
feature flags.If device implementations include a touchscreen that can track more than a single touch, they:
android.hardware.touchscreen.multitouch
, android.hardware.touchscreen.multitouch.distinct
, android.hardware.touchscreen.multitouch.jazzhand
corresponding to the type of the specific touchscreen on the device.If device implementations do not include a touchscreen (and rely on a pointer device only) and meet the fake touch requirements in section 7.2.5, they:
android.hardware.touchscreen
and MUST report only android.hardware.faketouch
.Fake touch interface provides a user input system that approximates a subset of touchscreen capabilities. For example, a mouse or remote control that drives an on-screen cursor approximates touch, but requires the user to first point or focus then click. Numerous input devices like the mouse, trackpad, gyro-based air mouse, gyro-pointer, joystick, and multi-touch trackpad can support fake touch interactions. Android includes the feature constant android.hardware.faketouch, which corresponds to a high-fidelity non-touch (pointer-based) input device such as a mouse or trackpad that can adequately emulate touch-based input (including basic gesture support), and indicates that the device supports an emulated subset of touchscreen functionality.
If device implementations do not include a touchscreen but include another pointer input system which they want to make available, they:
android.hardware.faketouch
feature flag.If device implementations declare support for android.hardware.faketouch
, they:
TOUCHSCREEN_NOTOUCH
for the Configuration.touchscreen
API field.If device implementations declare support for android.hardware.faketouch.multitouch.distinct
, they:
android.hardware.faketouch
.If device implementations declare support for android.hardware.faketouch.multitouch.jazzhand
, they:
android.hardware.faketouch
.Device implementations:
InputEvent
constants as listed in the below tables. The upstream Android implementation satisfies this requirement.If device implementations embed a controller or ship with a separate controller in the box that would provide means to input all the events listed in the below tables, they:
android.hardware.gamepad
D-pad down1 0x01 0x00393 AXIS_HAT_Y4
D-pad right1 0x01 0x00393 AXIS_HAT_X4
0x01 0x0031 AXIS_X
AXIS_Y
0x01 0x0035 AXIS_Z
AXIS_RZ
See Section 2.3.1 for device-specific requirements.