Remove activation gestures form reported and add a touch explore requesting flag.

1. Delegating activation gestures has several issues that we should
   decide how to handle if possible before allowing an accessibility
   service to take over them:

   A) It is needed that every view than can be clicked or long pressed on
      reacts to such as a response to calling performClick and performLongPress
      which is not necessary true since the view may watch the touch
      events and do its own click long click detection. As a result it may
      be possible that there are view a user cannot interact with in
      touch exploration mode but can if not in that mode.

   B) Clicking or long pressing on a different location in a view may yield
      different results, for example NumberPicker. Ideally such views have
      to implement AccessibilityNodeProvide which provider handles correctly
      the request for click long press on virtual nodes. Some apps however
      just fire different hover accessibility events when the user is over
      a specific semantic portion of the view but do not provide virtual
      nodes. Hence, a user will not be able to interact with such semantic
      regions but the system can achieve that by sending the click/long click
      at the precise location in the view that was last touch explored.

2. Adding a flag on accessibility service info to request explore by touch
   mode. There is no need to put the device in this mode if node of the currently
   enabled accessibility services supports it. Now the problem is inverted and
   the service has to explicitly state its capability.

3. Fixing a bug where includeImportantViews was ignored for automation
   services.

Change-Id: I3b29a19f24ab5e26ee29f974bbac2197614c9e2a
7 files changed