As you know, in version 4.0 of the Android operating system, the functionality of the accessibility of the touch interface for users with visual impairments was implemented, which was called “Explore by touch”. In the next generation of OS 4.1, this functionality was improved and new control gestures were added to it, which provide the blind with more opportunities to interact with the system.

In this article, we will consider the touch control model implemented in the Android version 4.1 Jelly Bean operating system as part of the accessibility for blind users.

The main improvement in the adaptive touch control in Jelly Bean is that the so-called linear navigation has been added to the already available functionality of learning by touch, when the screen reader just read objects under the finger, which is a sequential movement of all objects without the need for the user to specify their exact coordinates. …

Now the main scenarios for interacting with the touch interface are as follows:

  • To get a general idea of ​​the contents of the screen and the location of its elements, the user can apply learning by touch.
  • To quickly activate an object on the screen, whose location is familiar to him, he can use his own muscle memory within the framework of the work, all in the same mode of learning by touch.
  • To find an object whose specific location is not known exactly for some reason, or simply to familiarize himself with the contents of the screen without risking missing an object, the user can now apply linear navigation.

For example, when using the Play Market application, you can quickly find the “Search” button at the top of the screen based on muscle memory in the study mode by touching. When it comes to installing the application, the corresponding button, of course, you can continue to search in the same way, knowing which elements it is close to, but you can switch to linear navigation and, in a couple of movements, move exactly to it by following the gestures of sequential movement across onscreen objects …

Thus, where it is possible to use muscle memory, the user can continue to work in the touch learning mode, but when for some reason this is not convenient, for example, there are many closely spaced objects, there are problems with spatial representations, etc., then you can always switch to linear navigation.

In view of the implementation of linear navigation, such an entity was introduced as the “accessibility focus”, that is, a conditional cursor that points to the item that was selected last and to which the activation action will be applied when the user issues the appropriate command. That is, when moving through objects in the framework of linear navigation, they consistently fall into this very focus of accessibility.

It should also be noted here that the gesture for activating the selected element has been changed. Now this is not a tap with the second finger, while holding the first one, but simply a double quick tap, which activates the object in focus.

When a finger touches the screen, the element at the touch point is spoken and placed in the accessibility focus. Sliding your finger across an area of ​​the screen enables you to use tap learning to read objects that fall under your finger. If the movement is sharper and more accentuated, then this action will be evaluated by the system not as movement within the touch study mode, but as an accessibility gesture, that is, a separate command for controlling screen access.

There are four basic gestures of accessibility, based on quickly moving your finger in one direction: up, down, left, or right. In addition to them, there are additional gestures based on a combination of two such movements, for example, up and left, down and right, and so on, which gives twelve more commands.

Thus, in total, the system has sixteen accessibility gestures.

The main four gestures are used to navigate objects, that is, to move the focus of accessibility:

  • Up or Left – Move to the previous item.
  • Down or Right – Move to the next item.

That is, these four gestures are performed by two commands, duplicating each other in pairs, which ensures their unambiguous functioning in different positions of the device (portrait and landscape).

It should be especially noted that these gestures are not complete analogs of control commands issued using hardware elements such as a joystick, cursor keys, or trackpad / trackball.

The hardware controller does not move the accessibility focus, but the system focus, into which only interactive objects fall, that is, those with which direct interaction is implied. For example, buttons, checkboxes, edit boxes, and the like. The described gestures move precisely the focus of accessibility, into which not only interactive, but also all other objects fall. For example, static labels, graphic icons, and more.

In addition, the success of working with an application through the system focus directly depends on whether the developer has foreseen such an opportunity. Whereas the success of using the accessibility focus is not dependent on the application’s keyboard support.

On the other hand, working from the keyboard using the system focus, the user will not constantly stumble upon possibly uninformative and useless objects, such as the application logo or the title caption, so it cannot be unequivocally stated that one of the control methods is definitely better than the other, since it all depends a lot on the context.

In addition to basic navigation gestures, commands for a number of general actions are provided to the blind user.

  • Up and down – increase the detail.
  • Down and Up – Decrease the detail.
  • Left and right – scroll the list forward one screen.
  • Left and right – scroll the list back one screen.

There is also a group of four gestures, the functions of which can be redefined by the user himself. However, initially the following commands are assigned to them:

  • Down and Left is a command similar to pressing the Back key.
  • Up and Left – a command similar to pressing the Home key.
  • Up and Right – open the notification panel.
  • Down and to the right – open the panel of the last used applications, that is, analogous to long holding the Home key.

The remaining four gestures in Jelly Bean are not yet used, and are waiting in the wings.

As TV Raman, Project Manager for Android OS Accessibility, says: “Gestures to control and work with the accessibility focus are an evolving part of Android accessibility, and we will continue to improve this based on user experience. “(Originally: Gestures for manipulating and working with Accessibility Focus are an evolving part of the Android Accessibility; we will continue to refine these based on user experience).

Leave a Reply

Your email address will not be published. Required fields are marked *