Many Android OS users have physical disabilities, and as a result, their principles of interaction with devices may differ from generally accepted ones. This category includes, in particular, persons with visual impairments, mobility or the elderly, whose physical limitations do not allow them to properly use the touch screen or perceive the information displayed on it.

The Android system has a special accessibility functionality, also called the accessibility layer, that enables these people to make their Android experience more comfortable. Related services include text-to-speech, tactile feedback, and a trackball or joystick that enhance the user experience.

To provide this category of users with the highest level of accessibility of an application, its developer must adhere to the following guidelines.

Basic concepts
Adhering to the following two rules will help you solve a number of basic application availability issues.

Ensure the availability of interaction with all controls through a trackball or joystick.
Implement ImageButton, EditTexts, and other input elements using the contentDescription attribute.
Implementation of navigation through the control controller
Many Android devices have a control controller such as:

Interactive trackball giving arbitrary direction and input;
Interactive joystick, also called D-pad, giving four directions and input;
A set of cursor keys and input buttons, which are similar in general functionality to a joystick.
All of these types of controllers allow the user to move the system focus without using the touchscreen. On some devices, the user can also navigate to the top or bottom of the list by pressing the up or down commands while holding the Alt key.

The Control Controller is the primary navigation tool for users with visual impairments and certain motor functions, as well as on devices without a touchscreen display. Make sure that all important items are accessible without using the touchscreen, and the control controller input is handled in the same way as activating an item in focus through the touchscreen.

Navigation capabilities, in particular viewing by the controlling controller, are defined through the isFocusable () method . To set the user’s ability to focus elements, call setFocusable (boolean) or set the android: focusable attribute in the XML layout file.

Focus movement ordering is implemented through an algorithm that finds the nearest neighbor in the current movement direction. In rare cases, the result of the work of the standard algorithm may not coincide with the intention of the author of the application. In such situations, you can override the focus movement rules by explicitly specifying them in the XML layout file using the following attributes:

Down – nextFocusDown ;
Left – nextFocusLeft ;
Right – nextFocusRight ;
Up – nextFocusUp .
Pressing the control controller
On most devices, pressing a controller, that is, executing an enter command, sends a KeyEvent with KEYCODE_DPAD_CENTER . Make sure this event has the same effect as pressing a control on the touchscreen. All standard Android apps already handle KEYCODE_DPAD_CENTER appropriately.

In KEYCODE_DPAD_CENTER also have equivalent – KEYCODE_ENTER , which is more suitable for devices with a QWERTY-keyboard.

Labeling input elements
Many input elements are based on visual cues informing the user of their purpose. For example, an application might use an ImageButton with a plus sign to indicate that the user can add a record to the table. Alternatively, the EditText element can have a graphic label next to it, which indicates the purpose of this edit field. When a blind or visually impaired user is working, such labels may not be visible to him.

The contentDescription attribute is used to set a textual description of such elements . It should contain text that clearly describes the purpose of the input element, while its logical correspondence to the graphic pictogram is desirable. This attribute should be specified for all ImageButton, EditText, and all other input widgets for which this is possible.

Following Android UI Best Practices
Designing an interface that follows the Android UI guidelines will make it more user-friendly for users who are using the app. This compliance is especially important for users with disabilities, as they may lack contextual information to understand the program interface.

You should use the item view that is described in the Android SDK whenever possible, as these items have built-in accessibility support.

Dispatching AccessibilityEvents from Custom Visible Elements
If the application requires the creation of custom visual elements, it can be made available through the implementation of the interface AccessibilityEventSource (android.view.accessibility.AccessibilityEventSource) and sending the appropriate time AccessibilityEvent (android.view.accessibility.AccessibilityEvent).

The visible classes already use the AccessibilityEventSource interface . This interface provides a mechanism for dispatching events to registered AccessibilityService (android.accessibilityservice).

There are several types of availability events that must be sent:

Clicking a visible item – TYPE_VIEW_CLICKED ;
Pressing a visible item with a long hold – TYPE_VIEW_LONG_CLICKED ;
Selecting an item, usually in the context of “AdapterView.” – TYPE_VIEW_SELECTED ;
Setting focus on an element – TYPE_VIEW_FOCUSED ;
Changing the text content of the element on focus is TYPE_VIEW_TEXT_CHANGED .
Each type of event requires that specific properties have already been set and that Accessibility Services can respond to them appropriately. The available specifics are described in more detail in the documentation for the AccessibilityEvent .

Testing the accessibility of the application
To test the accessibility of an application, you can either contact the community of users with disabilities, or try to simulate the user experience yourself using accessibility features and the specialized UI-Automation service that appeared in API level 18. One such tool is TalkBack, a screen reader that comes preinstalled on many devices. It is also available for free in the Android Market.

TalkBack requires a speech synthesizer to work. On some devices, you may need to pre-install the voice engine, for example, the free eSpeak for Android.

After installing all the necessary components, you can start the on-screen reading in the settings menu, in the accessibility section. After that, you should try to work with the device and the applications installed on it, focusing exclusively on the spoken information and using only the control controller and other hardware controls.

Leave a Reply

Your email address will not be published. Required fields are marked *