About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ACM TACCESS
Paper
Insights on assistive orientation and mobility of people with visual impairment based on large-scale longitudinal data
Abstract
Assistive applications for orientation and mobility promote independence for people with visual impairment (PVI). While typical design and evaluation of such applications involves small-sample iterative studies, we analyze large-scale longitudinal data from a geographically diverse population. Our publicly released dataset from iMove, a mobile app supporting orientation of PVI, contains millions of interactions by thousands of users over a year. Our analysis (i) examines common functionalities, settings, assistive features, and movement modalities in iMove dataset and (ii) discovers user communities based on interaction patterns. We find that the most popular interaction mode is passive, where users receive more notifications, often verbose, while in motion and perform fewer actions. The use of built-in assistive features such as enlarged text indicate a high presence of users with residual sight. Users fall into three distinct groups: (C1) users interested in surrounding points of interest, (C2) users interacting in short bursts to inquire about current location, and (C3) users with long active sessions while in motion. iMove was designed with C3 in mind, and one strength of our contribution is providing meaningful semantics for unanticipated groups, C1 and C2. Our analysis reveals insights that can be generalized to other assistive orientation and mobility applications.