SmartWheels: Detecting urban features for wheelchair users’ navigation

Abstract

People with mobility impairments have heterogeneous needs and abilities while moving in an urban environment and hence they require personalized navigation instructions. Providing these instructions requires the knowledge of urban features like curb ramps, steps or other obstacles along the way. Since these urban features are not available from maps and change in time, crowdsourcing this information from end-users is a scalable and promising solution. However, it is inconvenient for wheelchair users to input data while on the move. Hence, an automatic crowdsourcing mechanism is needed. In this contribution we present SmartWheels, a solution to detect urban features by analyzing inertial sensors data produced by wheelchair movements. Activity recognition techniques are used to process the sensors data stream. SmartWheels is evaluated on data collected from 17 real wheelchair users navigating in a controlled environment (10 users) and in-the-wild (7 users). Experimental results show that SmartWheels is a viable solution to detect urban features, in particular by applying specific strategies based on the confidence assigned to predictions by the classifier

    Similar works