Invoking Visual Search With Device Camera Using Intuitive Physical Gestures

Abstract

Invoking a visual search typically requires a user to open an application that supports the functionality before pointing the camera at the space or objects for which visual search is desired. Such an approach for invoking a visual search is time-consuming and cumbersome and limits the types of actions that can be supported by these capabilities. This disclosure describes techniques that enable users to invoke visual searches from their device cameras via intuitive physical gestures. Users can then take further actions on objects within view

    Similar works