7,750 research outputs found

    Pointing Devices for Wearable Computers

    Get PDF
    We present a survey of pointing devices for wearable computers, which are body-mounted devices that users can access at any time. Since traditional pointing devices (i.e., mouse, touchpad, and trackpoint) were designed to be used on a steady and flat surface, they are inappropriate for wearable computers. Just as the advent of laptops resulted in the development of the touchpad and trackpoint, the emergence of wearable computers is leading to the development of pointing devices designed for them. However, unlike laptops, since wearable computers are operated from different body positions under different environmental conditions for different uses, researchers have developed a variety of innovative pointing devices for wearable computers characterized by their sensing mechanism, control mechanism, and form factor. We survey a representative set of pointing devices for wearable computers using an “adaptation of traditional devices” versus “new devices” dichotomy and study devices according to their control and sensing mechanisms and form factor. The objective of this paper is to showcase a variety of pointing devices developed for wearable computers and bring structure to the design space for wearable pointing devices. We conclude that a de facto pointing device for wearable computers, unlike laptops, is not likely to emerge

    Pointing Devices for Wearable Computers

    Get PDF
    We present a survey of pointing devices for wearable computers, which are body-mounted devices that users can access at any time. Since traditional pointing devices (i.e., mouse, touchpad, and trackpoint) were designed to be used on a steady and flat surface they are inappropriate for wearable computers. Just as the advent of laptops resulted in the development of the touchpad and trackpoint, the emergence of wearable computers is leading to the development of pointing devices designed for them. However, unlike laptops, since wearable computers are operated from different body positions under different environmental conditions for different uses, researchers have developed a variety of innovative pointing devices for wearable computers characterized by their sensing mechanism, control mechanism, and form factor. We survey a representative set of pointing devices for wearable computers using an “adaptation of traditional devices” versus “new devices” dichotomy and study devices according to their control and sensing mechanisms and form factor. The objective of this paper is to showcase a variety of pointing devices developed for wearable computers and bring structure to the design space for wearable pointing devices. We conclude that a de facto pointing device for wearable computers, unlike laptops, is not likely to emerge

    Using a cognitive prosthesis to assist foodservice managerial decision-making

    Get PDF
    The artificial intelligence community has been notably unsuccessful in producing intelligent agents that think for themselves. However, there is an obvious need for increased information processing power in real life situations. An example of this can be witnessed in the training of a foodservice manager, who is expected to solve a wide variety of complex problems on a daily basis. This article explores the possibility of creating an intelligence aid, rather than an intelligence agent, to assist novice foodservice managers in making decisions that are congruent with a subject matter expert\u27s decision schema

    The Design, Implementation, and Evaluation of a Pointing Device For a Wearable Computer

    Get PDF
    U.S. Air Force special tactics operators at times use small wearable computers (SWCs) for mission objectives. The primary pointing device of a SWC is either a touchpad or trackpoint, embedded into the chassis of the SWC. In situations where the user cannot directly interact with these pointing devices, the utility of the SWC is decreased. We developed a pointing device called the G3 that can be used for SWCs used by operators. The device utilizes gyroscopic sensors attached to the user’s index finger to move the computer cursor according to the angular velocity of his finger. We showed that, as measured by Fitts’s law, the overall performance and accuracy of the G3 was better than that of the touchpad and trackpoint. These findings suggest that the G3 can adequately be used with SWCs. Additionally, we investigated the G3\u27s utility as a control device for operating micro remotely piloted aircrafts

    Tactons: structured tactile messages for non-visual information display

    Get PDF
    Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given

    Pointing Without a Pointer

    Get PDF
    We present a method for performing selection tasks based on continuous control of multiple, competing agents who try to determine the user's intentions from their control behaviour without requiring an explicit pointer. The entropy in the selection process decreases in a continuous fashion -- we provide experimental evidence of selection from 500 initial targets. The approach allows adaptation over time to best make use of the multimodal communication channel between the human and the system. This general approach is well suited to mobile and wearable applications, shared displays and security conscious settings

    Evaluating the development of wearable devices, personal data assistants and the use of other mobile devices in further and higher education institutions

    Get PDF
    This report presents technical evaluation and case studies of the use of wearable and mobile computing mobile devices in further and higher education. The first section provides technical evaluation of the current state of the art in wearable and mobile technologies and reviews several innovative wearable products that have been developed in recent years. The second section examines three scenarios for further and higher education where wearable and mobile devices are currently being used. The three scenarios include: (i) the delivery of lectures over mobile devices, (ii) the augmentation of the physical campus with a virtual and mobile component, and (iii) the use of PDAs and mobile devices in field studies. The first scenario explores the use of web lectures including an evaluation of IBM's Web Lecture Services and 3Com's learning assistant. The second scenario explores models for a campus without walls evaluating the Handsprings to Learning projects at East Carolina University and ActiveCampus at the University of California San Diego . The third scenario explores the use of wearable and mobile devices for field trips examining San Francisco Exploratorium's tool for capturing museum visits and the Cybertracker field computer. The third section of the report explores the uses and purposes for wearable and mobile devices in tertiary education, identifying key trends and issues to be considered when piloting the use of these devices in educational contexts

    The Evolution of First Person Vision Methods: A Survey

    Full text link
    The emergence of new wearable technologies such as action cameras and smart-glasses has increased the interest of computer vision scientists in the First Person perspective. Nowadays, this field is attracting attention and investments of companies aiming to develop commercial devices with First Person Vision recording capabilities. Due to this interest, an increasing demand of methods to process these videos, possibly in real-time, is expected. Current approaches present a particular combinations of different image features and quantitative methods to accomplish specific objectives like object detection, activity recognition, user machine interaction and so on. This paper summarizes the evolution of the state of the art in First Person Vision video analysis between 1997 and 2014, highlighting, among others, most commonly used features, methods, challenges and opportunities within the field.Comment: First Person Vision, Egocentric Vision, Wearable Devices, Smart Glasses, Computer Vision, Video Analytics, Human-machine Interactio
    • 

    corecore