411 research outputs found

    An electronic architecture for mediating digital information in a hallway fac̦ade

    Get PDF
    Ubiquitous computing requires integration of physical space with digital information. This presents the challenges of integrating electronics, physical space, software and the interaction tools which can effectively communicate with the audience. Many research groups have embraced different techniques depending on location, context, space, and availability of necessary skills to make the world around us as an interface to the digital world. Encouraged by early successes and fostered by project undertaken by tangible visualization group. We introduce an architecture of Blades and Tiles for the development and realization of interactive wall surfaces. It provides an inexpensive, open-ended platform for constructing large-scale tangible and embedded interfaces. In this paper, we propose tiles built using inexpensive pegboards and a gateway for each of these tiles to provide access to digital information. The paper describes the architecture using a corridor fa\c{c}ade application. The corridor fa\c{c}ade uses full-spectrum LEDs, physical labels and stencils, and capacitive touch sensors to provide mediated representation, monitoring and querying of physical and digital content. Example contents include the physical and online status of people and the activity and dynamics of online research content repositories. Several complementary devices such as Microsoft PixelSense and smartdevices can support additional user interaction with the system. This enables interested people in synergistic physical environments to observe, explore, understand, and engage in ongoing activities and relationships. This paper describes the hardware architecture and software libraries employed and how they are used in our research center hallway and academic semester projects

    Clique: Perceptually Based, Task Oriented Auditory Display for GUI Applications

    Get PDF
    Screen reading is the prevalent approach for presenting graphical desktop applications in audio. The primary function of a screen reader is to describe what the user encounters when interacting with a graphical user interface (GUI). This straightforward method allows people with visual impairments to hear exactly what is on the screen, but with significant usability problems in a multitasking environment. Screen reader users must infer the state of on-going tasks spanning multiple graphical windows from a single, serial stream of speech. In this dissertation, I explore a new approach to enabling auditory display of GUI programs. With this method, the display describes concurrent application tasks using a small set of simultaneous speech and sound streams. The user listens to and interacts solely with this display, never with the underlying graphical interfaces. Scripts support this level of adaption by mapping GUI components to task definitions. Evaluation of this approach shows improvements in user efficiency, satisfaction, and understanding with little development effort. To develop this method, I studied the literature on existing auditory displays, working user behavior, and theories of human auditory perception and processing. I then conducted a user study to observe problems encountered and techniques employed by users interacting with an ideal auditory display: another human being. Based on my findings, I designed and implemented a prototype auditory display, called Clique, along with scripts adapting seven GUI applications. I concluded my work by conducting a variety of evaluations on Clique. The results of these studies show the following benefits of Clique over the state of the art for users with visual impairments (1-5) and mobile sighted users (6): 1. Faster, accurate access to speech utterances through concurrent speech streams. 2. Better awareness of peripheral information via concurrent speech and sound streams. 3. Increased information bandwidth through concurrent streams. 4. More efficient information seeking enabled by ubiquitous tools for browsing and searching. 5. Greater accuracy in describing unfamiliar applications learned using a consistent, task-based user interface. 6. Faster completion of email tasks in a standard GUI after exposure to those tasks in audio

    Supporting collaborative work using interactive tabletop

    Get PDF
    PhD ThesisCollaborative working is a key of success for organisations. People work together around tables at work, home, school, and coffee shops. With the explosion of the internet and computer systems, there are a variety of tools to support collaboration in groups, such as groupware, and tools that support online meetings. However, in the case of co-located meetings and face-to-face situations, facial expressions, body language, and the verbal communications have significant influence on the group decision making process. Often people have a natural preference for traditional pen-and-paper-based decision support solutions in such situations. Thus, it is a challenge to implement tools that rely advanced technological interfaces, such as interactive multi-touch tabletops, to support collaborative work. This thesis proposes a novel tabletop application to support group work and investigates the effectiveness and usability of the proposed system. The requirements for the developed system are based on a review of previous literature and also on requirements elicited from potential users. The innovative aspect of our system is that it allows the use of personal devices that allow some level of privacy for the participants in the group work. We expect that the personal devices may contribute to the effectiveness of the use of tabletops to support collaborative work. We chose for the purpose of evaluation experiment the collaborative development of mind maps by groups, which has been investigated earlier as a representative form of collaborative work. Two controlled laboratory experiments were designed to examine the usability features and associated emotional attitudes for the tabletop mind map application in comparison with the conventional pen-and-paper approach in the context of collaborative work. The evaluation clearly indicates that the combination of the tabletop and personal devices support and encourage multiple people working collaboratively. The comparison of the associated emotional attitudes indicates that the interactive tabletop facilitates the active involvement of participants in the group decision making significantly more than the use of the pen-and-paper conditions. The work reported here contributes significantly to our understanding of the usability and effectiveness of interactive tabletop applications in the context of supporting of collaborative work.The Royal Thai governmen

    Tangible interaction with anthropomorphic smart objects in instrumented environments

    Get PDF
    A major technological trend is to augment everyday objects with sensing, computing and actuation power in order to provide new services beyond the objects' traditional purpose, indicating that such smart objects might become an integral part of our daily lives. To be able to interact with smart object systems, users will obviously need appropriate interfaces that regard their distinctive characteristics. Concepts of tangible and anthropomorphic user interfaces are combined in this dissertation to create a novel paradigm for smart object interaction. This work provides an exploration of the design space, introduces design guidelines, and provides a prototyping framework to support the realisation of the proposed interface paradigm. Furthermore, novel methods for expressing personality and emotion by auditory means are introduced and elaborated, constituting essential building blocks for anthropomorphised smart objects. Two experimental user studies are presented, confirming the endeavours to reflect personality attributes through prosody-modelled synthetic speech and to express emotional states through synthesised affect bursts. The dissertation concludes with three example applications, demonstrating the potentials of the concepts and methodologies elaborated in this thesis.Die Integration von Informationstechnologie in Gebrauchsgegenstände ist ein gegenwärtiger technologischer Trend, welcher es Alltagsgegenständen ermöglicht, durch den Einsatz von Sensorik, Aktorik und drahtloser Kommunikation neue Dienste anzubieten, die über den ursprünglichen Zweck des Objekts hinausgehen. Die Nutzung dieser sogenannten Smart Objects erfordert neuartige Benutzerschnittstellen, welche die speziellen Eigenschaften und Anwendungsbereiche solcher Systeme berücksichtigen. Konzepte aus den Bereichen Tangible Interaction und Anthropomorphe Benutzerschnittstellen werden in dieser Dissertation vereint, um ein neues Interaktionsparadigma für Smart Objects zu entwickeln. Die vorliegende Arbeit untersucht dafür die Gestaltungsmöglichkeiten und zeigt relevante Aspekte aus verwandten Disziplinen auf. Darauf aufbauend werden Richtlinien eingeführt, welche den Entwurf von Benutzerschnittstellen nach dem hier vorgestellten Ansatz begleiten und unterstützen sollen. Für eine prototypische Implementierung solcher Benutzerschnittstellen wird eine Architektur vorgestellt, welche die Anforderungen von Smart Object Systemen in instrumentierten Umgebungen berücksichtigt. Ein wichtiger Bestandteil stellt dabei die Sensorverarbeitung dar, welche unter anderem eine Interaktionserkennung am Objekt und damit auch eine physikalische Eingabe ermöglicht. Des Weiteren werden neuartige Methoden für den auditiven Ausdruck von Emotion und Persönlichkeit entwickelt, welche essentielle Bausteine für anthropomorphisierte Smart Objects darstellen und in Benutzerstudien untersucht wurden. Die Dissertation schliesst mit der Beschreibung von drei Applikationen, welche im Rahmen der Arbeit entwickelt wurden und das Potential der hier erarbeiteten Konzepte und Methoden widerspiegeln

    Auditory interfaces: Using sound to improve the HSL metro ticketing interface for the visually impaired

    Get PDF
    Around 252 million trips by public transport are taken in Helsinki every year, and about 122 million passengers travel by Helsinki City Transport (tram, metro and ferry) in and around Finland's capitol. Given these numbers, it is important that the system be as wholly efficient, inclusive, and as easy to use as possible. In my master's thesis, I examine Helsinki Region Transport's ticketing and information system. I pay special attention to their new touch screen card readers, framing them in the context of increasing usability and accessibility through the use of sound design. I look at what design decisions have been made and compare these with a variety of available technology that exists today, as well as what solutions are being used in other cities. Throughout my research, I've placed an emphasis on sonic cues and sound design, as this is my area of study. Everything is assessed against the requirements and perspective of Helsinki's public transportation end users who are blind and visually impaired. I have used desk research, field research, user testing and stakeholder interviews in my methodology. I have put forth suggestions on how to improve the current system, taking into account the learnings from my research. I have looked at key points around people with disabilities and how sound can be used to improve accessibility and general functionality for all. I also hope to share this thesis with HSL and HKL, whom may use it to inform future optimization of their systems

    Designing for Cross-Device Interactions

    Get PDF
    Driven by technological advancements, we now own and operate an ever-growing number of digital devices, leading to an increased amount of digital data we produce, use, and maintain. However, while there is a substantial increase in computing power and availability of devices and data, many tasks we conduct with our devices are not well connected across multiple devices. We conduct our tasks sequentially instead of in parallel, while collaborative work across multiple devices is cumbersome to set up or simply not possible. To address these limitations, this thesis is concerned with cross-device computing. In particular it aims to conceptualise, prototype, and study interactions in cross-device computing. This thesis contributes to the field of Human-Computer Interaction (HCI)—and more specifically to the area of cross-device computing—in three ways: first, this work conceptualises previous work through a taxonomy of cross-device computing resulting in an in-depth understanding of the field, that identifies underexplored research areas, enabling the transfer of key insights into the design of interaction techniques. Second, three case studies were conducted that show how cross-device interactions can support curation work as well as augment users’ existing devices for individual and collaborative work. These case studies incorporate novel interaction techniques for supporting cross-device work. Third, through studying cross-device interactions and group collaboration, this thesis provides insights into how researchers can understand and evaluate multi- and cross-device interactions for individual and collaborative work. We provide a visualization and querying tool that facilitates interaction analysis of spatial measures and video recordings to facilitate such evaluations of cross-device work. Overall, the work in this thesis advances the field of cross-device computing with its taxonomy guiding research directions, novel interaction techniques and case studies demonstrating cross-device interactions for curation, and insights into and tools for effective evaluation of cross-device systems

    glueTK: A Framework for Multi-modal, Multi-display Interaction

    Get PDF
    This thesis describes glueTK, a framework for human machine interaction, that allows the integration of multiple input modalities and the interaction across different displays. Building upon the framework, several contributions to integrate pointing gestures into interactive systems are presented. To address the design of interfaces for the wide range of supported displays, a concept for transferring interaction performance from one system to another is defined

    A white paper: NASA virtual environment research, applications, and technology

    Get PDF
    Research support for Virtual Environment technology development has been a part of NASA's human factors research program since 1985. Under the auspices of the Office of Aeronautics and Space Technology (OAST), initial funding was provided to the Aerospace Human Factors Research Division, Ames Research Center, which resulted in the origination of this technology. Since 1985, other Centers have begun using and developing this technology. At each research and space flight center, NASA missions have been major drivers of the technology. This White Paper was the joint effort of all the Centers which have been involved in the development of technology and its applications to their unique missions. Appendix A is the list of those who have worked to prepare the document, directed by Dr. Cynthia H. Null, Ames Research Center, and Dr. James P. Jenkins, NASA Headquarters. This White Paper describes the technology and its applications in NASA Centers (Chapters 1, 2 and 3), the potential roles it can take in NASA (Chapters 4 and 5), and a roadmap of the next 5 years (FY 1994-1998). The audience for this White Paper consists of managers, engineers, scientists and the general public with an interest in Virtual Environment technology. Those who read the paper will determine whether this roadmap, or others, are to be followed

    Cruiser and PhoTable: Exploring Tabletop User Interface Software for Digital Photograph Sharing and Story Capture

    Get PDF
    Digital photography has not only changed the nature of photography and the photographic process, but also the manner in which we share photographs and tell stories about them. Some traditional methods, such as the family photo album or passing around piles of recently developed snapshots, are lost to us without requiring the digital photos to be printed. The current, purely digital, methods of sharing do not provide the same experience as printed photographs, and they do not provide effective face-to-face social interaction around photographs, as experienced during storytelling. Research has found that people are often dissatisfied with sharing photographs in digital form. The recent emergence of the tabletop interface as a viable multi-user direct-touch interactive large horizontal display has provided the hardware that has the potential to improve our collocated activities such as digital photograph sharing. However, while some software to communicate with various tabletop hardware technologies exists, software aspects of tabletop user interfaces are still at an early stage and require careful consideration in order to provide an effective, multi-user immersive interface that arbitrates the social interaction between users, without the necessary computer-human interaction interfering with the social dialogue. This thesis presents PhoTable, a social interface allowing people to effectively share, and tell stories about, recently taken, unsorted digital photographs around an interactive tabletop. In addition, the computer-arbitrated digital interaction allows PhoTable to capture the stories told, and associate them as audio metadata to the appropriate photographs. By leveraging the tabletop interface and providing a highly usable and natural interaction we can enable users to become immersed in their social interaction, telling stories about their photographs, and allow the computer interaction to occur as a side-effect of the social interaction. Correlating the computer interaction with the corresponding audio allows PhoTable to annotate an automatically created digital photo album with audible stories, which may then be archived. These stories remain useful for future sharing -- both collocated sharing and remote (e.g. via the Internet) -- and also provide a personal memento both of the event depicted in the photograph (e.g. as a reminder) and of the enjoyable photo sharing experience at the tabletop. To provide the necessary software to realise an interface such as PhoTable, this thesis explored the development of Cruiser: an efficient, extensible and reusable software framework for developing tabletop applications. Cruiser contributes a set of programming libraries and the necessary application framework to facilitate the rapid and highly flexible development of new tabletop applications. It uses a plugin architecture that encourages code reuse, stability and easy experimentation, and leverages the dedicated computer graphics hardware and multi-core processors of modern consumer-level systems to provide a responsive and immersive interactive tabletop user interface that is agnostic to the tabletop hardware and operating platform, using efficient, native cross-platform code. Cruiser's flexibility has allowed a variety of novel interactive tabletop applications to be explored by other researchers using the framework, in addition to PhoTable. To evaluate Cruiser and PhoTable, this thesis follows recommended practices for systems evaluation. The design rationale is framed within the above scenario and vision which we explore further, and the resulting design is critically analysed based on user studies, heuristic evaluation and a reflection on how it evolved over time. The effectiveness of Cruiser was evaluated in terms of its ability to realise PhoTable, use of it by others to explore many new tabletop applications, and an analysis of performance and resource usage. Usability, learnability and effectiveness of PhoTable was assessed on three levels: careful usability evaluations of elements of the interface; informal observations of usability when Cruiser was available to the public in several exhibitions and demonstrations; and a final evaluation of PhoTable in use for storytelling, where this had the side effect of creating a digital photo album, consisting of the photographs users interacted with on the table and associated audio annotations which PhoTable automatically extracted from the interaction. We conclude that our approach to design has resulted in an effective framework for creating new tabletop interfaces. The parallel goal of exploring the potential for tabletop interaction as a new way to share digital photographs was realised in PhoTable. It is able to support the envisaged goal of an effective interface for telling stories about one's photos. As a serendipitous side-effect, PhoTable was effective in the automatic capture of the stories about individual photographs for future reminiscence and sharing. This work provides foundations for future work in creating new ways to interact at a tabletop and to the ways to capture personal stories around digital photographs for sharing and long-term preservation

    Future bathroom: A study of user-centred design principles affecting usability, safety and satisfaction in bathrooms for people living with disabilities

    Get PDF
    Research and development work relating to assistive technology 2010-11 (Department of Health) Presented to Parliament pursuant to Section 22 of the Chronically Sick and Disabled Persons Act 197
    corecore