6,312 research outputs found

    Wearable and mobile devices

    Get PDF
    Information and Communication Technologies, known as ICT, have undergone dramatic changes in the last 25 years. The 1980s was the decade of the Personal Computer (PC), which brought computing into the home and, in an educational setting, into the classroom. The 1990s gave us the World Wide Web (the Web), building on the infrastructure of the Internet, which has revolutionized the availability and delivery of information. In the midst of this information revolution, we are now confronted with a third wave of novel technologies (i.e., mobile and wearable computing), where computing devices already are becoming small enough so that we can carry them around at all times, and, in addition, they have the ability to interact with devices embedded in the environment. The development of wearable technology is perhaps a logical product of the convergence between the miniaturization of microchips (nanotechnology) and an increasing interest in pervasive computing, where mobility is the main objective. The miniaturization of computers is largely due to the decreasing size of semiconductors and switches; molecular manufacturing will allow for “not only molecular-scale switches but also nanoscale motors, pumps, pipes, machinery that could mimic skin” (Page, 2003, p. 2). This shift in the size of computers has obvious implications for the human-computer interaction introducing the next generation of interfaces. Neil Gershenfeld, the director of the Media Lab’s Physics and Media Group, argues, “The world is becoming the interface. Computers as distinguishable devices will disappear as the objects themselves become the means we use to interact with both the physical and the virtual worlds” (Page, 2003, p. 3). Ultimately, this will lead to a move away from desktop user interfaces and toward mobile interfaces and pervasive computing

    Emerging technologies for learning (volume 2)

    Get PDF

    Master of Science

    Get PDF
    thesisComputing and data acquisition have become an integral part of everyday life. From reading emails on a cell phone, to kids playing with motion sensing game consoles, we are surrounded with sensors and mobile devices. As the availability of powerful mobile computing devices expands, the road is paved for applications in previously limited environments. Rehabilitative devices are emerging that embrace these mobile advances. Research has explored the use of smartphones in rehabilitation as a means to process data and provide feedback in conjunction with established rehabilitative methods. Smartphones, combined with sensor embedded insoles, provide a powerful tool for the clinician in gathering data and may act as a standalone training technique. This thesis presents continuing research of a sensor integrated insole system that provides real-time feedback through a mobile platform, the Adaptive Real-Time Instrumentation System for Tread Imbalance Correction (ARTISTIC). The system interfaces a wireless instrumented insole with an Android smartphone application to receive gait data and provide sensory feedback to modify gait patterns. Revisions to the system hardware, software, and feedback modes brought about the introduction of the ARTISTIC 2.0. The number of sensors in the insole was increased from two to 10. The microprocessor and a vibrotactile motor were embedded in the insole and the communications box was reduced in size and weight by more than 50%. Stance time iv measurements were validated against force plate equipment and found to be within 13.5 ± 3.3% error of force plate time measurements. Human subjects were tested using each of the feedback modes to alter gait symmetry. Results from the testing showed that more than one mode of feedback caused a statistically significant change in gait symmetry ratios (p < 0.05). Preference of feedback modes varied among subjects, with the majority agreeing that several feedback modes made a difference in their gait. Further improvements will prepare the ARTISTIC 2.0 for testing in a home environment for extended periods of time and improve data capture techniques, such as including a database in the smartphone application

    Master of Science

    Get PDF
    thesisComputing and data acquisition have become an integral part of everyday life. From reading emails on cell phones to kids playing with motion sensing game consoles, we are surrounded with sensors and mobile computing devices. As the availability of powerful computing devices increases, applications in previously limited environments become possible. Training devices in rehabilitation are becoming increasingly common and more mobile. Community based rehabilitative devices are emerging that embrace these mobile advances. To further the flexibility of devices used in rehabilitation, research has explored the use of smartphones as a means to process data and provide feedback to the user. In combination with sensor embedded insoles, smartphones provide a powerful tool for the clinician in gathering data and as a standalone training tool in rehabilitation. This thesis presents the continuing research of sensor based insoles, feedback systems and increasing the capabilities of the Adaptive Real-Time Instrumentation System for Tread Imbalance Correction, or ARTISTIC, with the introduction of ARTISTIC 2.0. To increase the capabilities of the ARTISTIC an Inertial Measurement Unit (IMU) was added, which gave the system the ability to quantify the motion of the gait cycle and, more specifically, determine stride length. The number of sensors in the insole was increased from two to ten, as well as placing the microprocessor and a vibratory motor in the insole. The transmission box weight was reduced by over 50 percent and the volume by over 60 percent. Stride length was validated against a motion capture system and found the average stride length to be within 2.7 ± 6.9 percent. To continue the improvement of the ARTISTIC 2.0, future work will include implementing real-time stride length feedback

    Resonating Experiences of Self and Others enabled by a Tangible Somaesthetic Design

    Get PDF
    Digitalization is penetrating every aspect of everyday life including a human's heart beating, which can easily be sensed by wearable sensors and displayed for others to see, feel, and potentially "bodily resonate" with. Previous work in studying human interactions and interaction designs with physiological data, such as a heart's pulse rate, have argued that feeding it back to the users may, for example support users' mindfulness and self-awareness during various everyday activities and ultimately support their wellbeing. Inspired by Somaesthetics as a discipline, which focuses on an appreciation of the living body's role in all our experiences, we designed and explored mobile tangible heart beat displays, which enable rich forms of bodily experiencing oneself and others in social proximity. In this paper, we first report on the design process of tangible heart displays and then present results of a field study with 30 pairs of participants. Participants were asked to use the tangible heart displays during watching movies together and report their experience in three different heart display conditions (i.e., displaying their own heart beat, their partner's heart beat, and watching a movie without a heart display). We found, for example that participants reported significant effects in experiencing sensory immersion when they felt their own heart beats compared to the condition without any heart beat display, and that feeling their partner's heart beats resulted in significant effects on social experience. We refer to resonance theory to discuss the results, highlighting the potential of how ubiquitous technology could utilize physiological data to provide resonance in a modern society facing social acceleration.Comment: 18 page

    Ambient Intelligence for Next-Generation AR

    Full text link
    Next-generation augmented reality (AR) promises a high degree of context-awareness - a detailed knowledge of the environmental, user, social and system conditions in which an AR experience takes place. This will facilitate both the closer integration of the real and virtual worlds, and the provision of context-specific content or adaptations. However, environmental awareness in particular is challenging to achieve using AR devices alone; not only are these mobile devices' view of an environment spatially and temporally limited, but the data obtained by onboard sensors is frequently inaccurate and incomplete. This, combined with the fact that many aspects of core AR functionality and user experiences are impacted by properties of the real environment, motivates the use of ambient IoT devices, wireless sensors and actuators placed in the surrounding environment, for the measurement and optimization of environment properties. In this book chapter we categorize and examine the wide variety of ways in which these IoT sensors and actuators can support or enhance AR experiences, including quantitative insights and proof-of-concept systems that will inform the development of future solutions. We outline the challenges and opportunities associated with several important research directions which must be addressed to realize the full potential of next-generation AR.Comment: This is a preprint of a book chapter which will appear in the Springer Handbook of the Metavers

    Avebury Portal – A Location-Based Augmented Reality Treasure Hunt for Archaeological Sites

    Get PDF
    Many archaeological sites are less popular by visits amongst the younger group and overall less popular than majority of other heritage sites. They are often not enhanced by supporting medium like in museums or historic buildings. Many augmented reality (AR) systems have been developed for archaeological sites and proved to benefit user engagement. However, most result in superimposing a virtual reconstruction of the site for users to passively observe and lack exploration of other methods for designing an interactive engaging experience. In this paper, we demonstrate the development of a location-based treasure hunt AR app, Avebury Portal, for the heritage site; Avebury in England. Avebury Portal uses puzzles with the environment to give clues, and a narrative that responds to the user’s location. We developed Avebury Portal with Unity Engine and Vuforia to demonstrate the effectiveness of using AR to enhance visitors’ experiences on learning
    • 

    corecore