6,551 research outputs found

    Design and semantics of form and movement (DeSForM 2006)

    Get PDF
    Design and Semantics of Form and Movement (DeSForM) grew from applied research exploring emerging design methods and practices to support new generation product and interface design. The products and interfaces are concerned with: the context of ubiquitous computing and ambient technologies and the need for greater empathy in the pre-programmed behaviour of the ‘machines’ that populate our lives. Such explorative research in the CfDR has been led by Young, supported by Kyffin, Visiting Professor from Philips Design and sponsored by Philips Design over a period of four years (research funding £87k). DeSForM1 was the first of a series of three conferences that enable the presentation and debate of international work within this field: ‱ 1st European conference on Design and Semantics of Form and Movement (DeSForM1), Baltic, Gateshead, 2005, Feijs L., Kyffin S. & Young R.A. eds. ‱ 2nd European conference on Design and Semantics of Form and Movement (DeSForM2), Evoluon, Eindhoven, 2006, Feijs L., Kyffin S. & Young R.A. eds. ‱ 3rd European conference on Design and Semantics of Form and Movement (DeSForM3), New Design School Building, Newcastle, 2007, Feijs L., Kyffin S. & Young R.A. eds. Philips sponsorship of practice-based enquiry led to research by three teams of research students over three years and on-going sponsorship of research through the Northumbria University Design and Innovation Laboratory (nuDIL). Young has been invited on the steering panel of the UK Thinking Digital Conference concerning the latest developments in digital and media technologies. Informed by this research is the work of PhD student Yukie Nakano who examines new technologies in relation to eco-design textiles

    Online Shaming

    Get PDF
    Online shaming is a subject of import for social philosophy in the Internet age, and not simply because shaming seems generally bad. I argue that social philosophers are well-placed to address the imaginal relationships we entertain when we engage in social media; activity in cyberspace results in more relationships than one previously had, entailing new and more responsibilities, and our relational behaviors admit of ethical assessment. I consider the stresses of social media, including the indefinite expansion of our relationships and responsibilities, and the gap between the experiences of those shamed and the shamers’ appreciation of the magnitude of what they do when they shame; I connect these to the literature suggesting that some intuitions fail to guide our ethics. I conclude that we each have more power than we believe we do or than we think carefully about exerting in our online imaginal relations. Whether we are the shamers or the shamed, we are unable to control the extent to which intangible words in cyberspace take the form of imaginal relationships that burden or brighten our self-perceptions

    ShapeClip: towards rapid prototyping with shape-changing displays for designers

    Get PDF
    This paper presents ShapeClip: a modular tool capable of transforming any computer screen into a z-actuating shape-changing display. This enables designers to produce dynamic physical forms by "clipping" actuators onto screens. ShapeClip displays are portable, scalable, fault-tolerant, and support runtime re-arrangement. Users are not required to have knowledge of electronics or programming, and can develop motion designs with presentation software, image editors, or web-technologies. To evaluate ShapeClip we carried out a full-day workshop with expert designers. Participants were asked to generate shape-changing designs and then construct them using ShapeClip. ShapeClip enabled participants to rapidly and successfully transform their ideas into functional systems

    An Intuitive Control API for Catroid

    Get PDF
    In this research, the main objective is to develop an intuitive control API in Catroid to enhance its usability as a graphical programming tool for children and study the human-mobile interaction and experience made possible with this control API. Another objective is to develop this control API in open source development method and benchmark it with the typical software development method. It would greatly enrich user experience if Catroid can provide support for implementing intuitive control concepts to enhance its usability for children. But currently Catroid do not have control API support to develop intuitive user interaction with the application. In brief, an intuitive control API is missing in Catroid. Without such an API, the potential of Catroid as a programming tool cannot be unleashed. This research studies the maximization programming power of Catroid and advancement of control API in Catroid into a more intuitive form. This research studies the Open Source Development Model used to develop the control API. The scope of prototype will only covers locating direction, tilting, turning, and shaking motions as the new intuitive control made possible in Catroid The research methodology is Open Source Development Methodology (OSDM) and the Test-Driven Development Method with Extreme Programming is used for code development. The objective of OSDM is to utilize the online community who is the user and developers of Catroid to review and test source code to improve the software quality. The intuitive control API where phone sensors are integrated will further improve the user interaction and experience both in using Catroid and its application. The intuitive control API consists of sensor variables and If-Then-Else Command Block. The If-Then-Else Command Block acts as the control and the sensor variables make the control become intuitive. Accelerometer and orientation sensor are implemented in this control API where each of the sensors contributed 3 different values acted as the sensor variables: X-Sensor Acceleration, Y-Sensor Acceleration, Z-Sensor Acceleration, Azimuth, Pitch, and Roll. These sensor variables can be assigned to or removed from any text field in the Command Blocks using the Formula Editor. The usage of the intuitive control API is simple and straight forward. When a sensor variable is assigned to one of the fields in If-Then-Else Command Blocks, the intuitive control is developed. The Command Blocks in between the If-Statement Command Block and End of If Command Block will be executed whenever the logic condition in the If-Statement is true. Various intuitive user interactions could be developed depending on the creativity of users. The most popular intuitive user interactions are through locating direction, tilting, turning and shaking motions. Open Source Development Method allows developers to redefine the user requirements along with the software development which reduce the risk of software failure in the end of development

    Game Implementation in Real-Time using the Project Tango

    Get PDF
    The goal of this senior project is to spread awareness of augmented reality, which Google defines as “a technology that superimposes a computer-generated image on a user\u27s view of the real world, thus providing a composite view.” It’s a topic that is rarely known to those outside of a technology related field or one that has vested interest in technology. Games can be ideal tools to help educate the public on any subject matter. The task is to create an augmented reality game using a “learn by doing” method. The game will introduce players to augmented reality, and thus demonstrate how this technology can be combined with the world around them. The Tango, Unity and Vuforia are the tools to be used for development. The game itself will be a coin collecting game that changes dynamically to the world around the player

    Mobile 2D and 3D Spatial Query Techniques for the Geospatial Web

    Get PDF
    The increasing availability of abundant geographically referenced information in the Geospatial Web provides a variety of opportunities for developing value-added LBS applications. However, large data volumes of the Geospatial Web and small mobile device displays impose a data visualization problem, as the amount of searchable information overwhelms the display when too many query results are returned. Excessive returned results clutter the mobile display, making it harder for users to prioritize information and causes confusion and usability problems. Mobile Spatial Interaction (MSI) research into this “information overload” problem is ongoing where map personalization and other semantic based filtering mechanisms are essential to de-clutter and adapt the exploration of the real-world to the processing/display limitations of mobile devices. In this thesis, we propose that another way to filter this information is to intelligently refine the search space. 3DQ (3-Dimensional Query) is our novel MSI prototype for information discovery on today’s location and orientation-aware smartphones within 3D Geospatial Web environments. Our application incorporates human interactions (interpreted from embedded sensors) in the geospatial query process by determining the shape of their actual visibility space as a query “window” in a spatial database, e.g. Isovist in 2D and Threat Dome in 3D. This effectively applies hidden query removal (HQR) functionality in 360Âș 3D that takes into account both the horizontal and vertical dimensions when calculating the 3D search space, significantly reducing display clutter and information overload on mobile devices. The effect is a more accurate and expected search result for mobile LBS applications by returning information on only those objects visible within a user’s 3D field-of-view

    Mobile 2D and 3D Spatial Query Techniques for the Geospatial Web

    Get PDF
    The increasing availability of abundant geographically referenced information in the Geospatial Web provides a variety of opportunities for developing value-added LBS applications. However, large data volumes of the Geospatial Web and small mobile device displays impose a data visualization problem, as the amount of searchable information overwhelms the display when too many query results are returned. Excessive returned results clutter the mobile display, making it harder for users to prioritize information and causes confusion and usability problems. Mobile Spatial Interaction (MSI) research into this “information overload” problem is ongoing where map personalization and other semantic based filtering mechanisms are essential to de-clutter and adapt the exploration of the real-world to the processing/display limitations of mobile devices. In this thesis, we propose that another way to filter this information is to intelligently refine the search space. 3DQ (3-Dimensional Query) is our novel MSI prototype for information discovery on today’s location and orientation-aware smartphones within 3D Geospatial Web environments. Our application incorporates human interactions (interpreted from embedded sensors) in the geospatial query process by determining the shape of their actual visibility space as a query “window” in a spatial database, e.g. Isovist in 2D and Threat Dome in 3D. This effectively applies hidden query removal (HQR) functionality in 360Âș 3D that takes into account both the horizontal and vertical dimensions when calculating the 3D search space, significantly reducing display clutter and information overload on mobile devices. The effect is a more accurate and expected search result for mobile LBS applications by returning information on only those objects visible within a user’s 3D field-of-view. ii

    AirConstellations: In-Air Device Formations for Cross-Device Interaction via Multiple Spatially-Aware Armatures

    Get PDF
    AirConstellations supports a unique semi-fixed style of cross-device interactions via multiple self-spatially-aware armatures to which users can easily attach (or detach) tablets and other devices. In particular, AirConstellations affords highly flexible and dynamic device formations where the users can bring multiple devices together in-air - with 2-5 armatures poseable in 7DoF within the same workspace - to suit the demands of their current task, social situation, app scenario, or mobility needs. This affords an interaction metaphor where relative orientation, proximity, attaching (or detaching) devices, and continuous movement into and out of ad-hoc ensembles can drive context-sensitive interactions. Yet all devices remain self-stable in useful configurations even when released in mid-air. We explore flexible physical arrangement, feedforward of transition options, and layering of devices in-air across a variety of multi-device app scenarios. These include video conferencing with flexible arrangement of the person-space of multiple remote participants around a shared task-space, layered and tiled device formations with overview+detail and shared-to-personal transitions, and flexible composition of UI panels and tool palettes across devices for productivity applications. A preliminary interview study highlights user reactions to AirConstellations, such as for minimally disruptive device formations, easier physical transitions, and balancing "seeing and being seen"in remote work
    • 

    corecore