1,118 research outputs found
Towards a crowdsourced solution for the authoring bottleneck in interactive narratives
Interactive Storytelling research has produced a wealth of technologies that can be
employed to create personalised narrative experiences, in which the audience takes
a participating rather than observing role. But so far this technology has not led
to the production of large scale playable interactive story experiences that realise
the ambitions of the field. One main reason for this state of affairs is the difficulty
of authoring interactive stories, a task that requires describing a huge amount of
story building blocks in a machine friendly fashion. This is not only technically
and conceptually more challenging than traditional narrative authoring but also a
scalability problem.
This thesis examines the authoring bottleneck through a case study and a literature
survey and advocates a solution based on crowdsourcing. Prior work has already
shown that combining a large number of example stories collected from crowd workers
with a system that merges these contributions into a single interactive story can be
an effective way to reduce the authorial burden. As a refinement of such an approach,
this thesis introduces the novel concept of Crowd Task Adaptation. It argues that in
order to maximise the usefulness of the collected stories, a system should dynamically
and intelligently analyse the corpus of collected stories and based on this analysis
modify the tasks handed out to crowd workers.
Two authoring systems, ENIGMA and CROSCAT, which show two radically different
approaches of using the Crowd Task Adaptation paradigm have been implemented and
are described in this thesis. While ENIGMA adapts tasks through a realtime dialog
between crowd workers and the system that is based on what has been learned from
previously collected stories, CROSCAT modifies the backstory given to crowd workers
in order to optimise the distribution of branching points in the tree structure that
combines all collected stories. Two experimental studies of crowdsourced authoring
are also presented. They lead to guidelines on how to employ crowdsourced authoring
effectively, but more importantly the results of one of the studies demonstrate the
effectiveness of the Crowd Task Adaptation approach
Manipulability Optimization of a Rehabilitative Collaborative Robotic System
The use of collaborative robots (or cobots) in rehabilitation therapies is aimed at assisting and shortening the patient's recovery after neurological injuries. Cobots are inherently safe when interacting with humans and can be programmed in different working modalities based on the patient's needs and the level of the injury. This study presents a design optimization of a robotic system for upper limb rehabilitation based on the manipulability ellipsoid method. The human-robot system is modeled as a closed kinematic chain in which the human hand grasps a handle attached to the robot's end effector. The manipulability ellipsoids are determined for both the human and the robotic arm and compared by calculating an index that quantifies the alignment of the principal axes. The optimal position of the robot base with respect to the patient is identified by a first global optimization and by a further local refinement, seeking the best alignment of the manipulability ellipsoids in a series of points uniformly distributed within the shared workspace
Managing delays for realtime error correction and compensation of an industrial robot in an open network
The calibration of articulated arms presents a substantial challenge within the manufacturing domain, necessitating sophisticated calibration systems often reliant on the integration of costly metrology equipment for ensuring high precision. However, the logistical complexities and financial burden associated with deploying these devices across diverse systems hinder their widespread adoption. In response, Industry 4.0 emerges as a transformative paradigm by enabling the integration of manufacturing devices into networked environments, thereby providing access through cloud-based infrastructure. Nonetheless, this transition introduces a significant concern in the form of network-induced delays, which can significantly impact realtime calibration procedures. To address this pivotal challenge, the present study introduces an innovative framework that adeptly manages and mitigates network-induced delays. This framework leverages two key components: controller and optimiser, specifically the MPC (Model Predictive Controller) in conjunction with the Extended Kalman Filter (EKF), and a Predictor, characterised as the Dead Reckoning Model (DRM). Collectively, these methodologies are strategically integrated to address and ameliorate the temporal delays experienced during the calibration process. Significantly expanding upon antecedent investigations, the study transcends prior boundaries by implementing an advanced realtime error correction system across networked environments, with particular emphasis on the intricate management of delays originating from network traffic dynamics. The fundamental aim of this research extension is twofold: firstly, it aims to enhance realtime system performance on open networks, while concurrently achieving an impressive level of error correction precision at 0.02 mm. The employment of the proposed methodologies is anticipated to effectively surmount the intricacies and challenges associated with network-induced delays. Subsequently, this endeavour serves to catalyse accurate and efficient calibration procedures in the context of realtime manufacturing scenarios. This research significantly advances the landscape of error correction systems and lays a robust groundwork for the optimised utilisation of networked manufacturing devices within the dynamic realm of Industry 4.0 applications
Recommended from our members
Supporting virtuosity and flow in computer music
As we begin to realise the sonic and expressive potential of the computer, HCI researchers face the challenge of designing rewarding and accessible user experiences that enable individuals to explore complex creative domains such as music.
In performance-based music systems such as sequencers, a disjunction exists between the musician’s specialist skill with performance hardware and the generic usability techniques applied in the design of the software. The creative process is not only fragmented across multiple physical (and virtual) devices, but divided across creativity and productivity phases separated by the act of recording.
Integrating psychologies of expertise and intrinsic motivation, this thesis proposes a design shift from usability to virtuosity, using theories of “flow” (Csikszentmihalyi, 1996) and feedback “liveness” (Tanimoto, 1990) to identify factors that facilitate learning and creativity in digital notations and interfaces, leading to a set of design heuristics to support virtuosity in notation use. Using the cognitive dimensions of notations framework (Green, 1996), models of the creative user experience are developed, working towards a theoretical framework for HCI in music systems, and specifically computer-aided composition.
Extensive analytical methods are used to look at corollaries of virtuosity and flow in real-world computer music interaction, notably in soundtracking, a software-based composing environment offering a rapid edit-audition feedback cycle, enabled by the user’s skill in manipulating the text-based notation (and program) through the computer keyboard. The interaction and development of more than 1,000 sequencer and tracker users was recorded over a period of 2 years, to investigate the nature and development of skill and technique, look for evidence of flow experiences, and establish the use and role of both visual and musical feedback in music software. Quantitative analyses of interaction data are supplemented with a detailed video study of a professional tracker composer, and a user survey that draws on psychometric methods to evaluate flow experiences in the use of digital music notations, such as sequencers and trackers.
Empirical findings broadly support the proposed design heuristics, and enable the development of further models of liveness and flow in notation use. Implications for UI design are discussed in the context of existing music systems, and supporting digitally-mediated creativity in other domains based on notation use
SORA Methodology for Multi-UAS Airframe Inspections in an Airport
Deploying Unmanned Aircraft Systems (UAS) in safety- and business-critical operations
requires demonstrating compliance with applicable regulations and a comprehensive understanding
of the residual risk associated with the UAS operation. To support these activities and enable the
safe deployment of UAS into civil airspace, the European Union Aviation Safety Agency (EASA) has
established a UAS regulatory framework that mandates the execution of safety risk assessment for
UAS operations in order to gain authorization to carry out certain types of operations. Driven by
this framework, the Joint Authorities for Rulemaking on Unmanned Systems (JARUS) released the
Specific Operation Risk Assessment (SORA) methodology that guides the systematic risk assessment
for UAS operations. However, existing work on SORA and its applications focuses mainly on single
UAS operations, offering limited support for assuring operations conducted with multiple UAS and
with autonomous features. Therefore, the work presented in this paper analyzes the application of
SORA for a Multi-UAS airframe inspection (AFI) operation, that involves deploying multiple UAS
with autonomous features inside an airport. We present the decision-making process of each SORA
step and its application to a multiple UAS scenario. The results shows that the procedures and safety
features included in the Multi-AFI operation such as workspace segmentation, the independent
multi-UAS AFI crew proposed, and the mitigation actions provide confidence that the operation can
be conducted safely and can receive a positive evaluation from the competent authorities. We also
present our key findings from the application of SORA and discuss how it can be extended to better
support multi-UAS operations.Unión Europea 10101725
Serious gama for integration in higher education
Tese de mestrado integrado. Engenharia Informática e Computação. Universidade do Porto. Faculdade de Engenharia. 201
Life Cycle Engineering 4.0: A Proposal to Conceive Manufacturing Systems for Industry 4.0 Centred on the Human Factor (DfHFinI4.0)
Engineering 4.0 environments are characterised by the digitisation, virtualisation, and connectivity of products, processes, and facilities composed of reconfigurable and adaptive socio-technical cyber-physical manufacturing systems (SCMS), in which Operator 4.0 works in real time in VUCA (volatile, uncertain, complex and ambiguous) contexts and markets. This situation gives rise to the interest in developing a framework for the conception of SCMS that allows the integration of the human factor, management, training, and development of the competencies of Operator 4.0 as fundamental aspects of the aforementioned system. The present paper is focused on answering how to conceive the adaptive manufacturing systems of Industry 4.0 through the operation, growth, and development of human talent in VUCA contexts. With this objective, exploratory research is carried, out whose contribution is specified in a framework called Design for the Human Factor in Industry 4.0 (DfHFinI4.0). From among the conceptual frameworks employed therein, the connectivist paradigm, Ashby's law of requisite variety and Vigotsky's activity theory are taken into consideration, in order to enable the affective-cognitive and timeless integration of the human factor within the SCMS. DfHFinI4.0 can be integrated into the life cycle engineering of the enterprise reference architectures, thereby obtaining manufacturing systems for Industry 4.0 focused on the human factor. The suggested framework is illustrated as a case study for the Purdue Enterprise Reference Architecture (PERA) methodology, which transforms it into PERA 4.0
Software for the collaborative editing of the Greek new testament
This project was responsible for developing the Virtual Manuscript Room Collaborative Research Environment (VMR CRE), which offers a facility for the critical editing workflow from raw data collection, through processing, to publication, within an open and online collaborative framework for the Institut für Neutestamentliche Textforschung (INTF) and their global partners while editing the Editio Critica Maior (ECM)-- the paramount critical edition of the Greek New Testament which analyses over 5600 Greek witnesses and includes a comprehensive apparatus of chosen manuscripts, weighted by quotations and early translations. Additionally, this project produced the first digital edition of the ECM. This case study, transitioning the workflow at the INTF to an online collaborative research environment, seeks to convey successful methods and lessons learned through describing a professional software engineer’s foray into the world of academic digital humanities. It compares development roles and practices in the software industry with the academic environment and offers insights to how this software engineer found a software team therein, suggests how a fledgling online community can successfully achieve critical mass, provides an outsider’s perspective on what a digital critical scholarly edition might be, and hopes to offer useful software, datasets, and a thriving online community for manuscript researchers
- …