26 research outputs found
Unsupervised three-dimensional reconstruction of small rocks from a single two-dimensional image
Surfaces covered with pebbles and small rocks can often be found in nature or in human shaped environments.
Generating an accurate three-dimensional model of those kind of surfaces from a reference image can be challenging,
especially if one wants to be able to animate each pebble individually. To undertake this kind of job manually
is time consuming and impossible to achieve in dynamic terrains animations.
The method described in this paper allows unsupervised automatic generation of three-dimensional textured rocks
from a two-dimensional image aiming to closely match the original image as much as possible
Recommended from our members
MAVIS: Mobile Acquisition and VISualization - a professional tool for video recording on a mobile platform
Professional video recording is a complex process which often requires expensive cameras and large amounts of ancillary equipment. With the advancement of mobile technologies, cameras on mobile devices have improved to the point where the quality of their output is sometimes comparable to that obtained from a professional video camera and are often used in professional productions. However, tools that allow professional users to access the information they need to control the technical quality of their filming and make an informed decision about what they are recording are missing on mobile platforms. In this paper we present MAVIS (Mobile Acquisition and VISualization) a tool for professional filming on a mobile platform. MAVIS allows users to access information such as colour vectorscope, waveform monitor, false colouring, focus peaking and all other information that is needed to produce high quality professional videos. This is achieved by exploiting the capabilities of modern mobile GPUs though the use of a number of vertex and fragment shaders. Evaluation with professionals in the film industry shows that the app and its functionalities are well received and that the output and usability of the application align with professional standards
Recommended from our members
Drift-diffusion based real-time dynamic terrain deformation
In the natural world, terrains are dynamic entities that change their morphology due to their interaction with other agents in the environment. However, in real-time applications terrains are often represented as static meshes, which present no interaction capabilities. This paper presents a novel real-time 2D method for dynamic terrain simulations, aimed for applications in the entertainment industry. This method is based on a Dynamically-Displaced Height-map and on the numerical solutions, obtained using an Euler method, of a modified drift-diffusion equation. The method allows objects to interact with the terrain and to deform it in real time, it is easy to implement and generates different kinds of realistic tracks depending on the soil composition
Recommended from our members
“I always wanted to see the night sky”: blind user preferences for Sensory Substitution Devices
Sensory Substitution Devices (SSDs) convert visual information into another sensory channel (e.g. sound) to improve the everyday functioning of blind and visually impaired persons (BVIP). However, the range of possible functions and options for translating vision into sound is largely open-ended. To provide constraints on the design of this technology, we interviewed ten BVIPs who were briefly trained in the use of three novel devices that, collectively, showcase a large range of design permutations. The SSDs include the ‘Depth-vOICe,’ ‘Synaestheatre’ and ‘Creole’ that offer high spatial, temporal, and colour resolutions respectively via a variety of sound outputs (electronic tones, instruments, vocals). The participants identified a range of practical concerns in relation to the devices (e.g. curb detection, recognition, mental effort) but also highlighted experiential aspects. This included both curiosity about the visual world (e.g. understanding shades of colour, the shape of cars, seeing the night sky) and the desire for the substituting sound to be responsive to movement of the device and aesthetically engaging
Recommended from our members
Design fiction film-making: a pipeline for communicating experiences
The use of films in early stages of the design of technology is a practice that is becoming increasingly common. However, the focus of these films is usually centred on exploring the technology and its specifications rather than on the experiences that the technology can potentially create for its user. Previous research emphasises the relevance of experiences created by the technology in the users arguing that the emotions should be taken into account during early design stages and made part of the design itself. In this paper we provide a step-by-step production pipeline on how to make your own design fiction film, and how you can get the experiences across. For this purpose we focus on the experiences and emotions that a specific interaction medium elicits. We gained inspiration from the increased exploration of olfactory experiences in HCI. We used a classification of smell experiences as a starting point to produce a design fiction film for the automotive context, not limited by technology but inspired by experiences
A user-centred approach to developing bWell, a mobile app for arm and shoulder exercises after breast cancer treatment
Purpose: The study aim was to develop a mobile application (app) supported by user preferences to optimise self-management of arm and shoulder exercises for upper-limb dysfunction (ULD) after breast cancer treatment.
Methods: Focus groups with breast cancer patients were held to identify user needs and requirements. Behaviour change techniques were explored by researchers and discussed during the focus groups. Concepts for content were identified by thematic analysis. A rapid review was conducted to inform the exercise programme. Preliminary testing was carried out to obtain user feedback from breast cancer patients who used the app for 8 weeks post-surgery.
Results: Breast cancer patients’ experiences with ULD and exercise advice and routines varied widely. They identified and prioritised several app features: tailored information, video demonstrations of the exercises, push notifications, and tracking and progress features. An evidence-based programme was developed with a physiotherapist with progressive exercises for passive and active mobilisation, stretching and strengthening. The exercise demonstration videos were filmed with a breast cancer patient. Early user testing demonstrated ease of use, and clear and motivating app content.
Conclusions: bWell, a novel app for arm and shoulder exercises was developed by breast cancer patients, health care professionals and academics. Further research is warranted to confirm its clinical effectiveness.
Implications for Cancer Survivors: Mobile health has great potential to provide patients with information specific to their needs. bWell is a promising way to support breast cancer patients with exercise routines after treatment and may improve future self-management of clinical care
Enhancing Student Engagement and Support with Digital Video
No description supplie
A SystemC based virtual prototyping methodology for embedded systems
No description supplie
Blurring the Boundaries of Simulation: the DORIS Multimedia Living Extensions
No description supplie
The effects of video lecture delivery formats on student engagement
Video lectures are the main teaching tool used in e-learning platforms. Different video lecture delivery formats are used to disseminate course content among students. However, there are a limited number of studies that investigate if and how different video lecture delivery formats affect the way a viewer feels engaged with the video content.
This paper presents results from a pilot study aimed at further investigating this area of research.
During the experiment participants were exposed to five lecture delivery formats: a one-to-one tutoring session, SussexDL a novel video delivery format being developed at the University of Sussex, and three other video lecture delivery formats commonly used in e-learning platforms. Participants were asked to rate the level of engagement they felt with each format. The results suggest that there is a link between the video lecture delivery format and the engagement felt with the video content