12 research outputs found

    3D movies at home

    Get PDF
    During the three-dimensional images - the future of home cinema. However, in order to make this "magic" idea was embodied in practice need to successfully achieve two main objectives: "to equip" the film industry standard equipment to capture and further processing of 3D-video content, available to develop end-user tools for viewing movies in surround format, 3D Full HD . And although the three-dimensional image has not won a home theater, a time when every "home-cinema-goer will receive the opportunity to watch three-dimensional movies at home, getting closer. When you are citing the document, use the following link http://essuir.sumdu.edu.ua/handle/123456789/2222

    The Sword, September 2003

    Get PDF
    Volume 38, Issue 2, published September 24, 2003. This issue of The Sword is from the 2003-2004 academic year

    ‘Tell no one’: Cinema auditorium as game-space – Audience participation, performance and play

    Get PDF
    Taking Secret Cinema as its site for analysis, this article engages with the question what is ludic at the cinema. Secret Cinema delivers live, immersive, participatory cinema-going experiences and is a complex interaction between film, game, theatre and social media. Through the expansion and reimagining of a film’s milieu in both virtual and real spaces, Secret Cinema experiences encourage spectatorial performativity and ludic participation. Through the use of multiple methods, this article presents the formation of a dramatic and playful community in which the impact of game cultures and a ludic aesthetic upon cinematic audience spectatorship is illuminated. Cross-disciplinary in its approach, this article connects the registers of both game and film studies in order to account for this emerging playful engagement with cinematic texts. Through its use of empirical methods, we move towards a fuller understanding of audience experience and affective engagement

    Contributions to virtual reality

    Get PDF
    153 p.The thesis contributes in three Virtual Reality areas: ¿ Visual perception: a calibration algorithm is proposed to estimate stereo projection parameters in head-mounted displays, so that correct shapes and distances can be perceived, and calibration and control procedures are proposed to obtain desired accommodation stimuli at different virtual distances.¿ Immersive scenarios: the thesis analyzes several use cases demanding varying degrees of immersion and special, innovative visualization solutions are proposed to fulfil their requirements. Contributions focus on machinery simulators, weather radar volumetric visualization and manual arc welding simulation.¿ Ubiquitous visualization: contributions are presented to scenarios where users access interactive 3D applications remotely. The thesis follows the evolution of Web3D standards and technologies to propose original visualization solutions for volume rendering of weather radar data, e-learning on energy efficiency, virtual e-commerce and visual product configurators

    Espacialización sonora con Wavefield Synthesis y Vector Base Amplitude Panning. Estudio comparativo

    Full text link
    Los sistemas de sonido espacial Wave Field Synthesis (WFS) y Vector Base Amplitude Panning (VBAP) son capaces de reproducir sonidos localizados y ambientes sonoros. VBAP se basa en panoramizaciones por amplitud, mientras que WFS emplea arrays de altavoces para reproducir los frentes de onda de las fuentes sonoras. La reproducción correcta de la distancia sonora, para percibir la distancia concreta a la que se sitúa la fuente sonora, es una característica no tan simple de generar por los sistemas de sonido espacial. Para este trabajo se ha diseñado de un test de percepción subjetiva en el que se compara la capacidad de reproducción de la distancia sonora de los sistemas WFS y VBAP. En esta prueba se evalúan ciertos parámetros (distintas distancias, reverberación, tipo de sonido, ángulo de escucha) para estudiar algunas diferencias posibles en la reproducción de distancia de los dos sistemas. Los datos obtenidos indican que aunque ambos sistemas son capaces de reproducir distancia sonora, WFS obtiene mejores resultados que VBAPSpatial sound systems Wave Field Synthesis (WFS) and Vector Base Amplitude Panning (VBAP) are able to reproduce located sounds and sound environments. VBAP is based on traditional (amplitude) panning, but WFS uses speaker arrays to reproduce the wave fronts of the sound sources. The right reproduction of the sound distance, to sense the right distance where the sound source is located, is not an easy characteristic to generate by spatial sound systems. In this work a subjective perceptual test had been done to compare the ability to reproduce accurate sound distance of WFS and VBAP systems. Some factors (different distances, reverberation, sound type, hearing angle) have been studied with this experiment, to evaluate possible differences between the sound distance reproduction ability of both systems. The data collected point out that both systems are capable to reproduce sound distance, but WFS has better results than VBAP.Gutiérrez Parera, P. (2013). Espacialización sonora con Wavefield Synthesis y Vector Base Amplitude Panning. Estudio comparativo. Universitat Politècnica de València. http://hdl.handle.net/10251/28791Archivo delegad

    Lost and Found:Studies in Confusing Films

    Get PDF

    Manipulation and Verification of Longitudinal Electric Fields for Nonlinear Optical Microscopy

    Get PDF
    Polarization, which is a fundamental property of light, describes the direction of oscillation of the electric component of the optical field. It is often assumed to be transverse to the direction of propagation of the optical wave. This is, for instance, the case for paraxial, i.e., collimated or weakly focused, laser beams. For nonparaxial, i.e., tightly focused, laser beams, however, the polarization of such beams shows a three-dimensional behavior as manifested by the generation of non-vanishing fields directed along the longitudinal direction, within the focal volume. These longitudinal fields have tremendous effects in the context of optical microscopy, and especially nonlinear microscopy because of the tensorial and symmetry dependence of the nonlinear response. So far, techniques able to precisely tailor the longitudinal field components at focus have relied on cumbersome setups and have seen their capabilities hindered by the lack of appropriate probes that can be used to unambiguously and directly detect such longitudinal fields.This Thesis aims to meet this challenge and to provide new ways to control and probe longitudinal electric fields at the focus of a high numerical aperture objective. Relying on state-of-the-art spatial phase-shaping techniques of an incident optical field, we manage to control various parameters of the longitudinal electric field within the focal volume, including its transverse spatial distribution and depth of field, demonstrated by collection of second-harmonic generation from vertically aligned GaAs nanowires. The results presented in this Thesis suggest that the strength and spatial distribution of longitudinal fields can be generally controlled but also probed using our techniques. This work also opens up new opportunities for better understanding optical responses at the nanoscale and is expected to provide alternative imaging techniques for different types of nanostructures and possibly later for biological samples. Finally, our phase-shaping techniques provide alternative tools towards more advanced control of polarization in three-dimensions at focus

    Contributions to virtual reality

    Get PDF
    153 p.The thesis contributes in three Virtual Reality areas: ¿ Visual perception: a calibration algorithm is proposed to estimate stereo projection parameters in head-mounted displays, so that correct shapes and distances can be perceived, and calibration and control procedures are proposed to obtain desired accommodation stimuli at different virtual distances.¿ Immersive scenarios: the thesis analyzes several use cases demanding varying degrees of immersion and special, innovative visualization solutions are proposed to fulfil their requirements. Contributions focus on machinery simulators, weather radar volumetric visualization and manual arc welding simulation.¿ Ubiquitous visualization: contributions are presented to scenarios where users access interactive 3D applications remotely. The thesis follows the evolution of Web3D standards and technologies to propose original visualization solutions for volume rendering of weather radar data, e-learning on energy efficiency, virtual e-commerce and visual product configurators

    Intuitive Robot Teleoperation Based on Haptic Feedback and 3D Visualization

    Get PDF
    Robots are required in many jobs. The jobs related to tele-operation may be very challenging and often require reaching a destination quickly and with minimum collisions. In order to succeed in these jobs, human operators are asked to tele-operate a robot manually through a user interface. The design of a user interface and of the information provided in it, become therefore critical elements for the successful completion of robot tele-operation tasks. Effective and timely robot tele-navigation mainly relies on the intuitiveness provided by the interface and on the richness and presentation of the feedback given. This project investigated the use of both haptic and visual feedbacks in a user interface for robot tele-navigation. The aim was to overcome some of the limitations observed in a state of the art works, turning what is sometimes described as contrasting into an added value to improve tele-navigation performance. The key issue is to combine different human sensory modalities in a coherent way and to benefit from 3-D vision too. The proposed new approach was inspired by how visually impaired people use walking sticks to navigate. Haptic feedback may provide helpful input to a user to comprehend distances to surrounding obstacles and information about the obstacle distribution. This was proposed to be achieved entirely relying on on-board range sensors, and by processing this input through a simple scheme that regulates magnitude and direction of the environmental force-feedback provided to the haptic device. A specific algorithm was also used to render the distribution of very close objects to provide appropriate touch sensations. Scene visualization was provided by the system and it was shown to a user coherently to haptic sensation. Different visualization configurations, from multi-viewpoint observation to 3-D visualization, were proposed and rigorously assessed through experimentations, to understand the advantages of the proposed approach and performance variations among different 3-D display technologies. Over twenty users were invited to participate in a usability study composed by two major experiments. The first experiment focused on a comparison between the proposed haptic-feedback strategy and a typical state of the art approach. It included testing with a multi-viewpoint visual observation. The second experiment investigated the performance of the proposed haptic-feedback strategy when combined with three different stereoscopic-3D visualization technologies. The results from the experiments were encouraging and showed good performance with the proposed approach and an improvement over literature approaches to haptic feedback in robot tele-operation. It was also demonstrated that 3-D visualization can be beneficial for robot tele-navigation and it will not contrast with haptic feedback if it is properly aligned to it. Performance may vary with different 3-D visualization technologies, which is also discussed in the presented work
    corecore