1,108 research outputs found
Comparing Virtual Reality to Conventional Simulator Visuals: Effects of Peripheral Visual Cues in Roll-Axis Tracking Tasks
This paper compares the effects of peripheral visual cues on manual control between a conventional fixed-base simulator and virtual reality. The results were also compared with those from a previous experiment conducted in a motion-base simulator. Fifteen participants controlled a system with second-order dynamics in a disturbance-rejection task. Tracking performance, control activity, simulator sickness questionnaire answers, and biometrics were collected. Manual control behavior was modeled for the first time in a virtual reality environment. Virtual reality did not degrade participants manual control performance or alter their control behavior. However, peripheral cues were significantly more effective in virtual reality. Control activity decreased for all conditions with peripheral cues. The trends introduced by the peripheral visual cues from the previous experiment were replicated. Finally, VR was not more nauseogenic than the conventional simulator. These results suggest that virtual reality might be a good alternative to conventional fixed-base simulators for training manual control skills
Rotational and Translational Velocity and Acceleration Thresholds for the Onset of Cybersickness in Virtual Reality
This paper determined rotational and translational velocity and acceleration thresholds for the onset of cybersickness. Cybersickness causes discomfort and discourages the widespread use of virtual reality systems for both recreational and professional use. Visual motion or optic flow is known to be one of the main causes of cybersickness due to the sensory conflict it creates with the vestibular system. The aim of this experiment is to detect rotational and translational velocity and acceleration thresholds that cause the onset of cybersickness. Participants were exposed to a moving particle field in virtual reality for a few seconds per run. The field moved in different directions (longitudinal, lateral, roll, and yaw), with different velocity profiles (steady and accelerating), and different densities. Using a staircase procedure, that controlled the speed or acceleration of the field, we detected the threshold at which participant started to feel temporary symptoms of cybersickness. The optic flow was quantified for each motion type and by modifying the number of features, the same amount of optic flow was present in each scene. Having the same optic flow in each scene allows a direct comparison of the thresholds. The results show that the velocity and acceleration thresholds for rotational optic flow were significantly lower than for translational optic flow. The thresholds suggestively decreased with the decreasing particle density of the scene. Finally, it was found that all the rotational and translational thresholds strongly correlate with each other. While the mean values of the thresholds could be used as guidelines to develop virtual reality applications, the high variability between individuals implies that the individual tuning of motion controls would be more effective to reduce cybersickness while minimizing the impact on the experience of immersion
Control Force Compensation in Ground-Based Flight Simulators
This paper presents the results of a study that investigated if controller force compensations accounting for the inertial force and moment due to the aircraft motion during flight have a significant effect on pilot control behavior and performance. Seven rotorcraft pilots performed a side-step and precision hovering task in light turbulence in the Vertical Motion Simulator. The effects of force compensation were examined for two different simulated rotorcraft: linear and UH-60 dynamics with two different force gradient of the lateral stick control. Four motion configurations were used: large motion, hexapod motion, fixed-base motion, and fixed-base motion with compensation. Control-input variables and task performance such as the time to translate to the designated hover position, station-keeping position errors, and handling qualities ratings were used as measures. Control force compensation enabled pilot control behavior and performance more similar to that under high- or medium-fidelity motion to some extent only. Control force compensation did not improve overall task performance considering both rotorcraft models at the same time. The control force compensation had effects on the linear model with lighter force gradient, but only a minimal effect on pilots? control behavior and task performance for the UH-60 model, which had a higher force gradient. This suggests that the control force compensation has limited benefits for controllers that have higher stiffness
A distinctive response to concanavalin A-mediated agglutination shown by cells from two different slime strains
Response of slime strains to concanavalin
ASTEF: A Simple Tool for Examining Fixation
In human factors and ergonomics research, the analysis of eye movements has gained popularity as a method for obtaining information concerning the operator's cognitive strategies and for drawing inferences about the cognitive state of an individual. For example, recent studies have shown that the distribution of eye fixations is sensitive to variations in mental workload---dispersed when workload is high, and clustered when workload is low. Spatial statistics algorithms can be used to obtain information about the type of distribution and can be applied over fixations recorded during small epochs of time to assess online changes in the level of mental load experienced by the individuals. In order to ease the computation of the statistical index and to encourage research on the spatial properties of visual scanning, A Simple Tool for Examining Fixations has been developed. The software application implements functions for fixation visualization, management, and analysis, and includes a tool for fixation identification from raw gaze point data. Updated information can be obtained online at www.astef.info, where the installation package is freely downloadable
Yes, construction cost, time and scope are important, but there is more: a new action plan for infrastructure success
Purpose: During the planning and delivery, iron triangle criteria, are essential for internal stakeholders (e.g. owner, sponsors and delivery company), mostly ignoring external stakeholders such as local communities (often perceived as inconvenient) or end users. In the medium-long term, infrastructure cost and benefit are far more important for external stakeholders and the environment. Design/methodology/approach: The iron triangle criteria, i.e. delivering on time, budget and quality/scope, is the traditional perspective to assess the success of infrastructure projects. Delivering on cost and time is significant, but particularly for infrastructure, there are more relevant success criteria. The authors argue which criteria are important, and explain why. Findings: The authors challenge the traditional view of judging projects based on respecting time, budget and quality/scope. The authors explain that discussing the social value and contribution to achieving the UN Sustainable Development Goals (SDGs) is extremely relevant. Crucially these metrics keep changing, even after the project is terminated. Originality/value: The authors provide a new seven-step action plan for decision-makers to improve infrastructure provision by reflecting on SDGs and engaging with external stakeholders, particularly minorities and the weaker members of their communities. Such an action plan is focused on the cost and value for different stakeholders on different timeframes and progress toward social value and achieving SDGs
Planck Low Frequency Instrument: Beam Patterns
The Low Frequency Instrument on board the ESA Planck satellite is coupled to
the Planck 1.5 meter off-axis dual reflector telescope by an array of 27
corrugated feed horns operating at 30, 44, 70, and 100 GHz. We briefly present
here a detailed study of the optical interface devoted to optimize the angular
resolution (10 arcmin at 100 GHz as a goal) and at the same time to minimize
all the systematics coming from the sidelobes of the radiation pattern. Through
optical simulations, we provide shapes, locations on the sky, angular
resolutions, and polarization properties of each beam.Comment: On behalf of the Planck collaboration. 3 pages, 1 figure. Article
published in the Proceedings of the 2K1BC Experimental Cosmology at
millimetre wavelength
Storm-Surge Flooding on the Yukon-Kuskokwim Delta, Alaska
Coastal regions of Alaska are regularly affected by intense storms of ocean origin, the frequency and intensity of which are expected to increase as a result of global climate change. The Yukon-Kuskokwim Delta (YKD), situated in western Alaska on the eastern edge of the Bering Sea, is one of the largest deltaic systems in North America. Its low relief makes it especially susceptible to storm-driven flood tides and increases in sea level. Little information exists on the extent of flooding caused by storm surges in western Alaska and its effects on salinization, shoreline erosion, permafrost thaw, vegetation, wildlife, and the subsistence-based economy. In this paper, we summarize storm flooding events in the Bering Sea region of western Alaska during 1913 – 2011 and map both the extent of inland flooding caused by autumn storms on the central YKD, using Radarsat-1 and MODIS satellite imagery, and the drift lines, using high-resolution IKONOS satellite imagery and field surveys. The largest storm surges occurred in autumn and were associated with high tides and strong (> 65 km hr-1) southwest winds. Maximum inland extent of flooding from storm surges was 30.3 km in 2005, 27.4 km in 2006, and 32.3 km in 2011, with total flood area covering 47.1%, 32.5%, and 39.4% of the 6730 km2 study area, respectively. Peak stages for the 2005 and 2011 storms were 3.1 m and 3.3 m above mean sea level, respectively—almost as high as the 3.5 m amsl elevation estimated for the largest storm observed (in November 1974). Several historically abandoned village sites lie within the area of inundation of the largest flood events. With projected sea level rise, large storms are expected to become more frequent and cover larger areas, with deleterious effects on freshwater ponds, non-saline habitats, permafrost, and landscapes used by nesting birds and local people.Les rĂ©gions cĂ´tières de l’Alaska sont souvent touchĂ©es par d’intenses tempĂŞtes d’origine ocĂ©anique. La frĂ©quence et l’intensitĂ© de ces tempĂŞtes devraient augmenter en raison du changement climatique qui s’opère Ă l’échelle mondiale. Le delta Yukon-Kuskokwim, dans l’ouest de l’Alaska, du cĂ´tĂ© est de la mer de BĂ©ring, est l’un des systèmes deltaĂŻques les plus imposants de l’AmĂ©rique du Nord. Son relief peu accidentĂ© le rend particulièrement susceptible aux marĂ©es montantes dĂ©coulant des tempĂŞtes et aux augmentations du niveau de la mer. Peu d’information existe au sujet de l’ampleur des inondations attribuables aux ondes de tempĂŞtes dans l’ouest de l’Alaska de mĂŞme que sur leurs effets en matière de saliniÂsation, d’érosion des berges, de dĂ©gel, de pergĂ©lisol, de vĂ©gĂ©tation, de faune et d’économie de subsistance. Dans cet article, nous rĂ©sumons les ondes de tempĂŞtes qui ont eu lieu dans la rĂ©gion de la mer de BĂ©ring de l’ouest de l’Alaska entre 1913 et 2011 et nous cartographions Ă l’aide de Radarsat-1 et de l’imagerie satellitaire MODIS l’étendue des inondations fluviales causĂ©es par les tempĂŞtes automnales dans le centre du delta Yukon-Kuskokwim, de mĂŞme que les lignes de dĂ©rive au moyen de l’imagerie satellitaire IKONOS Ă haute rĂ©solution et de levĂ©s sur le terrain. Les ondes de tempĂŞtes les plus importantes se sont produites Ă l’automne. Elles s’accompagnaient de marĂ©es hautes et de vents forts (> 65 km h-1) en provenance du sud-ouest. L’étendue maximale des inondations fluviales dĂ©coulant des ondes de tempĂŞtes a atteint 30,3 km en 2005, 27,4 km en 2006 et 32,3 km en 2011. Au total, la zone inondĂ©e couvrait respectivement 47,1 %, 32,5 % et 39,4 % de l’aire de 6 730 km2 visĂ©e par l’étude. Le niveau maximal des tempĂŞtes de 2005 et 2011 Ă©tait de 3,1 m et de 3,3 m au-dessus du niveau moyen de la mer, respectivement, ce qui est presque aussi Ă©levĂ© que la hauteur estimĂ©e de 3,5 m au-dessus du niveau moyen de la mer pour la plus grosse des tempĂŞtes observĂ©es (en novembre 1974). Plusieurs villages abandonnĂ©s au fil des ans se trouvent dans la zone touchĂ©e par les plus grandes inondations. Compte tenu de l’élĂ©vation projetĂ©e du niveau de la mer, la frĂ©quence des tempĂŞtes d’envergure devrait augmenter et les tempĂŞtes devraient couvrir des zones plus grandes, ce qui aura des effets dĂ©lĂ©tères sur les Ă©tangs d’eau douce, les habitats non salins, le pergĂ©lisol et les paysages dont se servent les oiseaux nicheurs et les gens de la rĂ©gion
Dynamic validation of the Planck/LFI thermal model
The Low Frequency Instrument (LFI) is an array of cryogenically cooled
radiometers on board the Planck satellite, designed to measure the temperature
and polarization anisotropies of the cosmic microwave backgrond (CMB) at 30, 44
and 70 GHz. The thermal requirements of the LFI, and in particular the
stringent limits to acceptable thermal fluctuations in the 20 K focal plane,
are a critical element to achieve the instrument scientific performance.
Thermal tests were carried out as part of the on-ground calibration campaign at
various stages of instrument integration. In this paper we describe the results
and analysis of the tests on the LFI flight model (FM) performed at Thales
Laboratories in Milan (Italy) during 2006, with the purpose of experimentally
sampling the thermal transfer functions and consequently validating the
numerical thermal model describing the dynamic response of the LFI focal plane.
This model has been used extensively to assess the ability of LFI to achieve
its scientific goals: its validation is therefore extremely important in the
context of the Planck mission. Our analysis shows that the measured thermal
properties of the instrument show a thermal damping level better than
predicted, therefore further reducing the expected systematic effect induced in
the LFI maps. We then propose an explanation of the increased damping in terms
of non-ideal thermal contacts.Comment: Planck LFI technical papers published by JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/1748-022
Role of the cell wall on the expression of osmotic-sensitive (os-1) and temperature-sensitive (cot-1) phenotypes of N. crassa. A comparative study on mycelial and wall-less phenotypes of the slime variant
Ascospore segregants ( slime -like) of the triple mutant fz(fuzzy);sg(spontaneous germination) os-1(osmotic) ( slime ; Emerson 1963. Genetica 34:162-182) of Neurospora crassa germinate as a plasmodium which, after some time, results in a morphologically abnormal mycelium. If the mycelium of a slime -like isolate is cultured under high osmotic pressure (Nelson et al. 1975. Neurospora Newsl. 22:15-16), it releases cells lacking walls which proliferate as spheroplasts
- …