906 research outputs found

    ScaleTrotter: Illustrative Visual Travels Across Negative Scales

    Full text link
    We present ScaleTrotter, a conceptual framework for an interactive, multi-scale visualization of biological mesoscale data and, specifically, genome data. ScaleTrotter allows viewers to smoothly transition from the nucleus of a cell to the atomistic composition of the DNA, while bridging several orders of magnitude in scale. The challenges in creating an interactive visualization of genome data are fundamentally different in several ways from those in other domains like astronomy that require a multi-scale representation as well. First, genome data has intertwined scale levels---the DNA is an extremely long, connected molecule that manifests itself at all scale levels. Second, elements of the DNA do not disappear as one zooms out---instead the scale levels at which they are observed group these elements differently. Third, we have detailed information and thus geometry for the entire dataset and for all scale levels, posing a challenge for interactive visual exploration. Finally, the conceptual scale levels for genome data are close in scale space, requiring us to find ways to visually embed a smaller scale into a coarser one. We address these challenges by creating a new multi-scale visualization concept. We use a scale-dependent camera model that controls the visual embedding of the scales into their respective parents, the rendering of a subset of the scale hierarchy, and the location, size, and scope of the view. In traversing the scales, ScaleTrotter is roaming between 2D and 3D visual representations that are depicted in integrated visuals. We discuss, specifically, how this form of multi-scale visualization follows from the specific characteristics of the genome data and describe its implementation. Finally, we discuss the implications of our work to the general illustrative depiction of multi-scale data

    Visualization of scientific data in multi-user augmented reality

    Get PDF
    Humanity has always strived to learn more about the origins of our neighboring celestial bodies. With the help of modern rover systems, unknown areas are explored through scientific measurements. With increasingly better sensors, this data becomes more extensive and complex, creating an evident need for new and improved tools. These tools should support the scientists in the collaborative analysis of the recorded measurements. Scientists from different disciplinary backgrounds work together on this analysis. Exploring the data can be made more efficient with the help of intuitive visualization, interaction, and collaborative tools. At the same time, misunderstandings among the experts can be minimized. This thesis investigates how modern augmented reality approaches can support the process of collaborative rover data analysis. Three main aspects are considered: the threedimensional visualization of high-resolution terrain data, the visualization and interaction with rover data, and the integration of multi-user collaboration tools for the collaborative discussion. A mobile augmented reality device, the Microsft HoloLens 2, is used to input, output, and process the data. In order to evaluate the implemented visualization and interaction concepts, an expert interview and several experiments for a user study are prepared in this work. Due to the current COVID-19 pandemic restrictions, both interview and user study could not be conducted. Based on promising informal preliminary user tests, potential improvements of the presented concepts are discussed

    No Habeas For You! Al Maqaleh v. Gates, The Bagram Detainees, and the Global Insurgency

    Get PDF

    NO MAN’S SKY: UTILIZING MARITIME LAW TO ADDRESS THE NEED FOR SPACE DEBRIS REMOVAL TECHNOLOGY

    Get PDF
    NO MAN’S SKY: UTILIZING MARITIME LAW TO ADDRESS THE NEED FOR SPACE DEBRIS REMOVAL TECHNOLOG

    Investigation of visual pathways in honeybees (Apis mellifera) and desert locusts (Schistocerca gregaria): anatomical, ultrastructural, and physiological approaches

    Get PDF
    Many insect species demonstrate sophisticated abilities regarding spatial orientation and navigation, despite their small brain size. The behaviors that are based on spatial orientation differ dramatically between individual insect species according to their lifestyle and habitat. Central place foragers like bees and ants, for example, orient themselves in their surrounding and navigate back to the nest after foraging for food or water. Insects like some locust and butterfly species, on the other hand, use spatial orientation during migratory phases to keep a stable heading into a certain direction over a long period of time. In both scenarios, homing and long-distance migration, vision is the primary source for orientation cues even though additional features like wind direction, the earth’s magnetic field, and olfactory cues can be taken into account as well. Visual cues that are used for orientational purposes range from landmarks and the panorama to celestial cues. The latter consists in diurnal insects of the position of the sun itself, the sun-based polarization pattern and intensity and spectral gradient, and is summarized as sky-compass system. For a reliable sky-compass orientation, the animal needs, in addition to the perception of celestial cues, to compensate for the daily movement of the sun across the sky. It is likely that a connection from the circadian pacemaker system to the sky-compass network could provide the necessary circuitry for this time compensation. The present thesis focuses on the sky-compass system of honeybees and locusts. There is a large body of work on the navigational abilities of honeybees from a behavioral perspective but the underlying neuronal anatomy and physiology has received less attention so far. Therefore, the first two chapters of this thesis reveals a large part of the anatomy of the anterior sky-compass pathway in the bee brain. To this end, dye injections, immunohistochemical stainings, and ultrastructural examinations were conducted. The third chapter describes a novel methodical protocol for physiological investigations of neurons involved in the sky-compass system using calcium imaging in behaving animals. The fourth chapter of this thesis deals with the anatomical basis of time compensation in the sky-compass system of locusts. Therefore, the ultrastructure of synaptic connections in a brain region of the desert locust where the contact of both systems could be feasible has been investigated

    Using reconstructed visual reality in ant navigation research

    Get PDF
    Insects have low resolution eyes and a tiny brain, yet they continuously solve very complex navigational problems; an ability that underpins fundamental biological processes such as pollination and parental care. Understanding the methods they employ would have profound impact on the fields of machine vision and robotics. As our knowledge on insect navigation grows, our physical, physiological and neural models get more complex and detailed. To test these models we need to perform increasingly sophisticated experiments. Evolution has optimised the animals to operate in their natural environment. To probe the fine details of the methods they utilise we need to use natural visual scenery which, for experimental purposes, we must be able to manipulate arbitrarily. Performing physiological experiments on insects outside the laboratory is not practical and our ability to modify the natural scenery for outdoor behavioural experiments is very limited. The solution is reconstructed visual reality, a projector that can present the visual aspect of the natural environment to the animal with high fidelity, taking the peculiarities of insect vision into account. While projectors have been used in insect research before, during my candidature I designed and built a projector specifically tuned to insect vision. To allow the ant to experience a full panoramic view, the projector completely surrounds her. The device (Antarium) is a polyhedral approximation of a sphere. It contains 20 thousand pixels made out of light emitting diodes (LEDs) that match the spectral sensitivity of Myrmecia. Insects have a much higher fusion frequency limit than humans, therefore the device has a very high flicker frequency (9kHz) and also a high frame rate (190fps). In the Antarium the animal is placed in the centre of the projector on a trackball. To test the trackball and to collect reference data, outdoor experiments were performed where ants were captured, tethered and placed on the trackball. The apparatus with the ant on it was then placed at certain locations relative to the nest and the foraging tree and the movements of the animal on the ball were recorded and analysed. The outdoor experiments proved that the trackball was well suited for our ants, and also provided the baseline behaviour reference for the subsequent Antarium experiments. To assess the Antarium, the natural habitat of the experimental animals was recreated as a 3-dimensional model. That model was then projected for the ants and their movements on the trackball was recorded, just like in the outdoor experiments Initial feasibility tests were performed by projecting a static image, which matches what the animals experienced during the outdoor experiments. To assess whether the ant was orienting herself relative to the scene we rotated the projected scene around her and her response monitored. Statistical methods were used to compare the outdoor and in-Antarium behaviour. The results proved that the concept was solid, but they also uncovered several shortcomings of the Antarium. Nevertheless, even with its limitations the Antarium was used to perform experiments that would be very hard to do in a real environment. In one experiment the foraging tree was repositioned in or deleted from the scene to see whether the animals go to where the tree is or where by their knowledge it should be. The results suggest the latter but the absence or altered location of the foraging tree certainly had a significant effect on the animals. In another experiment the scene, including the sky, were re-coloured to see whether colour plays a significant role in navigation. Results indicate that even very small amount of UV information statistically significantly improves the navigation of the animals. To rectify the device limitations discovered during the experiments a new, improved projector was designed and is currently being built
    • …
    corecore