20,430 research outputs found

    Unmanned Aerial Vehicles (UAVs) in environmental biology: A Review

    Get PDF
    Acquiring information about the environment is a key step during each study in the field of environmental biology at different levels, from an individual species to community and biome. However, obtaining information about the environment is frequently difficult because of, for example, the phenological timing, spatial distribution of a species or limited accessibility of a particular area for the field survey. Moreover, remote sensing technology, which enables the observation of the Earth’s surface and is currently very common in environmental research, has many limitations such as insufficient spatial, spectral and temporal resolution and a high cost of data acquisition. Since the 1990s, researchers have been exploring the potential of different types of unmanned aerial vehicles (UAVs) for monitoring Earth’s surface. The present study reviews recent scientific literature dealing with the use of UAV in environmental biology. Amongst numerous papers, short communications and conference abstracts, we selected 110 original studies of how UAVs can be used in environmental biology and which organisms can be studied in this manner. Most of these studies concerned the use of UAV to measure the vegetation parameters such as crown height, volume, number of individuals (14 studies) and quantification of the spatio-temporal dynamics of vegetation changes (12 studies). UAVs were also frequently applied to count birds and mammals, especially those living in the water. Generally, the analytical part of the present study was divided into following sections: (1) detecting, assessing and predicting threats on vegetation, (2) measuring the biophysical parameters of vegetation, (3) quantifying the dynamics of changes in plants and habitats and (4) population and behaviour studies of animals. At the end, we also synthesised all the information showing, amongst others, the advances in environmental biology because of UAV application. Considering that 33% of studies found and included in this review were published in 2017 and 2018, it is expected that the number and variety of applications of UAVs in environmental biology will increase in the future

    Sensors Application in Agriculture

    Get PDF
    Novel technologies are playing an important role in the development of crop and livestock farming and have the potential to be the key drivers of sustainable intensification of agricultural systems. In particular, new sensors are now available with reduced dimensions, reduced costs, and increased performances, which can be implemented and integrated in production systems, providing more data and eventually an increase in information. It is of great importance to support the digital transformation, precision agriculture, and smart farming, and to eventually allow a revolution in the way food is produced. In order to exploit these results, authoritative studies from the research world are still needed to support the development and implementation of new solutions and best practices. This Special Issue is aimed at bringing together recent developments related to novel sensors and their proved or potential applications in agriculture

    Spartan Daily, January 26, 2017

    Get PDF
    Volume 148, Issue 1https://scholarworks.sjsu.edu/spartan_daily_2017/1000/thumbnail.jp

    Robust Hand Motion Capture and Physics-Based Control for Grasping in Real Time

    Get PDF
    Hand motion capture technologies are being explored due to high demands in the fields such as video game, virtual reality, sign language recognition, human-computer interaction, and robotics. However, existing systems suffer a few limitations, e.g. they are high-cost (expensive capture devices), intrusive (additional wear-on sensors or complex configurations), and restrictive (limited motion varieties and restricted capture space). This dissertation mainly focus on exploring algorithms and applications for the hand motion capture system that is low-cost, non-intrusive, low-restriction, high-accuracy, and robust. More specifically, we develop a realtime and fully-automatic hand tracking system using a low-cost depth camera. We first introduce an efficient shape-indexed cascaded pose regressor that directly estimates 3D hand poses from depth images. A unique property of our hand pose regressor is to utilize a low-dimensional parametric hand geometric model to learn 3D shape-indexed features robust to variations in hand shapes, viewpoints and hand poses. We further introduce a hybrid tracking scheme that effectively complements our hand pose regressor with model-based hand tracking. In addition, we develop a rapid 3D hand shape modeling method that uses a small number of depth images to accurately construct a subject-specific skinned mesh model for hand tracking. This step not only automates the whole tracking system but also improves the robustness and accuracy of model-based tracking and hand pose regression. Additionally, we also propose a physically realistic human grasping synthesis method that is capable to grasp a wide variety of objects. Given an object to be grasped, our method is capable to compute required controls (e.g. forces and torques) that advance the simulation to achieve realistic grasping. Our method combines the power of data-driven synthesis and physics-based grasping control. We first introduce a data-driven method to synthesize a realistic grasping motion from large sets of prerecorded grasping motion data. And then we transform the synthesized kinematic motion to a physically realistic one by utilizing our online physics-based motion control method. In addition, we also provide a performance interface which allows the user to act out before a depth camera to control a virtual object

    Robust Hand Motion Capture and Physics-Based Control for Grasping in Real Time

    Get PDF
    Hand motion capture technologies are being explored due to high demands in the fields such as video game, virtual reality, sign language recognition, human-computer interaction, and robotics. However, existing systems suffer a few limitations, e.g. they are high-cost (expensive capture devices), intrusive (additional wear-on sensors or complex configurations), and restrictive (limited motion varieties and restricted capture space). This dissertation mainly focus on exploring algorithms and applications for the hand motion capture system that is low-cost, non-intrusive, low-restriction, high-accuracy, and robust. More specifically, we develop a realtime and fully-automatic hand tracking system using a low-cost depth camera. We first introduce an efficient shape-indexed cascaded pose regressor that directly estimates 3D hand poses from depth images. A unique property of our hand pose regressor is to utilize a low-dimensional parametric hand geometric model to learn 3D shape-indexed features robust to variations in hand shapes, viewpoints and hand poses. We further introduce a hybrid tracking scheme that effectively complements our hand pose regressor with model-based hand tracking. In addition, we develop a rapid 3D hand shape modeling method that uses a small number of depth images to accurately construct a subject-specific skinned mesh model for hand tracking. This step not only automates the whole tracking system but also improves the robustness and accuracy of model-based tracking and hand pose regression. Additionally, we also propose a physically realistic human grasping synthesis method that is capable to grasp a wide variety of objects. Given an object to be grasped, our method is capable to compute required controls (e.g. forces and torques) that advance the simulation to achieve realistic grasping. Our method combines the power of data-driven synthesis and physics-based grasping control. We first introduce a data-driven method to synthesize a realistic grasping motion from large sets of prerecorded grasping motion data. And then we transform the synthesized kinematic motion to a physically realistic one by utilizing our online physics-based motion control method. In addition, we also provide a performance interface which allows the user to act out before a depth camera to control a virtual object

    Clinical Decision Support Systems with Game-based Environments, Monitoring Symptoms of Parkinson’s Disease with Exergames

    Get PDF
    Parkinson’s Disease (PD) is a malady caused by progressive neuronal degeneration, deriving in several physical and cognitive symptoms that worsen with time. Like many other chronic diseases, it requires constant monitoring to perform medication and therapeutic adjustments. This is due to the significant variability in PD symptomatology and progress between patients. At the moment, this monitoring requires substantial participation from caregivers and numerous clinic visits. Personal diaries and questionnaires are used as data sources for medication and therapeutic adjustments. The subjectivity in these data sources leads to suboptimal clinical decisions. Therefore, more objective data sources are required to better monitor the progress of individual PD patients. A potential contribution towards more objective monitoring of PD is clinical decision support systems. These systems employ sensors and classification techniques to provide caregivers with objective information for their decision-making. This leads to more objective assessments of patient improvement or deterioration, resulting in better adjusted medication and therapeutic plans. Hereby, the need to encourage patients to actively and regularly provide data for remote monitoring remains a significant challenge. To address this challenge, the goal of this thesis is to combine clinical decision support systems with game-based environments. More specifically, serious games in the form of exergames, active video games that involve physical exercise, shall be used to deliver objective data for PD monitoring and therapy. Exergames increase engagement while combining physical and cognitive tasks. This combination, known as dual-tasking, has been proven to improve rehabilitation outcomes in PD: recent randomized clinical trials on exergame-based rehabilitation in PD show improvements in clinical outcomes that are equal or superior to those of traditional rehabilitation. In this thesis, we present an exergame-based clinical decision support system model to monitor symptoms of PD. This model provides both objective information on PD symptoms and an engaging environment for the patients. The model is elaborated, prototypically implemented and validated in the context of two of the most prominent symptoms of PD: (1) balance and gait, as well as (2) hand tremor and slowness of movement (bradykinesia). While balance and gait affections increase the risk of falling, hand tremors and bradykinesia affect hand dexterity. We employ Wii Balance Boards and Leap Motion sensors, and digitalize aspects of current clinical standards used to assess PD symptoms. In addition, we present two dual-tasking exergames: PDDanceCity for balance and gait, and PDPuzzleTable for tremor and bradykinesia. We evaluate the capability of our system for assessing the risk of falling and the severity of tremor in comparison with clinical standards. We also explore the statistical significance and effect size of the data we collect from PD patients and healthy controls. We demonstrate that the presented approach can predict an increased risk of falling and estimate tremor severity. Also, the target population shows a good acceptance of PDDanceCity and PDPuzzleTable. In summary, our results indicate a clear feasibility to implement this system for PD. Nevertheless, long-term randomized clinical trials are required to evaluate the potential of PDDanceCity and PDPuzzleTable for physical and cognitive rehabilitation effects

    MolecularRift, a Gesture Based Interaction Tool for Controlling Molecules in 3-D

    Get PDF
    Visualization of molecular models is a vital part in modern drug design. Improved visualization methods increases the conceptual understanding and enables faster and better decision making. The introduction of virtual reality goggles such as Oculus Rift has introduced new opportunities for the capabilities of such visualisations. A new interactive visualization tool (MolecularRift), which lets the user experience molecular models in a virtual reality environment, was developed in collaboration with AstraZeneca. In an attempt to create a more natural way to interact with the tool, users can steer and control molecules through hand gestures. The gestures are recorded using depth data from a Mircosoft Kinect v2 sensor and interpreted using per pixel algorithms, which only focus on the captured frames thus freeing the user from additional devices such as cursor, keyboard, touchpad or even piezoresistive gloves. MolecularRift was developed from a usability perspective using an iterative developing process and test group evaluations. The iterations allowed an agile process where features easily could be evaluated to monitor behavior and performance, resulting in a user-optimized tool. We conclude with reflections on virtual reality's capabilities in chemistry and possibilities for future projects.Virtual reality är framtiden. Nya tekniker utvecklas konstant och parallellt med att datakapaciteten förbättras finner vi nya sätt att använda dem ihop. Vi har utvecklat ett nytt interaktivt visualiserings verktyg (Molecular Rift) som låter användaren uppleva molekylära modeller i en virtuell verklighet. I dagens medicinindustri är man i ständigt behov av nya metoder för att visualisera potentiella läkemedel i 3-D. Det finns flera verktyg idag som används för att visualisera molekyler i 3-D stereo. Våra nyframtagna tekniker inom virtuell verklighet presenterar möjligheter för medicinutvecklare att ”gå in” i de molekylära strukturerna och uppleva dem på ett helt nytt sätt
    • …
    corecore