4,166 research outputs found

    KOLAM : human computer interfaces fro visual analytics in big data imagery

    Get PDF
    In the present day, we are faced with a deluge of disparate and dynamic information from multiple heterogeneous sources. Among these are the big data imagery datasets that are rapidly being generated via mature acquisition methods in the geospatial, surveillance (specifically, Wide Area Motion Imagery or WAMI) and biomedical domains. The need to interactively visualize these imagery datasets by using multiple types of views (as needed) into the data is common to these domains. Furthermore, researchers in each domain have additional needs: users of WAMI datasets also need to interactively track objects of interest using algorithms of their choice, visualize the resulting object trajectories and interactively edit these results as needed. While software tools that fulfill each of these requirements individually are available and well-used at present, there is still a need for tools that can combine the desired aspects of visualization, human computer interaction (HCI), data analysis, data management, and (geo-)spatial and temporal data processing into a single flexible and extensible system. KOLAM is an open, cross-platform, interoperable, scalable and extensible framework for visualization and analysis that we have developed to fulfil the above needs. The novel contributions in this thesis are the following: 1) Spatio-temporal caching for animating both giga-pixel and Full Motion Video (FMV) imagery, 2) Human computer interfaces purposefully designed to accommodate big data visualization, 3) Human-in-the-loop interactive video object tracking - ground-truthing of moving objects in wide area imagery using algorithm assisted human-in-the-loop coupled tracking, 4) Coordinated visualization using stacked layers, side-by-side layers/video sub-windows and embedded imagery, 5) Efficient one-click manual tracking, editing and data management of trajectories, 6) Efficient labeling of image segmentation regions and passing these results to desired modules, 7) Visualization of image processing results generated by non-interactive operators using layers, 8) Extension of interactive imagery and trajectory visualization to multi-monitor wall display environments, 9) Geospatial applications: Providing rapid roam, zoom and hyper-jump spatial operations, interactive blending, colormap and histogram enhancement, spherical projection and terrain maps, 10) Biomedical applications: Visualization and target tracking of cell motility in time-lapse cell imagery, collecting ground-truth from experts on whole-slide imagery (WSI) for developing histopathology analytic algorithms and computer-aided diagnosis for cancer grading, and easy-to-use tissue annotation features.Includes bibliographical reference

    Tracking icebergs with time-lapse photography and sparse optical flow, LeConte Bay, Alaska, 2016–2017

    Get PDF
    We present a workflow to track icebergs in proglacial fjords using oblique time-lapse photos and the Lucas-Kanade optical flow algorithm. We employ the workflow at LeConte Bay, Alaska, where we ran five time-lapse cameras between April 2016 and September 2017, capturing more than 400 000 photos at frame rates of 0.5–4.0 min−1. Hourly to daily average velocity fields in map coordinates illustrate dynamic currents in the bay, with dominant downfjord velocities (exceeding 0.5 m s−1 intermittently) and several eddies. Comparisons with simultaneous Acoustic Doppler Current Profiler (ADCP) measurements yield best agreement for the uppermost ADCP levels (∼ 12 m and above), in line with prevalent small icebergs that trace near-surface currents. Tracking results from multiple cameras compare favorably, although cameras with lower frame rates (0.5 min−1) tend to underestimate high flow speeds. Tests to determine requisite temporal and spatial image resolution confirm the importance of high image frame rates, while spatial resolution is of secondary importance. Application of our procedure to other fjords will be successful if iceberg concentrations are high enough and if the camera frame rates are sufficiently rapid (at least 1 min−1 for conditions similar to LeConte Bay).This work was funded by the U.S. National Science Foundation (OPP-1503910, OPP-1504288, OPP-1504521 and OPP-1504191).Ye

    Towards automated sample collection and return in extreme underwater environments

    Get PDF
    © The Author(s), 2022. This article is distributed under the terms of the Creative Commons Attribution License. The definitive version was published in Billings, G., Walter, M., Pizarro, O., Johnson-Roberson, M., & Camilli, R. Towards automated sample collection and return in extreme underwater environments. Journal of Field Robotics, 2(1), (2022): 1351–1385, https://doi.org/10.55417/fr.2022045.In this report, we present the system design, operational strategy, and results of coordinated multivehicle field demonstrations of autonomous marine robotic technologies in search-for-life missions within the Pacific shelf margin of Costa Rica and the Santorini-Kolumbo caldera complex, which serve as analogs to environments that may exist in oceans beyond Earth. This report focuses on the automation of remotely operated vehicle (ROV) manipulator operations for targeted biological sample-collection-and-return from the seafloor. In the context of future extraterrestrial exploration missions to ocean worlds, an ROV is an analog to a planetary lander, which must be capable of high-level autonomy. Our field trials involve two underwater vehicles, the SuBastian ROV and the Nereid Under Ice (NUI) hybrid ROV for mixed initiative (i.e., teleoperated or autonomous) missions, both equipped seven-degrees-of-freedom hydraulic manipulators. We describe an adaptable, hardware-independent computer vision architecture that enables high-level automated manipulation. The vision system provides a three-dimensional understanding of the workspace to inform manipulator motion planning in complex unstructured environments. We demonstrate the effectiveness of the vision system and control framework through field trials in increasingly challenging environments, including the automated collection and return of biological samples from within the active undersea volcano Kolumbo. Based on our experiences in the field, we discuss the performance of our system and identify promising directions for future research.This work was funded under a NASA PSTAR grant, number NNX16AL08G, and by the National Science Foundation under grants IIS-1830660 and IIS-1830500. The authors would like to thank the Costa Rican Ministry of Environment and Energy and National System of Conservation Areas for permitting research operations at the Costa Rican shelf margin, and the Schmidt Ocean Institute (including the captain and crew of the R/V Falkor and ROV SuBastian) for their generous support and making the FK181210 expedition safe and highly successful. Additionally, the authors would like to thank the Greek Ministry of Foreign Affairs for permitting the 2019 Kolumbo Expedition to the Kolumbo and Santorini calderas, as well as Prof. Evi Nomikou and Dr. Aggelos Mallios for their expert guidance and tireless contributions to the expedition

    Management and display of four-dimensional environmental data sets using McIDAS

    Get PDF
    Over the past four years, great strides have been made in the areas of data management and display of 4-D meteorological data sets. A survey was conducted of available and planned 4-D meteorological data sources. The data types were evaluated for their impact on the data management and display system. The requirements were analyzed for data base management generated by the 4-D data display system. The suitability of the existing data base management procedures and file structure were evaluated in light of the new requirements. Where needed, new data base management tools and file procedures were designed and implemented. The quality of the basic 4-D data sets was assured. The interpolation and extrapolation techniques of the 4-D data were investigated. The 4-D data from various sources were combined to make a uniform and consistent data set for display purposes. Data display software was designed to create abstract line graphic 3-D displays. Realistic shaded 3-D displays were created. Animation routines for these displays were developed in order to produce a dynamic 4-D presentation. A prototype dynamic color stereo workstation was implemented. A computer functional design specification was produced based on interactive studies and user feedback

    Implementation of Unmanned aerial vehicles (UAVs) for assessment of transportation infrastructure - Phase II

    Get PDF
    Technological advances in unmanned aerial vehicle (UAV) technologies continue to enable these tools to become easier to use, more economical, and applicable for transportation-related operations, maintenance, and asset management while also increasing safety and decreasing cost. This Phase 2 project continued to test and evaluate five main UAV platforms with a combination of optical, thermal, and lidar sensors to determine how to implement them into MDOT workflows. Field demonstrations were completed at bridges, a construction site, road corridors, and along highways with data being processed and analyzed using customized algorithms and tools. Additionally, a cost-benefit analysis was conducted, comparing manual and UAV-based inspection methods. The project team also gave a series of technical demonstrations and conference presentations, enabling outreach to interested audiences who gained understanding of the potential implementation of this technology and the advanced research that MDOT is moving to implementation. The outreach efforts and research activities performed under this contract demonstrated how implementing UAV technologies into MDOT workflows can provide many benefits to MDOT and the motoring public; such as advantages in improved cost-effectiveness, operational management, and timely maintenance of Michigan’s transportation infrastructure

    A Survey on Video-based Graphics and Video Visualization

    Get PDF
    • …
    corecore