32,682 research outputs found

    Real-time Realistic Rain Rendering

    Get PDF
    Artistic outdoor filming and rendering need to choose specific weather conditions in order to properly trigger the audience reaction; for instance, rain, one of the most common conditions, is usually employed to transmit a sense of unrest. Synthetic methods to recreate weather are an important avenue to simplify and cheapen filming, but simulations are a challenging problem due to the variety of different phenomena that need to be computed. Rain alone involves raindrops, splashes on the ground, fog, clouds, lightnings, etc. We propose a new rain rendering algorithm that uses and extends present state of the art approaches in this field. The scope of our method is to achieve real-time renders of rain streaks and splashes on the ground, while considering complex illumination effects and allowing an artistic direction for the drops placement. Our algorithm takes as input an artist-defined rain distribution and density, and then creates particles in the scene following these indications. No restrictions are imposed on the dimensions of the rain area, thus direct rendering approaches could rapidly overwhelm current computational capabilities with huge particle amounts. To solve this situation, we propose techniques that, in rendering time, adaptively sample the particles generated in order to only select the ones in the regions that really need to be simulated and rendered. Particle simulation is executed entirely in the graphics hardware. The algorithm proceeds by placing the particles in their updated coordinates. It then checks whether a particle is falling as a rain streak, it has reached the ground and it is a splash or, finally, if it should be discarded because it has entered a solid object of the scene. Different rendering techniques are used for each case. Complex illumination parameters are computed for rain streaks to select textures matching them. These textures are generated in a preprocess step and realistically simulate light when interacting with the optical properties of the water drops

    Real-time Realistic Rain Rendering

    Get PDF
    Artistic outdoor filming and rendering need to choose specific weather conditions in order to properly trigger the audience reaction; for instance, rain, one of the most common conditions, is usually employed to transmit a sense of unrest. Synthetic methods to recreate weather are an important avenue to simplify and cheapen filming, but simulations are a challenging problem due to the variety of different phenomena that need to be computed. Rain alone involves raindrops, splashes on the ground, fog, clouds, lightnings, etc. We propose a new rain rendering algorithm that uses and extends present state of the art approaches in this field. The scope of our method is to achieve real-time renders of rain streaks and splashes on the ground, while considering complex illumination effects and allowing an artistic direction for the drops placement. Our algorithm takes as input an artist-defined rain distribution and density, and then creates particles in the scene following these indications. No restrictions are imposed on the dimensions of the rain area, thus direct rendering approaches could rapidly overwhelm current computational capabilities with huge particle amounts. To solve this situation, we propose techniques that, in rendering time, adaptively sample the particles generated in order to only select the ones in the regions that really need to be simulated and rendered. Particle simulation is executed entirely in the graphics hardware. The algorithm proceeds by placing the particles in their updated coordinates. It then checks whether a particle is falling as a rain streak, it has reached the ground and it is a splash or, finally, if it should be discarded because it has entered a solid object of the scene. Different rendering techniques are used for each case. Complex illumination parameters are computed for rain streaks to select textures matching them. These textures are generated in a preprocess step and realistically simulate light when interacting with the optical properties of the water drops

    Formal Analysis and Redesign of a Neural Network-Based Aircraft Taxiing System with VerifAI

    Full text link
    We demonstrate a unified approach to rigorous design of safety-critical autonomous systems using the VerifAI toolkit for formal analysis of AI-based systems. VerifAI provides an integrated toolchain for tasks spanning the design process, including modeling, falsification, debugging, and ML component retraining. We evaluate all of these applications in an industrial case study on an experimental autonomous aircraft taxiing system developed by Boeing, which uses a neural network to track the centerline of a runway. We define runway scenarios using the Scenic probabilistic programming language, and use them to drive tests in the X-Plane flight simulator. We first perform falsification, automatically finding environment conditions causing the system to violate its specification by deviating significantly from the centerline (or even leaving the runway entirely). Next, we use counterexample analysis to identify distinct failure cases, and confirm their root causes with specialized testing. Finally, we use the results of falsification and debugging to retrain the network, eliminating several failure cases and improving the overall performance of the closed-loop system.Comment: Full version of a CAV 2020 pape

    Rain Removal in Traffic Surveillance: Does it Matter?

    Get PDF
    Varying weather conditions, including rainfall and snowfall, are generally regarded as a challenge for computer vision algorithms. One proposed solution to the challenges induced by rain and snowfall is to artificially remove the rain from images or video using rain removal algorithms. It is the promise of these algorithms that the rain-removed image frames will improve the performance of subsequent segmentation and tracking algorithms. However, rain removal algorithms are typically evaluated on their ability to remove synthetic rain on a small subset of images. Currently, their behavior is unknown on real-world videos when integrated with a typical computer vision pipeline. In this paper, we review the existing rain removal algorithms and propose a new dataset that consists of 22 traffic surveillance sequences under a broad variety of weather conditions that all include either rain or snowfall. We propose a new evaluation protocol that evaluates the rain removal algorithms on their ability to improve the performance of subsequent segmentation, instance segmentation, and feature tracking algorithms under rain and snow. If successful, the de-rained frames of a rain removal algorithm should improve segmentation performance and increase the number of accurately tracked features. The results show that a recent single-frame-based rain removal algorithm increases the segmentation performance by 19.7% on our proposed dataset, but it eventually decreases the feature tracking performance and showed mixed results with recent instance segmentation methods. However, the best video-based rain removal algorithm improves the feature tracking accuracy by 7.72%.Comment: Published in IEEE Transactions on Intelligent Transportation System

    A Survey of Ocean Simulation and Rendering Techniques in Computer Graphics

    Get PDF
    This paper presents a survey of ocean simulation and rendering methods in computer graphics. To model and animate the ocean's surface, these methods mainly rely on two main approaches: on the one hand, those which approximate ocean dynamics with parametric, spectral or hybrid models and use empirical laws from oceanographic research. We will see that this type of methods essentially allows the simulation of ocean scenes in the deep water domain, without breaking waves. On the other hand, physically-based methods use Navier-Stokes Equations (NSE) to represent breaking waves and more generally ocean surface near the shore. We also describe ocean rendering methods in computer graphics, with a special interest in the simulation of phenomena such as foam and spray, and light's interaction with the ocean surface

    Methods of Nature: Landscapes from the Gettysburg College Collection

    Full text link
    Methods of Nature: Landscapes from the Gettysburg College Collection is the third annual exhibition curated by students enrolled in the Art History Methods course. The exhibition is an exciting academic endeavor and incredible opportunity for engaged learning, research, and curatorial experience. The five student curators are Molly Chason ’17, Leah Falk ’18, Shannon Gross ’17, Bailey Harper ’19 and Laura Waters ’19. The selection of artworks in this exhibition includes the depiction of landscape in the nineteenth- and twentieth-century French, American and East Asian cultural traditions in various art forms from traditional media of paintings and prints to utilitarian artifacts of porcelain and a paper folding fan. Landscape paintings in this exhibition are inspired by nature, specific locales and literature. Each object carries a distinctive characteristic, a mood, and an ambience. Collectively, they present a multifaceted view of the landscape in the heart and mind of the artists and intended viewers. [excerpt]https://cupola.gettysburg.edu/artcatalogs/1020/thumbnail.jp

    Rain rendering for evaluating and improving robustness to bad weather

    Full text link
    Rain fills the atmosphere with water particles, which breaks the common assumption that light travels unaltered from the scene to the camera. While it is well-known that rain affects computer vision algorithms, quantifying its impact is difficult. In this context, we present a rain rendering pipeline that enables the systematic evaluation of common computer vision algorithms to controlled amounts of rain. We present three different ways to add synthetic rain to existing images datasets: completely physic-based; completely data-driven; and a combination of both. The physic-based rain augmentation combines a physical particle simulator and accurate rain photometric modeling. We validate our rendering methods with a user study, demonstrating our rain is judged as much as 73% more realistic than the state-of-theart. Using our generated rain-augmented KITTI, Cityscapes, and nuScenes datasets, we conduct a thorough evaluation of object detection, semantic segmentation, and depth estimation algorithms and show that their performance decreases in degraded weather, on the order of 15% for object detection, 60% for semantic segmentation, and 6-fold increase in depth estimation error. Finetuning on our augmented synthetic data results in improvements of 21% on object detection, 37% on semantic segmentation, and 8% on depth estimation.Comment: 19 pages, 19 figures, IJCV 2020 preprint. arXiv admin note: text overlap with arXiv:1908.1033
    • …
    corecore