34,861 research outputs found
Real-time Realistic Rain Rendering
Artistic outdoor filming and rendering need to choose specific weather conditions in order to
properly trigger the audience reaction; for instance, rain, one of the most common conditions, is
usually employed to transmit a sense of unrest. Synthetic methods to recreate weather are an
important avenue to simplify and cheapen filming, but simulations are a challenging problem due
to the variety of different phenomena that need to be computed. Rain alone involves raindrops,
splashes on the ground, fog, clouds, lightnings, etc. We propose a new rain rendering algorithm
that uses and extends present state of the art approaches in this field. The scope of our method is
to achieve real-time renders of rain streaks and splashes on the ground, while considering complex
illumination effects and allowing an artistic direction for the drops placement.
Our algorithm takes as input an artist-defined rain distribution and density, and then creates
particles in the scene following these indications. No restrictions are imposed on the dimensions
of the rain area, thus direct rendering approaches could rapidly overwhelm current computational
capabilities with huge particle amounts. To solve this situation, we propose techniques that, in
rendering time, adaptively sample the particles generated in order to only select the ones in the
regions that really need to be simulated and rendered.
Particle simulation is executed entirely in the graphics hardware. The algorithm proceeds by
placing the particles in their updated coordinates. It then checks whether a particle is falling as a
rain streak, it has reached the ground and it is a splash or, finally, if it should be discarded because
it has entered a solid object of the scene. Different rendering techniques are used for each case.
Complex illumination parameters are computed for rain streaks to select textures matching them.
These textures are generated in a preprocess step and realistically simulate light when interacting
with the optical properties of the water drops
Real-time Realistic Rain Rendering
Artistic outdoor filming and rendering need to choose specific weather conditions in order to
properly trigger the audience reaction; for instance, rain, one of the most common conditions, is
usually employed to transmit a sense of unrest. Synthetic methods to recreate weather are an
important avenue to simplify and cheapen filming, but simulations are a challenging problem due
to the variety of different phenomena that need to be computed. Rain alone involves raindrops,
splashes on the ground, fog, clouds, lightnings, etc. We propose a new rain rendering algorithm
that uses and extends present state of the art approaches in this field. The scope of our method is
to achieve real-time renders of rain streaks and splashes on the ground, while considering complex
illumination effects and allowing an artistic direction for the drops placement.
Our algorithm takes as input an artist-defined rain distribution and density, and then creates
particles in the scene following these indications. No restrictions are imposed on the dimensions
of the rain area, thus direct rendering approaches could rapidly overwhelm current computational
capabilities with huge particle amounts. To solve this situation, we propose techniques that, in
rendering time, adaptively sample the particles generated in order to only select the ones in the
regions that really need to be simulated and rendered.
Particle simulation is executed entirely in the graphics hardware. The algorithm proceeds by
placing the particles in their updated coordinates. It then checks whether a particle is falling as a
rain streak, it has reached the ground and it is a splash or, finally, if it should be discarded because
it has entered a solid object of the scene. Different rendering techniques are used for each case.
Complex illumination parameters are computed for rain streaks to select textures matching them.
These textures are generated in a preprocess step and realistically simulate light when interacting
with the optical properties of the water drops
Rain rendering for evaluating and improving robustness to bad weather
Rain fills the atmosphere with water particles, which breaks the common
assumption that light travels unaltered from the scene to the camera. While it
is well-known that rain affects computer vision algorithms, quantifying its
impact is difficult. In this context, we present a rain rendering pipeline that
enables the systematic evaluation of common computer vision algorithms to
controlled amounts of rain. We present three different ways to add synthetic
rain to existing images datasets: completely physic-based; completely
data-driven; and a combination of both. The physic-based rain augmentation
combines a physical particle simulator and accurate rain photometric modeling.
We validate our rendering methods with a user study, demonstrating our rain is
judged as much as 73% more realistic than the state-of-theart. Using our
generated rain-augmented KITTI, Cityscapes, and nuScenes datasets, we conduct a
thorough evaluation of object detection, semantic segmentation, and depth
estimation algorithms and show that their performance decreases in degraded
weather, on the order of 15% for object detection, 60% for semantic
segmentation, and 6-fold increase in depth estimation error. Finetuning on our
augmented synthetic data results in improvements of 21% on object detection,
37% on semantic segmentation, and 8% on depth estimation.Comment: 19 pages, 19 figures, IJCV 2020 preprint. arXiv admin note: text
overlap with arXiv:1908.1033
Formal Analysis and Redesign of a Neural Network-Based Aircraft Taxiing System with VerifAI
We demonstrate a unified approach to rigorous design of safety-critical
autonomous systems using the VerifAI toolkit for formal analysis of AI-based
systems. VerifAI provides an integrated toolchain for tasks spanning the design
process, including modeling, falsification, debugging, and ML component
retraining. We evaluate all of these applications in an industrial case study
on an experimental autonomous aircraft taxiing system developed by Boeing,
which uses a neural network to track the centerline of a runway. We define
runway scenarios using the Scenic probabilistic programming language, and use
them to drive tests in the X-Plane flight simulator. We first perform
falsification, automatically finding environment conditions causing the system
to violate its specification by deviating significantly from the centerline (or
even leaving the runway entirely). Next, we use counterexample analysis to
identify distinct failure cases, and confirm their root causes with specialized
testing. Finally, we use the results of falsification and debugging to retrain
the network, eliminating several failure cases and improving the overall
performance of the closed-loop system.Comment: Full version of a CAV 2020 pape
Understanding and controlling the ingress of driven rain through exposed, solid wall masonry structures
Long term performance of historic buildings can be affected by many environmental factors, some of which become more apparent as the competence of the fabric deteriorates. Many tall historic buildings suffer from water ingress when exposed to driving rain conditions, particularly church towers in the south west of England. It is important to recognise that leakage can occur not only through flaws in the roof of a building but also through significant thicknesses of solid masonry. Identification of the most appropriate intervention requires an understanding of the way in which water might enter the structure and the assessment of potential repair options. While the full work schedule used an integrated assessment involving laboratory, field and archival work to assess the repairs which might be undertaken on these solid wall structures, this paper focuses on the laboratory work done to inform the writing of a Technical Advice Note on the effects of wind driven rain and moisture movement in historic structures (English Heritage, 2012). The laboratory work showed that grouting and rendering was effective at reducing water penetration without retarding drying rates, but that use of internal plastering also had a very beneficial effect
Rain Removal in Traffic Surveillance: Does it Matter?
Varying weather conditions, including rainfall and snowfall, are generally
regarded as a challenge for computer vision algorithms. One proposed solution
to the challenges induced by rain and snowfall is to artificially remove the
rain from images or video using rain removal algorithms. It is the promise of
these algorithms that the rain-removed image frames will improve the
performance of subsequent segmentation and tracking algorithms. However, rain
removal algorithms are typically evaluated on their ability to remove synthetic
rain on a small subset of images. Currently, their behavior is unknown on
real-world videos when integrated with a typical computer vision pipeline. In
this paper, we review the existing rain removal algorithms and propose a new
dataset that consists of 22 traffic surveillance sequences under a broad
variety of weather conditions that all include either rain or snowfall. We
propose a new evaluation protocol that evaluates the rain removal algorithms on
their ability to improve the performance of subsequent segmentation, instance
segmentation, and feature tracking algorithms under rain and snow. If
successful, the de-rained frames of a rain removal algorithm should improve
segmentation performance and increase the number of accurately tracked
features. The results show that a recent single-frame-based rain removal
algorithm increases the segmentation performance by 19.7% on our proposed
dataset, but it eventually decreases the feature tracking performance and
showed mixed results with recent instance segmentation methods. However, the
best video-based rain removal algorithm improves the feature tracking accuracy
by 7.72%.Comment: Published in IEEE Transactions on Intelligent Transportation System
Methods of Nature: Landscapes from the Gettysburg College Collection
Methods of Nature: Landscapes from the Gettysburg College Collection is the third annual exhibition curated by students enrolled in the Art History Methods course. The exhibition is an exciting academic endeavor and incredible opportunity for engaged learning, research, and curatorial experience. The five student curators are Molly Chason ’17, Leah Falk ’18, Shannon Gross ’17, Bailey Harper ’19 and Laura Waters ’19. The selection of artworks in this exhibition includes the depiction of landscape in the nineteenth- and twentieth-century French, American and East Asian cultural traditions in various art forms from traditional media of paintings and prints to utilitarian artifacts of porcelain and a paper folding fan. Landscape paintings in this exhibition are inspired by nature, specific locales and literature. Each object carries a distinctive characteristic, a mood, and an ambience. Collectively, they present a multifaceted view of the landscape in the heart and mind of the artists and intended viewers. [excerpt]https://cupola.gettysburg.edu/artcatalogs/1020/thumbnail.jp
- …