33 research outputs found
Seeing the Wind: Visual Wind Speed Prediction with a Coupled Convolutional and Recurrent Neural Network
Wind energy resource quantification, air pollution monitoring, and weather
forecasting all rely on rapid, accurate measurement of local wind conditions.
Visual observations of the effects of wind---the swaying of trees and flapping
of flags, for example---encode information regarding local wind conditions that
can potentially be leveraged for visual anemometry that is inexpensive and
ubiquitous. Here, we demonstrate a coupled convolutional neural network and
recurrent neural network architecture that extracts the wind speed encoded in
visually recorded flow-structure interactions of a flag and tree in naturally
occurring wind. Predictions for wind speeds ranging from 0.75-11 m/s showed
agreement with measurements from a cup anemometer on site, with a
root-mean-squared error approaching the natural wind speed variability due to
atmospheric turbulence. Generalizability of the network was demonstrated by
successful prediction of wind speed based on recordings of other flags in the
field and in a controlled wind tunnel test. Furthermore, physics-based scaling
of the flapping dynamics accurately predicts the dependence of the network
performance on the video frame rate and duration.Comment: NeurIPS 2019 (to appear). The dataset has been expanded to include
videos of a tree canopy in addition to flags. The models were retrained, and
results were updated accordingly. The introduction and related work sections
were also expand upon. Clarifying details were added to explain author
choices such as time averaging windows and to further discuss test set
result