1,679 research outputs found

    Improving accuracy on wave height estimation through machine learning techniques

    Get PDF
    Estimatabion of wave agitation plays a key role in predicting natural disasters, path optimization and secure harbor operation. The Spanish agency Puertos del Estado (PdE) has several oceanographic measure networks equipped with sensors for different physical variables, and manages forecast systems involving numerical models. In recent years, there is a growing interest in wave parameter estimation by using machine learning models due to the large amount of oceanographic data available for training, as well as its proven efficacy in estimating physical variables. In this study, we propose to use machine learning techniques to improve the accuracy of the current forecast system of PdE. We have focused on four physical wave variables: spectral significant height, mean spectral period, peak period and mean direction of origin. Two different machine learning models have been explored: multilayer perceptron and gradient boosting decision trees, as well as ensemble methods that combine both models. These models reduce the error of the predictions of the numerical model by 36% on average, demonstrating the potential gains of combining machine learning and numerical models

    ANN wave prediction model for winter storms and hurricanes

    Get PDF
    Currently available wind-wave prediction models require a prohibitive amount of computing time for simulating non-linear wave-wave interactions. Moreover, some parts of wind-wave generation processes are not fully understood yet. For this reason accurate predictions are not always guaranteed. In contrast, Artificial Neural Network (ANN) techniques are designed to recognize the patterns between input and output so that they can save considerable computing time so that real-time wind-wave forecast can be available to the navy and commercial ships. For this reason, this study tries to use ANN techniques to predict waves for winter storms and hurricanes with much less computing time at the five National Oceanic and Atmospheric Administration (NOAA) wave stations along the East Coast of the U.S. from Florida to Maine (station 44007, 44013, 44025, 44009, and 41009). In order to identify prediction error sources of an ANN model, the 100% known wind-wave events simulated from the SMB model were used. The ANN predicted even untrained wind-wave events accurately, and this implied that it could be used for winter-storm and hurricane wave predictions. For the prediction of winter-storm waves, 1999 and 2001 winter-storm events with 403 data points had 1998 winter-storm events with 78 points were prepared for training and validation data sets, respectively. In general, because winter-storms are relatively evenly distributed over a large area and move slowly, wind information (u and v wind components) over a large domain was considered as ANN inputs. When using a 24-hour time-delay to simulate the time required for waves to be fully developed seas, the ANN predicted wave heights (r = 0.88) accurately, but the prediction accuracy of zero-crossing wave periods was much less (r = 0.61). For the prediction of hurricane waves, 15 hurricanes from 1995 to 2001 and Hurricane Bertha in 1998 were prepared for training and validation data sets, respectively. Because hurricanes affect a relatively small domain, move quickly, and change dramatically with time, the location of hurricane centers, the maximum wind speed, central pressure of hurricane centers, longitudinal and latitudinal distance between wave stations and hurricane centers were used as inputs. The ANN predicted wave height accurately when a 24-hour time-delay was used (r = 0.82), but the prediction accuracy of peak-wave periods was much less (r = 0.50). This is because the physical processes of wave periods are more complicated than those of wave heights. This study shows a possibility of an ANN technique as the winter-storm and hurricane-wave prediction model. If more winter-storm and hurricane data can be available, and the prediction of hurricane tracks is possible, we can forecast real-time wind-waves more accurately with less computing time

    Storm Tide and Wave Simulations and Assessment

    Get PDF
    In this Special Issue, seven high-quality papers covering the application and development of many high-end techniques for studies on storm tides, surges, and waves have been published, for instance, the employment of an artificial neural network for predicting coastal freak waves [1]; a reproduction of super typhoon-created extreme waves [2]; a numerical analysis of nonlinear interactions for storm waves, tides, and currents [3]; wave simulation for an island using a circulation–wave coupled model [4]; an analysis of typhoon-induced waves along typhoon tracks in the western North Pacific Ocean [5]; an understanding of how a storm surge prevents or severely restricts aeolian supply [6]; and an investigation of coastal settlements and an assessment of their vulnerability [7]

    Using an Artificial Neural Network for Wave Height Forecasting in the Red Sea

    Get PDF
    184-191Artificial Neural Networks (ANNs) are widely used in the field of wave forecasting as data-based soft-computing techniques that do not require prior knowledge regarding the nature of the relationships between the forecasted waves and the controlling physical mechanisms. Among ANN-techniques is the Nonlinear Auto-Regressive Network with eXogenous inputs (NARX), based on which two models were developed in this study to predict the significant wave heights in Eastern Central Red Sea for the next 3, 6, 12 and 24 h. The two NARX-based models differ only by the inclusion of the variance between wind and wave directions in one model and not in the other. Both models have shown the ability to efficiently predict the significant wave heights up to 12 hours in advance. However, the outperformance of the model that included the difference between wind and wave directions indicated the significance of the inclusion of such an input term

    Radio Galaxy Zoo: Knowledge Transfer Using Rotationally Invariant Self-Organising Maps

    Full text link
    With the advent of large scale surveys the manual analysis and classification of individual radio source morphologies is rendered impossible as existing approaches do not scale. The analysis of complex morphological features in the spatial domain is a particularly important task. Here we discuss the challenges of transferring crowdsourced labels obtained from the Radio Galaxy Zoo project and introduce a proper transfer mechanism via quantile random forest regression. By using parallelized rotation and flipping invariant Kohonen-maps, image cubes of Radio Galaxy Zoo selected galaxies formed from the FIRST radio continuum and WISE infrared all sky surveys are first projected down to a two-dimensional embedding in an unsupervised way. This embedding can be seen as a discretised space of shapes with the coordinates reflecting morphological features as expressed by the automatically derived prototypes. We find that these prototypes have reconstructed physically meaningful processes across two channel images at radio and infrared wavelengths in an unsupervised manner. In the second step, images are compared with those prototypes to create a heat-map, which is the morphological fingerprint of each object and the basis for transferring the user generated labels. These heat-maps have reduced the feature space by a factor of 248 and are able to be used as the basis for subsequent ML methods. Using an ensemble of decision trees we achieve upwards of 85.7% and 80.7% accuracy when predicting the number of components and peaks in an image, respectively, using these heat-maps. We also question the currently used discrete classification schema and introduce a continuous scale that better reflects the uncertainty in transition between two classes, caused by sensitivity and resolution limits

    Integrating Deep Learning and Hydrodynamic Modeling to Improve the Great Lakes Forecast

    Get PDF
    The Laurentian Great Lakes, one of the world’s largest surface freshwater systems, pose a modeling challenge in seasonal forecast and climate projection. While physics-based hydrodynamic modeling is a fundamental approach, improving the forecast accuracy remains critical. In recent years, machine learning (ML) has quickly emerged in geoscience applications, but its application to the Great Lakes hydrodynamic prediction is still in its early stages. This work is the first one to explore a deep learning approach to predicting spatiotemporal distributions of the lake surface temperature (LST) in the Great Lakes. Our study shows that the Long Short-Term Memory (LSTM) neural network, trained with the limited data from hypothetical monitoring networks, can provide consistent and robust performance. The LSTM prediction captured the LST spatiotemporal variabilities across the five Great Lakes well, suggesting an effective and efficient way for monitoring network design in assisting the ML-based forecast. Furthermore, we employed an explainable artificial intelligence (XAI) technique named SHapley Additive exPlanations (SHAP) to uncover how the features impact the LSTM prediction. Our XAI analysis shows air temperature is the most influential feature for predicting LST in the trained LSTM. The relatively large bias in the LSTM prediction during the spring and fall was associated with substantial heterogeneity of air temperature during the two seasons. In contrast, the physics-based hydrodynamic model performed better in spring and fall yet exhibited relatively large biases during the summer stratification period. Finally, we developed a statistical integration of the hydrodynamic modeling and deep learning results based on the Best Linear Unbiased Estimator (BLUE). The integration further enhanced prediction accuracy, suggesting its potential for next-generation Great Lakes forecast systems
    • 

    corecore