3,404 research outputs found

    Numerical Computations of a Two-Layer Model for Estuaries

    Get PDF
    The research described in this paper is aimed at improving the predictive capability of numerical models for estuarine circulation. An improved two-layer model has been developed, which is applicable to the entire estuary including areas near the river mouth and the estuary inlet. This model is applied to Apalachicola Bay, Florida. The calibration and verification of the numerical model is accomplished with available prototype data. The horizontal density gradient terms have been added to the model and provide significantly improved salinity predictions near the river mouths. A theoretical approach to the internal wave boundary condition has been developed in this two-layer model

    Resuspension and Transport of Fine Sediments by Waves

    Get PDF
    Although waves are the primary cause of sediment resuspension in the nearshore zone, in existing theoretical models, long-scale currents induced by the mean wind are often taken to be the only agent for the diffusion and convection of resuspended sediments. We present here theoretical examples where waves play a direct role in all aspects of sediment transport. Details are given for the simple case where only waves are present; the wave-induced current and diffusivity are shown to be no less important than similar factors in the wind-driven current. Hence, in a comprehensive model, one should include not only the current forced directly by the mean wind, but also the current forced by waves which may or m•y not be forced by the local wind

    Comparative Data Mining Analysis for Information Retrieval of MODIS Images: Monitoring Lake Turbidity Changes at Lake Okeechobee, Florida

    Get PDF
    In the remote sensing field, a frequently recurring question is: Which computational intelligence or data mining algorithms are most suitable for the retrieval of essential information given that most natural systems exhibit very high non-linearity. Among potential candidates might be empirical regression, neural network model, support vector machine, genetic algorithm/genetic programming, analytical equation, etc. This paper compares three types of data mining techniques, including multiple non-linear regression, artificial neural networks, and genetic programming, for estimating multi-temporal turbidity changes following hurricane events at Lake Okeechobee, Florida. This retrospective analysis aims to identify how the major hurricanes impacted the water quality management in 2003-2004. The Moderate Resolution Imaging Spectroradiometer (MODIS) Terra 8-day composite imageries were used to retrieve the spatial patterns of turbidity distributions for comparison against the visual patterns discernible in the in-situ observations. By evaluating four statistical parameters, the genetic programming model was finally selected as the most suitable data mining tool for classification in which the MODIS band 1 image and wind speed were recognized as the major determinants by the model. The multi-temporal turbidity maps generated before and after the major hurricane events in 2003-2004 showed that turbidity levels were substantially higher after hurricane episodes. The spatial patterns of turbidity confirm that sediment-laden water travels to the shore where it reduces the intensity of the light necessary to submerged plants for photosynthesis. This reduction results in substantial loss of biomass during the post-hurricane period

    Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public.</p> <p>Results</p> <p>In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile.</p> <p>Conclusion</p> <p>Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested researchers can further develop and extend this software based on the existing infrastructure.</p
    • …
    corecore