4,617 research outputs found

    Water residence time in Chesapeake Bay for 1980-2012

    Get PDF
    Concerns have grown over the increase of nutrients and pollutants discharged into the estuaries and coastal seas. The retention and export of these materials inside a system depends on the residence time (RT). A long-term simulation of time-varying RT of the Chesapeake Bay was conducted over the period from 1980 to 2012. The 33-year simulation results show that the mean RT of the entire Chesapeake Bay system ranges from 110 to 264 days, with an average value of 180 days. The RT was larger in the bottom layers than in the surface layers due to the persistent stratification and estuarine circulation. A clear seasonal cycle of RT was found, with a much smaller RT in winter than in summer, indicating materials discharged in winter would be quickly transported out of the estuary due to the winter-spring high flow. Large interannual variability of the RT was highly correlated with the variability of river discharge (R-2 = 0.92). The monthly variability of RT can be partially attributed to the variability of estuarine circulation. A strengthened estuarine circulation results in a larger bottom influx and thus reduces the RT. Wind exerts a significant impact on the RT. The upstream wind is more important in controlling the lateral pattern of RT in the mainstem. (C) 2016 Elsevier B.V. All rights reserved

    Tidal Response to Sea-Level Rise in Different Types of Estuaries: The Importance of Length, Bathymetry, and Geometry

    Get PDF
    Tidal response to sea-level rise (SLR) varies in different coastal systems. To provide a generic pattern of tidal response to SLR, a systematic investigation was conducted using numerical techniques applied to idealized and realistic estuaries, with model results cross-checked by analytical solutions. Our results reveal that the response of tidal range to SLR is nonlinear, spatially heterogeneous, and highly affected by the length and bathymetry of an estuary and weakly affected by the estuary convergence with an exception of strong convergence. Contrary to the common assumption that SLR leads to a weakened bottom friction, resulting in increased tidal amplitude, we demonstrate that tidal range is likely to decrease in short estuaries and in estuaries with a narrow channel and large low-lying shallow areas

    Cascaded two-photon nonlinearity in a one-dimensional waveguide with multiple two-level emitters

    Full text link
    We propose and theoretically investigate a model to realize cascaded optical nonlinearity with few atoms and photons in one-dimension (1D). The optical nonlinearity in our system is mediated by resonant interactions of photons with two-level emitters, such as atoms or quantum dots in a 1D photonic waveguide. Multi-photon transmission in the waveguide is nonreciprocal when the emitters have different transition energies. Our theory provides a clear physical understanding of the origin of nonreciprocity in the presence of cascaded nonlinearity. We show how various two-photon nonlinear effects including spatial attraction and repulsion between photons, background fluorescence can be tuned by changing the number of emitters and the coupling between emitters (controlled by the separation).Comment: 6 pages, 4 figure

    Near-field examination of perovskite-based superlenses and superlens-enhanced probe-object coupling

    Get PDF
    A planar slab of negative index material works as a superlens with sub-diffraction-limited imaging resolution, since propagating waves are focused and, moreover, evanescent waves are reconstructed in the image plane. Here, we demonstrate a superlens for electric evanescent fields with low losses using perovskites in the mid-infrared regime. The combination of near-field microscopy with a tunable free-electron laser allows us to address precisely the polariton modes, which are critical for super-resolution imaging. We spectrally study the lateral and vertical distributions of evanescent waves around the image plane of such a lens, and achieve imaging resolution of wavelength/14 at the superlensing wavelength. Interestingly, at certain distances between the probe and sample surface, we observe a maximum of these evanescent fields. Comparisons with numerical simulations indicate that this maximum originates from an enhanced coupling between probe and object, which might be applicable for multifunctional circuits, infrared spectroscopy, and thermal sensors.Comment: 20 pages, 6 figures, published as open access article in Nature Communications (see http://www.nature.com/ncomms/

    Supermodeling Improving Predictions with an Ensemble of Interacting Models

    Get PDF
    The modeling of weather and climate has been a success story. The skill of forecasts continues to improve and model biases continue to decrease. Combining the output of multiple models has further improved forecast skill and reduced biases. But are we exploiting the full capacity of state-of-the-art models in making forecasts and projections? Supermodeling is a recent step forward in the multimodel ensemble approach. Instead of combining model output after the simulations are completed, in a supermodel individual models exchange state information as they run, influencing each other's behavior. By learning the optimal parameters that determine how models influence each other based on past observations, model errors are reduced at an early stage before they propagate into larger scales and affect other regions and variables. The models synchronize on a common solution that through learning remains closer to the observed evolution. Effectively a new dynamical system has been created, a supermodel, that optimally combines the strengths of the constituent models. The supermodel approach has the potential to rapidly improve current state-of-the-art weather forecasts and climate predictions. In this paper we introduce supermodeling, demonstrate its potential in examples of various complexity, and discuss learning strategies. We conclude with a discussion of remaining challenges for a successful application of supermodeling in the context of state-of-the-art models. The supermodeling approach is not limited to the modeling of weather and climate, but can be applied to improve the prediction capabilities of any complex system, for which a set of different models exists

    The identification of informative genes from multiple datasets with increasing complexity

    Get PDF
    Background In microarray data analysis, factors such as data quality, biological variation, and the increasingly multi-layered nature of more complex biological systems complicates the modelling of regulatory networks that can represent and capture the interactions among genes. We believe that the use of multiple datasets derived from related biological systems leads to more robust models. Therefore, we developed a novel framework for modelling regulatory networks that involves training and evaluation on independent datasets. Our approach includes the following steps: (1) ordering the datasets based on their level of noise and informativeness; (2) selection of a Bayesian classifier with an appropriate level of complexity by evaluation of predictive performance on independent data sets; (3) comparing the different gene selections and the influence of increasing the model complexity; (4) functional analysis of the informative genes. Results In this paper, we identify the most appropriate model complexity using cross-validation and independent test set validation for predicting gene expression in three published datasets related to myogenesis and muscle differentiation. Furthermore, we demonstrate that models trained on simpler datasets can be used to identify interactions among genes and select the most informative. We also show that these models can explain the myogenesis-related genes (genes of interest) significantly better than others (P < 0.004) since the improvement in their rankings is much more pronounced. Finally, after further evaluating our results on synthetic datasets, we show that our approach outperforms a concordance method by Lai et al. in identifying informative genes from multiple datasets with increasing complexity whilst additionally modelling the interaction between genes. Conclusions We show that Bayesian networks derived from simpler controlled systems have better performance than those trained on datasets from more complex biological systems. Further, we present that highly predictive and consistent genes, from the pool of differentially expressed genes, across independent datasets are more likely to be fundamentally involved in the biological process under study. We conclude that networks trained on simpler controlled systems, such as in vitro experiments, can be used to model and capture interactions among genes in more complex datasets, such as in vivo experiments, where these interactions would otherwise be concealed by a multitude of other ongoing events

    A specific case in the classification of woods by FTIR and chemometric: discrimination of Fagales from Malpighiales

    Get PDF
    Fourier transform infrared (FTIR) spectroscopic data was used to classify wood samples from nine species within the Fagales and Malpighiales using a range of multivariate statistical methods. Taxonomic classification of the family Fagaceae and Betulaceae from Angiosperm Phylogenetic System Classification (APG II System) was successfully performed using supervised pattern recognition techniques. A methodology for wood sample discrimination was developed using both sapwood and heartwood samples. Ten and eight biomarkers emerged from the dataset to discriminate order and family, respectively. In the species studied FTIR in combination with multivariate analysis highlighted significant chemical differences in hemicelluloses, cellulose and guaiacyl (lignin) and shows promise as a suitable approach for wood sample classification
    • …
    corecore