20,169 research outputs found

    Hydroelectric Dams and the Decline of Chinook Salmon in the Columbia River Basin

    Get PDF
    The decline of chinook salmon runs into the mouth of the Columbia River in recent decades is thought to be partly attributable to the construction of hydroelectric dams. The purpose of this article is to estimate the magnitude of losses in chinook salmon runs caused by hydroelectric dams, using regression analysis. Such estimates are not only of historical interest but also can potentially affect the extent of efforts to mitigate salmon losses from hydropower operations. Congress has mandated the Northwest Power Planning Council to consider the magnitude of run losses caused by hydroelectric operations in determining the extent of mitigation efforts.habitat, Northwest Power Planning Council, hydroelectric dams, Chinook salmon, smolt production., Environmental Economics and Policy, Resource /Energy Economics and Policy,

    Comparative Monte Carlo Efficiency by Monte Carlo Analysis

    Full text link
    We propose a modified power method for computing the subdominant eigenvalue λ2\lambda_2 of a matrix or continuous operator. Here we focus on defining simple Monte Carlo methods for its application. The methods presented use random walkers of mixed signs to represent the subdominant eigenfuction. Accordingly, the methods must cancel these signs properly in order to sample this eigenfunction faithfully. We present a simple procedure to solve this sign problem and then test our Monte Carlo methods by computing the λ2\lambda_2 of various Markov chain transition matrices. We first computed λ2{\lambda_2} for several one and two dimensional Ising models, which have a discrete phase space, and compared the relative efficiencies of the Metropolis and heat-bath algorithms as a function of temperature and applied magnetic field. Next, we computed λ2\lambda_2 for a model of an interacting gas trapped by a harmonic potential, which has a mutidimensional continuous phase space, and studied the efficiency of the Metropolis algorithm as a function of temperature and the maximum allowable step size Δ\Delta. Based on the λ2\lambda_2 criterion, we found for the Ising models that small lattices appear to give an adequate picture of comparative efficiency and that the heat-bath algorithm is more efficient than the Metropolis algorithm only at low temperatures where both algorithms are inefficient. For the harmonic trap problem, we found that the traditional rule-of-thumb of adjusting Δ\Delta so the Metropolis acceptance rate is around 50% range is often sub-optimal. In general, as a function of temperature or Δ\Delta, λ2\lambda_2 for this model displayed trends defining optimal efficiency that the acceptance ratio does not. The cases studied also suggested that Monte Carlo simulations for a continuum model are likely more efficient than those for a discretized version of the model.Comment: 23 pages, 8 figure

    Northeastern Atlantic benthic foraminifera during the last 45,000 years: Changes in productivity seen from the bottom up

    Get PDF
    We studied benthic foraminifera from the last 45 kyr in the >63 mu m size fraction in Biogeochemical Ocean Flux Studies (BOFS) cores 5K (50 degrees 41.3'N, 21 degrees 51.9'W, depth 3547 m) and 14K (58 degrees 37.2'N, 19 degrees 26.2'W, depth 1756 m), at a time resolution of several hundreds to a thousand years. The deepest site showed the largest fluctuations in faunal composition, species richness, and benthic foraminiferal accumulation rates; the fluctuations resulted from changes in abundance of Epistominella exigua and Alabaminella weddellensis. In the present oceans, these species bloom opportunistically when a spring plankton bloom results in seasonal deposition of phytodetritus on the seafloor. The ''phytodetritus species'' had very low relative abundances and accumulation rates during the last glacial maximum. A strong increase in absolute and relative abundance of E. exigua and A weddellensis during deglaciation paralleled the decrease in abundance of the polar planktonic foraminifer Neogloboquadrina pachyderma (s), and the increase in abundance of warmer water planktonic species such as Globigerina bulloides. This strong increase in relative abundance of the ''phytodetritus species'' and the coeval increase in benthic foraminiferal accumulation rate were thus probably caused by an increase in the deposition of phytodetritus to the seafloor (and thus probably of surface productivity) when the polar front retreated to higher latitudes. The abundance of ''phytodetritus species'' decreased during the Younger Dryas, but not to the low levels of fully glacial conditions. During Heinrich events (periods of excessive melt-water formation and ice rafting) benthic accumulation rates were very low, as were the absolute and relative abundances of the ''phytodetritus species'', supporting suggestions that surface productivity was very low during these events. In both cores Pullenia and Cassidulina species were common during isotope stages 2, 3 and 4, as were bolivinid, buliminid and uvigerinid species. High relative abundances of these species have been interpreted as indicative either of sluggish deep water circulation or of high organic carbon fluxes to the seafloor. In our cores, relative abundances of these species are negatively correlated with benthic foraminiferal accumulation rates, and we can thus not interpret them as indicative of increased productivity during glacials. The percentage of these ''low oxygen'' species calculated on a ''phytodetritus species'' - free basis decreased slightly at deglaciation at 5K, but not at 14K. This suggests that decreased production of North Atlantic Deep Water during the last glacial might have slightly affected benthic foraminiferal faunas in the eastern North Atlantic at 3547 m depth, but not at 1756 m. In conclusion, major changes in deep-sea benthic foraminiferal faunas over the last 45,000 years in our cores from the northeastern Atlantic were the result of changes in surface water productivity, not of changes in deep water circulation; productivity was lower during the glacial, probably because of extensive ice cover

    New techniques for experimental generation of two-dimensional blade-vortex interaction at low Reynolds numbers

    Get PDF
    An experimental investigation of two dimensional blade vortex interaction was held at NASA Langley Research Center. The first phase was a flow visualization study to document the approach process of a two dimensional vortex as it encountered a loaded blade model. To accomplish the flow visualization study, a method for generating two dimensional vortex filaments was required. The numerical study used to define a new vortex generation process and the use of this process in the flow visualization study were documented. Additionally, photographic techniques and data analysis methods used in the flow visualization study are examined

    Software Engineering Laboratory Ada performance study: Results and implications

    Get PDF
    The SEL is an organization sponsored by NASA/GSFC to investigate the effectiveness of software engineering technologies applied to the development of applications software. The SEL was created in 1977 and has three organizational members: NASA/GSFC, Systems Development Branch; The University of Maryland, Computer Sciences Department; and Computer Sciences Corporation, Systems Development Operation. The goals of the SEL are as follows: (1) to understand the software development process in the GSFC environments; (2) to measure the effect of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that include the Ada Performance Study Report. This paper describes the background of Ada in the Flight Dynamics Division (FDD), the objectives and scope of the Ada Performance Study, the measurement approach used, the performance tests performed, the major test results, and the implications for future FDD Ada development efforts

    Risk Based Urban Watershed Management Under Conflicting Objectives

    Get PDF
    Ecological impairment and flooding caused by urbanization can be expressed numerically by calculating the risks throughout the watershed (floodplain) and along the main stems of the streams. The risks can be evaluated in terms of the present and/or future. This article describes the methodologies for ascertaining the risks in the Geographical Information Systems (GIS) environment. The objectives of urban flood controls and ecological preservation/restoration of urban waters are often conflicting and, in the past, the sole emphasis on flood control led to destruction of habitat and deterioration of water quality. An optimal solution to these two problems may be achieved by linking the risks to the concepts of risk communication, risk perception, and public willingness to pay for projects leading to ecological restoration and ecologically sustainable flood control. This method is appropriate because, in each case, public funds are used and the projects require approval and backing of policy makers and stakeholders. This article briefly describes a research project that attempts to resolve the conflict between the flood protection and stream ecological preservation and restoration and suggests alternative ways of expressing benefits of urban stream flood control and restoration projects

    Integrative Model-based clustering of microarray methylation and expression data

    Full text link
    In many fields, researchers are interested in large and complex biological processes. Two important examples are gene expression and DNA methylation in genetics. One key problem is to identify aberrant patterns of these processes and discover biologically distinct groups. In this article we develop a model-based method for clustering such data. The basis of our method involves the construction of a likelihood for any given partition of the subjects. We introduce cluster specific latent indicators that, along with some standard assumptions, impose a specific mixture distribution on each cluster. Estimation is carried out using the EM algorithm. The methods extend naturally to multiple data types of a similar nature, which leads to an integrated analysis over multiple data platforms, resulting in higher discriminating power.Comment: Published in at http://dx.doi.org/10.1214/11-AOAS533 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • …
    corecore