2,067 research outputs found

    Assessments at multiple levels of biological organization allow for an integrative determination of physiological tolerances to turbidity in an endangered fish species.

    Get PDF
    Turbidity can influence trophic levels by altering species composition and can potentially affect fish feeding strategies and predator-prey interactions. The estuarine turbidity maximum, described as an area of increased suspended particles, phytoplankton and zooplankton, generally represents a zone with higher turbidity and enhanced food sources important for successful feeding and growth in many fish species. The delta smelt (Hypomesus transpacificus) is an endangered, pelagic fish species endemic to the San Francisco Estuary and Sacramento-San Joaquin River Delta, USA, where it is associated with turbid waters. Turbidity is known to play an important role for the completion of the species' life cycle; however, turbidity ranges in the Delta are broad, and specific requirements for this fish species are still unknown. To evaluate turbidity requirements for early life stages, late-larval delta smelt were maintained at environmentally relevant turbidity levels ranging from 5 to 250 nephelometric turbidity units (NTU) for 24 h, after which a combination of physiological endpoints (molecular biomarkers and cortisol), behavioural indices (feeding) and whole-organism measures (survival) were determined. All endpoints delivered consistent results and identified turbidities between 25 and 80 NTU as preferential. Delta smelt survival rates were highest between 12 and 80 NTU and feeding rates were highest between 25 and 80 NTU. Cortisol levels indicated minimal stress between 35 and 80 NTU and were elevated at low turbidities (5, 12 and 25 NTU). Expression of stress-related genes indicated significant responses for gst, hsp70 and glut2 in high turbidities (250 NTU), and principal component analysis on all measured genes revealed a clustering of 25, 35, 50 and 80 NTU separating the medium-turbidity treatments from low- and high-turbidity treatments. Taken together, these data demonstrate that turbidity levels that are either too low or too high affect delta smelt physiological performance, causing significant effects on overall stress, food intake and mortality. They also highlight the need for turbidity to be considered in habitat and water management decisions

    Learner-Centered Assignments in Computer Literacy

    Get PDF
    Literacy is a concept that is understood to be the identifier of an educated populace. In today\u27s world, literacy includes computer literacy, as well as language and quantitative literacy. This paper describes exercises developed to improve first year students\u27 computer literacy through more learner-centered engagement. Exercises are designed to support learner-centered goals of independent and responsible learners, appropriate breadth and depth of content, teacher as facilitator, and assessment woven into learning. Exercise topics include purchase of a personal computer, basic logic via spreadsheets, an annotated bibliography built with electronic resources, and an integrated assignment customized by and for each student

    Enabling EASEY deployment of containerized applications for future HPC systems

    Full text link
    The upcoming exascale era will push the changes in computing architecture from classical CPU-based systems in hybrid GPU-heavy systems with much higher levels of complexity. While such clusters are expected to improve the performance of certain optimized HPC applications, it will also increase the difficulties for those users who have yet to adapt their codes or are starting from scratch with new programming paradigms. Since there are still no comprehensive automatic assistance mechanisms to enhance application performance on such systems, we are proposing a support framework for future HPC architectures, called EASEY (Enable exASclae for EverYone). The solution builds on a layered software architecture, which offers different mechanisms on each layer for different tasks of tuning. This enables users to adjust the parameters on each of the layers, thereby enhancing specific characteristics of their codes. We introduce the framework with a Charliecloud-based solution, showcasing the LULESH benchmark on the upper layers of our framework. Our approach can automatically deploy optimized container computations with negligible overhead and at the same time reduce the time a scientist needs to spent on manual job submission configurations.Comment: International Conference on Computational Science ICCS2020, 13 page

    Copyright & Privacy - Through the Technology Lens, 4 J. Marshall Rev. Intell. Prop. L. 242 (2005)

    Get PDF
    How is new technology impacting on the more general question of privacy in cyberspace? Is the original notion of an expectation of anonymity on the internet still viable? Can technology pierce through the expectation of privacy even without judicial interference? Do individuals need protection from such technology? Is there technology available to protect the individual? Should these technological tools be regulated? Should the law differentiate between various types of alleged “illegal” behavior; e.g., IP infringement, defamation, possession of pornography and terrorism? Are there international standards that can assist in regulating the intersection between technology and privacy in cyberspace

    Introduction to the Special Issue on the 2011 Tohoku Earthquake and Tsunami

    Get PDF
    The 11 March 2011 Tohoku earthquake (05:46:24 UTC) involved a massive rupture of the plate‐boundary fault along which the Pacific plate thrusts under northeastern Honshu, Japan. It was the fourth‐largest recorded earthquake, with seismic‐moment estimates of 3–5×10^(22)  N•m (M_w 9.0). The event produced widespread strong ground shaking in northern Honshu; in some locations ground accelerations exceeded 2g. Rupture extended ∼200  km along dip, spanning the entire width of the seismogenic zone from the Japan trench to below the Honshu coastline, and the aftershock‐zone length extended ∼500  km along strike of the subduction zone. The average fault slip over the entire rupture area was ∼10  m, but some estimates indicate ∼25  m of slip located around the hypocentral region and extraordinary slip of up to 60–80 m in the shallow megathrust extending to the trench. The faulting‐generated seafloor deformation produced a devastating tsunami that resulted in 5–10‐km inundation of the coastal plains, runup of up to 40 m along the Sanriku coastline, and catastrophic failure of the backup power systems at the Fukushima Daiichi nuclear power station, which precipitated a reactor meltdown and radiation release. About 18,131 lives appear to have been lost, 2829 people are still missing, and 6194 people were injured (as reported 28 September 2012 by the Fire and Disaster Management Agency of Japan) and over a half million were displaced, mainly due to the tsunami impact on coastal towns, where tsunami heights significantly exceeded harbor tsunami walls and coastal berms

    Verification of the Parallel Pin-Wise Core Simulator pCTF/PARCSv3.2 in Operational Control Rod Drop Transient Scenarios

    Full text link
    This is an Accepted Manuscript of an article published by Taylor & Francis in Nuclear Science and Engineering on 2017, available online: https://www.tandfonline.com/doi/full/10.1080/00295639.2017.1320892[EN] Thanks to advances in computer technology, it is feasible to obtain detailed reactor core descriptions for safety analysis of the light water reactor (LWR), in order to represent realistically the fuel elements design, as is the case for three-dimensional coupled simulations for local neutron kinetics and thermal hydraulics. This scenario requires an efficient thermal-hydraulic code that can produce a response in a reasonable time for large-scale, detailed models. In two-fluid codes, such as the thermal-hydraulic subchannel code COBRA-TF, the time restriction is even more important, since the set of equations to be solved is more complex. We have developed a message passing interface parallel version of COBRA-TF, called pCTF. The parallel code is based on a cell-oriented domain decomposition approach, and performs well in models that consist of many cells. The Jacobian matrix is computed in parallel, with each processor in charge of calculating the coefficients related to a subset of the cells. Furthermore, the resulting system of linear equations is also solved in parallel, by exploiting solvers and preconditioners from PETSc. The goal of this study is to demonstrate the capability of the recently developed pCTF/PARCS coupled code to simulate large cores with a pin-by-pin level of detail in an acceptable computational time, using for this purpose two control rod drop operational transients that took place in the core of a three-loop pressurized water reactor. As a result, the main safety parameters of the core hot channel have been calculated by the coupled code in a pin level of detail, obtaining best estimate results for this transient.This work has been partially supported by the Universitat Politecnica de Valencia under Projects COBRA_PAR (PAID-05-11-2810) and OpenNUC (PAID-05-12), and by the Spanish Ministerio de Economia y Competitividad under Projects SLEPc-HS (TIN2016-75985-P) and NUC-MULTPHYS (ENE2012-34585).Ramos Peinado, E.; Roman Moltó, JE.; Abarca Giménez, A.; Miró Herrero, R.; Bermejo, JA.; Ortego, A.; Posada-Barral, JM. (2017). Verification of the Parallel Pin-Wise Core Simulator pCTF/PARCSv3.2 in Operational Control Rod Drop Transient Scenarios. Nuclear Science and Engineering. 187(3):254-267. https://doi.org/10.1080/00295639.2017.1320892S2542671873Cuervo, D., Avramova, M., Ivanov, K., & Miró, R. (2006). Evaluation and enhancement of COBRA-TF efficiency for LWR calculations. Annals of Nuclear Energy, 33(9), 837-847. doi:10.1016/j.anucene.2006.03.011Ramos, E., Roman, J. E., Abarca, A., Miró, R., & Bermejo, J. A. (2016). Control rod drop transient analysis with the coupled parallel code pCTF-PARCSv2.7. Annals of Nuclear Energy, 87, 308-317. doi:10.1016/j.anucene.2015.09.016T. DOWNAR et al. “PARCS v2.7 U.S. NRC Core Neutronics Simulator: User Manual” (2006).T. DOWNAR et al. “PARCS v2.7 U.S. NRC Core Neutronics Simulator: Theory Manual” (2006)

    “Aftershock Faults” and What They Could Mean for Seismic Hazard Assessment

    Get PDF
    We study stress‐loading mechanisms for the California faults used in rupture forecasts. Stress accumulation drives earthquakes, and that accumulation mechanism governs recurrence. Most moment release in California occurs because of relative motion between the Pacific plate and the Sierra Nevada block; we calculate relative motion directions at fault centers and compare with fault displacement directions. Dot products between these vectors reveal that some displacement directions are poorly aligned with plate motions. We displace a 3D finite‐element model according to relative motions and resolve stress tensors onto defined fault surfaces, which reveal that poorly aligned faults receive no tectonic loading. Because these faults are known to be active, we search for other loading mechanisms. We find that nearly all faults with no tectonic loading show increase in stress caused by slip on the San Andreas fault, according to an elastic dislocation model. Globally, faults that receive a sudden stress change respond with triggered earthquakes that obey an Omori law rate decay with time. We therefore term this class of faults as “aftershock faults.” These faults release ∼4% of the moment release in California, have ∼0.1%–5% probability of M 6.7 earthquakes in 30 yr, and have a 0.001%–1% 30 yr M 7.7 probability range
    corecore