121 research outputs found

    Response of river delta hydrological connectivity to changes in river discharge and atmospheric frontal passage

    Get PDF
    Atmospheric frontal passage is a common meteorological event that can significantly affect hydrodynamics in coastal environments, including the hydrological connectivity between channels and floodplains that regulates material transport in river deltas. This study is focused on the influence of atmospheric cold fronts on the hydrological connectivity between channels and floodplains within the Wax Lake Delta using the Delft3D FM model. The results demonstrate a substantial effect of passing cold fronts on the exchange of water and transport fraction between the primary channels and floodplains. This impact is intricately connected to the morphodynamical characteristics of the floodplains, the intensity of cold fronts, river discharge, Coriolis force, and tidal currents. The passing cold fronts can enhance or reverse the direction of water exchange between channels and floodplains. For floodplains, the passage of cold fronts can lead to an increase in the rate of water exchange by as much as five times. In the WLD, a substantial fraction of water, 39-58%, is flowing through the floodplains to the bay at the delta front influenced by the prevailing discharge, although there is a significant spatial heterogeneity. Passing cold fronts can alter the transport distribution, depending on the phase of the front. An increase in river discharge tends to bolster floodplain connectivity and lessen the effects of cold fronts. Conversely, decreased river discharge results in reduced connectivity and exacerbates the fluctuations induced by cold fronts. Moreover, the findings indicate that from the apex to downstream, the contribution of channels decreases as they become shallower, while the role of the floodplains increases, leading to a less distinct demarcation between channels and floodplains. It has also been noted that an increase in river discharge correlates with an increased contribution from floodplains to transfer water to the bay

    Drivers and impacts of water level fluctuations in the Mississippi River delta: Implications for delta restoration

    Get PDF
    This review synthesizes the knowledge regarding the environmental forces affecting water level variability in the coastal waters of the Mississippi River delta and relates these fluctuations to planned river diversions. Water level fluctuations vary significantly across temporal and spatial scales, and are subject to influences from river flow, tides, vegetation, atmospheric forcing, climate change, and anthropogenic activities. Human impacts have strongly affected water level variability in the Mississippi River delta and other deltas worldwide. Collectively, the research reviewed in this article is important for enhancing environmental, economic, and social resilience and sustainability by assessing, mitigating, and adapting to geophysical changes that will cascade to societal systems in the coming decades in the economically and environmentally important Mississippi River delta. Specifically, this information provides a context within which to evaluate the impacts of diversions on the hydrology of the Mississippi delta and creates a benchmark for the evaluation of the impact of water level fluctuations on coastal restoration projects worldwide

    Habits of Mind: Designing Courses for Student Success

    Get PDF
    Although content knowledge remains at the heart of college teaching and learning, forward-thinking instructors recognize that we must also provide 21st-century college students with transferable skills (sometimes called portable intellectual abilities) to prepare them for their futures (Vazquez, 2020; Ritchhart, 2015; Venezia & Jaeger, 2013; Hazard, 2012). To “grow their capacity as efficacious thinkers to navigate and thrive in the face of unprecedented change” (Costa et al., 2023), students must learn and improve important study skills and academic dispositions throughout their educational careers. If we do not focus on skills-building in college courses, students will not be prepared for the challenges that await them after they leave institutions of higher education. If students are not prepared for these postsecondary education challenges, then it is fair to say that college faculty have failed them

    Assemblathon 2: evaluating de novo methods of genome assembly in three vertebrate species

    Get PDF
    Background: The process of generating raw genome sequence data continues to become cheaper, faster, and more accurate. However, assembly of such data into high-quality, finished genome sequences remains challenging. Many genome assembly tools are available, but they differ greatly in terms of their performance (speed, scalability, hardware requirements, acceptance of newer read technologies) and in their final output (composition of assembled sequence). More importantly, it remains largely unclear how to best assess the quality of assembled genome sequences. The Assemblathon competitions are intended to assess current state-of-the-art methods in genome assembly. Results: In Assemblathon 2, we provided a variety of sequence data to be assembled for three vertebrate species (a bird, a fish, and snake). This resulted in a total of 43 submitted assemblies from 21 participating teams. We evaluated these assemblies using a combination of optical map data, Fosmid sequences, and several statistical methods. From over 100 different metrics, we chose ten key measures by which to assess the overall quality of the assemblies. Conclusions: Many current genome assemblers produced useful assemblies, containing a significant representation of their genes and overall genome structure. However, the high degree of variability between the entries suggests that there is still much room for improvement in the field of genome assembly and that approaches which work well in assembling the genome of one species may not necessarily work well for another

    Clinical Sequencing Exploratory Research Consortium: Accelerating Evidence-Based Practice of Genomic Medicine

    Get PDF
    Despite rapid technical progress and demonstrable effectiveness for some types of diagnosis and therapy, much remains to be learned about clinical genome and exome sequencing (CGES) and its role within the practice of medicine. The Clinical Sequencing Exploratory Research (CSER) consortium includes 18 extramural research projects, one National Human Genome Research Institute (NHGRI) intramural project, and a coordinating center funded by the NHGRI and National Cancer Institute. The consortium is exploring analytic and clinical validity and utility, as well as the ethical, legal, and social implications of sequencing via multidisciplinary approaches; it has thus far recruited 5,577 participants across a spectrum of symptomatic and healthy children and adults by utilizing both germline and cancer sequencing. The CSER consortium is analyzing data and creating publically available procedures and tools related to participant preferences and consent, variant classification, disclosure and management of primary and secondary findings, health outcomes, and integration with electronic health records. Future research directions will refine measures of clinical utility of CGES in both germline and somatic testing, evaluate the use of CGES for screening in healthy individuals, explore the penetrance of pathogenic variants through extensive phenotyping, reduce discordances in public databases of genes and variants, examine social and ethnic disparities in the provision of genomics services, explore regulatory issues, and estimate the value and downstream costs of sequencing. The CSER consortium has established a shared community of research sites by using diverse approaches to pursue the evidence-based development of best practices in genomic medicine

    A SARS-CoV-2 protein interaction map reveals targets for drug repurposing

    Get PDF
    The novel coronavirus SARS-CoV-2, the causative agent of COVID-19 respiratory disease, has infected over 2.3 million people, killed over 160,000, and caused worldwide social and economic disruption1,2. There are currently no antiviral drugs with proven clinical efficacy, nor are there vaccines for its prevention, and these efforts are hampered by limited knowledge of the molecular details of SARS-CoV-2 infection. To address this, we cloned, tagged and expressed 26 of the 29 SARS-CoV-2 proteins in human cells and identified the human proteins physically associated with each using affinity-purification mass spectrometry (AP-MS), identifying 332 high-confidence SARS-CoV-2-human protein-protein interactions (PPIs). Among these, we identify 66 druggable human proteins or host factors targeted by 69 compounds (29 FDA-approved drugs, 12 drugs in clinical trials, and 28 preclinical compounds). Screening a subset of these in multiple viral assays identified two sets of pharmacological agents that displayed antiviral activity: inhibitors of mRNA translation and predicted regulators of the Sigma1 and Sigma2 receptors. Further studies of these host factor targeting agents, including their combination with drugs that directly target viral enzymes, could lead to a therapeutic regimen to treat COVID-19

    TXSAMC (transport cross sections from applied Monte Carlo): a new tool for generating shielded multigroup cross sections

    No full text
    This thesis describes a tool called TXSAMC (Transport Cross Sections from Applied Monte Carlo) that produces shielded and homogenized multigroup cross sections for small fast reactor systems. The motivation for this tool comes from a desire to investigate reactor systems that are not characterized well by existing tools. Proper investigation usually requires the use of deterministic codes to characterize the timedependent reactor behavior and to link reactor neutronics codes with thermal-hydraulics and/or other physics codes. Deterministic codes require an accurate set of multigroup cross section libraries. The current process for generating these libraries is time consuming. TXSAMC offers a shorter route for generating these libraries. TXSAMC links three external codes together to create these libraries. The code creates an MCNP (Monte Carlo N-Particle) model of the reactor and calculates the zoneaveraged scalar flux in various tally regions and a core-averaged scalar flux tallied by energy bin. The core-averaged scalar flux provides a weighting function for NJOY. The zone-averaged scalar flux data is used in TRANSX for homogenization and shielding. The code runs NJOY to produce multigroup cross sections that are tabulated by nuclide, temperature and background cross section in MATXS (Material-wise cross section) format. This library is read by TRANSX which, in conjunction with the RZFLUX (Regular Zone-averaged Flux) files, shields the cross sections and homogenizes them. The result is a macroscopic cross section for the cell within the reactor from which the RZFLUX file was written. The cross sections produced by this process have been tested in five different sample problems and have been shown to be reasonably accurate. For reactor cells containing fuel pins, the typical error in the overall fission, nusigf, (n,2n), absorption and total RRD is only a few percent and is often less than one percent. It appears that the error is less for hexagonal lattices than for square lattices. A significant amount of error is associated with threshold reactions like (n,2n) in the sodium coolant. For the square lattice test problems, a reduction in error occurs when smaller tally regions are selected. This reduction was not observed for hexagonal lattice reactors. Overall, the cross sections produced by TXSAMC performed very well
    corecore