12 research outputs found

    Protein Polymer-Based Nanoparticles: Fabrication and Medical Applications

    Get PDF
    Nanoparticles are particles that range in size from about 1–1000 nanometers in diameter, about one thousand times smaller than the average cell in a human body. Their small size, flexible fabrication, and high surface-area-to-volume ratio make them ideal systems for drug delivery. Nanoparticles can be made from a variety of materials including metals, polysaccharides, and proteins. Biological protein-based nanoparticles such as silk, keratin, collagen, elastin, corn zein, and soy protein-based nanoparticles are advantageous in having biodegradability, bioavailability, and relatively low cost. Many protein nanoparticles are easy to process and can be modified to achieve desired specifications such as size, morphology, and weight. Protein nanoparticles are used in a variety of settings and are replacing many materials that are not biocompatible and have a negative impact on the environment. Here we attempt to review the literature pertaining to protein-based nanoparticles with a focus on their application in drug delivery and biomedical fields. Additional detail on governing nanoparticle parameters, specific protein nanoparticle applications, and fabrication methods are also provided

    Development and Application of STORMTOOLS Design Load (SDL) Maps

    Get PDF
    Under the STORMTOOLS initiative, maps of the impact of sea level rise (SLR) (0 to 12 ft), nuisance flooding (1–10 yr), 25, 50, and 100 yr storms, and hindcasts of the four top ranked tropical storms have been developed for the coastal waters of Rhode Island (RI). Estimates of the design elevations, expressed in terms of the Base Flood Elevation (BFE) and thus incorporating surge and associated wave conditions, have also been developed, including the effects of SLR to facilitate structural design. Finally, Coastal Environmental Risk Index (CERI) maps have been developed to estimate the risk to individual structures and infrastructure. CERI employs the BFE maps in concert with damage curves for residential and commercial structures to make estimates of damage to individual structures. All maps are available via an ArcGIS Hub. The objective of this senior design capstone project was to develop STORMTOOLS Design Load maps (SDL) with a goal of estimating the hydrostatic, hydrodynamic, wave, and debris loading, based on ASCE/SEI 7–16 Minimum Design Standards methods, on residential structures in the RI coastal floodplain. The resulting maps display the unitized loads and thus can be scaled for any structure of interest. The goal of the maps is to provide environmental loads that support the design of structures, and reduce the time and cost required in performing the design and the permitting process, while also improving the accuracy and consistency of the designs. SDL maps were generated for all loads, including the effects of SLR for a test case: the Watch Hill/Misquamicut Beach, Westerly, along the southern RI coast. The Autodesk Professional Robot Structural Analysis software, along with SDL loading, was used to evaluate the designs for selected on-grade and pile-elevated residential structures. Damage curves were generated for each and shown to be consistent with the US Army Corps of Engineers empirical damage curves currently used in CERI

    The Importance of Standards for Sharing of Computational Models and Data

    Get PDF
    The target article by Lee et al. (in review) highlights the ways in which ongoing concerns about research reproducibility extend to model-based approaches in cognitive science. Whereas Lee et al. focus primarily on the importance of research practices to improve model robustness, we propose that the transparent sharing of model specifications, including their inputs and outputs, is also essential to improving the reproducibility of model-based analyses. We outline an ongoing effort (within the context of the Brain Imaging Data Structure community) to develop standards for the sharing of the structure of computational models and their outputs

    Accelerating Medicines Partnership® Schizophrenia (AMP® SCZ):Rationale and Study Design of the Largest Global Prospective Cohort Study of Clinical High Risk for Psychosis

    Get PDF
    This article describes the rationale, aims, and methodology of the Accelerating Medicines Partnership® Schizophrenia (AMP® SCZ). This is the largest international collaboration to date that will develop algorithms to predict trajectories and outcomes of individuals at clinical high risk (CHR) for psychosis and to advance the development and use of novel pharmacological interventions for CHR individuals. We present a description of the participating research networks and the data processing analysis and coordination center, their processes for data harmonization across 43 sites from 13 participating countries (recruitment across North America, Australia, Europe, Asia, and South America), data flow and quality assessment processes, data analyses, and the transfer of data to the National Institute of Mental Health (NIMH) Data Archive (NDA) for use by the research community. In an expected sample of approximately 2000 CHR individuals and 640 matched healthy controls, AMP SCZ will collect clinical, environmental, and cognitive data along with multimodal biomarkers, including neuroimaging, electrophysiology, fluid biospecimens, speech and facial expression samples, novel measures derived from digital health technologies including smartphone-based daily surveys, and passive sensing as well as actigraphy. The study will investigate a range of clinical outcomes over a 2-year period, including transition to psychosis, remission or persistence of CHR status, attenuated positive symptoms, persistent negative symptoms, mood and anxiety symptoms, and psychosocial functioning. The global reach of AMP SCZ and its harmonized innovative methods promise to catalyze the development of new treatments to address critical unmet clinical and public health needs in CHR individuals.</p

    Late postnatal maturation of excitatory synaptic transmission permits adult-like expression of hippocampal-dependent behaviors

    No full text

    Linear Collider Physics Resource Book for Snowmass 2001, 2: Higgs and Supersymmetry Studies

    No full text
    This Resource Book reviews the physics opportunities of a next-generation e+e- linear collider and discusses options for the experimental program. Part 2 reviews the possible experiments on Higgs bosons and supersymmetric particles that can be done at a linear collider.This Resource Book reviews the physics opportunities of a next-generation e+e- linear collider and discusses options for the experimental program. Part 2 reviews the possible experiments on Higgs bosons and supersymmetric particles that can be done at a linear collider

    DUNE Offline Computing Conceptual Design Report

    No full text
    International audienceThis document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment
    corecore