665 research outputs found
Polymer-Ionic Liquid Hybrid Electrolytes for Lithium Batteries
Intellectual Merit:
The goal of this dissertation is to investigate the electrochemical properties and microstructure of thin film polymer electrolytes with enhanced electrochemical performance. Solid electrolyte architectures have been produced by blending novel room temperature ionic liquid (RTIL) chemistries with ionically conductive polymer matrices. A variety of microstructure and electrical characterization tools have been employed to understand the hybrid electrolyte's performance.
Lithium-ion batteries are limited because of the safety of the electrolyte. The current generation of batteries uses organic solvents to conduct lithium between the electrodes. Occasionally, the low boiling point and high combustibility of these solvents lead to pressure build ups and fires within cells. Additionally, there are issues with electrolyte loss and decreased performance that must be accounted for in daily use. Thus, interest in replacing this system with a solid polymer electrolyte that can match the properties of an organic solvent is of great interest in battery research. However, a polymer electrolyte by itself is incapable of meeting the performance characteristics, and thus by adding an RTIL it has met the necessary threshold values.
With the development of the novel sulfur based ionic liquid compounds, improved performance characteristics were realized for the polymer electrolyte. The synthesized RTILs were blended with ionically conductive polymer matrices (polyethylene oxide (PEO) or block copolymers of PEO) to produce solid electrolytes. Such shape-conforming materials could be lead to unique battery morphologies, but more importantly the safety of these new batteries will greatly exceeds those based on traditional organic carbonate electrolytes.
Broader Impacts:
The broader impact of this research is that it will ultimately help push forward an attractive alternative to carbonate based liquid electrolyte systems. Development of these alternatives has been slow; however bypassing the current commercial options will lead to not only safer and more powerful batteries. The polymer electrolyte system offers flexibility in both mechanical properties and product design. In due course, this will lead to batteries unlike any currently available on the market. RTILs offer quite an attractive option and the electrochemical understanding of novel architectures based upon sulfur will lead to further potential uses for these compounds
Prostaglandin E2 increases fibroblast geneĂą specific and global DNA methylation via increased DNA methyltransferase expression
Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/154365/1/fsb2026009012.pd
Disambiguation of Social Polarization Concepts and Measures
ABSTRACT
This article distinguishes nine senses of polarization and provides formal measures for each one to refine the methodology used to describe polarization in distributions of attitudes. Each distinct concept is explained through a definition, formal measures, examples, and references. We then apply these measures to GSS data regarding political views, opinions on abortion, and
religiosityâtopics described as revealing social polarization. Previous breakdowns of polarization include domain-specific assumptions and focus on a subset of the distributionâs features. This has conflated multiple, independent features of attitude distributions. The current work aims to extract the distinct senses of polarization and demonstrate that by becoming clearer on these distinctions we can better focus our efforts on substantive issues in social
phenomena
Understanding Polarization: Meaning, Measures, and Model Evaluation
Polarization is a topic of intense interest among social scientists, but there is significant
disagreement regarding the character of the phenomenon and little understanding of underlying mechanics. A first problem, we argue, is that polarization appears in the literature as not one concept but many. In the first part of the article, we distinguish nine phenomena that may be considered polarization, with suggestions of appropriate measures for each. In the second part of the article, we apply this analysis to evaluate the types of polarization generated by the three major families of computational models proposing specific mechanisms of opinion polarization
Scientific Networks on Data Landscapes: Question Difficulty, Epistemic Success, and Convergence
A scientific community can be modeled as a collection of epistemic agents attempting to answer questions, in part by communicating about their hypotheses and results. We can treat the pathways of scientific communication as a network. When we do, it becomes clear that the interaction between the structure of the network and the nature of the question under investigation affects epistemic desiderata, including accuracy and speed to community consensus. Here we build on previous work, both our own and othersâ, in order to get a firmer grasp on precisely which features of scientific communities interact with which features of scientific questions in order to influence epistemic outcomes. Here we introduce a measure on the landscape meant to capture some aspects of the difficulty of answering an empirical question. We then investigate both how different communication networks affect whether the community finds the best answer and the time it takes for the community to reach consensus on an answer. We measure these two epistemic desiderata on a continuum of networks sampled from the Watts-Strogatz spectrum. It turns out that finding the best answer and reaching consensus exhibit radically different patterns. The time it takes for a community to reach a consensus in these models roughly tracks mean path length in the network. Whether a scientific community finds the best answer, on the other hand, tracks neither mean path length nor clustering coefficient
Development of a Novel ex vivo Nasal Epithelial Cell Model Supporting Colonization With Human Nasal Microbiota
The nasal mucosa provides first line defense against inhaled pathogens while creating a unique microenvironment for bacterial communities. Studying the impact of microbiota in the nasal cavity has been difficult due to limitations with current models including explant cultures, primary cells, or neoplastic cell lines. Most notably, none have been shown to support reproducible colonization by bacterial communities from human donors. Therefore, to conduct controlled studies of the human nasal ecosystem, we have developed a novel ex vivo mucosal model that supports bacterial colonization of a cultured host mucosa created by immortalized human nasal epithelial cells (NEC). For this model, immortalized NEC established from 5 male and 5 female donors were cultured with an air-interfaced, apical surface on a porous transwell membrane. NEC were grown from nasal turbinate tissues harvested from willed bodies or from discarded tissue collected during sinonasal procedures. Immortalized cells were evaluated through molecular verification of cell type, histological confirmation of tissue differentiation including formation of tight junctions, NEC multilayer viability, metabolism, physiology and imaging of the luminal surface by scanning electron microscopy. Results showed proper differentiation and multilayer formation at seven to 10 days after air interface that was maintained for up to 3 weeks. The optimized mucosal cultures created an environment necessary to sustain colonization by nasal microbiomes (NMBs) that were collected from healthy volunteers, cryogenically preserved and characterized with customized quantitative polymerase chain reaction (qPCR) arrays. Polymicrobial communities of nasal bacteria associated with healthy and inflamed states were consistently reproduced in matured NEC co-cultures by transplant of NMBs from multiple community types. The cultured NMBs were stable after an initial period of bacterial replication and equilibration. This novel ex vivo culture system is the first model that supports controlled cultivation of NMBs, allowing for lab-based causation studies and further experimentation to explore the complexities of host-microbe and microbe-microbe interactions
Recommended from our members
Whole-exome sequencing and clinical interpretation of FFPE tumor samples to guide precision cancer medicine
Translating whole exome sequencing (WES) for prospective clinical use may impact the care of cancer patients; however, multiple innovations are necessary for clinical implementation. These include: (1) rapid and robust WES from formalin-fixed paraffin embedded (FFPE) tumor tissue, (2) analytical output similar to data from frozen samples, and (3) clinical interpretation of WES data for prospective use. Here, we describe a prospective clinical WES platform for archival FFPE tumor samples. The platform employs computational methods for effective clinical analysis and interpretation of WES data. When applied retrospectively to 511 exomes, the interpretative framework revealed a âlong tailâ of somatic alterations in clinically important genes. Prospective application of this approach identified clinically relevant alterations in 15/16 patients. In one patient, previously undetected findings guided clinical trial enrollment leading to an objective clinical response. Overall, this methodology may inform the widespread implementation of precision cancer medicine
Time to get personal? The impact of researchers choices on the selection of treatment targets using the experience sampling methodology:The impact of researchers choices on the selection of treatment targets using the experience sampling methodology
OBJECTIVE: One of the promises of the experience sampling methodology (ESM) is that a statistical analysis of an individualâs emotions, cognitions and behaviors in everyday-life could be used to identify relevant treatment targets. A requisite for clinical implementation is that outcomes of such person-specific time-series analyses are not wholly contingent on the researcher performing them. METHODS: To evaluate this, we crowdsourced the analysis of one individual patientâs ESM data to 12 prominent research teams, asking them what symptom(s) they would advise the treating clinician to target in subsequent treatment. RESULTS: Variation was evident at different stages of the analysis, from preprocessing steps (e.g., variable selection, clustering, handling of missing data) to the type of statistics and rationale for selecting targets. Most teams did include a type of vector autoregressive model, examining relations between symptoms over time. Although most teams were confident their selected targets would provide useful information to the clinician, not one recommendation was similar: both the number (0â16) and nature of selected targets varied widely. CONCLUSION: This study makes transparent that the selection of treatment targets based on personalized models using ESM data is currently highly conditional on subjective analytical choices and highlights key conceptual and methodological issues that need to be addressed in moving towards clinical implementation
Catching Element Formation In The Act
Gamma-ray astronomy explores the most energetic photons in nature to address
some of the most pressing puzzles in contemporary astrophysics. It encompasses
a wide range of objects and phenomena: stars, supernovae, novae, neutron stars,
stellar-mass black holes, nucleosynthesis, the interstellar medium, cosmic rays
and relativistic-particle acceleration, and the evolution of galaxies. MeV
gamma-rays provide a unique probe of nuclear processes in astronomy, directly
measuring radioactive decay, nuclear de-excitation, and positron annihilation.
The substantial information carried by gamma-ray photons allows us to see
deeper into these objects, the bulk of the power is often emitted at gamma-ray
energies, and radioactivity provides a natural physical clock that adds unique
information. New science will be driven by time-domain population studies at
gamma-ray energies. This science is enabled by next-generation gamma-ray
instruments with one to two orders of magnitude better sensitivity, larger sky
coverage, and faster cadence than all previous gamma-ray instruments. This
transformative capability permits: (a) the accurate identification of the
gamma-ray emitting objects and correlations with observations taken at other
wavelengths and with other messengers; (b) construction of new gamma-ray maps
of the Milky Way and other nearby galaxies where extended regions are
distinguished from point sources; and (c) considerable serendipitous science of
scarce events -- nearby neutron star mergers, for example. Advances in
technology push the performance of new gamma-ray instruments to address a wide
set of astrophysical questions.Comment: 14 pages including 3 figure
LSST: from Science Drivers to Reference Design and Anticipated Data Products
(Abridged) We describe here the most ambitious survey currently planned in
the optical, the Large Synoptic Survey Telescope (LSST). A vast array of
science will be enabled by a single wide-deep-fast sky survey, and LSST will
have unique survey capability in the faint time domain. The LSST design is
driven by four main science themes: probing dark energy and dark matter, taking
an inventory of the Solar System, exploring the transient optical sky, and
mapping the Milky Way. LSST will be a wide-field ground-based system sited at
Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m
effective) primary mirror, a 9.6 deg field of view, and a 3.2 Gigapixel
camera. The standard observing sequence will consist of pairs of 15-second
exposures in a given field, with two such visits in each pointing in a given
night. With these repeats, the LSST system is capable of imaging about 10,000
square degrees of sky in a single filter in three nights. The typical 5
point-source depth in a single visit in will be (AB). The
project is in the construction phase and will begin regular survey operations
by 2022. The survey area will be contained within 30,000 deg with
, and will be imaged multiple times in six bands, ,
covering the wavelength range 320--1050 nm. About 90\% of the observing time
will be devoted to a deep-wide-fast survey mode which will uniformly observe a
18,000 deg region about 800 times (summed over all six bands) during the
anticipated 10 years of operations, and yield a coadded map to . The
remaining 10\% of the observing time will be allocated to projects such as a
Very Deep and Fast time domain survey. The goal is to make LSST data products,
including a relational database of about 32 trillion observations of 40 billion
objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures
available from https://www.lsst.org/overvie
- âŠ