2,201 research outputs found
pncA and bptA Are Not Sufficient To Complement Ixodes scapularis Colonization and Persistence by Borrelia burgdorferi in a Linear Plasmid lp25-Deficient Background
The complex segmented genome of Borrelia burgdorferi is comprised of a linear chromosome along with numerous linear and circular plasmids essential for tick and/or mammalian infectivity. The pathogenic necessity for specific borrelial plasmids has been identified; most notably, infections of the tick vector and mammalian host both require linear plasmid 25 (lp25). Genes carried on lp25, specifically bptA and pncA, are postulated to play a role for B. burgdorferi to infect and persist in Ixodes ticks. In this study, we complemented an lp25-deficient borrelial strain with pncA alone or pncA accompanied by bptA to evaluate the ability of the complemented strains to restore larval colonization and persistence through transstadial transmission relative to that of wild-type B. burgdorferi. The acquisition of the complemented strains by tick larvae from infected mice and/or the survival of these strains was significantly decreased when assayed by cultivation and quantitative PCR (qPCR). Only 10% of the pncA-complemented strain organisms were found by culture to survive 17 days following larval feeding, while 45% of the pncA- and bptA-complemented strain organisms survived, with similar results by PCR. However, neither of the complemented B. burgdorferi strains was capable of persisting through the molt to the nymphal stage as analyzed by culture. qPCR analyses of unfed nymphs detected B. burgdorferi genomes in several nymphs at low copy numbers, likely indicating the presence of DNA from dead or dying cells. Overall, the data indicate that pncA and bptA cannot independently support infection, suggesting that lp25 carries additional gene(s) or regulatory elements critical for B. burgdorferi survival and pathogenesis in the Ixodes vector
Towards Improved Quantum Simulations and Sensing with Trapped 2D Ion Crystals via Parametric Amplification
Improving coherence is a fundamental challenge in quantum simulation and
sensing experiments with trapped ions. Here we discuss, experimentally
demonstrate, and estimate the potential impacts of two different protocols that
enhance, through motional parametric excitation, the coherent spin-motion
coupling of ions obtained with a spin-dependent force. The experiments are
performed on 2D crystal arrays of approximately one hundred Be ions
confined in a Penning trap. By modulating the trapping potential at close to
twice the center-of-mass mode frequency, we squeeze the motional mode and
enhance the spin-motion coupling while maintaining spin coherence. With a
stroboscopic protocol, we measure dB of motional squeezing below
the ground-state motion, from which theory predicts a dB enhancement in
the sensitivity for measuring small displacements using a recently demonstrated
protocol [Science , 673 (2021)]. With a continuous squeezing
protocol, we measure and accurately calibrate the parametric coupling strength.
Theory suggests this protocol can be used to improve quantum spin squeezing,
limited in our system by off-resonant light scatter. We illustrate numerically
the trade-offs between strong parametric amplification and motional dephasing
in the form of center-of-mass frequency fluctuations for improving quantum spin
squeezing in our set-up.Comment: 11 pages, 6 figure
Evaluation of work-based screening for early signs of alcohol-related liver disease in hazardous and harmful drinkers: the PrevAIL study
Background
The direct cost of excessive alcohol consumption to health services is substantial but dwarfed by the cost borne by the workplace as a result of lost productivity. The workplace is also a promising setting for health interventions. The Preventing Alcohol Harm in Liverpool and Knowsley (PrevAIL) project aimed to evaluate a mechanism for detecting the prevalence of alcohol related liver disease using fibrosis biomarkers. Secondary aims were to identify the additive effect of obesity as a risk factor for early liver disease; to assess other impacts of alcohol on work, using a cross-sectional survey.
Methods
Participants (aged 36-55y) from 13 workplaces participated (March 2011âApril 2012). BMI, waist circumference, blood pressure and self-reported alcohol consumption in the previous week was recorded. Those consuming more than the accepted UK threshold (men: >21 units; female: >14 units alcohol) provided a 20 ml venous blood sample for a biomarker test (Southampton Traffic Light Test) and completed an alcohol questionnaire (incorporating the Severity of Alcohol Dependence Questionnaire).
Results
The screening mechanism enrolled 363 individuals (52 % women), 39 % of whom drank above the threshold and participated in the liver screen (nâ=â141, complete dataâ=â124 persons). Workplaces with successful participation were those where employers actively promoted, encouraged and facilitated attendance. Biomarkers detected that 30 % had liver disease (25 %, intermediate; 5 % probable). Liver disease was associated with the frequency of visits to the family physician (Pâ=â0.036) and obesity (Pâ=â0.052).
Conclusions
The workplace is an important setting for addressing alcohol harm, but there are barriers to voluntary screening that need to be addressed. Early detection and support of cases in the community could avert deaths and save health and social costs. Alcohol and obesity should be addressed simultaneously, because of their known multiplicative effect on liver disease risk, and because employers preferred a general health intervention to one that focused solely on alcohol consumption
Exploring the Local Milky Way: M Dwarfs as Tracers of Galactic Populations
We have assembled a spectroscopic sample of low-mass dwarfs observed as part
of the Sloan Digital Sky Survey along one Galactic sightline, designed to
investigate the observable properties of the thin and thick disks. This sample
of ~7400 K and M stars also has measured ugriz photometry, proper motions, and
radial velocities. We have computed UVW space motion distributions, and
investigate their structure with respect to vertical distance from the Galactic
Plane. We place constraints on the velocity dispersions of the thin and thick
disks, using two-component Gaussian fits. We also compare these kinematic
distributions to a leading Galactic model. Finally, we investigate other
possible observable differences between the thin and thick disks, such as
color, active fraction and metallicity.Comment: 11 pages, 12 figures, Accepted by A
Early-branching gut fungi possess a large, comprehensive array of biomass-degrading enzymes
available in PMC 2016 November 07The fungal kingdom is the source of almost all industrial enzymes in use for lignocellulose bioprocessing. We developed a systems-level approach that integrates transcriptomic sequencing, proteomics, phenotype, and biochemical studies of relatively unexplored basal fungi. Anaerobic gut fungi isolated from herbivores produce a large array of biomass-degrading enzymes that synergistically degrade crude, untreated plant biomass and are competitive with optimized commercial preparations from Aspergillus and Trichoderma. Compared to these model platforms, gut fungal enzymes are unbiased in substrate preference due to a wealth of xylan-degrading enzymes. These enzymes are universally catabolite-repressed and are further regulated by a rich landscape of noncoding regulatory RNAs. Additionally, we identified several promising sequence-divergent enzyme candidates for lignocellulosic bioprocessing.United States. Dept. of Energy. Office of Science (Biological and Environmental Research (BER) program)United States. Department of Energy (DOE Grant DE-SC0010352)United States. Department of Agriculture (Award 2011-67017-20459)Institute for Collaborative Biotechnologies (grant W911NF-09-0001
THE COMMUNITY LEVERAGED UNIFIED ENSEMBLE (CLUE) IN THE 2016 NOAA/HAZARDOUS WEATHER TESTBED SPRING FORECASTING EXPERIMENT
One primary goal of annual Spring Forecasting Experiments (SFEs), which are coorganized by NOAAâs National Severe Storms Laboratory and Storm Prediction Center and conducted in the National Oceanic and Atmospheric Administrationâs (NOAA) Hazardous Weather Testbed, is documenting performance characteristics of experimental, convection-allowing modeling systems (CAMs). Since 2007, the number of CAMs (including CAM ensembles) examined in the SFEs has increased dramatically, peaking at six different CAM ensembles in 2015. Meanwhile, major advances have been made in creating, importing, processing, verifying, and developing tools for analyzing and visualizing these large and complex datasets. However, progress toward identifying optimal CAM ensemble configurations has been inhibited because the different CAM systems have been independently designed, making it difficult to attribute differences in performance characteristics. Thus, for the 2016 SFE, a much more coordinated effort among many collaborators was made by agreeing on a set of model specifications (e.g., model version, grid spacing, domain size, and physics) so that the simulations contributed by each collaborator could be combined to form one large, carefully designed ensemble known as the Community Leveraged Unified Ensemble (CLUE). The 2016 CLUE was composed of 65 members contributed by five research institutions and represents an unprecedented effort to enable an evidence-driven decision process to help guide NOAAâs operational modeling efforts. Eight unique experiments were designed within the CLUE framework to examine issues directly relevant to the design of NOAAâs future operational CAM-based ensembles. This article will highlight the CLUE design and present results from one of the experiments examining the impact of single versus multicore CAM ensemble configurations
THE COMMUNITY LEVERAGED UNIFIED ENSEMBLE (CLUE) IN THE 2016 NOAA/HAZARDOUS WEATHER TESTBED SPRING FORECASTING EXPERIMENT
One primary goal of annual Spring Forecasting Experiments (SFEs), which are coorganized by NOAAâs National Severe Storms Laboratory and Storm Prediction Center and conducted in the National Oceanic and Atmospheric Administrationâs (NOAA) Hazardous Weather Testbed, is documenting performance characteristics of experimental, convection-allowing modeling systems (CAMs). Since 2007, the number of CAMs (including CAM ensembles) examined in the SFEs has increased dramatically, peaking at six different CAM ensembles in 2015. Meanwhile, major advances have been made in creating, importing, processing, verifying, and developing tools for analyzing and visualizing these large and complex datasets. However, progress toward identifying optimal CAM ensemble configurations has been inhibited because the different CAM systems have been independently designed, making it difficult to attribute differences in performance characteristics. Thus, for the 2016 SFE, a much more coordinated effort among many collaborators was made by agreeing on a set of model specifications (e.g., model version, grid spacing, domain size, and physics) so that the simulations contributed by each collaborator could be combined to form one large, carefully designed ensemble known as the Community Leveraged Unified Ensemble (CLUE). The 2016 CLUE was composed of 65 members contributed by five research institutions and represents an unprecedented effort to enable an evidence-driven decision process to help guide NOAAâs operational modeling efforts. Eight unique experiments were designed within the CLUE framework to examine issues directly relevant to the design of NOAAâs future operational CAM-based ensembles. This article will highlight the CLUE design and present results from one of the experiments examining the impact of single versus multicore CAM ensemble configurations
LSST: from Science Drivers to Reference Design and Anticipated Data Products
(Abridged) We describe here the most ambitious survey currently planned in
the optical, the Large Synoptic Survey Telescope (LSST). A vast array of
science will be enabled by a single wide-deep-fast sky survey, and LSST will
have unique survey capability in the faint time domain. The LSST design is
driven by four main science themes: probing dark energy and dark matter, taking
an inventory of the Solar System, exploring the transient optical sky, and
mapping the Milky Way. LSST will be a wide-field ground-based system sited at
Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m
effective) primary mirror, a 9.6 deg field of view, and a 3.2 Gigapixel
camera. The standard observing sequence will consist of pairs of 15-second
exposures in a given field, with two such visits in each pointing in a given
night. With these repeats, the LSST system is capable of imaging about 10,000
square degrees of sky in a single filter in three nights. The typical 5
point-source depth in a single visit in will be (AB). The
project is in the construction phase and will begin regular survey operations
by 2022. The survey area will be contained within 30,000 deg with
, and will be imaged multiple times in six bands, ,
covering the wavelength range 320--1050 nm. About 90\% of the observing time
will be devoted to a deep-wide-fast survey mode which will uniformly observe a
18,000 deg region about 800 times (summed over all six bands) during the
anticipated 10 years of operations, and yield a coadded map to . The
remaining 10\% of the observing time will be allocated to projects such as a
Very Deep and Fast time domain survey. The goal is to make LSST data products,
including a relational database of about 32 trillion observations of 40 billion
objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures
available from https://www.lsst.org/overvie
LSST Science Book, Version 2.0
A survey that can cover the sky in optical bands over wide fields to faint
magnitudes with a fast cadence will enable many of the exciting science
opportunities of the next decade. The Large Synoptic Survey Telescope (LSST)
will have an effective aperture of 6.7 meters and an imaging camera with field
of view of 9.6 deg^2, and will be devoted to a ten-year imaging survey over
20,000 deg^2 south of +15 deg. Each pointing will be imaged 2000 times with
fifteen second exposures in six broad bands from 0.35 to 1.1 microns, to a
total point-source depth of r~27.5. The LSST Science Book describes the basic
parameters of the LSST hardware, software, and observing plans. The book
discusses educational and outreach opportunities, then goes on to describe a
broad range of science that LSST will revolutionize: mapping the inner and
outer Solar System, stellar populations in the Milky Way and nearby galaxies,
the structure of the Milky Way disk and halo and other objects in the Local
Volume, transient and variable objects both at low and high redshift, and the
properties of normal and active galaxies at low and high redshift. It then
turns to far-field cosmological topics, exploring properties of supernovae to
z~1, strong and weak lensing, the large-scale distribution of galaxies and
baryon oscillations, and how these different probes may be combined to
constrain cosmological models and the physics of dark energy.Comment: 596 pages. Also available at full resolution at
http://www.lsst.org/lsst/sciboo
- âŠ