4,132 research outputs found
b-Initiated processes at the LHC: a reappraisal
Several key processes at the LHC in the standard model and beyond that
involve quarks, such as single-top, Higgs, and weak vector boson associated
production, can be described in QCD either in a 4-flavor or 5-flavor scheme. In
the former, quarks appear only in the final state and are typically
considered massive. In 5-flavor schemes, calculations include quarks in the
initial state, are simpler and allow the resummation of possibly large initial
state logarithms of the type into the
parton distribution function (PDF), being the typical scale of the
hard process. In this work we critically reconsider the rationale for using
5-flavor improved schemes at the LHC. Our motivation stems from the observation
that the effects of initial state logs are rarely very large in hadron
collisions: 4-flavor computations are pertubatively well behaved and a
substantial agreement between predictions in the two schemes is found. We
identify two distinct reasons that explain this behaviour, i.e., the
resummation of the initial state logarithms into the -PDF is relevant only
at large Bjorken and the possibly large ratios 's are
always accompanied by universal phase space suppression factors. Our study
paves the way to using both schemes for the same process so to exploit their
complementary advantages for different observables, such as employing a
5-flavor scheme to accurately predict the total cross section at NNLO and the
corresponding 4-flavor computation at NLO for fully exclusive studies.Comment: Fixed typo in Eq. (A.10) and few typos in Eq. (C.2) and (C.3
(Correcting) misdiagnoses of asthma: A cost effectiveness analysis
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.Background: The prevalence of physician-diagnosed-asthma has risen over the past three decades and misdiagnosis of asthma is potentially common. Objective: to determine whether a secondary-screening-program to establish a correct diagnosis of asthma in those who report a physician diagnosis of asthma is cost effective.Method: Randomly selected physician-diagnosed-asthmatic subjects from 8 Canadian cities were studied with an extensive diagnostic algorithm to rule-in, or rule-out, a correct diagnosis of asthma. Subjects in whom the diagnosis of asthma was excluded were followed up for 6-months and data on asthma medications and heath care utilization was obtained. Economic analysis was performed to estimate the incremental lifetime costs associated with secondary screening of previously diagnosed asthmatic subjects. Analysis was from the perspective of the Canadian healthcare system and is reported in Canadian dollars.Results: Of 540 randomly selected patients with physician diagnosed asthma 150 (28%; 95%CI 19-37%) did not have asthma when objectively studied. 71% of these misdiagnosed patients were on some asthma medications. Incorporating the incremental cost of secondary-screening for the diagnosis of asthma, we found that the average cost savings per 100 individuals screened was 4,588-$69,278).Conclusion: Cost savings primarily resulted from lifetime costs of medication use averted in those who had been misdiagnosed.This work was funded by the Canadian Institute of Health Research, Canada and the University Of Ottawa Division Of Respiratory Medicine
The Thermal Electrical Conductivity Probe (TECP) for Phoenix
The Thermal and Electrical Conductivity Probe (TECP) is a component of the Microscopy, Electrochemistry, and Conductivity Analyzer (MECA) payload on the Phoenix Lander. TECP will measure the temperature, thermal conductivity and volumetric heat capacity of the regolith. It will also detect and quantify the population of mobile H2O molecules in the regolith, if any, throughout the polar summer, by measuring the electrical conductivity of the regolith, as well as the dielectric permittivity. In the vapor phase, TECP is capable of measuring the atmospheric H2O vapor abundance, as well as augment the wind velocity measurements from the meteorology instrumentation. TECP is mounted near the end of the 2.3 m Robotic Arm, and can be placed either in the regolith material or held aloft in the atmosphere. This paper describes the development and calibration of the TECP. In addition, substantial characterization of the instrument has been conducted to identify behavioral characteristics that might affect landed surface operations. The greatest potential issue identified in characterization tests is the extraordinary sensitivity of the TECP to placement. Small gaps alter the contact between the TECP and regolith, complicating data interpretation. Testing with the Phoenix Robotic Arm identified mitigation techniques that will be implemented during flight. A flight model of the instrument was also field tested in the Antarctic Dry Valleys during the 2007-2008 International Polar year.
Differences in transcription between free-living and CO_2-activated third-stage larvae of Haemonchus contortus
Background:
The disease caused by Haemonchus contortus, a blood-feeding nematode of small ruminants, is of major economic importance worldwide. The infective third-stage larva (L3) of this gastric nematode is enclosed in a cuticle (sheath) and, once ingested with herbage by the host, undergoes an exsheathment process that marks the transition from the free-living (L3) to the parasitic (xL3) stage. This study explored changes in gene transcription associated with this transition and predicted, based on comparative analysis, functional roles for key transcripts in the metabolic pathways linked to larval development.
Results:
Totals of 101,305 (L3) and 105,553 (xL3) expressed sequence tags (ESTs) were determined using 454 sequencing technology, and then assembled and annotated; the most abundant transcripts encoded transthyretin-like, calcium-binding EF-hand, NAD(P)-binding and nucleotide-binding proteins as well as homologues of Ancylostoma-secreted proteins (ASPs). Using an in silico-subtractive analysis, 560 and 685 sequences were shown to be uniquely represented in the L3 and xL3 stages, respectively; the transcripts encoded ribosomal proteins, collagens and elongation factors (in L3), and mainly peptidases and other enzymes of amino acid catabolism (in xL3). Caenorhabditis elegans orthologues of transcripts that were uniquely transcribed in each L3 and xL3 were predicted to interact with a total of 535 other genes, all of which were involved in embryonic development.
Conclusion:
The present study indicated that some key transcriptional alterations taking place during the transition from the L3 to the xL3 stage of H. contortus involve genes predicted to be linked to the development of neuronal tissue (L3 and xL3), formation of the cuticle (L3) and digestion of host haemoglobin (xL3). Future efforts using next-generation sequencing and bioinformatic technologies should provide the efficiency and depth of coverage required for the determination of the complete transcriptomes of different developmental stages and/or tissues of H. contortus as well as the genome of this important parasitic nematode. Such advances should lead to a significantly improved understanding of the molecular biology of H. contortus and, from an applied perspective, to novel methods of intervention
The small x gluon and b\bar{b} production at the LHC
We study open b\bar{b} production at large rapidity at the LHC in an attempt
to pin down the gluon distribution at very low x. For the LHC energy of 7 TeV,
at next-to-leading order (NLO), there is a large factorization scale
uncertainty. We show that the uncertainty can be greatly reduced if events are
selected in which the transverse momenta of the two B-mesons balance each other
to some accuracy, that is |\vec p_{1T}+\vec p_{2T}| < k_0. This will fix the
scale \mu_F \simeq k_0, and will allow the LHCb experiment, in particular, to
study the x-behaviour of gluon distribution down to x ~ 10^{-5}, at rather low
scales, \mu ~ 2 GeV. We evaluate the expected cross sections using, for
illustrative purposes, various recent sets of Parton Distribution Functions.Comment: 13 pages, 5 figure
Ecological and Genomic Attributes of Novel Bacterial Taxa That Thrive in Subsurface Soil Horizons.
While most bacterial and archaeal taxa living in surface soils remain undescribed, this problem is exacerbated in deeper soils, owing to the unique oligotrophic conditions found in the subsurface. Additionally, previous studies of soil microbiomes have focused almost exclusively on surface soils, even though the microbes living in deeper soils also play critical roles in a wide range of biogeochemical processes. We examined soils collected from 20 distinct profiles across the United States to characterize the bacterial and archaeal communities that live in subsurface soils and to determine whether there are consistent changes in soil microbial communities with depth across a wide range of soil and environmental conditions. We found that bacterial and archaeal diversity generally decreased with depth, as did the degree of similarity of microbial communities to those found in surface horizons. We observed five phyla that consistently increased in relative abundance with depth across our soil profiles: Chloroflexi, Nitrospirae, Euryarchaeota, and candidate phyla GAL15 and Dormibacteraeota (formerly AD3). Leveraging the unusually high abundance of Dormibacteraeota at depth, we assembled genomes representative of this candidate phylum and identified traits that are likely to be beneficial in low-nutrient environments, including the synthesis and storage of carbohydrates, the potential to use carbon monoxide (CO) as a supplemental energy source, and the ability to form spores. Together these attributes likely allow members of the candidate phylum Dormibacteraeota to flourish in deeper soils and provide insight into the survival and growth strategies employed by the microbes that thrive in oligotrophic soil environments.IMPORTANCE Soil profiles are rarely homogeneous. Resource availability and microbial abundances typically decrease with soil depth, but microbes found in deeper horizons are still important components of terrestrial ecosystems. By studying 20 soil profiles across the United States, we documented consistent changes in soil bacterial and archaeal communities with depth. Deeper soils harbored communities distinct from those of the more commonly studied surface horizons. Most notably, we found that the candidate phylum Dormibacteraeota (formerly AD3) was often dominant in subsurface soils, and we used genomes from uncultivated members of this group to identify why these taxa are able to thrive in such resource-limited environments. Simply digging deeper into soil can reveal a surprising number of novel microbes with unique adaptations to oligotrophic subsurface conditions
The Fate of Arsenic in Noah’s Flood
One potential consequence of Noah’s Flood would be the mobilization of toxic elements such as arsenic (As), a group 15 metalloid with a significant solubility and redox chemistry in water and a high toxicity to human beings. This paper discusses the likely chemistry of arsenic during the Flood. The Flood would have released arsenic through hydrothermal activity, volcanic eruptions, and weathering of crustal rock. Arsenic in hydrothermal fluid would likely be rapidly precipitated by sulfides. Likewise, much of the arsenic in volcanoes would actually be deposited sub-surface as sulfides. In the presence of oxygen-rich waters, these sulfide minerals can undergo oxidative dissolution, releasing the arsenic back into the water to join that liberated by the weathering of the surface. Iron oxyhydroxides would form in such an environment, however, and these will sorb and remove arsenic from the water once again. In waters rich in organic-carbon, reducing conditions can return periodically. This would lead to reductive dissolution to liberate the arsenic from the iron oxyhydroxides. However, these conditions can also reduce sulfates to sulfides and thus reprecipitate the arsenic sulfide minerals. Furthermore, the extremely rapid formation of sedimentary rock during the Flood would likely bury both the original sulfide minerals and the arsenic-sorbed iron oxyhydroxides before they could be significantly dissolved. The modern distribution of arsenic gives evidence of this; the element is often concentrated in large sedimentary basins adjacent to orogenic belts. It appears that arsenic sulfides (formed during the Flood) were in some cases subject to uplift during orogenesis associated with the Flood and underwent oxidation, resulting in the arsenic being sorbed to iron minerals and clays. These eroded into the foreland basins and were buried before the arsenic could leach into local waters to a major degree. In modern times, however, reductive dissolutions of these deposits has resulted in arsenic poisoning. While arsenic does not threaten the Flood model (rather the Flood explains the modern distribution of arsenic), modern arsenic contamination is an ongoing result of the judgement of the Flood
A practical, bioinformatic workflow system for large data sets generated by next-generation sequencing
Transcriptomics (at the level of single cells, tissues and/or whole organisms) underpins many fields of biomedical science, from understanding the basic cellular function in model organisms, to the elucidation of the biological events that govern the development and progression of human diseases, and the exploration of the mechanisms of survival, drug-resistance and virulence of pathogens. Next-generation sequencing (NGS) technologies are contributing to a massive expansion of transcriptomics in all fields and are reducing the cost, time and performance barriers presented by conventional approaches. However, bioinformatic tools for the analysis of the sequence data sets produced by these technologies can be daunting to researchers with limited or no expertise in bioinformatics. Here, we constructed a semi-automated, bioinformatic workflow system, and critically evaluated it for the analysis and annotation of large-scale sequence data sets generated by NGS. We demonstrated its utility for the exploration of differences in the transcriptomes among various stages and both sexes of an economically important parasitic worm (Oesophagostomum dentatum) as well as the prediction and prioritization of essential molecules (including GTPases, protein kinases and phosphatases) as novel drug target candidates. This workflow system provides a practical tool for the assembly, annotation and analysis of NGS data sets, also to researchers with a limited bioinformatic expertise. The custom-written Perl, Python and Unix shell computer scripts used can be readily modified or adapted to suit many different applications. This system is now utilized routinely for the analysis of data sets from pathogens of major socio-economic importance and can, in principle, be applied to transcriptomics data sets from any organism
A practical, bioinformatic workflow system for large data sets generated by next-generation sequencing
Transcriptomics (at the level of single cells, tissues and/or whole organisms) underpins many fields of biomedical science, from understanding the basic cellular function in model organisms, to the elucidation of the biological events that govern the development and progression of human diseases, and the exploration of the mechanisms of survival, drug-resistance and virulence of pathogens. Next-generation sequencing (NGS) technologies are contributing to a massive expansion of transcriptomics in all fields and are reducing the cost, time and performance barriers presented by conventional approaches. However, bioinformatic tools for the analysis of the sequence data sets produced by these technologies can be daunting to researchers with limited or no expertise in bioinformatics. Here, we constructed a semi-automated, bioinformatic workflow system, and critically evaluated it for the analysis and annotation of large-scale sequence data sets generated by NGS. We demonstrated its utility for the exploration of differences in the transcriptomes among various stages and both sexes of an economically important parasitic worm (Oesophagostomum dentatum) as well as the prediction and prioritization of essential molecules (including GTPases, protein kinases and phosphatases) as novel drug target candidates. This workflow system provides a practical tool for the assembly, annotation and analysis of NGS data sets, also to researchers with a limited bioinformatic expertise. The custom-written Perl, Python and Unix shell computer scripts used can be readily modified or adapted to suit many different applications. This system is now utilized routinely for the analysis of data sets from pathogens of major socio-economic importance and can, in principle, be applied to transcriptomics data sets from any organism
Disk-Jet Connection in the Radio Galaxy 3C 120
We present the results of extensive multi-frequency monitoring of the radio
galaxy 3C 120 between 2002 and 2007 at X-ray, optical, and radio wave bands, as
well as imaging with the Very Long Baseline Array (VLBA). Over the 5 yr of
observation, significant dips in the X-ray light curve are followed by
ejections of bright superluminal knots in the VLBA images. Consistent with
this, the X-ray flux and 37 GHz flux are anti-correlated with X-ray leading the
radio variations. This implies that, in this radio galaxy, the radiative state
of accretion disk plus corona system, where the X-rays are produced, has a
direct effect on the events in the jet, where the radio emission originates.
The X-ray power spectral density of 3C 120 shows a break, with steeper slope at
shorter timescale and the break timescale is commensurate with the mass of the
central black hole based on observations of Seyfert galaxies and black hole
X-ray binaries. These findings provide support for the paradigm that black hole
X-ray binaries and active galactic nuclei are fundamentally similar systems,
with characteristic time and size scales linearly proportional to the mass of
the central black hole. The X-ray and optical variations are strongly
correlated in 3C 120, which implies that the optical emission in this object
arises from the same general region as the X-rays, i.e., in the accretion
disk-corona system. We numerically model multi-wavelength light curves of 3C
120 from such a system with the optical-UV emission produced in the disk and
the X-rays generated by scattering of thermal photons by hot electrons in the
corona. From the comparison of the temporal properties of the model light
curves to that of the observed variability, we constrain the physical size of
the corona and the distances of the emitting regions from the central BH.Comment: Accepted for publication in the Astrophysical Journal. 28 pages, 21
figures, 2 table
- …
