1,141 research outputs found
Reducing Greenhouse Gas Emissions in Transport: All in One Basket?
Analysis after analysis has shown consistently that if policy-makers aiming to meet climate goals are looking for the most-efficient, least-distortionary way to target emissions growth, there is simply nothing better than abandoning all emissions regulations except for one: A straight, revenue-neutral carbon tax. Nothing works through more channels, at a lower cost. Alas, policy-makers are not always looking for the most-efficient, least-distortionary way to target emissions growth. That’s because many of those same analyses show that in order to reach emissions targets, the price on carbon would have to be so punitive as to be politically unbearable, raising the price of gasoline, for example, by about a dollar a litre. That leads politicians to mix in other policies that are less visible to the consumer but also less efficient, less effective and more expensive in abating carbon dioxide. The recently negotiated Pan-Canadian Framework on Clean Growth and Climate Change intends to follow that model, relying on a blend of different policies to help reach Canada’s Paris climate targets. But while the government seems therefore determined to rule out the possibility of a nothing-but-a-carbon-tax plan, it is possible, through the careful application of just the right sort of emission-reduction approaches, to reduce the costs of abatement in a key policy target — namely, road transportation — to a level that at least approaches the lower cost of a carbon tax.  The government will likely consider several options in trying to reduce emissions from road transportation. Typical tools include requiring manufacturers to meet standards for new vehicles that mandate fuel economy and greenhouse gas emissions; gasoline taxes; taxes on emissions-intensive vehicles; subsidies for low-emission or zero-emission vehicles; and subsidies for public transit. Indications are that a low-carbon fuel standard (LCFS) will play a significant role in the Pan-Canadian Framework. Applied carefully, an LCFS combined with a mandate for automakers to sell more electric vehicles would be an appropriate policy for Canada to achieve meaningful emissions reductions at a tolerable cost, given other policy measures already committed to. Subsidies for electric vehicles, however, should be avoided as they turn out to be one of the least cost-effective policies to reduce emissions. Requiring car makers to sell more electric vehicles will lead to higher prices for standard internal-combustion vehicles as automakers are forced to spread the cost of the electric-vehicle mandate across their non-electric models. That in turn will lead to cheaper electric cars and pricier non-electric cars, making it likelier that consumers will gravitate in increasing numbers to electric cars, helping reduce emissions. Meanwhile, as the LFCS standard is raised, drivers of internal-combustion vehicles will face an even higher cost of filling up, again prompting more drivers to consider switching to electric vehicles. The combined effect — achievable at close to the cost of a carbon tax — will make it more expensive to drive a gasoline-powered car, similar to the effect of a carbon tax on drivers, but less visible and so less politically risky
Dengue Virus and Autophagy
Several independent groups have published that autophagy is required for optimal RNA replication of dengue virus (DENV). Initially, it was postulated that autophagosomes might play a structural role in replication complex formation. However, cryo-EM tomography of DENV replication complexes showed that DENV replicates on endoplasmic reticulum (ER) cisternae invaginations and not on classical autophagosomes. Recently, it was reported that autophagy plays an indirect role in DENV replication by modulating cellular lipid metabolism. DENV-induced autophagosomes deplete cellular triglycerides that are stored in lipid droplets, leading to increased β-oxidation and energy production. This is the first example of a virus triggering autophagy to modulate cellular physiology. In this review, we summarize these data and discuss new questions and implications for autophagy during DENV replication
Including autapomorphies is important for paleontological tip-dating with clocklike data, but not with non-clock data
Tip-dating, where fossils are included as dated terminal taxa in Bayesian dating inference, is an increasingly popular method. Data for these studies often come from morphological character matrices originally developed for non-dated, and usually parsimony, analyses. In parsimony, only shared derived characters (synapomorphies) provide grouping information, so many character matrices have an ascertainment bias: they omit autapomorphies (unique derived character states), which are considered uninformative. There has been no study of the effect of this ascertainment bias in tip-dating, but autapomorphies can be informative in model-based inference. We expected that excluding autapomorphies would shorten the morphological branchlengths of terminal branches, and thus bias downwards the time branchlengths inferred in tip-dating. We tested for this effect using a matrix for Carboniferous-Permian eureptiles where all autapomorphies had been deliberately coded. Surprisingly, date estimates are virtually unchanged when autapomorphies are excluded, although we find large changes in morphological rate estimates and small effects on topological and dating confidence. We hypothesized that the puzzling lack of effect on dating was caused by the non-clock nature of the eureptile data. We confirm this explanation by simulating strict clock and non-clock datasets, showing that autapomorphy exclusion biases dating only for the clocklike case. A theoretical solution to ascertainment bias is computing the ascertainment bias correction (Mkparsinf), but we explore this correction in detail, and show that it is computationally impractical for typical datasets with many character states and taxa. Therefore we recommend that palaeontologists collect autapomorphies whenever possible when assembling character matrices.Discovery Early Career Researcher Award (DECRA): DE150101773.
National Institute for Mathematical and Biological Synthesis (NIMBioS).
Institute sponsored by the National Science Foundation.
US Department of Homeland Security.
US Department of Agriculture through NSF: EFJ0832858, DBI-1300426.
The University of Tennessee, Knoxville.
NESCent.
The University of Utah
Mass Beyond Measure: Eccentric Searches for Black Hole Populations
Stellar mass binary black holes of unknown formation mechanism have been
observed, motivating new methods for distinguishing distinct black hole
populations. This work explores how the orbital eccentricity of stellar mass
binary black holes is a viable conduit for making such distinctions. Four
different production mechanisms, and their corresponding eccentricity
distributions, are studied in the context of an experimental landscape composed
of mHz (LISA), dHz (DECIGO), and Hz (LIGO) range gravitational wave detectors.
We expand on prior work considering these effects at fixed population
eccentricity. We show that a strong signal corresponding to subsets of
eccentric populations is effectively hidden from the mHz and dHz range
gravitational wave detectors without the incorporation of high eccentricity
waveform templates. Even with sufficiently large eccentricity templates, we
find dHz range experiments with a LISA-like level of sensitivity are unlikely
to aid in distinguishing different populations. We consider the degree to which
a mHz range detector like LISA can differentiate among black hole populations
independently and in concert with follow-up merger detection for binaries
coalescing within a 10 year period. We find that mHz range detectors, with only
(nearly circular) sensitivity, can successfully discern eccentric
sub-populations except when attempting to distinguish very low eccentricity
distributions. In these cases where sensitivity is insufficient, we
find that the increase in event counts resulting from sensitivity
provides a statistically significant signal for discerning even these low
eccentricity sub-populations. While improvements offered by sensitivity
can be generally increased by factors with
sensitivity, going beyond this in eccentricity sensitivity provides negligible
enhancement.Comment: 13 pages, 10 figure
The importance of good coding practices for data scientists
Many data science students and practitioners are reluctant to adopt good
coding practices as long as the code "works". However, code standards are an
important part of modern data science practice, and they play an essential role
in the development of "data acumen". Good coding practices lead to more
reliable code and often save more time than they cost, making them important
even for beginners. We believe that principled coding practices are vital for
statistics and data science. To install these practices within academic
programs, it is important for instructors and programs to begin establishing
these practices early, to reinforce them often, and to hold themselves to a
higher standard while guiding students. We describe key aspects of coding
practices (both good and bad), focusing primarily on the R language, though
similar standards are applicable to other software environments. The lessons
are organized into a top ten list
Probe Brane Dynamics and the Cosmological Constant
Recently a brane world perspective on the cosmological constant and the
hierarchy problems was presented. Here, we elaborate on some aspects of that
particular scenario and discuss the stability of the stationary brane solution
and the dynamics of a probe brane. Even though the brane is unstable under a
small perturbation from its stationary position, such instability is harmless
when the 4-D cosmological constant is very small, as is the case of our
universe. One may also introduce radion stabilizing potentials in a more
realistic scenario.Comment: 13 pages, 1 figure, REVTE
Mapping the Corporate Blogosphere: Linking Audience, Content, and Management to Blog Visibility
Blogs have been a common part of the Web for many years. Individuals create most blogs for their own purposes, but corporations have also begun to develop corporate blogs as a means for communicating with their stakeholders (e.g., customers, partners, investors). In this paper, we extend theory by generating what Gregor (2006) would call a type I theory. Specifically, we develop a theoretical framework for classifying and analyzing corporate blogs that examines blogs’ target audience, their content (focus and function), and how one should manage them. We use this framework to analyze the impact of these characteristics on the visibility of blogs operated by a sample of Fortune 500 companies. Our results show that a blog’s target audience and how its content and management fit with this audience can have significant impacts on blog visibility. We believe this framework provides a useful foundation for studying corporate blogs in the future
Recommended from our members
Canagliflozin lowers blood sugar, but does it also lower cardiovascular risk? Maybe not
For the last 25 years it has been widely accepted that diabetes mellitus is associated with a twofold or greater risk of clinical atherosclerotic disease. Long-standing elevated blood sugar levels, as measured by the hemoglobin A1c level, have been shown to be independent of major cardiovascular risk factors including age, body mass index, systolic blood pressure, serum cholesterol, cigarette smoking, or history of cardiovascular disease
Differential roles of Aβ42/40, p-tau231 and p-tau217 for Alzheimer\u27s trial selection and disease monitoring
Blood biomarkers indicative of Alzheimer\u27s disease (AD) pathology are altered in both preclinical and symptomatic stages of the disease. Distinctive biomarkers may be optimal for the identification of AD pathology or monitoring of disease progression. Blood biomarkers that correlate with changes in cognition and atrophy during the course of the disease could be used in clinical trials to identify successful interventions and thereby accelerate the development of efficient therapies. When disease-modifying treatments become approved for use, efficient blood-based biomarkers might also inform on treatment implementation and management in clinical practice. In the BioFINDER-1 cohort, plasma phosphorylated (p)-tau231 and amyloid-β42/40 ratio were more changed at lower thresholds of amyloid pathology. Longitudinally, however, only p-tau217 demonstrated marked amyloid-dependent changes over 4-6 years in both preclinical and symptomatic stages of the disease, with no such changes observed in p-tau231, p-tau181, amyloid-β42/40, glial acidic fibrillary protein or neurofilament light. Only longitudinal increases of p-tau217 were also associated with clinical deterioration and brain atrophy in preclinical AD. The selective longitudinal increase of p-tau217 and its associations with cognitive decline and atrophy was confirmed in an independent cohort (Wisconsin Registry for Alzheimer\u27s Prevention). These findings support the differential association of plasma biomarkers with disease development and strongly highlight p-tau217 as a surrogate marker of disease progression in preclinical and prodromal AD, with impact for the development of new disease-modifying treatments
High temperature nanoindentation up to 800°C for characterizing high temperature properties of materials
One of the primary motivations for development of instrumented indentation was to measure the mechanical properties of thin films. Characterization of thin film mechanical properties as a function of temperature is of immense industrial and scientific interest. The major bottlenecks in variable temperature measurements have been thermal drift, signal stability (noise) and oxidation of/condensation on the surfaces. Thermal drift is a measurement artifact that arises due to thermal expansion/contraction of indenter tip and loading column. This gets superimposed on the mechanical behavior data precluding accurate extraction of mechanical properties of the sample at elevated temperatures. Vacuum is essential to prevent sample/tip oxidation at elevated temperatures.
In this poster, the design and development of a novel nanoindentation system that can perform reliable load-displacement measurements over a wide temperature ranges (from -150 to 800 °C) will be presented emphasizing the procedures and techniques for carrying out accurate nanomechanical measurements. This system is based on the Ultra Nanoindentation Tester (UNHT) that utilizes an active surface referencing technique comprising of two independent axes, one for surface referencing and another for indentation. The differential depth measurement technology results in negligible compliance of the system and very low thermal drift rates at high temperatures. The sample, indenter and reference tip are heated/cooled separately and the surface temperatures matched to obtain drift rates as low as 1nm/min at 800 °C without correction. Instrumentation development, system characterization, experimental protocol, operational refinements and thermal drift characteristics over the temperature range will be presented, together with a range of results on different materials.
Please click Additional Files below to see the full abstract
- …