760 research outputs found
A methodology for measuring the sustainability of car transport systems
Measuring the sustainability of car fleets, an important task in developing transport policy, can be accomplished with an appropriate set of indicators. We applied the Process Analysis Method of sustainability assessment to generate an indicator set in a systematic and transparent way, that is consistent with a declared definition of a sustainable transport system. Our method identifies stakeholder groups, the full range of impacts across the environmental, economic and human/social domains of sustainability, and those who generate and receive those impacts. Car users are shown by the analysis to have dual roles, both as individual makers of decisions and as beneficiaries/sufferers of the impacts resulting from communal choice. Thus car users, through their experience of service quality, are a potential force for system change. Our method addresses many of the well-known flaws in measuring transport sustainability. The indicator set created is independent of national characteristics and will be useful to transport policy practitioners and sustainable mobility researchers globally. © 2013 Elsevier Ltd
Dogs as Sources and Sentinels of Parasites in Humans and Wildlife, Northern Canada
A minimum of 11 genera of parasites, including 7 known or suspected to cause zoonoses, were detected in dogs in 2 northern Canadian communities. Dogs in remote settlements receive minimal veterinary care and may serve as sources and sentinels for parasites in persons and wildlife, and as parasite bridges between wildlife and humans
Generalized pricing formulas for stochastic volatility jump diffusion models applied to the exponential Vasicek model
Path integral techniques for the pricing of financial options are mostly
based on models that can be recast in terms of a Fokker-Planck differential
equation and that, consequently, neglect jumps and only describe drift and
diffusion. We present a method to adapt formulas for both the path-integral
propagators and the option prices themselves, so that jump processes are taken
into account in conjunction with the usual drift and diffusion terms. In
particular, we focus on stochastic volatility models, such as the exponential
Vasicek model, and extend the pricing formulas and propagator of this model to
incorporate jump diffusion with a given jump size distribution. This model is
of importance to include non-Gaussian fluctuations beyond the Black-Scholes
model, and moreover yields a lognormal distribution of the volatilities, in
agreement with results from superstatistical analysis. The results obtained in
the present formalism are checked with Monte Carlo simulations.Comment: 9 pages, 2 figures, 1 tabl
Provably Secure Double-Block-Length Hash Functions in a Black-Box Model
In CRYPTOâ89, Merkle presented three double-block-length
hash functions based on DES. They are optimally collision resistant in
a black-box model, that is, the time complexity of any collision-finding
algorithm for them is Ω(2^<l/2>) if DES is a random block cipher, where
l is the output length. Their drawback is that their rates are low. In
this article, new double-block-length hash functions with higher rates
are presented which are also optimally collision resistant in the blackbox
model. They are composed of block ciphers whose key length is twice
larger than their block length
Hot Spots and Transition from d-Wave to Another Pairing Symmetry in the Electron-Doped Cuprate Superconductors
We present a simple theoretical explanation for a transition from d-wave to
another superconducting pairing observed in the electron-doped cuprates. The
d_{x^2-y^2} pairing potential Delta, which has the maximal magnitude and
opposite signs at the hot spots on the Fermi surface, becomes suppressed with
the increase of electron doping, because the hot spots approach the Brillouin
zone diagonals, where Delta vanishes. Then, the d_{x^2-y^2} pairing is replaced
by either singlet s-wave or triplet p-wave pairing. We argue in favor of the
latter and discuss experiments to uncover it.Comment: 6 pages, 4 figures, RevTeX 4. V.2: Extra figure and many references
added. V.3: Minor update of references for the proof
Recommended from our members
Brookhaven Fastbus/Unibus Interface
A typical high energy physics experiment requires both a high speed data acquisition and processing system, for data collection and reduction; and a general purpose computer to handle further reduction, bookkeeping and mass storage. Broad differences in architecture, format or technology, will often exist between these two systems, and interface design can become a formidable task. The PDP-11 series minicomputer is widely used in physics research, and the Brookhaven FASTBUS is the only standard high speed data acquisition system which is fully implemented in a current high energy physics experiment. This paper will describe the design and operation of an interface between these two systems. The major issues are elucidated by a preliminary discussion on the basic principles of Bus Systems, and their application to Brookhaven FASTBUS and UNIBUS
Come back Marshall, all is forgiven? : Complexity, evolution, mathematics and Marshallian exceptionalism
Marshall was the great synthesiser of neoclassical economics. Yet with his qualified assumption of self-interest, his emphasis on variation in economic evolution and his cautious attitude to the use of mathematics, Marshall differs fundamentally from other leading neoclassical contemporaries. Metaphors inspire more specific analogies and ontological assumptions, and Marshall used the guiding metaphor of Spencerian evolution. But unfortunately, the further development of a Marshallian evolutionary approach was undermined in part by theoretical problems within Spencer's theory. Yet some things can be salvaged from the Marshallian evolutionary vision. They may even be placed in a more viable Darwinian framework.Peer reviewedFinal Accepted Versio
Recommended from our members
Brookhaven Segment Interconnect
We have performed a high energy physics experiment using a multisegment Brookhaven FASTBUS system. The system was composed of three crate segments and two cable segments. We discuss the segment interconnect module which permits communication between the various segments
The Medieval Climate Anomaly and Little Ice Age in Chesapeake Bay and the North Atlantic Ocean
This paper is not subject to U.S. copyright. The definitive version was published in Palaeogeography, Palaeoclimatology, Palaeoecology 297 (2010): 299-310, doi:10.1016/j.palaeo.2010.08.009.A new 2400-year paleoclimate reconstruction from Chesapeake Bay (CB) (eastern US) was compared to other paleoclimate records in the North Atlantic region to evaluate climate variability during the Medieval Climate Anomaly (MCA) and Little Ice Age (LIA). Using Mg/Ca ratios from ostracodes and oxygen isotopes from benthic foraminifera as proxies for temperature and precipitation-driven estuarine hydrography, results show that warmest temperatures in CB reached 16â17 °C between 600 and 950 CE (Common Era), centuries before the classic European Medieval Warm Period (950â1100 CE) and peak warming in the Nordic Seas (1000â1400 CE). A series of centennial warm/cool cycles began about 1000 CE with temperature minima of ~ 8 to 9 °C about 1150, 1350, and 1650â1800 CE, and intervening warm periods (14â15 °C) centered at 1200, 1400, 1500 and 1600 CE. Precipitation variability in the eastern US included multiple dry intervals from 600 to 1200 CE, which contrasts with wet medieval conditions in the Caribbean. The eastern US experienced a wet LIA between 1650 and 1800 CE when the Caribbean was relatively dry. Comparison of the CB record with other records shows that the MCA and LIA were characterized by regionally asynchronous warming and complex spatial patterns of precipitation, possibly related to oceanâatmosphere processes
Measurement of the polarisation of W bosons produced with large transverse momentum in pp collisions at sqrt(s) = 7 TeV with the ATLAS experiment
This paper describes an analysis of the angular distribution of W->enu and
W->munu decays, using data from pp collisions at sqrt(s) = 7 TeV recorded with
the ATLAS detector at the LHC in 2010, corresponding to an integrated
luminosity of about 35 pb^-1. Using the decay lepton transverse momentum and
the missing transverse energy, the W decay angular distribution projected onto
the transverse plane is obtained and analysed in terms of helicity fractions
f0, fL and fR over two ranges of W transverse momentum (ptw): 35 < ptw < 50 GeV
and ptw > 50 GeV. Good agreement is found with theoretical predictions. For ptw
> 50 GeV, the values of f0 and fL-fR, averaged over charge and lepton flavour,
are measured to be : f0 = 0.127 +/- 0.030 +/- 0.108 and fL-fR = 0.252 +/- 0.017
+/- 0.030, where the first uncertainties are statistical, and the second
include all systematic effects.Comment: 19 pages plus author list (34 pages total), 9 figures, 11 tables,
revised author list, matches European Journal of Physics C versio
- âŠ