1,064 research outputs found
Deterministic reaction models with power-law forces
We study a one-dimensional particles system, in the overdamped limit, where
nearest particles attract with a force inversely proportional to a power of
their distance and coalesce upon encounter. The detailed shape of the
distribution function for the gap between neighbouring particles serves to
discriminate between different laws of attraction. We develop an exact
Fokker-Planck approach for the infinite hierarchy of distribution functions for
multiple adjacent gaps and solve it exactly, at the mean-field level, where
correlations are ignored. The crucial role of correlations and their effect on
the gap distribution function is explored both numerically and analytically.
Finally, we analyse a random input of particles, which results in a stationary
state where the effect of correlations is largely diminished
Market Efficiency after the Financial Crisis: It's Still a Matter of Information Costs
Compared to the worldwide financial carnage that followed the Subprime Crisis of 2007-2008, it may seem of small consequence that it is also said to have demonstrated the bankruptcy of an academic financial institution: the Efficient Capital Market Hypothesis (“ECMH”). Two things make this encounter between theory and seemingly inconvenient facts of consequence. First, the ECMH had moved beyond academia, fueling decades of a deregulatory agenda. Second, when economic theory moves from academics to policy, it also enters the realm of politics, and is inevitably refashioned to serve the goals of political argument. This happened starkly with the ECMH. It was subject to its own bubble – as a result of politics, it expanded from a narrow but important academic theory about the informational underpinnings of market prices to a broad ideological preference for market outcomes over even measured regulation. In this Article we examine the Subprime Crisis as a vehicle to return the ECMH to its information cost roots that support a more modest but sensible regulatory policy. In particular, we argue that the ECMH addresses informational efficiency, which is a relative, not an absolute measure. This focus on informational efficiency leads to a more focused understanding of what went wrong in 2007-2008. Yet informational efficiency is related to fundamental efficiency – if all information relevant to determining a security’s fundamental value is publicly available and the mechanisms by which that information comes to be reflected in the securities market price operate without friction, fundamental and informational efficiency coincide. But where all value relevant information is not publicly available and/or the mechanisms of market efficiency operate with frictions, the coincidence is an empirical question both as to the information efficiency of prices and their relation to fundamental value. Properly framing market efficiency focuses our attention on the frictions that drive a wedge between relative efficiency and efficiency under perfect market conditions. So framed, relative efficiency is a diagnostic tool that identifies the information costs and structural barriers that reduce price efficiency which, in turn, provides part of a realistic regulatory strategy. While it will not prevent future crises, improving the mechanisms of market efficiency will make prices more efficient, frictions more transparent, and the influence of politics on public agencies more observable, which may allow us to catch the next problem earlier. Recall that on September 8, 2008, the Congressional Budget Office publicly stated its uncertainty about whether there would be a recession and predicted 1.5 percent growth in 2009. Eight days later, Lehman Brothers had failed, and AIG was being nationalized
Sterile neutrino production via active-sterile oscillations: the quantum Zeno effect
We study several aspects of the kinetic approach to sterile neutrino
production via active-sterile mixing. We obtain the neutrino propagator in the
medium including self-energy corrections up to , from which
we extract the dispersion relations and damping rates of the propagating modes.
The dispersion relations are the usual ones in terms of the index of refraction
in the medium, and the damping rates are where
is the active neutrino scattering rate and
is the mixing angle in the medium. We provide a generalization of
the transition probability in the \emph{medium from expectation values in the
density matrix}: and
study the conditions for its quantum Zeno suppression directly in real time. We
find the general conditions for quantum Zeno suppression, which for sterile neutrinos with \emph{may
only be} fulfilled near an MSW resonance. We discuss the implications for
sterile neutrino production and argue that in the early Universe the wide
separation of relaxation scales far away from MSW resonances suggests the
breakdown of the current kinetic approach.Comment: version to appear in JHE
Climate Change Meets the Law of the Horse
The climate change policy debate has only recently turned its full attention to adaptation - how to address the impacts of climate change we have already begun to experience and that will likely increase over time. Legal scholars have in turn begun to explore how the many different fields of law will and should respond. During this nascent period, one overarching question has gone unexamined: how will the legal system as a whole organize around climate change adaptation? Will a new distinct field of climate change adaptation law and policy emerge, or will legal institutions simply work away at the problem through unrelated, duly self-contained fields, as in the famous Law of the Horse? This Article is the first to examine that question comprehensively, to move beyond thinking about the law and climate change adaptation to consider the law of climate change adaptation. Part I of the Article lays out our methodological premises and approach. Using a model we call Stationarity Assessment, Part I explores how legal fields are structured and sustained based on assumptions about the variability of natural, social, and economic conditions, and how disruptions to that regime of variability can lead to the emergence of new fields of law and policy. Case studies of environmental law and environmental justice demonstrate the model’s predictive power for the formation of new distinct legal regimes. Part II applies the Stationarity Assessment model to the topic of climate change adaptation, using a case study of a hypothetical coastal region and the potential for climate change impacts to disrupt relevant legal doctrines and institutions. We find that most fields of law appear capable of adapting effectively to climate change. In other words, without some active intervention, we expect the law and policy of climate change adaptation to follow the path of the Law of the Horse - a collection of fields independently adapting to climate change - rather than organically coalescing into a new distinct field. Part III explores why, notwithstanding this conclusion, it may still be desirable to seek a different trajectory. Focusing on the likelihood of systemic adaptation decisions with perverse, harmful results, we identify the potential benefits offered by intervening to shape a new and distinct field of climate change adaptation law and policy. Part IV then identifies the contours of such a field, exploring the distinct purposes of reducing vulnerability, ensuring resiliency, and safeguarding equity. These features provide the normative policy components for a law of climate change adaptation that would be more than just a Law of the Horse. This new field would not replace or supplant any existing field, however, as environmental law did with regard to nuisance law, and it would not be dominated by substantive doctrine. Rather, like the field of environmental justice, this new legal regime would serve as a holistic overlay across other fields to ensure more efficient, effective, and just climate change adaptation solutions
- …