1,025 research outputs found

    A Field Range Bound for General Single-Field Inflation

    Full text link
    We explore the consequences of a detection of primordial tensor fluctuations for general single-field models of inflation. Using the effective theory of inflation, we propose a generalization of the Lyth bound. Our bound applies to all single-field models with two-derivative kinetic terms for the scalar fluctuations and is always stronger than the corresponding bound for slow-roll models. This shows that non-trivial dynamics can't evade the Lyth bound. We also present a weaker, but completely universal bound that holds whenever the Null Energy Condition (NEC) is satisfied at horizon crossing.Comment: 16 page

    Market Efficiency after the Financial Crisis: It's Still a Matter of Information Costs

    Get PDF
    Compared to the worldwide financial carnage that followed the Subprime Crisis of 2007-2008, it may seem of small consequence that it is also said to have demonstrated the bankruptcy of an academic financial institution: the Efficient Capital Market Hypothesis (“ECMH”). Two things make this encounter between theory and seemingly inconvenient facts of consequence. First, the ECMH had moved beyond academia, fueling decades of a deregulatory agenda. Second, when economic theory moves from academics to policy, it also enters the realm of politics, and is inevitably refashioned to serve the goals of political argument. This happened starkly with the ECMH. It was subject to its own bubble – as a result of politics, it expanded from a narrow but important academic theory about the informational underpinnings of market prices to a broad ideological preference for market outcomes over even measured regulation. In this Article we examine the Subprime Crisis as a vehicle to return the ECMH to its information cost roots that support a more modest but sensible regulatory policy. In particular, we argue that the ECMH addresses informational efficiency, which is a relative, not an absolute measure. This focus on informational efficiency leads to a more focused understanding of what went wrong in 2007-2008. Yet informational efficiency is related to fundamental efficiency – if all information relevant to determining a security’s fundamental value is publicly available and the mechanisms by which that information comes to be reflected in the securities market price operate without friction, fundamental and informational efficiency coincide. But where all value relevant information is not publicly available and/or the mechanisms of market efficiency operate with frictions, the coincidence is an empirical question both as to the information efficiency of prices and their relation to fundamental value. Properly framing market efficiency focuses our attention on the frictions that drive a wedge between relative efficiency and efficiency under perfect market conditions. So framed, relative efficiency is a diagnostic tool that identifies the information costs and structural barriers that reduce price efficiency which, in turn, provides part of a realistic regulatory strategy. While it will not prevent future crises, improving the mechanisms of market efficiency will make prices more efficient, frictions more transparent, and the influence of politics on public agencies more observable, which may allow us to catch the next problem earlier. Recall that on September 8, 2008, the Congressional Budget Office publicly stated its uncertainty about whether there would be a recession and predicted 1.5 percent growth in 2009. Eight days later, Lehman Brothers had failed, and AIG was being nationalized

    Scale-Invariance and the Strong Coupling Problem

    Full text link
    The effective theory of adiabatic fluctuations around arbitrary Friedmann-Robertson-Walker backgrounds - both expanding and contracting - allows for more than one way to obtain scale-invariant two-point correlations. However, as we show in this paper, it is challenging to produce scale-invariant fluctuations that are weakly coupled over the range of wavelengths accessible to cosmological observations. In particular, requiring the background to be a dynamical attractor, the curvature fluctuations are scale-invariant and weakly coupled for at least 10 e-folds only if the background is close to de Sitter space. In this case, the time-translation invariance of the background guarantees time-independent n-point functions. For non-attractor solutions, any predictions depend on assumptions about the evolution of the background even when the perturbations are outside of the horizon. For the simplest such scenario we identify the regions of the parameter space that avoid both classical and quantum mechanical strong coupling problems. Finally, we present extensions of our results to backgrounds in which higher-derivative terms play a significant role.Comment: 17 pages + appendices, 3 figures; v2: typos fixe

    Circular Dichroism in Atomic Resonance-Enhanced Few-Photon Ionization

    Get PDF
    We investigate few-photon ionization of lithium atoms prepared in the polarized 2p(mâ„“ = +1) state when subjected to femtosecond light pulses with left- or right-handed circular polarization at wavelengths between 665 and 920 nm. We consider whether ionization proceeds more favorably for the electric field co- or counter-rotating with the initial electronic current density. Strong asymmetries are found and quantitatively analyzed in terms of circular dichroism (CD). While the intensity dependence of the measured CD values is rather weak throughout the investigated regime, a very strong sensitivity on the center wavelength of the incoming radiation is observed. While the co-rotating situation overall prevails, the counter-rotating geometry is strongly favored around 800 nm due to the 2p-3s resonant transition, which can only be driven by counter-rotating fields. The observed features provide insights into the helicity dependence of light-atom interactions, and on the possible control of electron emission in atomic few-photon ionization by polarization-selective resonance enhancement

    Validity of Robot-based Assessments of Upper Extremity Function

    Get PDF
    Objective To examine the validity of 5 robot-based assessments of arm motor function post-stroke. Design Cross sectional. Setting Outpatient clinical research center. Participants Volunteer sample of 40 participants, age \u3e18 years, 3-6 months post-stroke, with arm motor deficits that had plateaued. Intervention None. Main Outcome Measures Clinical standards included the Fugl-Meyer Arm Motor Scale (FMA), and 5 secondary motor outcomes: hand/wrist subsection of the FMA; Action Research Arm Test (ART); Box & Blocks test (B/B); hand subscale of Stroke Impact Scale-2 (SIS); and the Barthel Index (BI). Robot-based assessments included: wrist targeting; finger targeting; finger movement speed; reaction time; and a robotic version of the (B/B) test. Anatomical measures included percentage injury to the corticospinal tract (CST) and primary motor cortex (M1, hand region) obtained from MRI . Results Subjects had moderate-severe impairment (arm FMA scores = 35.6±14.4, range 13.5-60). Performance on the robot-based tests, including speed (r=0.82, p\u3c0.0001), wrist targeting (r=0.72, p\u3c0.0001), and finger targeting (r=0.67, p\u3c0.0001) correlated significantly with the FMA scores. Wrist targeting (r=0.57 - 0.82) and finger targeting (r=0.49 - 0.68) correlated significantly with all 5 secondary motor outcomes and with percent CST injury. The robotic version of the B/B correlated significantly with the clinical B/B test but was less prone to floor effect. Robot-based assessments were comparable to FMA score in relation to percent CST injury and superior in relation to M1 hand injury. Conclusions The current findings support using a battery of robot-based methods for assessing the upper extremity motor function in subjects with chronic stroke
    • …
    corecore