2,913 research outputs found

    The dynamical stability of the static real scalar field solutions to the Einstein-Klein-Gordon equations revisited

    Get PDF
    We re-examine the dynamical stability of the nakedly singular, static, spherical ly symmetric solutions of the Einstein-Klein Gordon system. We correct an earlier proof of the instability of these solutions, and demonstrate that there are solutions to the massive Klein-Gordon system that are perturbatively stable.Comment: 13 pages, uses Elsevier style files. To appear in Phys. Lett.

    Critical collapse of collisionless matter - a numerical investigation

    Get PDF
    In recent years the threshold of black hole formation in spherically symmetric gravitational collapse has been studied for a variety of matter models. In this paper the corresponding issue is investigated for a matter model significantly different from those considered so far in this context. We study the transition from dispersion to black hole formation in the collapse of collisionless matter when the initial data is scaled. This is done by means of a numerical code similar to those commonly used in plasma physics. The result is that for the initial data for which the solutions were computed, most of the matter falls into the black hole whenever a black hole is formed. This results in a discontinuity in the mass of the black hole at the onset of black hole formation.Comment: 22 pages, LaTeX, 7 figures (ps-files, automatically included using psfig

    Smooth Inequalities and Equilibrium Inefficiency in Scheduling Games

    Full text link
    We study coordination mechanisms for Scheduling Games (with unrelated machines). In these games, each job represents a player, who needs to choose a machine for its execution, and intends to complete earliest possible. Our goal is to design scheduling policies that always admit a pure Nash equilibrium and guarantee a small price of anarchy for the l_k-norm social cost --- the objective balances overall quality of service and fairness. We consider policies with different amount of knowledge about jobs: non-clairvoyant, strongly-local and local. The analysis relies on the smooth argument together with adequate inequalities, called smooth inequalities. With this unified framework, we are able to prove the following results. First, we study the inefficiency in l_k-norm social costs of a strongly-local policy SPT and a non-clairvoyant policy EQUI. We show that the price of anarchy of policy SPT is O(k). We also prove a lower bound of Omega(k/log k) for all deterministic, non-preemptive, strongly-local and non-waiting policies (non-waiting policies produce schedules without idle times). These results ensure that SPT is close to optimal with respect to the class of l_k-norm social costs. Moreover, we prove that the non-clairvoyant policy EQUI has price of anarchy O(2^k). Second, we consider the makespan (l_infty-norm) social cost by making connection within the l_k-norm functions. We revisit some local policies and provide simpler, unified proofs from the framework's point of view. With the highlight of the approach, we derive a local policy Balance. This policy guarantees a price of anarchy of O(log m), which makes it the currently best known policy among the anonymous local policies that always admit a pure Nash equilibrium.Comment: 25 pages, 1 figur

    Improving the Price of Anarchy for Selfish Routing via Coordination Mechanisms

    Get PDF
    We reconsider the well-studied Selfish Routing game with affine latency functions. The Price of Anarchy for this class of games takes maximum value 4/3; this maximum is attained already for a simple network of two parallel links, known as Pigou's network. We improve upon the value 4/3 by means of Coordination Mechanisms. We increase the latency functions of the edges in the network, i.e., if ℓe(x)\ell_e(x) is the latency function of an edge ee, we replace it by ℓ^e(x)\hat{\ell}_e(x) with ℓe(x)≤ℓ^e(x)\ell_e(x) \le \hat{\ell}_e(x) for all xx. Then an adversary fixes a demand rate as input. The engineered Price of Anarchy of the mechanism is defined as the worst-case ratio of the Nash social cost in the modified network over the optimal social cost in the original network. Formally, if \CM(r) denotes the cost of the worst Nash flow in the modified network for rate rr and \Copt(r) denotes the cost of the optimal flow in the original network for the same rate then [\ePoA = \max_{r \ge 0} \frac{\CM(r)}{\Copt(r)}.] We first exhibit a simple coordination mechanism that achieves for any network of parallel links an engineered Price of Anarchy strictly less than 4/3. For the case of two parallel links our basic mechanism gives 5/4 = 1.25. Then, for the case of two parallel links, we describe an optimal mechanism; its engineered Price of Anarchy lies between 1.191 and 1.192.Comment: 17 pages, 2 figures, preliminary version appeared at ESA 201

    Magnetic resonance multitasking for motion-resolved quantitative cardiovascular imaging.

    Get PDF
    Quantitative cardiovascular magnetic resonance (CMR) imaging can be used to characterize fibrosis, oedema, ischaemia, inflammation and other disease conditions. However, the need to reduce artefacts arising from body motion through a combination of electrocardiography (ECG) control, respiration control, and contrast-weighting selection makes CMR exams lengthy. Here, we show that physiological motions and other dynamic processes can be conceptualized as multiple time dimensions that can be resolved via low-rank tensor imaging, allowing for motion-resolved quantitative imaging with up to four time dimensions. This continuous-acquisition approach, which we name cardiovascular MR multitasking, captures - rather than avoids - motion, relaxation and other dynamics to efficiently perform quantitative CMR without the use of ECG triggering or breath holds. We demonstrate that CMR multitasking allows for T1 mapping, T1-T2 mapping and time-resolved T1 mapping of myocardial perfusion without ECG information and/or in free-breathing conditions. CMR multitasking may provide a foundation for the development of setup-free CMR imaging for the quantitative evaluation of cardiovascular health

    Best Approximation to a Reversible Process in Black-Hole Physics and the Area Spectrum of Spherical Black Holes

    Get PDF
    The assimilation of a quantum (finite size) particle by a Reissner-Nordstr\"om black hole inevitably involves an increase in the black-hole surface area. It is shown that this increase can be minimized if one considers the capture of the lightest charged particle in nature. The unavoidable area increase is attributed to two physical reasons: the Heisenberg quantum uncertainty principle and a Schwinger-type charge emission (vacuum polarization). The fundamental lower bound on the area increase is 4â„Ź4 \hbar, which is smaller than the value given by Bekenstein for neutral particles. Thus, this process is a better approximation to a reversible process in black-hole physics. The universality of the minimal area increase is a further evidence in favor of a uniformly spaced area spectrum for spherical quantum black holes. Moreover, this universal value is in excellent agreement with the area spacing predicted by Mukhanov and Bekenstein and independently by Hod.Comment: 10 page

    A Remark on Boundary Effects in Static Vacuum Initial Data sets

    Full text link
    Let (M, g) be an asymptotically flat static vacuum initial data set with non-empty compact boundary. We prove that (M, g) is isometric to a spacelike slice of a Schwarzschild spacetime under the mere assumption that the boundary of (M, g) has zero mean curvature, hence generalizing a classic result of Bunting and Masood-ul-Alam. In the case that the boundary has constant positive mean curvature and satisfies a stability condition, we derive an upper bound of the ADM mass of (M, g) in terms of the area and mean curvature of the boundary. Our discussion is motivated by Bartnik's quasi-local mass definition.Comment: 10 pages, to be published in Classical and Quantum Gravit

    Development of probabilistic models for quantitative pathway analysis of plant pest introduction for the EU territory

    Get PDF
    This report demonstrates a probabilistic quantitative pathway analysis model that can be used in risk assessment for plant pest introduction into EU territory on a range of edible commodities (apples, oranges, stone fruits and wheat). Two types of model were developed: a general commodity model that simulates distribution of an imported infested/infected commodity to and within the EU from source countries by month; and a consignment model that simulates the movement and distribution of individual consignments from source countries to destinations in the EU. The general pathway model has two modules. Module 1 is a trade pathway model, with a Eurostat database of five years of monthly trade volumes for each specific commodity into the EU28 from all source countries and territories. Infestation levels based on interception records, commercial quality standards or other information determine volume of infested commodity entering and transhipped within the EU. Module 2 allocates commodity volumes to processing, retail use and waste streams and overlays the distribution onto EU NUTS2 regions based on population densities and processing unit locations. Transfer potential to domestic host crops is a function of distribution of imported infested product and area of domestic production in NUTS2 regions, pest dispersal potential, and phenology of susceptibility in domestic crops. The consignment model covers the several routes on supply chains for processing and retail use. The output of the general pathway model is a distribution of estimated volumes of infested produce by NUTS2 region across the EU28, by month or annually; this is then related to the accessible susceptible domestic crop. Risk is expressed as a potential volume of infested fruit in potential contact with an area of susceptible domestic host crop. The output of the consignment model is a volume of infested produce retained at each stage along the specific consignment trade chain

    Singularity Formation in 2+1 Wave Maps

    Full text link
    We present numerical evidence that singularities form in finite time during the evolution of 2+1 wave maps from spherically equivariant initial data of sufficient energy.Comment: 5 pages, 3 figure

    General K=-1 Friedman-Lema\^itre models and the averaging problem in cosmology

    Full text link
    We introduce the notion of general K=-1 Friedman-Lema\^itre (compact) cosmologies and the notion of averaged evolution by means of an averaging map. We then analyze the Friedman-Lema\^itre equations and the role of gravitational energy on the universe evolution. We distinguish two asymptotic behaviors: radiative and mass gap. We discuss the averaging problem in cosmology for them through precise definitions. We then describe in quantitative detail the radiative case, stressing on precise estimations on the evolution of the gravitational energy and its effect in the universe's deceleration. Also in the radiative case we present a smoothing property which tells that the long time H^{3} x H^{2} stability of the flat K=-1 FL models implies H^{i+1} x H^{i} stability independently of how big the initial state was in H^{i+1} x H^{i}, i.e. there is long time smoothing of the space-time. Finally we discuss the existence of initial "big-bang" states of large gravitational energy, showing that there is no mathematical restriction to assume it to be low at the beginning of time.Comment: Revised version. 32 pages, 1 figur
    • …
    corecore