480,939 research outputs found

    Volume and complexity for warped AdS black holes

    Get PDF
    We study the Complexity=Volume conjecture for Warped AdS3_3 black holes. We compute the spatial volume of the Einstein-Rosen bridge and we find that its growth rate is proportional to the Hawking temperature times the Bekenstein-Hawking entropy. This is consistent with expectations about computational complexity in the boundary theory.Comment: 18 pages, 3 figures, V2: refs adde

    Numerical modelling of heat transfer and experimental validation in Powder-Bed Fusion with the Virtual Domain Approximation

    Get PDF
    Among metal additive manufacturing technologies, powder-bed fusion features very thin layers and rapid solidification rates, leading to long build jobs and a highly localized process. Many efforts are being devoted to accelerate simulation times for practical industrial applications. The new approach suggested here, the virtual domain approximation, is a physics-based rationale for spatial reduction of the domain in the thermal finite-element analysis at the part scale. Computational experiments address, among others, validation against a large physical experiment of 17.5 [cm3]\mathrm{[cm^3]} of deposited volume in 647 layers. For fast and automatic parameter estimation at such level of complexity, a high-performance computing framework is employed. It couples FEMPAR-AM, a specialized parallel finite-element software, with Dakota, for the parametric exploration. Compared to previous state-of-the-art, this formulation provides higher accuracy at the same computational cost. This sets the path to a fully virtualized model, considering an upwards-moving domain covering the last printed layers

    A Fully Polynomial-Time Approximation Scheme for Speed Scaling with Sleep State

    Full text link
    We study classical deadline-based preemptive scheduling of tasks in a computing environment equipped with both dynamic speed scaling and sleep state capabilities: Each task is specified by a release time, a deadline and a processing volume, and has to be scheduled on a single, speed-scalable processor that is supplied with a sleep state. In the sleep state, the processor consumes no energy, but a constant wake-up cost is required to transition back to the active state. In contrast to speed scaling alone, the addition of a sleep state makes it sometimes beneficial to accelerate the processing of tasks in order to transition the processor to the sleep state for longer amounts of time and incur further energy savings. The goal is to output a feasible schedule that minimizes the energy consumption. Since the introduction of the problem by Irani et al. [16], its exact computational complexity has been repeatedly posed as an open question (see e.g. [2,8,15]). The currently best known upper and lower bounds are a 4/3-approximation algorithm and NP-hardness due to [2] and [2,17], respectively. We close the aforementioned gap between the upper and lower bound on the computational complexity of speed scaling with sleep state by presenting a fully polynomial-time approximation scheme for the problem. The scheme is based on a transformation to a non-preemptive variant of the problem, and a discretization that exploits a carefully defined lexicographical ordering among schedules

    The supernova-regulated ISM. I. The multi-phase structure

    Get PDF
    We simulate the multi-phase interstellar medium randomly heated and stirred by supernovae, with gravity, differential rotation and other parameters of the solar neighbourhood. Here we describe in detail both numerical and physical aspects of the model, including injection of thermal and kinetic energy by SN explosions, radiative cooling, photoelectric heating and various transport processes. With 3D domain extending 1 kpc^2 horizontally and 2 kpc vertically, the model routinely spans gas number densities 10^-5 - 10^2 cm^-3, temperatures 10-10^8 K, local velocities up to 10^3 km s^-1 (with Mach number up to 25). The thermal structure of the modelled ISM is classified by inspection of the joint probability density of the gas number density and temperature. We confirm that most of the complexity can be captured in terms of just three phases, separated by temperature borderlines at about 10^3 K and 5x10^5 K. The probability distribution of gas density within each phase is approximately lognormal. We clarify the connection between the fractional volume of a phase and its various proxies, and derive an exact relation between the fractional volume and the filling factors defined in terms of the volume and probabilistic averages. These results are discussed in both observational and computational contexts. The correlation scale of the random flows is calculated from the velocity autocorrelation function; it is of order 100 pc and tends to grow with distance from the mid-plane. We use two distinct parameterizations of radiative cooling to show that the multi-phase structure of the gas is robust, as it does not depend significantly on this choice.Comment: 28 pages, 22 figures and 8 table

    Tenfold your photons -- a physically-sound approach to filtering-based variance reduction of Monte-Carlo-simulated dose distributions

    Full text link
    X-ray dose constantly gains interest in the interventional suite. With dose being generally difficult to monitor reliably, fast computational methods are desirable. A major drawback of the gold standard based on Monte Carlo (MC) methods is its computational complexity. Besides common variance reduction techniques, filter approaches are often applied to achieve conclusive results within a fraction of time. Inspired by these methods, we propose a novel approach. We down-sample the target volume based on the fraction of mass, simulate the imaging situation, and then revert the down-sampling. To this end, the dose is weighted by the mass energy absorption, up-sampled, and distributed using a guided filter. Eventually, the weighting is inverted resulting in accurate high resolution dose distributions. The approach has the potential to considerably speed-up MC simulations since less photons and boundary checks are necessary. First experiments substantiate these assumptions. We achieve a median accuracy of 96.7 % to 97.4 % of the dose estimation with the proposed method and a down-sampling factor of 8 and 4, respectively. While maintaining a high accuracy, the proposed method provides for a tenfold speed-up. The overall findings suggest the conclusion that the proposed method has the potential to allow for further efficiency.Comment: 6 pages, 3 figures, Bildverarbeitung f\"ur die Medizin 202

    Complexity change under conformal transformations in AdS3_{3}/CFT2_{2}

    Get PDF
    Using the volume proposal, we compute the change of complexity of holographic states caused by a small conformal transformation in AdS3_{3}/CFT2_{2}. This computation is done perturbatively to second order. We give a general result and discuss some of its properties. As operators generating such conformal transformations can be explicitly constructed in CFT terms, these results allow for a comparison between holographic methods of defining and computing computational complexity and purely field-theoretic proposals. A comparison of our results to one such proposal is given.Comment: v2: 23 pages, 5 figures, added references and one entirely new section about a comparison to a field theory proposal v3: 27 pages, 5 figures, minor improvements. Matches published versio
    corecore