19,664 research outputs found

    The Bullet cluster at its best: weighing stars, gas and dark matter

    Full text link
    We present a new strong lensing mass reconstruction of the Bullet cluster (1E 0657-56) at z=0.296, based on WFC3 and ACS HST imaging and VLT/FORS2 spectroscopy. The strong lensing constraints underwent substantial revision compared to previously published analysis, there are now 14 (six new and eight previously known) multiply-imaged systems, of which three have spectroscopically confirmed redshifts (including one newly measured from this work). The reconstructed mass distribution explicitly included the combination of three mass components: i) the intra-cluster gas mass derived from X-ray observation, ii) the cluster galaxies modeled by their fundamental plane scaling relations and iii) dark matter. The model that includes the intra-cluster gas is the one with the best Bayesian evidence. This model has a total RMS value of 0.158" between the predicted and measured image positions for the 14 multiple images considered. The proximity of the total RMS to resolution of HST/WFC3 and ACS (0.07-0.15" FWHM) demonstrates the excellent precision of our mass model. The derived mass model confirms the spatial offset between the X-ray gas and dark matter peaks. The fraction of the galaxy halos mass to total mass is found to be f_s=11+/-5% for a total mass of 2.5+/-0.1 x 10^14 solar mass within a 250 kpc radial aperture.Comment: Accepted by A&A 15 pages, 12 figure

    Neural Decision Boundaries for Maximal Information Transmission

    Get PDF
    We consider here how to separate multidimensional signals into two categories, such that the binary decision transmits the maximum possible information transmitted about those signals. Our motivation comes from the nervous system, where neurons process multidimensional signals into a binary sequence of responses (spikes). In a small noise limit, we derive a general equation for the decision boundary that locally relates its curvature to the probability distribution of inputs. We show that for Gaussian inputs the optimal boundaries are planar, but for non-Gaussian inputs the curvature is nonzero. As an example, we consider exponentially distributed inputs, which are known to approximate a variety of signals from natural environment.Comment: 5 pages, 3 figure

    Locality-Adaptive Parallel Hash Joins Using Hardware Transactional Memory

    Get PDF
    Previous work [1] has claimed that the best performing implementation of in-memory hash joins is based on (radix-)partitioning of the build-side input. Indeed, despite the overhead of partitioning, the benefits from increased cache-locality and synchronization free parallelism in the build-phase outweigh the costs when the input data is randomly ordered. However, many datasets already exhibit significant spatial locality (i.e., non-randomness) due to the way data items enter the database: through periodic ETL or trickle loaded in the form of transactions. In such cases, the first benefit of partitioning — increased locality — is largely irrelevant. In this paper, we demonstrate how hardware transactional memory (HTM) can render the other benefit, freedom from synchronization, irrelevant as well. Specifically, using careful analysis and engineering, we develop an adaptive hash join implementation that outperforms parallel radix-partitioned hash joins as well as sort-merge joins on data with high spatial locality. In addition, we show how, through lightweight (less than 1% overhead) runtime monitoring of the transaction abort rate, our implementation can detect inputs with low spatial locality and dynamically fall back to radix-partitioning of the build-side input. The result is a hash join implementation that is more than 3 times faster than the state-of-the-art on high-locality data and never more than 1% slower

    Can we avoid high coupling?

    Get PDF
    It is considered good software design practice to organize source code into modules and to favour within-module connections (cohesion) over between-module connections (coupling), leading to the oft-repeated maxim "low coupling/high cohesion". Prior research into network theory and its application to software systems has found evidence that many important properties in real software systems exhibit approximately scale-free structure, including coupling; researchers have claimed that such scale-free structures are ubiquitous. This implies that high coupling must be unavoidable, statistically speaking, apparently contradicting standard ideas about software structure. We present a model that leads to the simple predictions that approximately scale-free structures ought to arise both for between-module connectivity and overall connectivity, and not as the result of poor design or optimization shortcuts. These predictions are borne out by our large-scale empirical study. Hence we conclude that high coupling is not avoidable--and that this is in fact quite reasonable

    Nonlinear force-free modeling of the solar coronal magnetic field

    Full text link
    The coronal magnetic field is an important quantity because the magnetic field dominates the structure of the solar corona. Unfortunately direct measurements of coronal magnetic fields are usually not available. The photospheric magnetic field is measured routinely with vector magnetographs. These photospheric measurements are extrapolated into the solar corona. The extrapolated coronal magnetic field depends on assumptions regarding the coronal plasma, e.g. force-freeness. Force-free means that all non-magnetic forces like pressure gradients and gravity are neglected. This approach is well justified in the solar corona due to the low plasma beta. One has to take care, however, about ambiguities, noise and non-magnetic forces in the photosphere, where the magnetic field vector is measured. Here we review different numerical methods for a nonlinear force-free coronal magnetic field extrapolation: Grad-Rubin codes, upward integration method, MHD-relaxation, optimization and the boundary element approach. We briefly discuss the main features of the different methods and concentrate mainly on recently developed new codes.Comment: 33 pages, 3 figures, Review articl
    • 

    corecore