12,068 research outputs found

    Detection of Dark Matter Concentrations in the Field of Cl 1604+4304 from Weak Lensing Analysis

    Get PDF
    We present a weak-lensing analysis of a region around the galaxy cluster Cl 1604+4304 (z=0.897) on the basis of the deep observations with the HST/WFPC2. We apply a variant of Schneider's aperture mass technique to the observed WFPC2 field and obtain the distribution of weak-lensing signal-to-noise (S/N) ratio within the field. The resulting S/N map reveals a clear pronounced peak located about 1.7 arcmin (850h_{50}^{-1} kpc at z=0.897) southwest of the second peak associated with the optical cluster center determined from the dynamical analysis of Postman et al. A non-linear finite-field inversion method has been used to reconstruct the projected mass distribution from the observed shear field. The reconstructed mass map shows a super-critical feature at the location of the S/N peak as well as in the cluster central region. Assuming the redshift distribution of field galaxies, we obtain the total mass in the observed field to be 1.0 h_{50}^{-1} 10^{15} M_sun for =1.0. The estimated mass within a circular aperture of radius 280h_{50}^{-1} kpc centered on the dark clump is 2.4h_{50}^{-1} 10^{14} M_sun. We have confirmed the existence of the ` dark ' mass concentration from another deep HST observation with a slightly different ~20 arcsec pointing.Comment: 7 pages, 3 figure

    Strong Lensing Reconstruction

    Get PDF
    We present a general linear algorithm for measuring the surface mass density 1-\kappa from the observable reduced shear g=\gamma/(1-\kappa) in the strong lensing regime. We show that in general, the observed polarization field can be decomposed into ``electric'' and ``magnetic'' components, which have independent and redundant solutions, but perfectly orthogonal noise properties. By combining these solutions, one can increase the signal-to-noise ratio by \sqrt{2}. The solutions allow dynamic optimization of signal and noise, both in real and Fourier space (using arbitrary smoothing windows). Boundary conditions have no effect on the reconstructions, apart from its effect on the signal-to-noise. Many existing reconstruction techniques are recovered as special cases of this framework. The magnetic solution has the added benefit of yielding the global and local parity of the reconstruction in a single step.Comment: final accepted version for ApJ

    High resolution, high capacity, spatial specificity in perceptual learning.

    Get PDF
    Research of perceptual learning has received significant interest due to findings that training on perceptual tasks can yield learning effects that are specific to the stimulus features of that task. However, recent studies have demonstrated that while training a single stimulus at a single location can yield a high-degree of stimulus specificity, training multiple features, or at multiple locations can reveal a broad transfer of learning to untrained features or stimulus locations. We devised a high resolution, high capacity, perceptual learning procedure with the goal of testing whether spatial specificity can be found in cases where observers are highly trained to discriminate stimuli in many different locations in the visual field. We found a surprising degree of location specific learning, where performance was significantly better when target stimuli were presented at 1 of the 24 trained locations compared to when they were placed in 1 of the 12 untrained locations. This result is particularly impressive given that untrained locations were within a couple degrees of visual angle of those that were trained. Given the large number of trained locations, the fact that the trained and untrained locations were interspersed, and the high-degree of spatial precision of the learning, we suggest that these results are difficult to account for using attention or decision strategies and instead suggest that learning may have taken place for each location separately in retinotopically organized visual cortex

    Double lenses

    Full text link
    The analysis of the shear induced by a single cluster on the images of a large number of background galaxies is all centered around the curl-free character of a well-known vector field that can be derived from the data. Such basic property breaks down when the source galaxies happen to be observed through two clusters at different redshifts, partially aligned along the line of sight. In this paper we address the study of double lenses and obtain five main results. (i) First we generalize the procedure to extract the available information, contained in the observed shear field, from the case of a single lens to that of a double lens. (ii) Then we evaluate the possibility of detecting the signature of double lensing given the known properties of the distribution of clusters of galaxies. (iii) As a different astrophysical application, we demonstrate how the method can be used to detect the presence of a dark cluster that might happen to be partially aligned with a bright cluster studied in terms of statistical lensing. (iv) In addition, we show that the redshift distribution of the source galaxies, which in principle might also contribute to break the curl-free character of the shear field, actually produces systematic effects typically two orders of magnitude smaller than the double lensing effects we are focusing on. (v) Remarkably, a discussion of relevant contributions to the noise of the shear measurement has brought up an intrinsic limitation of weak lensing analyses, since one specific contribution, associated with the presence of a non-vanishing two-galaxy correlation function, turns out not to decrease with the density of source galaxies (and thus with the depth of the observations).Comment: 40 pages, 15 figures. Accepted for publication in ApJ main journa

    Summer Residency of Pacific Halibut in Glacier Bay National Park

    Get PDF
    Glacier Bay National Park (Fig.1), as a Marine Protected Area (MPA), is phasing out commercial fishing of Pacific halibut (Hippoglossus stenolepis) within the park. The species continues to be commercially harvested outside of the bay

    The noise of cluster mass reconstructions from a source redshift distribution

    Get PDF
    The parameter-free reconstruction of the surface-mass density of clusters of galaxies is one of the principal applications of weak gravitational lensing. From the observable ellipticities of images of background galaxies, the tidal gravitational field (shear) of the mass distribution is estimated, and the corresponding surface mass density is constructed. The noise of the resulting mass map is investigated here, generalizing previous work which included mainly the noise due to the intrinsic galaxy ellipticities. Whereas this dominates the noise budget if the lens is very weak, other sources of noise become important, or even dominant, for the medium-strong lensing regime close to the center of clusters. In particular, shot noise due to a Poisson distribution of galaxy images, and increased shot noise owing to the correlation of galaxies in angular position and redshift, can yield significantly larger levels of noise than that from the intrinsic ellipticities only. We estimate the contributions from these various effects for two widely used smoothing operations, showing that one of them effectively removes the Poisson and the correlation noises related to angular positions of galaxies. Noise sources due to the spread in redshift of galaxies are still present in the optimized estimator and are shown to be relevant in many cases. We show how (even approximate) redshift information can be profitably used to reduce the noise in the mass map. The dependence of the various noise terms on the relevant parameters (lens redshift, strength, smoothing length, redshift distribution of background galaxies) are explicitly calculated and simple estimates are provided.Comment: 18 pages, A&A in pres

    Weak Lensing Reconstruction and Power Spectrum Estimation: Minimum Variance Methods

    Full text link
    Large-scale structure distorts the images of background galaxies, which allows one to measure directly the projected distribution of dark matter in the universe and determine its power spectrum. Here we address the question of how to extract this information from the observations. We derive minimum variance estimators for projected density reconstruction and its power spectrum and apply them to simulated data sets, showing that they give a good agreement with the theoretical minimum variance expectations. The same estimator can also be applied to the cluster reconstruction, where it remains a useful reconstruction technique, although it is no longer optimal for every application. The method can be generalized to include nonlinear cluster reconstruction and photometric information on redshifts of background galaxies in the analysis. We also address the question of how to obtain directly the 3-d power spectrum from the weak lensing data. We derive a minimum variance quadratic estimator, which maximizes the likelihood function for the 3-d power spectrum and can be computed either from the measurements directly or from the 2-d power spectrum. The estimator correctly propagates the errors and provides a full correlation matrix of the estimates. It can be generalized to the case where redshift distribution depends on the galaxy photometric properties, which allows one to measure both the 3-d power spectrum and its time evolution.Comment: revised version, 36 pages, AAS LateX, submitted to Ap

    On the Reverse Engineering of the Citadel Botnet

    Get PDF
    Citadel is an advanced information-stealing malware which targets financial information. This malware poses a real threat against the confidentiality and integrity of personal and business data. A joint operation was recently conducted by the FBI and the Microsoft Digital Crimes Unit in order to take down Citadel command-and-control servers. The operation caused some disruption in the botnet but has not stopped it completely. Due to the complex structure and advanced anti-reverse engineering techniques, the Citadel malware analysis process is both challenging and time-consuming. This allows cyber criminals to carry on with their attacks while the analysis is still in progress. In this paper, we present the results of the Citadel reverse engineering and provide additional insight into the functionality, inner workings, and open source components of the malware. In order to accelerate the reverse engineering process, we propose a clone-based analysis methodology. Citadel is an offspring of a previously analyzed malware called Zeus; thus, using the former as a reference, we can measure and quantify the similarities and differences of the new variant. Two types of code analysis techniques are provided in the methodology, namely assembly to source code matching and binary clone detection. The methodology can help reduce the number of functions requiring manual analysis. The analysis results prove that the approach is promising in Citadel malware analysis. Furthermore, the same approach is applicable to similar malware analysis scenarios.Comment: 10 pages, 17 figures. This is an updated / edited version of a paper appeared in FPS 201
    corecore