1,255 research outputs found

    Improving the accuracy of mass reconstructions from weak lensing: local shear measurements

    Get PDF
    Different options can be used in order to measure the shear from observations in the context of weak lensing. Here we introduce new methods where the isotropy assumption for the distribution of the source galaxies is implemented directly on the observed quadrupole moments. A quantitative analysis of the error associated with the finite number of source galaxies and with their ellipticity distribution is provided, applicable even when the shear is not weak. Monte Carlo simulations based on a realistic sample of source galaxies show that our procedure generally leads to errors ~30% smaller than those associated with the standard method of Kaiser and Squires (1993).Comment: 9 pages and 3 Postscript figures, uses A&A TeX macros. To be published in A&

    Improving the accuracy of mass reconstructions from weak lensing: from the shear map to the mass distribution

    Get PDF
    In this paper we provide a statistical analysis of the parameter-free method often used in weak lensing mass reconstructions. It is found that a proper assessment of the errors involved in such a non-local analysis requires the study of the relevant two-point correlation functions. After calculating the two-point correlation function for the reduced shear, we determine the expected error on the inferred mass distribution and on other related quantities, such as the total mass, and derive the error power spectrum. This allows us to optimize the reconstruction method, with respect to the kernel used in the inversion procedure. In particular, we find that curl-free kernels are bound to lead to more accurate mass reconstructions. Our analytical results clarify the arguments and the numerical simulations by Seitz & Schneider (1996).Comment: 11 pages and 2 Postscript figures, uses A&A TeX macros. Submitted to A&A. Changed conten

    A fast direct method of mass reconstruction for gravitational lenses

    Get PDF
    Statistical analyses of observed galaxy distortions are often used to reconstruct the mass distribution of an intervening cluster responsible for gravitational lensing. In current projects, distortions of thousands of source galaxies have to be handled efficiently; much larger data bases and more massive investigations are envisaged for new major observational initiatives. In this article we present an efficient mass reconstruction procedure, a direct method that solves a variational principle noted in an earlier paper, which, for rectangular fields, turns out to reduce the relevant execution time by a factor from 100 to 1000 with respect to the fastest methods currently used, so that for grid numbers N = 400 the required CPU time on a good workstation can be kept within the order of 1 second. The acquired speed also opens the way to some long-term projects based on simulated observations (addressing statistical or cosmological questions) that would be, at present, practically not viable for intrinsically slow reconstruction methods.Comment: 6 pages, 2 figures. Uses A&A macros. Accepted for pubblication on A&

    Looking at the Fundamental Plane through a gravitational lens

    Get PDF
    We consider the Fundamental Plane of elliptical galaxies lensed by the gravitational field of a massive deflector (typically, a cluster of galaxies). We show that the Fundamental Plane relation provides a straightforward measurement of the projected mass distribution of the lens with a typical accuracy of ~0.15 in the dimensionless column density kappa. The proposed technique breaks the mass-sheet degeneracy completely and is thus expected to serve as an important complement to other lensing-based analyses. Moreover, its ability to measure directly the mass distribution on the small pencil beams that characterize the size of background galaxies may lead to crucial tests for current scenarios of structure formation.Comment: ApJL, in pres

    Weak lensing and cosmology

    Get PDF
    Recently, it has been shown that it is possible to reconstruct the projected mass distribution of a cluster from weak lensing provided that both the geometry of the universe and the probability distribution of galaxy redshifts are known; actually, when additional photometric data are taken to be available, the galaxy redshift distribution could be determined jointly with the cluster mass from the weak lensing analysis. In this paper we develop, in the spirit of a ``thought experiment,'' a method to constrain the geometry of the universe from weak lensing, provided that the redshifts of the source galaxies are measured. The quantitative limits and merits of the method are discussed analytically and with a set of simulations, in relation to point estimation, interval estimation, and test of hypotheses for homogeneous Friedmann-Lemaitre models. The constraints turn out to be significant when a few thousand source galaxies are used.Comment: 17 pages, 8 figures. Uses A&A LaTeX style. Accepted for pubblication by A&A. Several changes made: new model for the lens; Sect. 7 and App. A. adde

    2-Hydroxyglutarate as a biomarker in glioma patients

    Get PDF
    Background: mutation of IDH1 gene is a prognostic factor and a diagnostic hallmark of gliomas. Mutant IDH1 enzyme can convert α-KG into 2-Hydroxyglutarate (2HG) and mutated gliomas have elevated amounts of intracellular 2HG. Since 2HG is a small molecule it seems possible that it could reach the systemic circulation and to be excreted by urine. And so, we analyzed 2HG concentration in plasma and urine in glioma patients to identify a surrogate biomarker of IDH1 gene mutation. Materials and Methods: All patients had a prior histological confirmation of glioma, a recent brain MRI (within 2 weeks) showing the neoplastic lesions. The exclusion criteria were any chemotherapy performed within 28 days prior, other neoplastic and metabolic diseases. Plasma and urine samples were taken from all patients and 2HG concentrations determined by liquid chromatography tandem mass spectrometry; exon 4 of IDH1 genes were analyzed by Sanger sequencing; differences in metabolite concentrations between mutant and wild-type IDH1 patients were examined with the Mann-Whitney U test for non-parametric data; Student’s t-test was used to compare parametric data. ROC curve was used to evaluate the cut off value of the 2HG biomarker. Results: 84 patients were enrolled: 38 with IDH1 mutated and 46 IDH1 wild-type. All the mutations were R132H. Among patients with mutant IDH1 we had 21 highgrade gliomas (HGG) and 17 low-grade gliomas (LGG); among patients with IDH wild-type we had 35 HGG and 11 LGG.. In all patients we analyzed the mean 2HG concentration in plasma (P_2HG), in urine (U_2HG) and the ratio between P_2HG and U_2HG (R_2HG). We found an important significant difference in R_2HG between glioma patients with and without IDH1 mutation (22.2 versus 15.6, respectively, p<0.0001). The optimal cut-off value of R_2HG to identify glioma patients with and without IDH mutation was 19 (sensitivity 63%, specificity 76%, accuracy 70%); in only PTS with HGG the optimal cut-off value was 20 (sensitivity 76%, specificity 89%, accuracy 84%, positive predictive value 80%, negative predictive value 86%). No associations between the grade or size of tumor and R_2HG were found. In 7 patients with highgrade gliomas we found a correlation between R_2HG value and response to treatment. Conclusions: analyzing R_2HG derived from individual plasma and urine 2HG levels is possible discriminate glioma patients with and without IDH mutation, in particular in high grade gliomas. Moreover, a larger samples need to be analyzed to investigate this method in patients follow-up for recurrence detection and to monitor treatment efficacy

    Risk assessment, and carcinogen mutagen for workers potentially exposed in the research laboratories of “Sapienza” University of Rome for Health Surveillance

    Get PDF
    The following work is meant to represent the evaluation of risk factors for the health of exposed workers, arising from the management of carcinogenic and mutagenic substances, through the use of algorithms. In some places of work as a research laboratory, it is more suitable a theoretical and practical methodology (algorithm) which allows a "timely" exposure assessment. The methodology developed and used is able to determine the level of risk of exposure due to a single agent and / or to more agents. Results obtained by the algorithm, have shown an higher exposure to 1 for formaldehyde (Lcanc = 1.32), while for acrylamide results obtained shows a lower exposure to 1 (Lcanc = 0.528). Although the overall exposure level of studied workers higher value to 1 (Lcanc= 1.848), the Occupational Medicine Centre of "Sapienza" - University of Rome, in agreement with the position taken by the Italian Society of Occupational Medicine and Industrial Hygiene applies health surveillance even in the presence of potential health risk reducing it among the general protection measures the health and safety of workers

    An ERP study of low and high relevance semantic features

    Get PDF
    It is believed that the N400 elicited by concepts belonging to Living is larger than N400 to Non-living. This is considered as evidence that concepts are organized, in the brain, on the basis of categories. We conducted a feature-verification experiment where Living and Non-living concepts were matched for relevance of semantic features. Relevance is a measure of the contribution of semantic features to the “core” meaning of a concept. We found that when relevance is low the N400 is large. In addition, we found that when the two categories of Living and Non-living are equated for relevance the seemingly category effect at behavioral and neural level disappeared. In sum, N400 is sensitive, rather than to categories, to semantic features, thus showing that previously reported effects of semantic categories may arise as a consequence of the differing relevance of concepts belonging to Living and Non-living categories

    FEATURE TYPE EFFECTS IN SEMANTIC MEMORY: AN EVENT RELATED POTENTIALS STUDY

    Get PDF
    It is believed that the N400 elicited by concepts belonging to Living is larger than N400 to Objects. This is considered as evidence that concepts are organized, in the brain, on the basis of categories. Similarly, differential N400 to sensory and non-sensory semantic features was taken as evidence for a neural organisation of conceptual memory based on semantic features. We conducted a feature-verification experiment where Living and Non-Living concepts are described by sensory and non-sensory features were matched for age-of-acquisition, typicality and familiarity and for relevance of semantic features. Relevance is a measure of the contribution of semantic features to the “core” meaning of a concept. We found that when Relevance is low then N400 is larger. In addition, we found that when the two categories of Living and Non-Living concepts are matched for relevance the seemingly category effect at the neural level disappeared. Also no difference between sensory and non-sensory descriptions was detected when relevance was matched. In sum, N400 does not differ between categories or feature types. Previously reported effects of semantic categories and feature type may have arisen as a consequence of the differing Relevance of concepts belonging to Living and Non-Living categories

    La gestione del rischio elettrico nelle attivitĂ  di cantiere

    Get PDF
    Analizzando il fenomeno infortunistico nella complessità dei fattori che lo generano, emerge che le caratteristiche di elementi critici ricorrenti, spesso indotti da errori e/o omissioni procedurali, condizionano la pericolosità delle attività lavorative, soprattutto con riferimento al rischio elettrico e nell’ambito delle attività di cantiere. E’ altresì evidente che in generale la realizzazione di un processo lavorativo “sicuro” è una prestazione attesa, che richiede competenze, investimenti e dedizione più frequentemente attuati nelle imprese grandi che non in quelle piccole e medie, nelle competenze lavorative mature che non in quelle più giovani o anziane, dalle maestranze nazionali meglio che da quelle straniere. Queste risultanze statistiche contribuiscono a fornire il primo set di indicazioni utili a caratterizzare il rischio rispetto al caso concreto della specifica attività lavorativa e a gestirne gli effetti, progettando opportuni sistemi di prevenzione, soprattutto laddove l’aspetto del coordinamento può risultare particolarmente rilevante, essendo i rapporti più frequentemente orientati a stabilire confini di responsabilità piuttosto che delineare politiche comuni di prevenzione. Frequentemente gli infortuni sul lavoro sono interpretati, a tali fini predittivi, attraverso caratterizzazioni tendenzialmente informali. Obiettivo preliminare dell’analisi presentata è di formalizzare la descrizione degli eventi di infortunio, disponibili nelle banche dati di settore in forma di schede descrittive di dettaglio, in casi algebrici rappresentabili nello spazio Rn dei determinanti (cause di infortunio) al fine di poter operare trattamenti statistici descrittivi finalizzati all’acquisizione di informazioni omogenee e predittive. L’introduzione di questa potenzialità esplicativa consente, infatti, l’applicazione di tecniche di analisi statistica multivariata a campioni stratificati per modalità di infortunio, con l’obiettivo di ricercare modalità ricorrenti di infortunio a supporto delle attività di prevenzione del rischio totale (valore atteso del danno) e gestione del rischio residuo. Sulla base della collezione di infortuni mortali - riconducibili, rispetto all’ambito generatore, al rischio elettrico - disponibile nel database Infor.MO - che costituisce a tutti gli effetti una serie storica del fenomeno osservato, si è proceduto, quindi, ad aggregare i casi di infortunio mediante tecniche di cluster analysis e di analisi multifattoriale (ACM) che, con riferimento alle cause prodromiche del flusso del pericolo, possano evidenziare modalità peculiari e ricorrenti di infortunio, ossia genesi preferenziali, esplicative del fenomeno infortunistico registrato e predittive delle future realizzazioni dello stesso. In particolare, applicando questi modelli ai casi di infortunio mortale relativi al settore AtEco F3, con riferimento al "rischio elettrico", è possibile indirizzare in modo razionale gli interventi di prevenzione e/o protezione, in una prospettiva di massima efficienza gestionale
    • …
    corecore