2,864 research outputs found

    Lower mass normalization of the stellar initial mass function for dense massive early-type galaxies at z ~ 1.4

    Get PDF
    This paper aims at understanding if the normalization of the stellar initial mass function (IMF) of massive early-type galaxies (ETGs) varies with cosmic time and/or with mean stellar mass density Sigma (M*/2\pi Re^2). For this purpose we collected a sample of 18 dense (Sigma>2500 M_sun/pc^2) ETGs at 1.2<z<1.6 with available velocity dispersion sigma_e. We have constrained their mass-normalization by comparing their true stellar masses (M_true) derived through virial theorem, hence IMF independent, with those inferred through the fit of the photometry assuming a reference IMF (M_ref). Adopting the virial estimator as proxy of the true stellar mass, we have assumed for these ETGs zero dark matter (DM). However, dynamical models and numerical simulations of galaxy evolution have shown that the DM fraction within Re in dense high-z ETGs is negligible. We have considered the possible bias of virial theorem in recovering the total masses and have shown that for dense ETGs the virial masses are in agreement with those derived through more sophisticated dynamical models. The variation of the parameter Gamma = M_true/M_ref with sigma_e shows that, on average, dense ETGs at = 1.4 follow the same IMF-sigma_e trend of typical local ETGs, but with a lower mass-normalization. Nonetheless, once the IMF-sigma_e trend we have found for high-z dense ETGs is compared with that of local ETGs with similar Sigma and sigma_e, they turn out to be consistent. The similarity between the IMF-sigma_e trends of dense high-z and low-z ETGs over 9 Gyr of evolution and their lower mass-normalization with respect to the mean value of local ETGs suggest that, independently on formation redshift, the physical conditions characterizing the formation of a dense spheroid lead to a mass spectrum of new formed stars with an higher ratio of high- to low-mass stars with respect to the IMF of normal local ETGs.Comment: 9 pages, 4 figures, accepted for pubblication in A&A, updated to match final journal versio

    The population of early-type galaxies: how it evolves with time and how it differs from passive and late-type galaxies

    Full text link
    The aim of our analysis is twofold. On the one hand we are interested in addressing whether a sample of ETGs morphologically selected differs from a sample of passive galaxies in terms of galaxy statistics. On the other hand we study how the relative abundance of galaxies, the number density and the stellar mass density for different morphological types change over the redshift range 0.6<z<2.5. From the 1302 galaxies brighter than Ks=22 selected from the GOODS-MUSIC catalogue, we classified the ETGs on the basis of their morphology and the passive galaxies on the basis of their sSFR. We proved how the definition of passive galaxy depends on the IMF adopted in the models and on the assumed sSFR threshold. We find that ETGs cannot be distinguished from the other morphological classes on the basis of their low sSFR, irrespective of the IMF adopted in the models. Using the sample of 1302 galaxies morphologically classified into spheroidal galaxies (ETGs) and not spheroidal galaxies (LTGs), we find that their fractions are constant over the redshift range 0.6<z<2.5 (20-30% ETGs vs 70-80% LTGs). However, at z<1 these fractions change among the population of the most massive (M*>=10^(11) M_sol) galaxies, with the fraction of massive ETGs rising up to 40% and the fraction of massive LTGs decreasing down to 60%. Moreover, we find that the number density and the stellar mass density of the whole population of massive galaxies increase almost by a factor of ~10 between 0.6<z<2.5, with a faster increase of these densities for the ETGs than for the LTGs. Finally, we find that the number density of the highest-mass galaxies (M*>3-4x10^(11) M_sol) both ETGs and LTGs do not increase since z~2.5, contrary to the lower mass galaxies. This suggests that the population of the most massive galaxies formed at z>2.5-3 and that the assembly of such high-mass galaxies is not effective at lower redshift.Comment: 15 pages, 14 figures. Published in A&

    A fast - Monte Carlo toolkit on GPU for treatment plan dose recalculation in proton therapy

    Get PDF
    In the context of the particle therapy a crucial role is played by Treatment Planning Systems (TPSs), tools aimed to compute and optimize the tratment plan. Nowadays one of the major issues related to the TPS in particle therapy is the large CPU time needed. We developed a software toolkit (FRED) for reducing dose recalculation time by exploiting Graphics Processing Units (GPU) hardware. Thanks to their high parallelization capability, GPUs significantly reduce the computation time, up to factor 100 respect to a standard CPU running software. The transport of proton beams in the patient is accurately described through Monte Carlo methods. Physical processes reproduced are: Multiple Coulomb Scattering, energy straggling and nuclear interactions of protons with the main nuclei composing the biological tissues. FRED toolkit does not rely on the water equivalent translation of tissues, but exploits the Computed Tomography anatomical information by reconstructing and simulating the atomic composition of each crossed tissue. FRED can be used as an efficient tool for dose recalculation, on the day of the treatment. In fact it can provide in about one minute on standard hardware the dose map obtained combining the treatment plan, earlier computed by the TPS, and the current patient anatomic arrangement

    The perception of lexical tone contrasts in Cantonese children with and without Specific Language Impairment (SLI)

    Get PDF
    Purpose: This study examined the perception of fundamental frequency (f0) patterns by Cantonese children with and without specific language impairment (SLI). Method: Participants were 14 five-year-old children with SLI, and 14 age-matched (AM) and 13 four-year-old vocabulary-matched (VM) controls. The children identified a word from familiar word pairs that illustrated the 8 minimally contrastive pairs of the 6 lexical tones. They discriminated the f0 patterns within contrastive tonal pairs in speech and nonspeech stimuli. Results: In tone identification, the SLI group performed worse than the AM group but not the VM group. In tone discrimination, the SLI group did worse than the AM group on 2 contrasts and showed a nonsignificant trend of poorer performance on all contrasts combined. The VM group generally did worse than the AM group. There were no group differences in discrimination performance between speech and nonspeech stimuli. No correlation was found between identification and discrimination performance. Only the normal controls showed a moderate correlation between vocabulary scores and performance in the 2 perception tasks. Conclusion: The SLI group's poor tone identification cannot be accounted for by vocabulary knowledge alone. The group's tone discrimination performance suggests that some children with SLI have a deficit in f0 processing. © American Speech-Language-Hearing Association.postprin

    User study on 3D multitouch interaction (3DMi) and gaze on surface computing

    Get PDF
    On a multitouch table, user’s interactions with 3D virtual representations of real objects should be influenced by task and their perceived physical characteristics. This article explores the development and user study of an interactive 3D application that allows users to explore virtual heritage objects on a surface device. To-date, most multitouch has focused on 2D or 2.5D systems. A user-study is reported where we analyse their multimodal behaviour – specifically how they interact on a surface device with objects that have similar properties to physical versions and the users’ associated gaze patterns with touch. The study reveals that gaze characteristics are different according to their interaction intention in terms of position and duration of visual attention. We discovered that virtual objects afford the perception of haptic attributes ascribed to their equivalent physical objects, and that differences in the summary statistics of gaze showed consistent characteristics between people and differences between natural and task based activities. An awareness of user behaviours using natural gestures can inform the design of interactive 3D applications which complements the user’s model of past experience with physical objects and with GUI interaction

    Ontology-Driven Food Category Classification in Images

    Get PDF
    The self-management of chronic diseases related to dietary habits includes the necessity of tracking what people eat. Most of the approaches proposed in the literature classify food pictures by labels describing the whole recipe. The main drawback of this kind of strategy is that a wrong prediction of the recipe leads to a wrong prediction of any ingredient of such a recipe. In this paper we present a multi-label food classification approach, exploiting deep neural networks, where each food picture is classified with labels describing the food categories of the ingredients in each recipe. The aim of our approach is to support the detection of food categories in order to detect which one might be dangerous for a user affected by chronic disease. Our approach relies on background knowledge where recipes, food categories, and their relatedness with chronic diseases are modeled within a state-of-the-art ontology. Experiments conducted on a new publicly released dataset demonstrated the effectiveness of the proposed approach with respect to state-of-the-art classification strategies

    tDCS changes in motor excitability are specific to orientation of current flow

    Get PDF
    BACKGROUND: Measurements and models of current flow in the brain during transcranial Direct Current Stimulation (tDCS) indicate stimulation of regions in-between electrodes. Moreover, the folded cortex results in local fluctuations in current flow intensity and direction, and animal studies suggest current flow direction relative to cortical columns determines response to tDCS. METHODS: Here we test this idea by using Transcranial Magnetic Stimulation Motor Evoked Potentials (TMS-MEP) to measure changes in corticospinal excitability following tDCS applied with electrodes aligned orthogonal (across) or parallel to M1 in the central sulcus. RESULTS: Current flow models predicted that the orthogonal electrode montage produces consistently oriented current across the hand region of M1 that flows along cortical columns, while the parallel electrode montage produces non-uniform current directions across the M1 cortical surface. We find that orthogonal, but not parallel, orientated tDCS modulates TMS-MEPs. We also show modulation is sensitive to the orientation of the TMS coil (PA or AP), which is thought to select different afferent pathways to M1. CONCLUSIONS: Our results are consistent with tDCS producing directionally specific neuromodulation in brain regions in-between electrodes, but shows nuanced changes in excitability that are presumably current direction relative to column and axon pathway specific. We suggest that the direction of current flow through cortical target regions should be considered for targeting and dose-control of tDCS

    Multiple isolated aneurysms in a case of “burned out” Takayasu aortitis

    Get PDF
    AbstractTakayasu aortitis (TA) is a chronic inflammatory disease predominantly seen in young Asian women. The disease is idiopathic and largely affects the aorta and its major branches. The basic pathologic changes in TA are fibrosis and subsequent occlusion of the large arteries. TA is classically termed “pulseless” disease, with manifestations during the occlusive stage including limb ischemia, renovascular hypertension, and heart failure. Arterial dilation and aneurysm are largely unappreciated manifestations of TA, but they occur in as many as 32% of affected patients. We report chronic “burned out” TA in a 23-year-old Hispanic woman with isolated aneurysms of the descending thoracic aorta, abdominal aorta, and common iliac arteries, without occlusive disease. (J Vasc Surg 2003;37:1094-7.

    A fast - Monte Carlo toolkit on GPU for treatment plan dose recalculation in proton therapy

    Get PDF
    In the context of the particle therapy a crucial role is played by Treatment Planning Systems (TPSs), tools aimed to compute and optimize the tratment plan. Nowadays one of the major issues related to the TPS in particle therapy is the large CPU time needed. We developed a software toolkit (FRED) for reducing dose recalculation time by exploiting Graphics Processing Units (GPU) hardware. Thanks to their high parallelization capability, GPUs significantly reduce the computation time, up to factor 100 respect to a standard CPU running software. The transport of proton beams in the patient is accurately described through Monte Carlo methods. Physical processes reproduced are: Multiple Coulomb Scattering, energy straggling and nuclear interactions of protons with the main nuclei composing the biological tissues. FRED toolkit does not rely on the water equivalent translation of tissues, but exploits the Computed Tomography anatomical information by reconstructing and simulating the atomic composition of each crossed tissue. FRED can be used as an efficient tool for dose recalculation, on the day of the treatment. In fact it can provide in about one minute on standard hardware the dose map obtained combining the treatment plan, earlier computed by the TPS, and the current patient anatomic arrangement
    • …
    corecore