864 research outputs found

    The universal power spectrum of Quasars in optical wavelengths: Break timescale scales directly with both black hole mass and accretion rate

    Full text link
    Aims: Establish the dependence of variability properties, such as characteristic timescales and variability amplitude, on basic quasar parameters such as black hole mass and accretion rate, controlling for the rest-frame wavelength of emission. Methods: Using large catalogs of quasars, we selected the g-band light curves for 4770 objects from the Zwicky Transient Facility archive. All selected objects fall into a narrow redshift bin, 0.6<z<0.70.6<z<0.7, but cover a wide range of accretion rates in Eddington units (REdd) and black hole masses (MM). We grouped these objects into 26 independent bins according to these parameters, calculated low-resolution gg-band variability power spectra for each of these bins, and approximated the power spectra with a simple analytic model that features a break at a timescale tbt_b. Results: We found a clear dependence of the break timescale tbt_b on REdd, on top of the known dependence of tbt_b on the black hole mass MM. In our fits, tbM0.650.55t_b\propto M^{0.65 - 0.55} REdd 0.350.3^{0.35 - 0.3}, where the ranges in the exponents correspond to the best-fitting parameters of different power spectrum models. Scaling tbt_b to the orbital timescale of the innermost stable circular orbit (ISCO), tISCOt_{\rm ISCO}, results approximately in tb/tISCO(t_{b}/t_{\rm ISCO} \propto (REdd/M)0.35/M)^{0.35}. The observed values of tbt_b are 10\sim 10 longer than the orbital timescale at the light-weighted average radius of the disc region emitting in the (observer frame) gg-band. The different scaling of the break frequency with MM and REdd shows that the shape of the variability power spectrum cannot be solely a function of the quasar luminosity, even for a single rest-frame wavelength. Finally, the best-fitting models have slopes above the break in the range -2.5 and -3. A slope of -2, as in the damped random walk models, fits the data significantly worse.Comment: Accepted for publication in A&

    Partial order label decomposition approaches for melanoma diagnosis

    Get PDF
    Melanoma is a type of cancer that develops from the pigment-containing cells known as melanocytes. Usually occurring on the skin, early detection and diagnosis is strongly related to survival rates. Melanoma recognition is a challenging task that nowadays is performed by well trained dermatologists who may produce varying diagnosis due to the task complexity. This motivates the development of automated diagnosis tools, in spite of the inherent difficulties (intra-class variation, visual similarity between melanoma and non-melanoma lesions, among others). In the present work, we propose a system combining image analysis and machine learning to detect melanoma presence and severity. The severity is assessed in terms of melanoma thickness, which is measured by the Breslow index. Previous works mainly focus on the binary problem of detecting the presence of the melanoma. However, the system proposed in this paper goes a step further by also considering the stage of the lesion in the classification task. To do so, we extract 100 features that consider the shape, colour, pigment network and texture of the benign and malignant lesions. The problem is tackled as a five-class classification problem, where the first class represents benign lesions, and the remaining four classes represent the different stages of the melanoma (via the Breslow index). Based on the problem definition, we identify the learning setting as a partial order problem, in which the patterns belonging to the different melanoma stages present an order relationship, but where there is no order arrangement with respect to the benign lesions. Under this assumption about the class topology, we design several proposals to exploit this structure and improve data preprocessing. In this sense, we experimentally demonstrate that those proposals exploiting the partial order assumption achieve better performance than 12 baseline nominal and ordinal classifiers (including a deep learning model) which do not consider this partial order. To deal with class imbalance, we additionally propose specific over-sampling techniques that consider the structure of the problem for the creation of synthetic patterns. The experimental study is carried out with clinician-curated images from the Interactive Atlas of Dermoscopy, which eases reproducibility of experiments. Concerning the results obtained, in spite of having augmented the complexity of the classification problem with more classes, the performance of our proposals in the binary problem is similar to the one reported in the literature

    Nuclear astrophysics with radioactive ions at FAIR

    Get PDF
    R. Reifarth et al: ; 12 págs.; 9 figs.; Open Access funded by Creative Commons Atribution Licence 3.0 ; Nuclear Physics in Astrophysics VI (NPA6)The nucleosynthesis of elements beyond iron is dominated by neutron captures in the s and r processes. However, 32 stable, proton-rich isotopes cannot be formed during those processes, because they are shielded from the s-process ow and r-process -decay chains. These nuclei are attributed to the p and rp process. For all those processes, current research in nuclear astrophysics addresses the need for more precise reaction data involving radioactive isotopes. Depending on the particular reaction, direct or inverse kinematics, forward or time-reversed direction are investigated to determine or at least to constrain the desired reaction cross sections. The Facility for Antiproton and Ion Research (FAIR) will oer unique, unprecedented opportunities to investigate many of the important reactions. The high yield of radioactive isotopes, even far away from the valley of stability, allows the investigation of isotopes involved in processes as exotic as the r or rp processes.This project was supported by the HGF Young Investigators Project VH-NG-327, EMMI, H4F, HGS-HIRe, JINA, NAVI, DFG and ATHENA.Peer Reviewe

    Resistance to autosomal dominant Alzheimer's disease in an APOE3 Christchurch homozygote: a case report.

    Get PDF
    We identified a PSEN1 (presenilin 1) mutation carrier from the world's largest autosomal dominant Alzheimer's disease kindred, who did not develop mild cognitive impairment until her seventies, three decades after the expected age of clinical onset. The individual had two copies of the APOE3 Christchurch (R136S) mutation, unusually high brain amyloid levels and limited tau and neurodegenerative measurements. Our findings have implications for the role of APOE in the pathogenesis, treatment and prevention of Alzheimer's disease

    Modular Composition of Gene Transcription Networks

    Get PDF
    Predicting the dynamic behavior of a large network from that of the composing modules is a central problem in systems and synthetic biology. Yet, this predictive ability is still largely missing because modules display context-dependent behavior. One cause of context-dependence is retroactivity, a phenomenon similar to loading that influences in non-trivial ways the dynamic performance of a module upon connection to other modules. Here, we establish an analysis framework for gene transcription networks that explicitly accounts for retroactivity. Specifically, a module's key properties are encoded by three retroactivity matrices: internal, scaling, and mixing retroactivity. All of them have a physical interpretation and can be computed from macroscopic parameters (dissociation constants and promoter concentrations) and from the modules' topology. The internal retroactivity quantifies the effect of intramodular connections on an isolated module's dynamics. The scaling and mixing retroactivity establish how intermodular connections change the dynamics of connected modules. Based on these matrices and on the dynamics of modules in isolation, we can accurately predict how loading will affect the behavior of an arbitrary interconnection of modules. We illustrate implications of internal, scaling, and mixing retroactivity on the performance of recurrent network motifs, including negative autoregulation, combinatorial regulation, two-gene clocks, the toggle switch, and the single-input motif. We further provide a quantitative metric that determines how robust the dynamic behavior of a module is to interconnection with other modules. This metric can be employed both to evaluate the extent of modularity of natural networks and to establish concrete design guidelines to minimize retroactivity between modules in synthetic systems.United States. Air Force Office of Scientific Research (FA9550-12-1-0129

    Cooperative development of logical modelling standards and tools with CoLoMoTo.

    Get PDF
    The identification of large regulatory and signalling networks involved in the control of crucial cellular processes calls for proper modelling approaches. Indeed, models can help elucidate properties of these networks, understand their behaviour and provide (testable) predictions by performing in silico experiments. In this context, qualitative, logical frameworks have emerged as relevant approaches, as demonstrated by a growing number of published models, along with new methodologies and software tools. This productive activity now requires a concerted effort to ensure model reusability and interoperability between tools. Following an outline of the logical modelling framework, we present the most important achievements of the Consortium for Logical Models and Tools, along with future objectives. Our aim is to advertise this open community, which welcomes contributions from all researchers interested in logical modelling or in related mathematical and computational developments

    Palaeoecological data indicates land-use changes across Europe linked to spatial heterogeneity in mortality during the Black Death pandemic

    Get PDF
    Historical accounts of the mortality outcomes of the Black Death plague pandemic are variable across Europe, with much higher death tolls suggested in some areas than others. Here the authors use a 'big data palaeoecology' approach to show that land use change following the pandemic was spatially variable across Europe, confirming heterogeneous responses with empirical data.The Black Death (1347-1352 ce) is the most renowned pandemic in human history, believed by many to have killed half of Europe's population. However, despite advances in ancient DNA research that conclusively identified the pandemic's causative agent (bacterium Yersinia pestis), our knowledge of the Black Death remains limited, based primarily on qualitative remarks in medieval written sources available for some areas of Western Europe. Here, we remedy this situation by applying a pioneering new approach, 'big data palaeoecology', which, starting from palynological data, evaluates the scale of the Black Death's mortality on a regional scale across Europe. We collected pollen data on landscape change from 261 radiocarbon-dated coring sites (lakes and wetlands) located across 19 modern-day European countries. We used two independent methods of analysis to evaluate whether the changes we see in the landscape at the time of the Black Death agree with the hypothesis that a large portion of the population, upwards of half, died within a few years in the 21 historical regions we studied. While we can confirm that the Black Death had a devastating impact in some regions, we found that it had negligible or no impact in others. These inter-regional differences in the Black Death's mortality across Europe demonstrate the significance of cultural, ecological, economic, societal and climatic factors that mediated the dissemination and impact of the disease. The complex interplay of these factors, along with the historical ecology of plague, should be a focus of future research on historical pandemics

    Genome-wide association analysis of dementia and its clinical endophenotypes reveal novel loci associated with Alzheimer's disease and three causality networks: The GR@ACE project

    Get PDF
    Introduction: Large variability among Alzheimer's disease (AD) cases might impact genetic discoveries and complicate dissection of underlying biological pathways. Methods: Genome Research at Fundacio ACE (GR@ACE) is a genome-wide study of dementia and its clinical endophenotypes, defined based on AD's clinical certainty and vascular burden. We assessed the impact of known AD loci across endophenotypes to generate loci categories. We incorporated gene coexpression data and conducted pathway analysis per category. Finally, to evaluate the effect of heterogeneity in genetic studies, GR@ACE series were meta-analyzed with additional genome-wide association study data sets. Results: We classified known AD loci into three categories, which might reflect the disease clinical heterogeneity. Vascular processes were only detected as a causal mechanism in probable AD. The meta-analysis strategy revealed the ANKRD31-rs4704171 and NDUFAF6-rs10098778 and confirmed SCIMP-rs7225151 and CD33-rs3865444. Discussion: The regulation of vasculature is a prominent causal component of probable AD. GR@ACE meta-analysis revealed novel AD genetic signals, strongly driven by the presence of clinical heterogeneity in the AD series
    corecore