599 research outputs found

    A new panchromatic classification of unclassified Burst Alert Telescope active galactic nuclei

    Full text link
    We collect data at all frequencies for the new sources classified as unknown active galactic nuclei (AGNs) in the latest Burst Alert Telescope (BAT) all-sky hard X-ray catalog. Focusing on the 36 sources with measured redshift, we compute their spectral energy distribution (SED) from radio to Îł\gamma-rays with the aim to classify these objects. We apply emission models that attempt to reproduce the obtained SEDs, including: i) a standard thin accretion disk together with an obscuring torus and a X-ray corona; ii) a two temperature thick advection-dominated flow; iii) an obscured AGN model, accounting for absorption along the line of sight at kiloelectronvolt energies and in the optical band; and iv) a phenomenological model to describe the jet emission in blazar-like objects. We integrate the models with the SWIRE template libraries to account for the emission of the host galaxy. For every source we found a good agreement between data and our model. Considering that the sources were selected in the hard X-ray band, which is rather unaffected by absorption, we expected and found a large fraction of absorbed radio-quiet AGNs (31 out of 36) and some additional rare radio-loud sources (5 out of 36), since the jet emission in hard X-rays is important for aligned jets owing to the boost produced by the beaming effect. With our work we can confirm the hypothesis that a number of galaxies, whose optical spectra lack AGN emission features, host an obscured active nucleus. The approach we used proved to be efficient in rapidly identifying objects, which commonly used methods were not able to classify.Comment: 11 pages, LaTeX; Acknowledgments adde

    Generalized Disparate Impact for Configurable Fairness Solutions in ML

    Full text link
    We make two contributions in the field of AI fairness over continuous protected attributes. First, we show that the Hirschfeld-Gebelein-Renyi (HGR) indicator (the only one currently available for such a case) is valuable but subject to a few crucial limitations regarding semantics, interpretability, and robustness. Second, we introduce a family of indicators that are: 1) complementary to HGR in terms of semantics; 2) fully interpretable and transparent; 3) robust over finite samples; 4) configurable to suit specific applications. Our approach also allows us to define fine-grained constraints to permit certain types of dependence and forbid others selectively. By expanding the available options for continuous protected attributes, our approach represents a significant contribution to the area of fair artificial intelligence.Comment: to be published in ICML2

    Progettazione e Implementazione di un Domain Specific Language per la Costruzione di Reti Geniche

    Get PDF
    Il lavoro esposto in questa trattazione si colloca all’interno dell’ambito della biologia sintetica, settore interdisciplinare fondato sulla biologia e sull’ingegneria, i cui obiettivi consistono nella progettazione e costruzione di sistemi biologici spesso finalizzati alla sintesi di particolari elementi chimici difficilmente reperibili in natura. Preso atto della complessità di realizzazione di tali sistemi, quindi, si è deciso di sfruttare le potenzialità dei nuovi linguaggi di programmazione, sempre più indirizzati verso un connubio fra il paradigma ad oggetti e quello funzionale, per sviluppare un tool adatto alla loro descrizione e, successivamente, alla simulazione dei loro comportamenti. In questo contesto, pertanto, si presenta Another Genetic Circuit Transcriber, un linguaggio dominio-specifico per la costruzione di reti geniche, nato con l’intento di essere quanto più possibile affine al linguaggio naturale e dotato di una serie – espandibile – di generatori in grado di esportare il contenuto della rete verso software esterni

    Predicting and Optimizing Microswimmer Performance from the Hydrodynamics of Its Components: The Relevance of Interactions

    Get PDF
    Interest in the design of bioinspired robotic microswimmers is growing rapidly, motivated by the spectacular capabilities of their unicellular biological templates. Predicting the swimming speed and efficiency of such devices in a reliable way is essential for their rational design, and to optimize their performance. The hydrodynamic simulations needed for this purpose are demanding and simplified models that neglect nonlocal hydrodynamic interactions (e.g., resistive force theory for slender, filament-like objects that are the typical propulsive apparatus for unicellular swimmers) are commonly used. We show through a detailed case study of a model robotic system consisting of a spherical head powered by a rotating helical flagellum that (a) the errors one makes in the prediction of swimming speed and efficiency by neglecting hydrodynamic interactions are never quite acceptable and (b) there are simple ways to correct the predictions of the simplified theories to make them more accurate. We also formulate optimal design problems for the length of the helical flagellum giving maximal energetic efficiency, maximal distance traveled per motor turn, or maximal distance traveled per unit of work expended, and exhibit optimal solutions

    Effects of Increasing Doses of Intracoronary Adenosine on the Assessment of Fractional Flow Reserve

    Get PDF
    ObjectivesThe purpose of this study was to investigate the effects of increasing dose of intracoronary adenosine on fractional flow reserve (FFR) measurement.BackgroundsFFR is a validated method for the assessment of the severity of coronary artery stenosis. It is based on the change in the pressure gradient across the stenosis after the achievement of maximal hyperemia of the coronary microcirculation that may be obtained by either intracoronary bolus or intravenous infusion of adenosine. No study has explored so far the effects of very high doses of intracoronary adenosine on FFR.MethodsFFR was assessed in 46 patients with 50 intermediate lesions during cardiac catheterization by pressure-recording guidewire (PrimeWire, Volcano, San Diego, California). FFR was calculated as the ratio of the distal coronary pressure to the aortic pressure at hyperemia. Increasing doses of adenosine were administrated (60, 120, 180, 360, and 720 ÎĽg) as intracoronary boluses. Exclusion criteria were: 1) allergy to adenosine; 2) baseline bradycardia (heart rate <50 beats/min); 3) hypotension (blood pressure <90 mm Hg); and 4) refusal to provide signed informed consent.ResultsHigh doses of intracoronary adenosine were well tolerated, with no major side effects. Increasing doses up to 720 ÎĽg progressively decreased FFR values and increased the percentage of patients showing an FFR <0.75. Among angiographic parameters, both percent stenosis and lesion length were independently associated with lower FFR values.ConclusionsThis study shows that high doses of intracoronary adenosine (up to 720 ÎĽg) increased the sensitivity of FFR in the detection of hemodynamically relevant coronary stenoses. Furthermore, lesion length and stenosis severity were independent angiographic determinants of FFR

    Deal2lkit: a Toolkit Library for High Performance Programming in deal.II

    Get PDF
    We propose a software design for the efficient and flexible handling of the building blocks used in high performance finite element simulations, through the pervasive use of parameters (parsed through parameter files). In the proposed design, all the building blocks of a high performance finite element program are built according to the command and composite design patterns.We present version 1.1.0 of the deal2lkit (deal.II ToolKit) library, which is a collection of modules and classes aimed at providing high level interfaces to several deal.II classes and functions, obeying the command and composite design patterns, and controlled via parameter files. Keywords: Object-orientation, Software design, Finite element methods, C+

    Profiling residential water users’ routines by eigenbehavior modelling

    Get PDF
    Developing effective demand-side management strategies is essential to meet future residential water demands, pursue water conservation, and reduce the costs for water utilities. The effectiveness of water demand management strategies relies on our understanding of water consumers’ behavior and their consumption habits and routines, which can be monitored through the deployment of smart metering technologies and the adoption of data analytics and machine learning techniques. This work contributes a novel modeling procedure, based on a combination of clustering and principal component analysis, which allows performing water users’ segmentation on the basis of their eigenbehaviors (i.e., recurrent water consumption behaviors) automatically identified from smart metered consumption data. The approach is tested against a dataset of smart metered water consumption data from 175 households in the municipality of Tegna (CH). Numerical results demonstrate the potential of the method for identifying typical profiles of water consumption, which constitute essential information to support residential water demand management

    Dark matter effects in vacuum spacetime

    Full text link
    We analyze a toy model describing an empty spacetime in which the motion of a test mass (and the trajectories of photons) evidence the presence of a continuous and homogeneous distribution of matter; however, since the energy-momentum tensor vanishes, no real matter or energy distribution is present at all. Thus, a hypothetical observer will conclude that he is immersed in some sort of dark matter, even though he has no chance to directly detect it. This suggests yet another possibility of explaining the elusive dark matter as a purely dynamical effect due to the curvature of spacetime.Comment: 5 pages, 2 figures, expanded with comments about the exact motion and curvature invariant

    Baseline characteristics associated with NEDA-3 status in fingolimod-treated patients with relapsing-remitting multiple sclerosis

    Get PDF
    Abstract Background Fingolimod is an efficacious treatment for relapsing-remitting multiple sclerosis (RRMS) and there is class I evidence that it is superior to standard care in reducing relapse rate. However, real-world data investigating its effectiveness and potential predictors of response are still scarce. Objective To estimate (i) the proportion of fingolimod-treated patients who achieved the no evidence of disease activity (NEDA-3) status; and (ii) to determine which baseline (i.e. at treatment start) clinical and magnetic resonance imaging (MRI) variables were associated with better outcomes. Methods We collected clinical and MRI data of RRMS patients treated with fingolimod and followed-up for 24 months. The proportion of patients who had NEDA-3 - i.e. absence of relapses, sustained Expanded Disability Status Scale (EDSS) worsening and radiological activity on MRI - was estimated. A Cox proportional hazard model was carried out to investigate which baseline characteristics were associated with the NEDA status at follow-up. Results We collected data of 201 patients who started fingolimod. Of them, 24 (12%) were treatment-naïve, 115 (58%) were switched after failing a self-injectable drug, and 60 (30%) switching from natalizumab. Five patients who discontinued fingolimod early (within 3 months) (bradycardia, n = 2; leukopaenia, n = 2; macular oedema, n = 1) were removed from the analysis. At follow-up, 118 (60%) patients achieved the NEDA-3 status, while 78 experienced clinical and/or MRI activity. The risk of not achieving the NEDA-3 status was associated with higher baseline EDSS score (hazard ratio [HR] = 1.18, p = 0.024) and more relapses in the 12 months prior to fingolimod start (HR = 1.61, p = 0.014). Conclusion Our findings suggest that fingolimod may lead to a better control of the disease if started in patients with a less aggressive disease (i.e. fewer pre-treatment relapses and milder disability level), thus supporting its possible role as an early treatment for MS
    • …
    corecore