4,530 research outputs found

    Knowledge Representation with Multiple Logical Theories and Time

    Get PDF
    We present a knowledge representation framework where a collection of logic programs can be combined together by means of meta-level program composition operations. Each object-level program is composed of a collection of extended clauses, equipped with a time interval representing the time period in which they hold. The interaction between program composition operations and time yields a powerful knowledge representation language in which many applications can be naturally developed. The language is given a meta-level semantics which also provides an executable specification. Moreover, we define an abstract semantics by extending the immediate consequence operator from a single logic program to compositions of logic programs and taking into account time intervals. The operational, meta-level semantics is proven sound and complete with respect to the abstract bottom-up semantics. The approach is further extended in order to cope with the problem of reasoning over joined intervals of time. Three applications in the field of business regulations are shown

    The renin—angiotensin system in refractory heart failure: clinical, hemodynamic and hormonal effects of captopril and enalapril

    Get PDF
    Studies using a competitive inhibitor of angiotensin II (saralasin) or converting enzyme inhibitors (teprotide, captopril, enalapril) have established that the renin-angiotensin system participates in the control of vascular tone in congestive heart failure both in experimental settings and in patients. In man, the marked decrease in left ventricular filling pressure and the variable increase in stroke volume induced by renin-angiotensin blockade suggests that angiotensin II actively constricts venous as well as arteriolar vascular beds. Captopril, in doses of 25 to 150 mg p.o. TID, maintains its efficacy during chronic administration with persistent clinical and hemodynamic improvement as well as increased exercise tolerance. In our experience, enalapril, 10 mg p.o., improves cardiac function within 4 to 6 h as reflected by a 30% decrease in left ventricular filling pressure, a 28% increase in stroke volume in the face of unchanged heart rate. Clinical improvement, enhanced exercise tolerance and characteristic hormonal responses suggest that enalapril also maintains its efficacy during long-term treatment. Chronic angiotensin II converting enzyme inhibition appears to be a major advance in the treatment of patients with severe congestive heart failure, refractory to digitalis and diuretic

    Oral administration of chestnut tannins to reduce the duration of neonatal calf diarrhea

    Get PDF
    Background: Neonatal calf diarrhea is generally caused by infectious agents and is a very common disease in bovine practice, leading to substantial economic losses. Tannins are known for their astringent and anti- inflammatory properties in the gastro-enteric tract. The aim of this study was to evaluate the effect of the oral administration of chestnut tannins (Castanea sativa Mill.) in order to reduce the duration of calf neonatal diarrhea. Twenty-four Italian Friesian calves affected by neonatal diarrhea were included. The duration of the diarrheic episode (DDE) was recorded and the animals were divided into a control group (C), which received Effydral® in 2 l of warm water, and a tannin-treated group (T), which received Effydral® in 2 l of warm water plus 10 g of extract of chestnut tannins powder. A Mann-Whitney test was performed to verify differences for the DDE values between the two groups. Results: The DDE was significantly higher in group C than in group T (p = 0.02), resulting in 10.1 ± 3.2 and 6.6 ± 3. 8 days, respectively. Conclusions: Phytotherapic treatments for various diseases have become more common both in human and in veterinary medicine, in order to reduce the presence of antibiotic molecules in the food chain and in the environment. Administration of tannins in calves with diarrhea seemed to shorten the DDE in T by almost 4 days compared to C, suggesting an effective astringent action of chestnut tannins in the calf, as already reported in humans. The use of chestnut tannins in calves could represent an effective, low-impact treatment for neonatal diarrhea

    The Explanation Dialogues: Understanding How Legal Experts Reason About XAI Methods

    Get PDF
    The Explanation Dialogues project is an expert focus study that aims to uncover expectations, reasoning, and rules of legal experts and practitioners towards explainable artificial intelligence (XAI). We examine legal perceptions and disputes that arise in a fictional scenario that resembles a daily life situation - a bank’s use of an automated decision-making (ADM) system to decide on credit allocation to individuals. Through this simulation, the study aims to provide insights into the legal value and validity of explanations of ADMs, identify potential gaps and issues that may arise in the context of compliance with European legislation, and provide guidance on how to address these shortcomings

    Opening the black box: a primer for anti-discrimination

    Get PDF
    The pervasive adoption of Artificial Intelligence (AI) models in the modern information society, requires counterbalancing the growing decision power demanded to AI models with risk assessment methodologies. In this paper, we consider the risk of discriminatory decisions and review approaches for discovering discrimination and for designing fair AI models. We highlight the tight relations between discrimination discovery and explainable AI, with the latter being a more general approach for understanding the behavior of black boxes

    Evaluation of Ultrasound Measurement of Subcutaneous Fat Thickness in Dairy Jennies during the Periparturient Period

    Get PDF
    The body condition score (BCS) represents a practical but subjective method for assessing body fat reserves. Real time ultrasonography (RTU) has been proposed as an accurate method to objectively measure subcutaneous fat (SF) thickness and predict body fat reserves in cows, horses and donkeys. The aim of the present study was to describe RTU measures of SF thickness during periparturient period in jennies. The present prospective cohort study evaluated six dairy jennies. SF RTU were performed at 15 and 7 days before the presumptive delivery, and 2, 15 and 30 days after delivery. A portable ultrasound machine and multifrequency linear transducer (5–7.5 MHz) was used. RTU images were obtained in six sites (S1–S6). Results at each time point were reported as mean ± standard deviation and compared through time. A total of 180 images were evaluated. RTU technique was easy to perform and well tolerated. No statistically significant differences were found of each site during time, except for S2 and S6a: S2 at T2 and S6a at T1 were significatively different to values obtained at T5. The RTU mean values were above those reported by others, suggesting major physio-logical challenges related to energy balance and fat mobilization in pregnant jennies bred for milking production. BCS and sites through observational time have shown a good and reliable association. Our study could give preliminary indications on fat reserves in different body locations evaluated thanks to RTU and it show no significative variation of SF thickness, in pregnant and lactating jennies

    GLocalX - From Local to Global Explanations of Black Box AI Models

    Get PDF
    Artificial Intelligence (AI) has come to prominence as one of the major components of our society, with applications in most aspects of our lives. In this field, complex and highly nonlinear machine learning models such as ensemble models, deep neural networks, and Support Vector Machines have consistently shown remarkable accuracy in solving complex tasks. Although accurate, AI models often are “black boxes” which we are not able to understand. Relying on these models has a multifaceted impact and raises significant concerns about their transparency. Applications in sensitive and critical domains are a strong motivational factor in trying to understand the behavior of black boxes. We propose to address this issue by providing an interpretable layer on top of black box models by aggregating “local” explanations. We present GLOCALX, a “local-first” model agnostic explanation method. Starting from local explanations expressed in form of local decision rules, GLOCALX iteratively generalizes them into global explanations by hierarchically aggregating them. Our goal is to learn accurate yet simple interpretable models to emulate the given black box, and, if possible, replace it entirely. We validate GLOCALX in a set of experiments in standard and constrained settings with limited or no access to either data or local explanations. Experiments show that GLOCALX is able to accurately emulate several models with simple and small models, reaching state-of-the-art performance against natively global solutions. Our findings show how it is often possible to achieve a high level of both accuracy and comprehensibility of classification models, even in complex domains with high-dimensional data, without necessarily trading one property for the other. This is a key requirement for a trustworthy AI, necessary for adoption in high-stakes decision making applications.Artificial Intelligence (AI) has come to prominence as one of the major components of our society, with applications in most aspects of our lives. In this field, complex and highly nonlinear machine learning models such as ensemble models, deep neural networks, and Support Vector Machines have consistently shown remarkable accuracy in solving complex tasks. Although accurate, AI models often are “black boxes” which we are not able to understand. Relying on these models has a multifaceted impact and raises significant concerns about their transparency. Applications in sensitive and critical domains are a strong motivational factor in trying to understand the behavior of black boxes. We propose to address this issue by providing an interpretable layer on top of black box models by aggregating “local” explanations. We present GLOCALX, a “local-first” model agnostic explanation method. Starting from local explanations expressed in form of local decision rules, GLOCALX iteratively generalizes them into global explanations by hierarchically aggregating them. Our goal is to learn accurate yet simple interpretable models to emulate the given black box, and, if possible, replace it entirely. We validate GLOCALX in a set of experiments in standard and constrained settings with limited or no access to either data or local explanations. Experiments show that GLOCALX is able to accurately emulate several models with simple and small models, reaching state-of-the-art performance against natively global solutions. Our findings show how it is often possible to achieve a high level of both accuracy and comprehensibility of classification models, even in complex domains with high-dimensional data, without necessarily trading one property for the other. This is a key requirement for a trustworthy AI, necessary for adoption in high-stakes decision making applications

    Mammary cistern size during the dry period in healthy dairy cows: A preliminary study for an ultrasonographic evaluation

    Get PDF
    We evaluated the udder cistern (UC) size during the dry period using ultrasound. Forty healthy quarters were evaluated in both the longitudinal and cross-section of the UC. Quarters were evaluated at the drying-off (T0) and 24 h later (T1), then regularly until the end of the dry period (T7–T58), during the colostrum production phase (TCPP) and at 7 days in milking (T7PP). The Spearman test was applied to find the correlation between the ultrasonographic UC size (UUCS) assessment and time. The Friedman test and Dunn’s test for multiple comparisons as a post-hoc test were performed to compare the forequarter and hindquarter cross-sections (FQCSs and HQCSs, respectively) and the forequarter and hindquarter longitudinal sections (FQLSs and HQLSs, respectively) at T0 vs. T58 vs. TCPP vs. T7PP. A total of 440 images were evaluated. A negative linear correlation between time and FQCS and FQLS (r = −0.95; p < 0.0004) and between time and HQCS and HQLS (r = −0.90; p < 0.002) was found. The UUCS decreased throughout the dry period, starting to increase at the beginning of the next lactation. Measuring the UUCS provides useful information for monitoring the dry period

    A novel background reduction strategy for high level triggers and processing in gamma-ray Cherenkov detectors

    Full text link
    Gamma ray astronomy is now at the leading edge for studies related both to fundamental physics and astrophysics. The sensitivity of gamma detectors is limited by the huge amount of background, constituted by hadronic cosmic rays (typically two to three orders of magnitude more than the signal) and by the accidental background in the detectors. By using the information on the temporal evolution of the Cherenkov light, the background can be reduced. We will present here the results obtained within the MAGIC experiment using a new technique for the reduction of the background. Particle showers produced by gamma rays show a different temporal distribution with respect to showers produced by hadrons; the background due to accidental counts shows no dependence on time. Such novel strategy can increase the sensitivity of present instruments.Comment: 4 pages, 3 figures, Proc. of the 9th Int. Syposium "Frontiers of Fundamental and Computational Physics" (FFP9), (AIP, Melville, New York, 2008, in press

    A New Scintillator Tile/Fiber Preshower Detector for the CDF Central Calorimeter

    Full text link
    A detector designed to measure early particle showers has been installed in front of the central CDF calorimeter at the Tevatron. This new preshower detector is based on scintillator tiles coupled to wavelength-shifting fibers read out by multi-anode photomultipliers and has a total of 3,072 readout channels. The replacement of the old gas detector was required due to an expected increase in instantaneous luminosity of the Tevatron collider in the next few years. Calorimeter coverage, jet energy resolution, and electron and photon identification are among the expected improvements. The final detector design, together with the R&D studies that led to the choice of scintillator and fiber, mechanical assembly, and quality control are presented. The detector was installed in the fall 2004 Tevatron shutdown and started collecting colliding beam data by the end of the same year. First measurements indicate a light yield of 12 photoelectrons/MIP, a more than two-fold increase over the design goals.Comment: 5 pages, 10 figures (changes are minor; this is the final version published in IEEE-Trans.Nucl.Sci.
    corecore