2,009 research outputs found

    Two new feature selection algorithms with rough sets theory

    Get PDF
    Rough Sets Theory has opened new trends for the development of the Incomplete Information Theory. Inside this one, the notion of reduct is a very significant one, but to obtain a reduct in a decision system is an expensive computing process although very important in data analysis and knowledge discovery. Because of this, it has been necessary the development of different variants to calculate reducts. The present work look into the utility that offers Rough Sets Model and Information Theory in feature selection and a new method is presented with the purpose of calculate a good reduct. This new method consists of a greedy algorithm that uses heuristics to work out a good reduct in acceptable times. In this paper we propose other method to find good reducts, this method combines elements of Genetic Algorithm with Estimation of Distribution Algorithms. The new methods are compared with others which are implemented inside Pattern Recognition and Ant Colony Optimization Algorithms and the results of the statistical tests are shown.IFIP International Conference on Artificial Intelligence in Theory and Practice - Knowledge Acquisition and Data MiningRed de Universidades con Carreras en Informática (RedUNCI

    Impact of empagliflozin on subclinical left ventricular dysfunctions and on the mechanisms involved in myocardial disease progression in type 2 diabetes: rationale and design of the EMPA-HEART trial.

    Get PDF
    BACKGROUND: Asymptomatic left ventricular (LV) dysfunction is highly prevalent in type 2 diabetes patients. Unlike the other hypoglycemic drugs, SGLT2 inhibitors have shown potential benefits for reducing cardiovascular death and risk factors, aside from lowering plasma glucose levels. With this study we aim at determining whether the treatment with empagliflozin is associated with an improvement in LV functions in diabetic patients with asymptomatic LV dysfunction against Sitagliptin, which is presumably neutral on myocardial function. To determine changes in LV systolic and diastolic functions we will use speckle-tracking echocardiography, a novel sensitive, non-invasive, bedside method allowing the calculation of LV global longitudinal strain (GLS), an index of myocardial deformability, as well as 3D echocardiography, which allows a better evaluation of LV volumes and mass. METHODS: The EMPA-HEART trial will be a phase III, open label, active-controlled, parallel groups, single centre, exploratory study conducted in Pisa, Italy. A cohort of 75 diabetic patients with normal LV systolic (2D-Echo EF > 50%) and renal (eGFR sec MDRD > 60 ml/min/1.73 mq) functions and no evidence of valvular and/or ischemic heart disease will be randomized to either Empagliflozin 10 mg/die or Sitagliptin 100 mg/die. The primary outcome is to detect a change in GLS from baseline to 1 and 6 months after treatment initiation. The secondary outcomes include changes from baseline to 6 months in 3-D Echocardiography EF, left atrial volume and E/E', VO2max as measured at cardiopulmonary test, cardiac autonomic function tests (R-R interval during Valsalva manoeuvre, deep-breathing, lying-to-standing), and the determination of a set of plasma biomarkers aimed at studying volume, inflammation, oxidative stress, matrix remodelling, myocyte strain and injury. DISCUSSION: SGLT2 inhibitors might affect myocardial functions through mechanisms acting both directly and indirectly on the myocardium. The set of instrumental and biohumoral tests of our study might actually detect the presence and entity of empagliflozin beneficial effects on the myocardium and shed light on the mechanisms involved. Further, this study might eventually provide information to design a clinical strategy, based on echocardiography and/or biomarkers, to select the patients who might benefit more from this intervention. Trial registration EUDRACT Code 2016-0022250-10

    Speckle-Tracking Imaging, Principles and Clinical Applications: A Review for Clinical Cardiologists

    Get PDF
    Evaluation of myocardial mechanics, although complex, has now entered the clinical arena, thanks to the introduction of bedside imaging techniques, such as speckle-tracking echocardiography

    Bracket bonding to polymethylmethacrylate-based materials for computer-aided design/manufacture of temporary restorations: influence of mechanical treatment and chemical treatment with universal adhesives

    Get PDF
    Objective To assess shear bond strength and failure mode (Adhesive Remnant Index, ARI) of orthodontic brackets bonded to polymethylmethacrylate (PMMA) blocks for computer-aided design/manufacture (CAD/CAM) fabrication of temporary restorations, following substrate chemical or mechanical treatment. Methods Two types of PMMA blocks were tested: CAD-Temp® (VITA) and Telio® CAD (Ivoclar-Vivadent). The substrate was roughened with 320-grit sandpaper, simulating a fine-grit diamond bur. Two universal adhesives, Scotchbond Universal Adhesive (SU) and Assure Plus (AP), and a conventional adhesive, Transbond XT Primer (XTP; control), were used in combination with Transbond XT Paste to bond the brackets. Six experimental groups were formed: (1) CADTemp®/SU; (2) CAD-Temp®/AP; (3) CAD-Temp®/XTP; (4) Telio® CAD/SU; (5) Telio® CAD/AP; (6) Telio® CAD/XTP. Shear bond strength and ARI were assessed. On 1 extra block for each PMMA-based material surfaces were roughened with 180-grit sandpaper, simulating a normal/medium-grit (100 mm) diamond bur, and brackets were bonded. Shear bond strengths and ARI scores were compared with those of groups 3, 6. Results On CAD-Temp® significantly higher bracket bond strengths than on Telio® CAD were recorded. With XTP significantly lower levels of adhesion were reached than using SU or AP. Roughening with a coarser bur resulted in a significant increase in adhesion. Conclusions Bracket bonding to CAD/CAM PMMA can be promoted by grinding the substrate with a normal/medium-grit bur or by coating the intact surface with universal adhesives. With appropriate pretreatments, bracket adhesion to CAD/CAM PMMA temporary restorations can be enhanced to clinically satisfactory levels

    Data mining to assess organizational transparency across technology processes: an approach from IT governance and knowledge management

    Get PDF
    Information quality and organizational transparency are relevant issues for corporate governance and sustainability of companies, as they contribute to reducing information asymmetry, decreasing risks, and improving the conduct of decision-makers, ensuring an ethical standard of organizational control. This work uses the COBIT framework of IT governance, knowledge management, and machine learning techniques to evaluate organizational transparency considering the maturity levels of technology processes applied in 285 companies of southern Brazil. Data mining techniques have been methodologically applied to analyze the 37 processes in four different domains: Planning and organization, acquisition and implementation, delivery and support, and monitoring. Four learning techniques for knowledge discovery have been used to build a computational model that allowed us to evaluate the organizational transparency level. The results evidence the importance of IT performance monitoring and assessment, and internal control processes in enabling organizations to improve their levels of transparency. These processes depend directly on the establishment of IT strategic plans and quality management, as well as IT risk and project management, therefore an improvement in the maturity of these processes implies an increase in the levels of organizational transparency and their reputational, financial, and accountability impact

    Two new feature selection algorithms with rough sets theory

    Get PDF
    Rough Sets Theory has opened new trends for the development of the Incomplete Information Theory. Inside this one, the notion of reduct is a very significant one, but to obtain a reduct in a decision system is an expensive computing process although very important in data analysis and knowledge discovery. Because of this, it has been necessary the development of different variants to calculate reducts. The present work look into the utility that offers Rough Sets Model and Information Theory in feature selection and a new method is presented with the purpose of calculate a good reduct. This new method consists of a greedy algorithm that uses heuristics to work out a good reduct in acceptable times. In this paper we propose other method to find good reducts, this method combines elements of Genetic Algorithm with Estimation of Distribution Algorithms. The new methods are compared with others which are implemented inside Pattern Recognition and Ant Colony Optimization Algorithms and the results of the statistical tests are shown.IFIP International Conference on Artificial Intelligence in Theory and Practice - Knowledge Acquisition and Data MiningRed de Universidades con Carreras en Informática (RedUNCI
    corecore