1,018 research outputs found

    Digging by Debating: Linking massive datasets to specific arguments

    Get PDF
    We will develop and implement a multi-scale workbench, called "InterDebates", with the goal of digging into data provided by hundreds of thousands, eventually millions, of digitized books, bibliographic databases of journal articles, and comprehensive reference works written by experts. Our hypotheses are: that detailed and identifiable arguments drive many aspects of research in the sciences and the humanities; that argumentative structures can be extracted from large datasets using a mixture of automated and social computing techniques; and, that the availability of such analyses will enable innovative interdisciplinary research, and may also play a role in supporting better-informed critical debates among students and the general public. A key challenge tackled by this project is thus to uncover and represent the argumentative structure of digitized documents, allowing users to find and interpret detailed arguments in the broad semantic landscape of books and articles

    Treatment of white coat HYpertension in the Very Elderly Trial (HYVET 2) - feasibility of a randomized controlled trial (study protocol)

    Get PDF
    The results of HYpertension in the Very Elderly Trial (HYVET) were crucial in providing evidence of benefit of the treatment of hypertension in those 80 years or older. Following a subsequent sub study analysis of the HYVET data there is a suggestion that 50% of patients in the main study had White Coat Hypertension (WCH), defined as clinic BP readings >140/90 mmHg and ambulatory BP readings <135/85 mmHg. Currently, definitive evidence in support of treatment for such individuals is not available. HYVET 2 has been designed in order to assess the feasibility of conducting a randomized controlled trial which might determine whether the treatment of WCH in the very elderly is clinically beneficial. One hundred participants aged ≥75 years diagnosed with WCH will be recruited from General Practices (GPs) in UK. Randomization will be 1:1 to a treatment arm (indapamide and perindopril) and control arm (no treatment) and follow up will be for 52 weeks. HYVET 2 will report on feasibility outcomes including participant recruitment, adherence and withdrawal rates, willingness of GPs to recruit and randomize patients and the frequency of a composite of cardiovascular events. Simple descriptive statistics will be presented

    From Big Data to Argument Analysis and Automated Extraction: A Selective Study of Argument in the Philosophy of Animal Psychology from the Volumes of the Hathi Trust Collection

    Get PDF
    The Digging by Debating (DbyD) project aimed to identify, extract, model, map and visualise philosophical arguments in very large text repositories such as the Hathi Trust. The project has: 1) developed a method for visualizing points of contact between philosophy and the sciences; 2) used topic modeling to identify the volumes, and pages within those volumes, which are ‘rich’ in a chosen topic; 3) used a semiformal discourse analysis technique to manually identify key arguments in the selected pages; 4) used the OVA argument mapping tool to represent and map the key identified arguments and provide a framework for comparative analysis; 5) devised and used a novel analysis framework applied to the mapped arguments covering role, content and source of propositions, and the importance, context and meaning of arguments; 6) created a prototype tool for identifying propositions, using naive Bayes classifiers, and for identifying argument structure in chosen texts, using propositional similarity; 7) created tools to apply topic modeling to tasks of rating similarity of papers in the PhilPapers repository. The methods from 1 to 5 above, have enabled us to locate and extract the key arguments from each text. It is significant that, in applying the methods, a nonexpert with limited or no domain knowledge of philosophy has both identified the volumes of interest from a key ‘Big Data Set’ (Hathi Trust) AND identified key arguments within these texts. This provided several key insights about the nature and form of arguments in historical texts, and is a proofofconcept design for a tool that will be usable by scholars. We have further created a dataset with which to train and test prototype tools for both proposition and argument extraction. Though at an early stage, these preliminary results are promising given the complexity of the task. Specifically, we have prototyped a set of tools and methods that allow scholars to move between macroscale, global views of the distributions of philosophical themes in such repositories, and microscale analyses of the arguments appearing on specific pages in texts belonging to the repository. Our approach spans bibliographic analysis, science mapping, and LDA topic modeling conducted at Indiana University and machineassisted argument markup into Argument Interchange Format (AIF) using the OVA (Online Visualization of Argument) tool from the University of Dundee, where the latter has been used to analyse and represent arguments by the team based at the University of East London, who also performed a detailed empirical analysis of arguments in selected texts. This work has been articulated as a proof of concept tool – linked to the repository PhilPapers – designed by members linked to the University of London. This project is showing for the first time how big data text processing techniques can be combined with deep structural analysis to provide researchers and students with navigation and interaction tools for engaging with the large and rich resources provided by datasets such as the Hathi Trust and PhilPapers. Ultimately our efforts show how the computational humanities can bridge the gulf between the “big data” perspective of firstgeneration digital humanities and the close readings of text that are the “bread and butter” of more traditional scholarship in the humanities

    Measurement of the top quark forward-backward production asymmetry and the anomalous chromoelectric and chromomagnetic moments in pp collisions at √s = 13 TeV

    Get PDF
    Abstract The parton-level top quark (t) forward-backward asymmetry and the anomalous chromoelectric (d̂ t) and chromomagnetic (μ̂ t) moments have been measured using LHC pp collisions at a center-of-mass energy of 13 TeV, collected in the CMS detector in a data sample corresponding to an integrated luminosity of 35.9 fb−1. The linearized variable AFB(1) is used to approximate the asymmetry. Candidate t t ¯ events decaying to a muon or electron and jets in final states with low and high Lorentz boosts are selected and reconstructed using a fit of the kinematic distributions of the decay products to those expected for t t ¯ final states. The values found for the parameters are AFB(1)=0.048−0.087+0.095(stat)−0.029+0.020(syst),μ̂t=−0.024−0.009+0.013(stat)−0.011+0.016(syst), and a limit is placed on the magnitude of | d̂ t| &lt; 0.03 at 95% confidence level. [Figure not available: see fulltext.

    MUSiC : a model-unspecific search for new physics in proton-proton collisions at root s=13TeV

    Get PDF
    Results of the Model Unspecific Search in CMS (MUSiC), using proton-proton collision data recorded at the LHC at a centre-of-mass energy of 13 TeV, corresponding to an integrated luminosity of 35.9 fb(-1), are presented. The MUSiC analysis searches for anomalies that could be signatures of physics beyond the standard model. The analysis is based on the comparison of observed data with the standard model prediction, as determined from simulation, in several hundred final states and multiple kinematic distributions. Events containing at least one electron or muon are classified based on their final state topology, and an automated search algorithm surveys the observed data for deviations from the prediction. The sensitivity of the search is validated using multiple methods. No significant deviations from the predictions have been observed. For a wide range of final state topologies, agreement is found between the data and the standard model simulation. This analysis complements dedicated search analyses by significantly expanding the range of final states covered using a model independent approach with the largest data set to date to probe phase space regions beyond the reach of previous general searches.Peer reviewe

    Search for new particles in events with energetic jets and large missing transverse momentum in proton-proton collisions at root s=13 TeV

    Get PDF
    A search is presented for new particles produced at the LHC in proton-proton collisions at root s = 13 TeV, using events with energetic jets and large missing transverse momentum. The analysis is based on a data sample corresponding to an integrated luminosity of 101 fb(-1), collected in 2017-2018 with the CMS detector. Machine learning techniques are used to define separate categories for events with narrow jets from initial-state radiation and events with large-radius jets consistent with a hadronic decay of a W or Z boson. A statistical combination is made with an earlier search based on a data sample of 36 fb(-1), collected in 2016. No significant excess of events is observed with respect to the standard model background expectation determined from control samples in data. The results are interpreted in terms of limits on the branching fraction of an invisible decay of the Higgs boson, as well as constraints on simplified models of dark matter, on first-generation scalar leptoquarks decaying to quarks and neutrinos, and on models with large extra dimensions. Several of the new limits, specifically for spin-1 dark matter mediators, pseudoscalar mediators, colored mediators, and leptoquarks, are the most restrictive to date.Peer reviewe

    Measurement of prompt open-charm production cross sections in proton-proton collisions at root s=13 TeV

    Get PDF
    The production cross sections for prompt open-charm mesons in proton-proton collisions at a center-of-mass energy of 13TeV are reported. The measurement is performed using a data sample collected by the CMS experiment corresponding to an integrated luminosity of 29 nb(-1). The differential production cross sections of the D*(+/-), D-+/-, and D-0 ((D) over bar (0)) mesons are presented in ranges of transverse momentum and pseudorapidity 4 < p(T) < 100 GeV and vertical bar eta vertical bar < 2.1, respectively. The results are compared to several theoretical calculations and to previous measurements.Peer reviewe

    Combined searches for the production of supersymmetric top quark partners in proton-proton collisions at root s=13 TeV

    Get PDF
    A combination of searches for top squark pair production using proton-proton collision data at a center-of-mass energy of 13 TeV at the CERN LHC, corresponding to an integrated luminosity of 137 fb(-1) collected by the CMS experiment, is presented. Signatures with at least 2 jets and large missing transverse momentum are categorized into events with 0, 1, or 2 leptons. New results for regions of parameter space where the kinematical properties of top squark pair production and top quark pair production are very similar are presented. Depending on themodel, the combined result excludes a top squarkmass up to 1325 GeV for amassless neutralino, and a neutralinomass up to 700 GeV for a top squarkmass of 1150 GeV. Top squarks with masses from 145 to 295 GeV, for neutralino masses from 0 to 100 GeV, with a mass difference between the top squark and the neutralino in a window of 30 GeV around the mass of the top quark, are excluded for the first time with CMS data. The results of theses searches are also interpreted in an alternative signal model of dark matter production via a spin-0 mediator in association with a top quark pair. Upper limits are set on the cross section for mediator particle masses of up to 420 GeV

    Search for Physics beyond the Standard Model in Events with Overlapping Photons and Jets

    Get PDF
    Results are reported from a search for new particles that decay into a photon and two gluons, in events with jets. Novel jet substructure techniques are developed that allow photons to be identified in an environment densely populated with hadrons. The analyzed proton-proton collision data were collected by the CMS experiment at the LHC, in 2016 at root s = 13 TeV, and correspond to an integrated luminosity of 35.9 fb(-1). The spectra of total transverse hadronic energy of candidate events are examined for deviations from the standard model predictions. No statistically significant excess is observed over the expected background. The first cross section limits on new physics processes resulting in such events are set. The results are interpreted as upper limits on the rate of gluino pair production, utilizing a simplified stealth supersymmetry model. The excluded gluino masses extend up to 1.7 TeV, for a neutralino mass of 200 GeV and exceed previous mass constraints set by analyses targeting events with isolated photons.Peer reviewe
    corecore