121 research outputs found

    Mining HCI Data for Theory of Mind Induction

    Get PDF
    Human-computer interaction (HCI) results in enormous amounts of data-bearing potentials for understanding a human user’s intentions, goals, and desires. Knowing what users want and need is a key to intelligent system assistance. The theory of mind concept known from studies in animal behavior is adopted and adapted for expressive user modeling. Theories of mind are hypothetical user models representing, to some extent, a human user’s thoughts. A theory of mind may even reveal tacit knowledge. In this way, user modeling becomes knowledge discovery going beyond the human’s knowledge and covering domain-specific insights. Theories of mind are induced by mining HCI data. Data mining turns out to be inductive modeling. Intelligent assistant systems inductively modeling a human user’s intentions, goals, and the like, as well as domain knowledge are, by nature, learning systems. To cope with the risk of getting it wrong, learning systems are equipped with the skill of reflection

    Using spreadsheets as learning tools for neural network simulation

    Get PDF
    The article supports the need for training techniques for neural network computer simulations in a spreadsheet context. Their use in simulating artificial neural networks is systematically reviewed. The authors distinguish between fundamental methods for addressing the issue of network computer simulation training in the spreadsheet environment, joint application of spreadsheets and tools for neural network simulation, application of third-party add-ins to spreadsheets, development of macros using embedded languages of spreadsheets, use of standard spreadsheet add-ins for non-linear optimization, creation of neural networks in the spreadsheet environment without add-ins, and On the article, methods for creating neural network models in Google Sheets, a cloud-based spreadsheet, are discussed. The classification of multidimensional data presented in R. A. Fisher's "The Use of Multiple Measurements in Taxonomic Problems" served as the model's primary inspiration. Discussed are various idiosyncrasies of data selection as well as Edgar Anderson's participation in the 1920s and 1930s data preparation and collection. The approach of multi-dimensional data display in the form of an ideograph, created by Anderson and regarded as one of the first effective methods of data visualization, is discussed here.The article supports the need for training techniques for neural network computer simulations in a spreadsheet context. Their use in simulating artificial neural networks is systematically reviewed. The authors distinguish between fundamental methods for addressing the issue of network computer simulation training in the spreadsheet environment, joint application of spreadsheets and tools for neural network simulation, application of third-party add-ins to spreadsheets, development of macros using embedded languages of spreadsheets, use of standard spreadsheet add-ins for non-linear optimization, creation of neural networks in the spreadsheet environment without add-ins, and On the article, methods for creating neural network models in Google Sheets, a cloud-based spreadsheet, are discussed. The classification of multidimensional data presented in R. A. Fisher's "The Use of Multiple Measurements in Taxonomic Problems" served as the model's primary inspiration. Discussed are various idiosyncrasies of data selection as well as Edgar Anderson's participation in the 1920s and 1930s data preparation and collection. The approach of multi-dimensional data display in the form of an ideograph, created by Anderson and regarded as one of the first effective methods of data visualization, is discussed here

    Effects of Dimethyl Fumarate on Brain Atrophy in Relapsing-Remitting Multiple Sclerosis: Pooled Analysis Phase 3 DEFINE and CONFIRM Studies

    Get PDF
    OBJECTIVE: In the pivotal DEFINE and CONFIRM trials for dimethyl fumarate (DMF), patterns of brain volume changes were different, potentially due to low sample sizes and because MRIs were analyzed at two different reading centers. We evaluated effects of DMF on brain volume change in patients with multiple sclerosis (MS) through reanalysis of pooled images from DEFINE/CONFIRM trials in one reading center. METHODS: MRIs from DEFINE/CONFIRM at weeks 0, 24, 48, and 96 from patients randomized to twice-daily DMF or placebo (PBO) were reanalyzed at the Cleveland Clinic to measure brain parenchymal fraction (BPF). To account for pseudoatrophy, brain volume estimates were re-baselined to calculate changes for weeks 48–96. RESULTS: Across studies, 301 and 314 patients receiving DMF and PBO, respectively, had analyzable MRIs. In weeks 0–48, mean ± SE percentage change in BPF was −0.44 ± 0.04 vs. −0.34 ± 0.04% in DMF vs. PBO, respectively, whereas in weeks 48–96, mean ± SE percentage change in BPF was −0.27 ± 0.03 vs. −0.41 ± 0.04% in DMF vs. PBO, respectively. The mixed-effect model for repeated measures showed similar results: in weeks 48–96, estimated change (95% confidence interval) in BPF was −0.0021 (−0.0027, −0.0016) for DMF vs. −0.0033 (−0.0039, −0.0028) for PBO (35.9% reduction; p = 0.0025). CONCLUSIONS: The lower rate of whole brain volume loss with DMF in this pooled BPF analysis in the second year vs. PBO is consistent with its effects on relapses, disability, and MRI lesions. Brain volume changes in the first year may be explained by pseudoatrophy effects also described in other MS clinical trials

    Azimuthal anisotropy of charged jet production in root s(NN)=2.76 TeV Pb-Pb collisions

    Get PDF
    We present measurements of the azimuthal dependence of charged jet production in central and semi-central root s(NN) = 2.76 TeV Pb-Pb collisions with respect to the second harmonic event plane, quantified as nu(ch)(2) (jet). Jet finding is performed employing the anti-k(T) algorithm with a resolution parameter R = 0.2 using charged tracks from the ALICE tracking system. The contribution of the azimuthal anisotropy of the underlying event is taken into account event-by-event. The remaining (statistical) region-to-region fluctuations are removed on an ensemble basis by unfolding the jet spectra for different event plane orientations independently. Significant non-zero nu(ch)(2) (jet) is observed in semi-central collisions (30-50% centrality) for 20 <p(T)(ch) (jet) <90 GeV/c. The azimuthal dependence of the charged jet production is similar to the dependence observed for jets comprising both charged and neutral fragments, and compatible with measurements of the nu(2) of single charged particles at high p(T). Good agreement between the data and predictions from JEWEL, an event generator simulating parton shower evolution in the presence of a dense QCD medium, is found in semi-central collisions. (C) 2015 CERN for the benefit of the ALICE Collaboration. Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).Peer reviewe

    The SIB Swiss Institute of Bioinformatics' resources: focus on curated databases

    Get PDF
    The SIB Swiss Institute of Bioinformatics (www.isb-sib.ch) provides world-class bioinformatics databases, software tools, services and training to the international life science community in academia and industry. These solutions allow life scientists to turn the exponentially growing amount of data into knowledge. Here, we provide an overview of SIB's resources and competence areas, with a strong focus on curated databases and SIB's most popular and widely used resources. In particular, SIB's Bioinformatics resource portal ExPASy features over 150 resources, including UniProtKB/Swiss-Prot, ENZYME, PROSITE, neXtProt, STRING, UniCarbKB, SugarBindDB, SwissRegulon, EPD, arrayMap, Bgee, SWISS-MODEL Repository, OMA, OrthoDB and other databases, which are briefly described in this article

    Pseudorapidity and transverse-momentum distributions of charged particles in proton-proton collisions at root s=13 TeV

    Get PDF
    The pseudorapidity (eta) and transverse-momentum (p(T)) distributions of charged particles produced in proton-proton collisions are measured at the centre-of-mass energy root s = 13 TeV. The pseudorapidity distribution in vertical bar eta vertical bar <1.8 is reported for inelastic events and for events with at least one charged particle in vertical bar eta vertical bar <1. The pseudorapidity density of charged particles produced in the pseudorapidity region vertical bar eta vertical bar <0.5 is 5.31 +/- 0.18 and 6.46 +/- 0.19 for the two event classes, respectively. The transverse-momentum distribution of charged particles is measured in the range 0.15 <p(T) <20 GeV/c and vertical bar eta vertical bar <0.8 for events with at least one charged particle in vertical bar eta vertical bar <1. The evolution of the transverse momentum spectra of charged particles is also investigated as a function of event multiplicity. The results are compared with calculations from PYTHIA and EPOS Monte Carlo generators. (C) 2015 CERN for the benefit of the ALICE Collaboration. Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).Peer reviewe

    Centrality evolution of the charged-particle pseudorapidity density over a broad pseudorapidity range in Pb-Pb collisions at root s(NN)=2.76TeV

    Get PDF
    Peer reviewe

    Product-Service Systems: reviewing achievements and refining the research agenda

    No full text
    This special issue on Product-Service Systems: reviewing achievements and refining the research agenda shows the progress that has been made in the PSS field in the last decade, including various national and international research projects and companies' initiatives, important achievements and gaps at theoretical and practical levels. Most of the papers can be of interest to practitioners, company managers and consultants, since they present methods and tools for helping companies to shift towards PSS in environmentally sound ways and to evaluate the outcomes of the shift. In addition, this issue can be of interest to the research community since it evaluates the,progress in the PSS field and outlines a future research agenda

    Therapy Plan Generation as Program Synthesis

    No full text
    . There has been developed and implemented an algorithm for the automatic synthesis of therapy plans for complex dynamic systems. This algorithm is the core of some control synthesis module which is embedded in a larger knowledge-based system for control, diagnosis and therapy. There are several applications. The planning algorithm may be understood as an inductive program synthesis procedure. Its fundamentals are introduced and its key ideas are sketched. The dichotomy between executability and consistency is investigated. 1 Motivation and Introduction The main intention of the present paper is to establish a new link between two areas of research: Program Synthesis and Therapy Planning. Thus, the authors wish to advance both areas of research they are active in. For program synthesis, the intended integration may result in new and exciting problems characterized by particular constraints not investigated in the classical approaches, so far. Our approach may widen the view at automat..
    corecore