906 research outputs found

    Archetypes for histogram-valued data

    Get PDF
    Il principale sviluppo innovativo del lavoro è quello di propone una estensione dell'analisi archetipale per dati ad istogramma. Per quanto concerne l'impianto metodologico nell'approccio all'analisi di dati ad istogramma, che sono di natura complessa, il presente lavora utilizza le intuizioni della "Symbolic Data Analysis" (SDA) e le relazioni intrinseche tra dati valutati ad intervallo e dati valutati ad istogramma. Dopo aver discusso la tecnica sviluppata in ambiente Matlab, il suo funzionamento e le sue proprietà su di un esempio di comodo, tale tecnica viene proposta, nella sezione applicativa, come strumento per effettuare una analisi di tipo "benchmarking" quantitativo. Nello specifico, si propongono i principali risultati ottenuti da una applicazione degli archetipi per dati ad istogramma ad un caso di benchmarking interno del sistema scolastico, utilizzando dati provenienti dal test INVALSI relativi all'anno scolastico 2015/2016. In questo contesto l'unità di analisi è considerata essere la singola scuola, definita operativamente attraverso le distribuzioni dei punteggi dei propri alunni valutate, congiuntamente, sotto forma di oggetti simbolici ad istogramma

    Estimation of parameters for an archetypal model of cardiomyocyte membrane potentials

    Get PDF
    Contemporary realistic mathematical models of single-cell cardiac electrical excitation are immensely detailed. Model complexity leads to parameter uncertainty, high computational cost and barriers to mechanistic understanding. There is a need for reduced models that are conceptually and mathematically simple but physiologically accurate. To this end, we consider an archetypal model of single-cell cardiac excitation that replicates the phase-space geometry of detailed cardiac models, but at the same time has a simple piecewise-linear form and a relatively low-dimensional configuration space. In order to make this archetypal model practically applicable, we develop and report a robust method for estimation of its parameter values from the morphology of single-stimulus action potentials derived from detailed ionic current models and from experimental myocyte measurements. The procedure is applied to five significant test cases and an excellent agreement with target biomarkers is achieved. Action potential duration restitution curves are also computed and compared to those of the target test models and data, demonstrating conservation of dynamical pacing behaviour by the fine-tuned archetypal model. An archetypal model that accurately reproduces a variety of wet-lab and synthetic electrophysiology data offers a number of specific advantages such as computational efficiency, as also demonstrated in the study. Open-source numerical code of the models and methods used is provided

    Benchmarking of Gaussian boson sampling using two-point correlators

    Get PDF
    Gaussian boson sampling is a promising scheme for demonstrating a quantum computational advantage using photonic states that are accessible in a laboratory and, thus, offer scalable sources of quantum light. In this contribution, we study two-point photon-number correlation functions to gain insight into the interference of Gaussian states in optical networks. We investigate the characteristic features of statistical signatures which enable us to distinguish classical from quantum interference. In contrast to the typical implementation of boson sampling, we find additional contributions to the correlators under study which stem from the phase dependence of Gaussian states and which are not observable when Fock states interfere. Using the first three moments, we formulate the tools required to experimentally observe signatures of quantum interference of Gaussian states using two outputs only. By considering the current architectural limitations in realistic experiments, we further show that a statistically significant discrimination between quantum and classical interference is possible even in the presence of loss, noise, and a finite photon-number resolution. Therefore, we formulate and apply a theoretical framework to benchmark the quantum features of Gaussian boson sampling under realistic conditions

    Maturity Models Development in IS Research: A Literature Review

    Get PDF
    Maturity models are widespread in IS research and in particular, IT practitioner communities. However, theoretically sound, methodologically rigorous and empirically validated maturity models are quite rare. This literature review paper focuses on the challenges faced during the development of maturity models. Specifically, it explores maturity models literature in IS and standard guidelines, if any to develop maturity models, challenges identified and solutions proposed. Our systematic literature review of IS publications revealed over hundred and fifty articles on maturity models. Extant literature reveals that researchers have primarily focused on developing new maturity models pertaining to domain-specific problems and/or new enterprise technologies. We find rampant re-use of the design structure of widely adopted models such as Nolan’s Stage of Growth Model, Crosby’s Grid, and Capability Maturity Model (CMM). Only recently have there been some research efforts to standardize maturity model development. We also identify three dominant views of maturity models and provide guidelines for various approaches of constructing maturity models with a standard vocabulary. We finally propose using process theories and configurational approaches to address the main theoretical criticisms with regard to maturity models and conclude with some recommendations for maturity model developers

    Quality-diversity in dissimilarity spaces

    Full text link
    The theory of magnitude provides a mathematical framework for quantifying and maximizing diversity. We apply this framework to formulate quality-diversity algorithms in generic dissimilarity spaces. In particular, we instantiate and demonstrate a very general version of Go-Explore with promising performance.Comment: Minor bug fix: see new appendix J for details. Only small quantitative effects; no significant changes to results (but all redone

    IT Revolutionizing the Supply chain Transformation: A Case Study of Unilever Pakistan Ltd.

    Get PDF
    Supply chain Management is a combination of three words supply, chain and management. Supply is all about meeting the needs, wants and demands of customers where as the chain actually represents connectivity. Further management is all about planning the supplies transit to meet demand , organizing the processes sequence for it , controlling and ensuring the process quality through check points and gate control systems, leading by defining process ownership and staffing by right sizing at each and every step. Thus aforementioned three words combined together to form supply chain management. Supply chain management is also to provide the best possible services to the customer with maximum cost effectiveness. Another recent development in the field of supply chain management is the evolution of supply chain operation reference model also known as SCOR model. This model was first introduced by the supply chain council (SCC), a globally known corporation due to excellence in supply chain practices and systems formulation. This SCOR model is actually a reference model standardized terminology and processes [42] which actually emphasizes on benchmarking. This benchmarking is actually related to an operational measurement in order to craft a portfolio for improvement which is directly linked and tied with balance sheet of the company for improved performance along with bottom line increment. Information technology (IT) applications in the field of supply chain management (SCM) has achieved a significance by virtue of its capability and ability to lessen the costs and enhanced responsiveness in the supply chain functions [41], [15] , [19], [36], [45], [43]

    Maturity Models Architecture: A large systematic mapping

    Get PDF
    Maturity models are widespread in research and in particular, IT practitioner communities. However, theoretically sound, methodologically rigorous and empirically validated maturity models are quite rare.  This systematic mapping paper focuses on the challenges faced during the development of maturity models. More specifically, it explores the literature on maturity models and standard guidelines to develop maturity models, the challenges identified and solutions proposed.  Our systematic mapping  revealed over six hundred articles on maturity models. Extant literature reveals that researchers have primarily focused on developing new maturity models pertaining to domain-specific problems and/or new enterprise technologies. We find rampant re-use of the design structure of widely adopted models such as Nolan’s Stage of Growth Model, Crosby’s Grid, and Capability Maturity Model (CMM). We also identify three dominant views of maturity models and provide guidelines for various approaches to constructing maturity models with a standard vocabulary. We finally propose using process theories and configurational approaches to address the main theoretical criticisms with regard to maturity models and conclude with some recommendations for maturity model developers

    Interpretable Machine Learning for Electro-encephalography

    Get PDF
    While behavioral, genetic and psychological markers can provide important information about brain health, research in that area over the last decades has much focused on imaging devices such as magnetic resonance tomography (MRI) to provide non-invasive information about cognitive processes. Unfortunately, MRI based approaches, able to capture the slow changes in blood oxygenation levels, cannot capture electrical brain activity which plays out on a time scale up to three orders of magnitude faster. Electroencephalography (EEG), which has been available in clinical settings for over 60 years, is able to measure brain activity based on rapidly changing electrical potentials measured non-invasively on the scalp. Compared to MRI based research into neurodegeneration, EEG based research has, over the last decade, received much less interest from the machine learning community. But generally, EEG in combination with sophisticated machine learning offers great potential such that neglecting this source of information, compared to MRI or genetics, is not warranted. In collaborating with clinical experts, the ability to link any results provided by machine learning to the existing body of research is especially important as it ultimately provides an intuitive or interpretable understanding. Here, interpretable means the possibility for medical experts to translate the insights provided by a statistical model into a working hypothesis relating to brain function. To this end, we propose in our first contribution a method allowing for ultra-sparse regression which is applied on EEG data in order to identify a small subset of important diagnostic markers highlighting the main differences between healthy brains and brains affected by Parkinson's disease. Our second contribution builds on the idea that in Parkinson's disease impaired functioning of the thalamus causes changes in the complexity of the EEG waveforms. The thalamus is a small region in the center of the brain affected early in the course of the disease. Furthermore, it is believed that the thalamus functions as a pacemaker - akin to a conductor of an orchestra - such that changes in complexity are expressed and quantifiable based on EEG. We use these changes in complexity to show their association with future cognitive decline. In our third contribution we propose an extension of archetypal analysis embedded into a deep neural network. This generative version of archetypal analysis allows to learn an appropriate representation where every sample of a data set can be decomposed into a weighted sum of extreme representatives, the so-called archetypes. This opens up an interesting possibility of interpreting a data set relative to its most extreme representatives. In contrast, clustering algorithms describe a data set relative to its most average representatives. For Parkinson's disease, we show based on deep archetypal analysis, that healthy brains produce archetypes which are different from those produced by brains affected by neurodegeneration
    • …
    corecore