117 research outputs found

    DPpack: An R Package for Differentially Private Statistical Analysis and Machine Learning

    Full text link
    Differential privacy (DP) is the state-of-the-art framework for guaranteeing privacy for individuals when releasing aggregated statistics or building statistical/machine learning models from data. We develop the open-source R package DPpack that provides a large toolkit of differentially private analysis. The current version of DPpack implements three popular mechanisms for ensuring DP: Laplace, Gaussian, and exponential. Beyond that, DPpack provides a large toolkit of easily accessible privacy-preserving descriptive statistics functions. These include mean, variance, covariance, and quantiles, as well as histograms and contingency tables. Finally, DPpack provides user-friendly implementation of privacy-preserving versions of logistic regression, SVM, and linear regression, as well as differentially private hyperparameter tuning for each of these models. This extensive collection of implemented differentially private statistics and models permits hassle-free utilization of differential privacy principles in commonly performed statistical analysis. We plan to continue developing DPpack and make it more comprehensive by including more differentially private machine learning techniques, statistical modeling and inference in the future

    Mechanics of the Developing Brain: From Smooth-walled Tube to the Folded Cortex

    Get PDF
    Over the course of human development, the brain undergoes dramatic physical changes to achieve its final, convoluted shape. However, the forces underlying every cinch, bulge, and fold remain poorly understood. This doctoral research focuses on the mechanical processes responsible for early (embryonic) and late (preterm) brain development. First, we examine early brain development in the chicken embryo, which is similar to human at these stages. Research has primarily focused on molecular signals to describe morphogenesis, but mechanical analysis can also provide important insights. Using a combination of experiments and finite element modeling, we find that actomyosin contraction is responsible for initial segmentation of the forebrain. By considering mechanical forces from the internal and external environment, we propose a role for mechanical feedback in maintaining these segments during subsequent inflation and bending. Next, we extend our analysis to division of right and left cerebral hemispheres. In this case, we discover that morphogen signals and mechanical feedback act synergistically to shape the hemispheres. In human, cerebral hemispheres go on to form complex folds through a mechanical process that involves rapid expansion of the cortical surface. However, the spatiotemporal dynamics of cortical growth remain unknown in human. Here, we develop a novel strain energy minimization approach to measure regional growth in complex surfaces. By considering brain surfaces of preterm subjects, reconstructed from magnetic resonance imaging (MRI), this analysis reveals distinct patterns of cortical growth that evolve over the third trimester. This information provides a comprehensive view of cortical growth and folding, connecting what is known about patterns of development at the cellular and folding scales. Abnormal brain morphogenesis can lead to serious structural defects and neurological disorders such as epilepsy and autism. By integrating mechanics, biology, and neuroimaging, we gain a more complete understanding of brain development. By studying physical changes from the simple, microscopic embryo to the macroscopic, folded cortex, we gain insight into relevant biological and physical mechanisms across developmental stages

    Front Propagation in Random Media

    Get PDF
    This PhD thesis deals with the problem of the propagation of fronts under random circumstances. A statistical model to represent the motion of fronts when are evolving in a media characterized by microscopical randomness is discussed and expanded, in order to cope with three distinct applications: wild-land fire simulation, turbulent premixed combustion, biofilm modeling. In the studied formalism, the position of the average front is computed by making use of a sharp-front evolution method, such as the level set method. The microscopical spread of particles which takes place around the average front is given by the probability density function linked to the underlying diffusive process, that is supposedly known in advance. The adopted statistical front propagation framework allowed a deeper understanding of any studied field of application. The application of this model introduced eventually parameters whose impact on the physical observables of the front spread have been studied with Uncertainty Quantification and Sensitivity Analysis tools. In particular, metamodels for the front propagation system have been constructed in a non intrusive way, by making use of generalized Polynomial Chaos expansions and Gaussian Processes.The Thesis received funding from Basque Government through the BERC 2014-2017 program. It was also funded by the Spanish Ministry of Economy and Competitiveness MINECO via the BCAM Severo Ochoa SEV-2013-0323 accreditation. The PhD is fundend by La Caixa Foundation through the PhD grant “La Caixa 2014”. Funding from “Programma Operativo Nazionale Ricerca e Innovazione” (PONRI 2014-2020) , “Innotavive PhDs with Industrial Characterization” is kindly acknowledged for a research visit at the department of Mathematics and Applications “Renato Caccioppoli” of University “Federico II” of Naples

    Front propagation in random media.

    Get PDF
    244 p.This PhD thesis deals with the problem of the propagation of fronts under random circumstances. Astatistical model to represent the motion of fronts when are evolving in a media characterized bymicroscopical randomness is discussed and expanded, in order to cope with three distinctapplications: wild-land fire simulation, turbulent premixed combustion, biofilm modeling. In thestudied formalism, the position of the average front is computed by making use of a sharp-frontevolution method, such as the level set method. The microscopical spread of particles which takesplace around the average front is given by the probability density function linked to the underlyingdiffusive process, that is supposedly known in advance. The adopted statistical front propagationframework allowed a deeper understanding of any studied field of application. The application ofthis model introduced eventually parameters whose impact on the physical observables of the frontspread have been studied with Uncertainty Quantification and Sensitivity Analysis tools. Inparticular, metamodels for the front propagation system have been constructed in a non intrusiveway, by making use of generalized Polynomial Chaos expansions and Gaussian Processes.bcam:basque center for applied mathematic

    High-level synthesis of dataflow programs for heterogeneous platforms:design flow tools and design space exploration

    Get PDF
    The growing complexity of digital signal processing applications implemented in programmable logic and embedded processors make a compelling case the use of high-level methodologies for their design and implementation. Past research has shown that for complex systems, raising the level of abstraction does not necessarily come at a cost in terms of performance or resource requirements. As a matter of fact, high-level synthesis tools supporting such a high abstraction often rival and on occasion improve low-level design. In spite of these successes, high-level synthesis still relies on programs being written with the target and often the synthesis process, in mind. In other words, imperative languages such as C or C++, most used languages for high-level synthesis, are either modified or a constrained subset is used to make parallelism explicit. In addition, a proper behavioral description that permits the unification for hardware and software design is still an elusive goal for heterogeneous platforms. A promising behavioral description capable of expressing both sequential and parallel application is RVC-CAL. RVC-CAL is a dataflow programming language that permits design abstraction, modularity, and portability. The objective of this thesis is to provide a high-level synthesis solution for RVC-CAL dataflow programs and provide an RVC-CAL design flow for heterogeneous platforms. The main contributions of this thesis are: a high-level synthesis infrastructure that supports the full specification of RVC-CAL, an action selection strategy for supporting parallel read and writes of list of tokens in hardware synthesis, a dynamic fine-grain profiling for synthesized dataflow programs, an iterative design space exploration framework that permits the performance estimation, analysis, and optimization of heterogeneous platforms, and finally a clock gating strategy that reduces the dynamic power consumption. Experimental results on all stages of the provided design flow, demonstrate the capabilities of the tools for high-level synthesis, software hardware Co-Design, design space exploration, and power optimization for reconfigurable hardware. Consequently, this work proves the viability of complex systems design and implementation using dataflow programming, not only for system-level simulation but real heterogeneous implementations

    Temporal Markov Decision Problems : Formalization and Resolution

    Get PDF
    This thesis addresses the question of planning under uncertainty within a time-dependent changing environment. Original motivation for this work came from the problem of building an autonomous agent able to coordinate with its uncertain environment; this environment being composed of other agents communicating their intentions or non-controllable processes for which some discrete-event model is available. We investigate several approaches for modeling continuous time-dependency in the framework of Markov Decision Processes (MDPs), leading us to a definition of Temporal Markov Decision Problems. Then our approach focuses on two separate paradigms. First, we investigate time-dependent problems as \emph{implicit-event} processes and describe them through the formalism of Time-dependent MDPs (TMDPs). We extend the existing results concerning optimality equations and present a new Value Iteration algorithm based on piecewise polynomial function representations in order to solve a more general class of TMDPs. This paves the way to a more general discussion on parametric actions in hybrid state and action spaces MDPs with continuous time. In a second time, we investigate the option of separately modeling the concurrent contributions of exogenous events. This approach of \emph{explicit-event} modeling leads to the use of Generalized Semi-Markov Decision Processes (GSMDP). We establish a link between the general framework of Discrete Events Systems Specification (DEVS) and the formalism of GSMDP, allowing us to build sound discrete-event compatible simulators. Then we introduce a simulation-based Policy Iteration approach for explicit-event Temporal Markov Decision Problems. This algorithmic contribution brings together results from simulation theory, forward search in MDPs, and statistical learning theory. The implicit-event approach was tested on a specific version of the Mars rover planning problem and on a drone patrol mission planning problem while the explicit-event approach was evaluated on a subway network control problem

    DevOps for Trustworthy Smart IoT Systems

    Get PDF
    ENACT is a research project funded by the European Commission under its H2020 program. The project consortium consists of twelve industry and research member organisations spread across the whole EU. The overall goal of the ENACT project was to provide a novel set of solutions to enable DevOps in the realm of trustworthy Smart IoT Systems. Smart IoT Systems (SIS) are complex systems involving not only sensors but also actuators with control loops distributed all across the IoT, Edge and Cloud infrastructure. Since smart IoT systems typically operate in a changing and often unpredictable environment, the ability of these systems to continuously evolve and adapt to their new environment is decisive to ensure and increase their trustworthiness, quality and user experience. DevOps has established itself as a software development life-cycle model that encourages developers to continuously bring new features to the system under operation without sacrificing quality. This book reports on the ENACT work to empower the development and operation as well as the continuous and agile evolution of SIS, which is necessary to adapt the system to changes in its environment, such as newly appearing trustworthiness threats

    Simulation and Optimization Models for Scheduling Multi-step Sequential Procedures in Nuclear Medicine

    Get PDF
    The rise in demand for specialized medical services in the U.S has been recognized as one of the contributors to increased health care costs. Nuclear medicine is a specialized service that uses relatively new technologies and radiopharmaceuticals with a short half-life for diagnosis and treatment of patients. Nuclear medicine procedures are multi-step and have to be performed under restrictive time constraints. Consequently, managing patients in nuclear medicine clinics is a challenging problem with little research attention. In this work we present simulation and optimization models for improving patient and resource scheduling in health care specialty clinics such as nuclear medicine departments. We rst derive a discrete event system speci cation (DEVS) simulation model for nuclear medicine patient service management that considers both patient and management perspectives. DEVS is a formal modeling and simulation framework based on dynamical systems theory and provides well de ned concepts for coupling components, hierarchical and modular model construction, and an object-oriented substrate supporting repository reuse. Secondly, we derive algorithms for scheduling nuclear medicine patients and resources and validate our algorithms using the simulation model. We obtain computational results that provide useful insights into patient service management in nuclear medicine. For example, the number of patients seen at the clinic during a year increases when a group of stations are reserved to serve procedures with higher demand. Finally, we derive a stochastic online scheduling (SOS) algorithm for patient and resource management in nuclear medicine clinics. The algorithm performs scheduling decisions by taking into account stochastic information about patient future arrivals. We compare the results obtained using the SOS algorithm with the algorithms that do not take into consideration stochastic information. The SOS algorithm provides a balanced utilization of resources and a 10% improvement in the number of patients served
    • …
    corecore