355 research outputs found

    Transparently Mixing Undo Logs and Software Reversibility for State Recovery in Optimistic PDES

    Get PDF
    The rollback operation is a fundamental building block to support the correct execution of a speculative Time Warp-based Parallel Discrete Event Simulation. In the literature, several solutions to reduce the execution cost of this operation have been proposed, either based on the creation of a checkpoint of previous simulation state images, or on the execution of negative copies of simulation events which are able to undo the updates on the state. In this paper, we explore the practical design and implementation of a state recoverability technique which allows to restore a previous simulation state either relying on checkpointing or on the reverse execution of the state updates occurred while processing events in forward mode. Differently from other proposals, we address the issue of executing backward updates in a fully-transparent and event granularity-independent way, by relying on static software instrumentation (targeting the x86 architecture and Linux systems) to generate at runtime reverse update code blocks (not to be confused with reverse events, proper of the reverse computing approach). These are able to undo the effects of a forward execution while minimizing the cost of the undo operation. We also present experimental results related to our implementation, which is released as free software and fully integrated into the open source ROOT-Sim (ROme OpTimistic Simulator) package. The experimental data support the viability and effectiveness of our proposal

    Development of high sensitivity and high specificity strategies for tissue microproteomics

    Get PDF
    Mass-spectrometry based proteomics has become an indispensable tool for molecular biology and clinical research because of its ability to identify and quantify thousands of proteins. When combined with laser capture microdissection (LCM), MS-based proteomics may be used to investigate disease-associated changes in the proteome of specific tissue regions or cell populations. Such specificity is essential because different anatomical regions often have distinct and diverse functions and may behave differently under pathological conditions. However, the number of proteins that may be identified and quantified decreases with smaller sample amounts. Strict anatomical/cellular specification usually yields micrograms or submicrograms of protein, and thus ultrasensitive microproteomics protocols are required to analyze these small sample amounts while maintaining high proteome coverage. Recent advances in liquid chromatography (LC) and MS equipment have improved the analysis of low sample amounts. The development of mass spectrometers with increased sequencing speed and ion transmission, have resulted in an increase in dynamic range and sensitivity. The advances in ultra high-pressure liquid chromatography (HPLC) has enabled the routine use of long columns ( 6550 cm) with smaller internal diameter and smaller particle sized (< 5 \u3bcm) further increasing peptide separation resolution. However, the developments in LC-MS sensitivity have outpaced developments in sensitive sample preparation protocols. In this PhD thesis, I will present my 4-years research on the development of ultransensitive microproteomics strategies for the molecular characterization of tissues. During my PhD I developed and optimized an ultrasensitive proteomic workflow for the analysis of small sample amounts, and I applied it to biomedical case studies. First, I compared the digestion efficiency of the Filter-Aided Sample Preparation protocol (FASP) and the Single-Pot, Solid Phase-enhanced Sample Preparation protocol (SP3) with the conventional urea based in-solution digestion (ISD) method for different amounts of HeLa cells. The SP3 protocol, based on the use of carboxylate coated paramagnetic beads, outperformed the FASP and ISD protocols for the analysis of small sample amounts, providing the identification of about 3000 protein groups from 1 \u3bcg of HeLa lysate. As a proof of principle, I applied the optimized SP3 protocol to the characterization of the brain of a mouse model of glioblastoma. Laser capture microdissection provided the specificity required to isolate different anatomical regions of the brain (healthy, border and tumor regions), while the SP3 digestion protocol provided the sensitivity required for the analysis of heterogeneous and complex samples. To ensure accurate relative quantification and increase the proteome coverage I optimized in-solution and on-column Tandem Mass Tags (TMT) labeling and peptide fractionation of low sample amounts. Preliminary experiments revealed very low labeling efficiency when standard labeling conditions were applied to volume limited samples. Following an exhaustive optimized of in-solution and on-column TMT labeling the final conditions provided a TMT labeling efficiency (for 1 \u3bcg of HeLa digest) even greater than that obtained using standard methods on high sample amounts (25-50 \u3bcg of digest). Moreover, high-pH reversed phase fractionation increased proteome coverage by approximately 140% relative to single long gradient analyses. One of the challenges of working with microdissected tissues or other samples characterized by low total protein content, is the need to estimate total protein content (essential knowledge for accurate quantitation). Previously adjacent sections were used just for the protein content estimation, which is non-ideal because tissue histologies may differ (especially for small pathological features with a specific histology). To address this, I developed a colorimetric assay for protein content estimation. I modified the microBCA protein assay to be able to measure proteins in just 1 \u3bcL and in the presence of the reagents commonly used in lysis buffers (as SDS, EDTA, EGTA, etc.). This modified microBCA assay allowed a reproducible quantification of the protein content of each individual sample down to a concentration of 15 ng/\u3bcL. The final optimized quantitative workflow for the proteomic analysis of tissue samples comprised laser capture microdissection, protein content estimation with the modified MicroBCA assay, SP3 digestion, TMT labeling, high-pH reversed phase fractionation and injection in a nanoLC system coupled with an Orbitrap Fusion mass spectrometer. As a proof of principle, I applied the optimized workflow to the proteomic characterization of mouse kidney substructures. Finally, I applied the optimized workflow to the characterization of the central and peripheral nervous system of a mouse model of Krabbe disease (the Twitcher mouse). I compared the proteomes extracted from the corpus callosum, motor cortex and sciatic nerves of five Twitcher and five control wild type mice. The results on the proteome changes in the Twitcher mouse provided new insights into the molecular mechanisms of Krabbe disease showing neuroinflammation, activation of immune response, accumulation of lysosomal proteins, demyelination, membrane rafts disruption and reduced nervous system development. Altogether, the microproteomic protocol developed during my PhD represents a powerful tool for the proteomic characterization of pathological tissues. Moreover, the research study on Krabbe disease represents the first in-depth proteomic characterization of the Twitcher mouse and a starting point for future functional experiments to study the pathogenesis of Krabbe disease and new possible therapies

    Mixing Hardware and Software Reversibility for Speculative Parallel Discrete Event Simulation

    Get PDF
    Speculative parallel discrete event simulation requires a support for reversing processed events, also called state recovery, when causal inconsistencies are revealed. In this article we present an approach where state recovery relies on a mix of hardware- and software-based techniques. We exploit the Hardware Transactional Memory (HTM) support, as offered by Intel Haswell CPUs, to process events as in-memory transactions, which are possibly committed only after their causal consistency is verified. At the same time, we exploit an innovative software-based reversibility technique, fully relying on transparent software instrumentation targeting x86/ELF objects, which enables undoing side effects by events with no actual backward re-computation. Each thread within our speculative processing engine dynamically (on a per-event basis) selects which recovery mode to rely on (hardware vs software) depending on varying runtime dynamics. The latter are captured by a lightweight analytic model indicating to what extent the HTM support (not paying any instrumentation cost) is efficient, and after what level of events’ parallelism it starts degrading its performance, e.g., due to excessive data conflicts while manipulating causality meta-data within HTM-based transactions. We released our implementation as open source software and provide experimental results for an assessment of its effectiveness. © Springer International Publishing Switzerland 2016

    A different kind of string

    Get PDF
    In U(1) lattice gauge theory in three spacetime dimensions, the problem of confinement can be studied analytically in a semi-classical approach, in terms of a gas of monopoles with Coulomb-like interactions. In addition, this theory can be mapped to a spin model via an exact duality transformation, which allows one to perform high-precision numerical studies of the confining potential. Taking advantage of these properties, we carried out an accurate investigation of the effective string describing the low-energy properties of flux tubes in this confining gauge theory. We found striking deviations from the expected Nambu-Goto-like behavior, and, for the first time, evidence for contributions that can be described by a term proportional to the extrinsic curvature of the effective string worldsheet. Such term is allowed by Lorentz invariance, and its presence in the infrared regime of the U(1) model was indeed predicted by Polyakov several years ago. Our results show that this term scales as expected according to Polyakov's solution, and becomes the dominant contribution to the effective string action in the continuum limit. We also demonstrate analytically that the corrections to the confining potential induced by the extrinsic curvature term can be related to the partition function of the massive perturbation of a c=1 bosonic conformal field theory. The implications of our results for SU(N) Yang-Mills theories in three and in four spacetime dimensions are discussed.Comment: 1+21 pages, 2 figures; v2 (1+24 pages, 2 figures): improved the discussion in the conclusions' section, added an appendix, included new references, updated the affiliation details for one of the authors, corrected typos: version published in the journa

    Fine structure of the confining string in an analytically solvable 3D model

    Get PDF
    In U(1)\mathrm{U(1)} lattice gauge theory in three spacetime dimensions, confinement can be analytically shown to persist at all values of the coupling. Furthermore, the explicit predictions for the dependence of string tension σ\sigma and mass gap m0m_0 on the coupling allow one to tune their ratio at will. These features, and the possibility of obtaining high-precision numerical results via an exact duality map to a spin model, make this theory an ideal laboratory to test the effective string description of confining flux tubes. In this contribution, we discuss our investigation of next-to-leading-order corrections to the confining potential and of the finite-temperature behavior of the flux tube width. Our data provide a very stringent test of the theoretical predictions for these quantities and allow to test their dependence on the m0/σm_0/\sqrt{\sigma} ratio.Comment: Presented at the 31st International Symposium on Lattice Field Theory (Lattice 2013), 29 July - 3 August 2013, Mainz, German

    Evaluating the impact of innovation incentives: evidence from an unexpected shortage of funds

    Get PDF
    To evaluate the effect of an R&D subsidy one needs to know what the subsidized firms would have done without the incentive. This paper studies an Italian programme of subsidies for the applied development of innovations, exploiting a discontinuity in programme financing due to an unexpected shortage of public money. To identify the effect of the programme, the study implements a regression discontinuity design and compares firms that applied for funding before and after the shortage occurred. The results indicate that the programme was not effective in stimulating innovative investment.R&D, public policy, evaluation

    Effective string description of the interquark potential in the 3D U(1) lattice gauge theory

    Get PDF
    The U(1) lattice gauge theory in three dimensions is a perfect laboratory to study the properties of the confining string. On the one hand, thanks to the mapping to a Coulomb gas of monopoles, the confining properties of the model can be studied semi-classically. On the other hand, high-precision numerical estimates of Polyakov loop correlators can be obtained via a duality map to a spin model. This allowed us to perform high-precision tests of the universal behavior of the effective string and to find macroscopic deviations with respect to the expected Nambu-Goto predictions. These corrections could be fitted with very good precision including a contribution (which is consistent with Lorentz symmetry) proportional to the square of the extrinsic curvature in the effective string action, as originally suggested by Polyakov. Performing our analysis at different values of β\beta we were able to show that this term scales as expected by Polyakov's solution and dominates in the continuum. We also discuss the interplay between the extrinsic curvature contribution and the boundary correction induced by the Polyakov loops.Comment: 7 pages, 2 pdf figures, contribution to the 32nd International Symposium on Lattice Field Theory "Lattice 2014" (23-28 June 2014, Columbia University, New York, NY, USA

    Optimizing simulation on shared-memory platforms: The smart cities case

    Get PDF
    Modern advancements in computing architectures have been accompanied by new emergent paradigms to run Parallel Discrete Event Simulation models efficiently. Indeed, many new paradigms to effectively use the available underlying hardware have been proposed in the literature. Among these, the Share-Everything paradigm tackles massively-parallel shared-memory machines, in order to support speculative simulation by taking into account the limits and benefits related to this family of architectures. Previous results have shown how this paradigm outperforms traditional speculative strategies (such as data-separated Time Warp systems) whenever the granularity of executed events is small. In this paper, we show performance implications of this simulation-engine organization when the simulation models have a variable granularity. To this end, we have selected a traffic model, tailored for smart cities-oriented simulation. Our assessment illustrates the effects of the various tuning parameters related to the approach, opening to a higher understanding of this innovative paradigm

    Factors affecting smartphone shopping.

    Get PDF
    In recent years, the telecommunication sector has seen its market-leaders change. Today, the market is headed by 11 manufacturers, even though two main companies hold 42% of the market-share (Samsung and Apple). Furthermore, hundreds of models incorporating new functionalities are launched every year. This research is one of the first attempts to investigate functional evaluation in shopping smartphones and to predict future context of this turbulent market. With the use of 264 surveys of real smartphone owners and users, collected online in the first fortnight of May 2015, and the use of Conjoint Analysis (CA), we highlight major attributes consumers take into consideration in buying smartphones. Results show that consumers who decide to buy a smartphone consider Price, Camera performance, Battery-life and Brand. De facto, we find that, in smartphone shopping, consumers brand awareness is less important than technical characteristics. Notwithstanding, running the CA on subgroups defined by the brand of the smartphone owned, we find different attributes’ relative importance. Results show that Apple owners have a stronger brand awareness than Samsung owners. Implications aim to help manufacturers in developing smartphone features rationalizing invested resources, interpreting preferences of customers and reinforcing competitive advantages

    Co-Advertising, E-Wom and Social Responsibility

    Get PDF
    Following the digital revolution, the traditional divide between value creation, R&D, production and advertising - and value distribution and consumption - sales, use and post-use - is blurring. Individuals and companies are called to exchange multiple inputs and outputs before, during and after sale. The new contemporarity of value processes is gradually leading to a new convergence among parties. Companies are enabled to promote, intermediate and intercept the customers conversation; individuals are committed to the new social game and keeping companies under non- contractual observation. This study researches the effects of e-WOM (Electronic-Word of Mouth) of individuals through a netnography on 20 worldwide crowd-sourcing platforms. Findings demonstrate that the new overlapping of dialogue and sale can generate a positive loop between companies and individuals responsibility and reduce the distance between market and society
    corecore