518 research outputs found

    A Flexible Framework For Implementing Multi-Nested Software Transaction Memory

    Get PDF
    Programming with locks is very difficult in multi-threaded programmes. Concurrency control of access to shared data limits scalable locking strategies otherwise provided for in software transaction memory. This work addresses the subject of creating dependable software in the face of eminent failures. In the past, programmers who used lock-based synchronization to implement concurrent access to shared data had to grapple with problems with conventional locking techniques such as deadlocks, convoying, and priority inversion. This paper proposes another advanced feature for Dynamic Software Transactional Memory intended to extend the concepts of transaction processing to provide a nesting mechanism and efficient lock-free synchronization, recoverability and restorability. In addition, the code for implementation has also been researched, coded, tested, and implemented to achieve the desired objectives

    Some models are useful, but how do we know which ones? Towards a unified Bayesian model taxonomy

    Full text link
    Probabilistic (Bayesian) modeling has experienced a surge of applications in almost all quantitative sciences and industrial areas. This development is driven by a combination of several factors, including better probabilistic estimation algorithms, flexible software, increased computing power, and a growing awareness of the benefits of probabilistic learning. However, a principled Bayesian model building workflow is far from complete and many challenges remain. To aid future research and applications of a principled Bayesian workflow, we ask and provide answers for what we perceive as two fundamental questions of Bayesian modeling, namely (a) "What actually is a Bayesian model?" and (b) "What makes a good Bayesian model?". As an answer to the first question, we propose the PAD model taxonomy that defines four basic kinds of Bayesian models, each representing some combination of the assumed joint distribution of all (known or unknown) variables (P), a posterior approximator (A), and training data (D). As an answer to the second question, we propose ten utility dimensions according to which we can evaluate Bayesian models holistically, namely, (1) causal consistency, (2) parameter recoverability, (3) predictive performance, (4) fairness, (5) structural faithfulness, (6) parsimony, (7) interpretability, (8) convergence, (9) estimation speed, and (10) robustness. Further, we propose two example utility decision trees that describe hierarchies and trade-offs between utilities depending on the inferential goals that drive model building and testing

    Using quality models in software package selection

    Get PDF
    The growing importance of commercial off-the-shelf software packages requires adapting some software engineering practices, such as requirements elicitation and testing, to this emergent framework. Also, some specific new activities arise, among which selection of software packages plays a prominent role. All the methodologies that have been proposed recently for choosing software packages compare user requirements with the packages' capabilities. There are different types of requirements, such as managerial, political, and, of course, quality requirements. Quality requirements are often difficult to check. This is partly due to their nature, but there is another reason that can be mitigated, namely the lack of structured and widespread descriptions of package domains (that is, categories of software packages such as ERP systems, graphical or data structure libraries, and so on). This absence hampers the accurate description of software packages and the precise statement of quality requirements, and consequently overall package selection and confidence in the result of the process. Our methodology for building structured quality models helps solve this drawback.Peer ReviewedPostprint (published version

    Prediction can be safely used as a proxy for explanation in causally consistent Bayesian generalized linear models

    Full text link
    Bayesian modeling provides a principled approach to quantifying uncertainty in model parameters and model structure and has seen a surge of applications in recent years. Within the context of a Bayesian workflow, we are concerned with model selection for the purpose of finding models that best explain the data, that is, help us understand the underlying data generating process. Since we rarely have access to the true process, all we are left with during real-world analyses is incomplete causal knowledge from sources outside of the current data and model predictions of said data. This leads to the important question of when the use of prediction as a proxy for explanation for the purpose of model selection is valid. We approach this question by means of large-scale simulations of Bayesian generalized linear models where we investigate various causal and statistical misspecifications. Our results indicate that the use of prediction as proxy for explanation is valid and safe only when the models under consideration are sufficiently consistent with the underlying causal structure of the true data generating process

    Recovery Management of Long Running eBusiness Transactions

    Get PDF
    eBusiness collaboration and an eBusiness process are introduced as a context of a long running eBusiness transaction. The nature of the eBusiness collaboration sets requirements for the long running transactions. The ACID properties of the classical database transaction must be relaxed for the eBusiness transaction. Many techniques have been developed to take care of the execution of the long running business transactions such as the classical Saga and a business transaction model (BTM) of the business transaction framework. Those classic techniques cannot adequately take into account the recovery needs of the long running eBusiness transactions and they need to be further improved and developed. The expectations for a new service composition and recovery model are defined and described. The DeltaGrid service composition and recovery model (DGM) and the Constraint rules-based recovery mechanism (CM) are introduced as examples of the new model. The classic models and the new models are compared to each other and it is analysed how the models answer to the expectations. Neither new model uses the unaccustomed classification of atomicity even if the BTM includes the unaccustomed classifying of atomicity. A recovery model of the new models has improved the ability to take into account the data and control dependencies in the backward recovery. The new models present two different kinds of strategies to recover a failed service. The strategy of the CM increases the flexibility and the efficiency compared to the Saga or the BTF. The DGM defines characteristics that the CM does not have: a Delta-Enabled rollback, mechanisms for a pre-commit recoverability and for a post-commit recoverability and extends the concepts of a shallow compensation and a deep compensation. The use of them guarantees that an eBusiness process recovers always in a consistent state which is something the Saga, the BTM and the CM could not proof. The DGM offers also the algorithms of the important mechanisms. ACM Computing Classification System (CCS): C.2.4 [Distributed Systems]: Distributed application

    Regional economic resilience in the European Union: a numerical general equilibrium analysis

    Get PDF
    Using a spatial general equilibrium model, this paper investigates the resilience of EU regions under three alternative recessionary shocks, each of them activating different economic adjustments and mechanisms. We measure the vulnerability, resistance, and recoverability of regions and we identify key regional features affecting the ability of regions to withstand to and recover from unexpected external shocks. The analysis reveals that the response of regions varies according to the nature of the external disturbance and to the pre-shock regional characteristics. Finally, it seems that resilience also depends on factors' mobility
    corecore