247,552 research outputs found
Tuning the Level of Concurrency in Software Transactional Memory: An Overview of Recent Analytical, Machine Learning and Mixed Approaches
Synchronization transparency offered by Software Transactional Memory (STM) must not come at the expense of run-time efficiency, thus demanding from the STM-designer the inclusion of mechanisms properly oriented to performance and other quality indexes. Particularly, one core issue to cope with in STM is related to exploiting parallelism while also avoiding thrashing phenomena due to excessive transaction rollbacks, caused by excessively high levels of contention on logical resources, namely concurrently accessed data portions. A means to address run-time efficiency consists in dynamically determining the best-suited level of concurrency (number of threads) to be employed for running the application (or specific application phases) on top of the STM layer. For too low levels of concurrency, parallelism can be hampered. Conversely, over-dimensioning the concurrency level may give rise to the aforementioned thrashing phenomena caused by excessive data contention—an aspect which has reflections also on the side of reduced energy-efficiency. In this chapter we overview a set of recent techniques aimed at building “application-specific” performance models that can be exploited to dynamically tune the level of concurrency to the best-suited value. Although they share some base concepts while modeling the system performance vs the degree of concurrency, these techniques rely on disparate methods, such as machine learning or analytic methods (or combinations of the two), and achieve different tradeoffs in terms of the relation between the precision of the performance model and the latency for model instantiation. Implications of the different tradeoffs in real-life scenarios are also discussed
Analytical/ML Mixed Approach for Concurrency Regulation in Software Transactional Memory
In this article we exploit a combination of analytical and Machine Learning (ML) techniques in order to build a performance model allowing to dynamically tune the level of concurrency of applications based on Software Transactional Memory (STM). Our mixed approach has the advantage of reducing the training time of pure machine learning methods, and avoiding approximation errors typically affecting pure analytical approaches. Hence it allows very fast construction of highly reliable performance models, which can be promptly and effectively exploited for optimizing actual application runs. We also present a real implementation of a concurrency regulation architecture, based on the mixed modeling approach, which has been integrated with the open source Tiny STM package, together with experimental data related to runs of applications taken from the STAMP benchmark suite demonstrating the effectiveness of our proposal. © 2014 IEEE
Report : review of the literature : maintenance and rehabilitation costs for roads (Risk-based Analysis)
Realistic estimates of short- and long-term (strategic) budgets for maintenance and
rehabilitation of road assessment management should consider the stochastic
characteristics of asset conditions of the road networks so that the overall variability
of road asset data conditions is taken into account.
The probability theory has been used for assessing life-cycle costs for bridge
infrastructures by Kong and Frangopol (2003), Zayed et.al. (2002), Kong and
Frangopol (2003), Liu and Frangopol (2004), Noortwijk and Frangopol (2004), Novick
(1993). Salem 2003 cited the importance of the collection and analysis of existing
data on total costs for all life-cycle phases of existing infrastructure, including bridges,
road etc., and the use of realistic methods for calculating the probable useful life of
these infrastructures (Salem et. al. 2003). Zayed et. al. (2002) reported conflicting
results in life-cycle cost analysis using deterministic and stochastic methods.
Frangopol et. al. 2001 suggested that additional research was required to develop
better life-cycle models and tools to quantify risks, and benefits associated with
infrastructures.
It is evident from the review of the literature that there is very limited information on
the methodology that uses the stochastic characteristics of asset condition data for
assessing budgets/costs for road maintenance and rehabilitation (Abaza 2002,
Salem et. al. 2003, Zhao, et. al. 2004). Due to this limited information in the research
literature, this report will describe and summarise the methodologies presented by
each publication and also suggest a methodology for the current research project
funded under the Cooperative Research Centre for Construction Innovation CRC CI
project no 2003-029-C
Vectorwise: Beyond Column Stores
textabstractThis paper tells the story of Vectorwise, a high-performance analytical database system, from multiple perspectives: its history from academic project to commercial product, the evolution of its technical
architecture, customer reactions to the product and its future research and development roadmap. One take-away from this story is that the novelty in Vectorwise is much more than just column-storage:
it boasts many query processing innovations in its vectorized execution model, and an adaptive mixed
row/column data storage model with indexing support tailored to analytical workloads. Another one is that there is a long road from research prototype to commercial product, though database research continues to achieve a strong innovative influence on product development
- …