267 research outputs found
Extending Transactional Memory with Atomic Deferral
This paper introduces atomic deferral, an extension to TM that allows programmers to move long-running or irrevocable operations out of a transaction while maintaining serializability: the transaction and its de- ferred operation appear to execute atomically from the perspective of other transactions. Thus, program- mers can adapt lock-based programs to exploit TM with relatively little effort and without sacrificing scalability by atomically deferring the problematic operations. We demonstrate this with several use cases for atomic deferral, as well as an in-depth analysis of its use on the PARSEC dedup benchmark, where we show that atomic deferral enables TM to be competitive with well-designed lock-based code
Tailoring Transactional Memory to Real-World Applications
Transactional Memory (TM) promises to provide a scalable mechanism for synchronizationin concurrent programs, and to offer ease-of-use benefits to programmers. Since multiprocessorarchitectures have dominated CPU design, exploiting parallelism in program
Supporting Time-Based QoS Requirements in Software Transactional Memory
International audienceSoftware Transactional Memory (STM) is an optimistic concurrency control mechanism that simplifies parallel programming. Still, there has been little interest in its applicability for reactive applications in which there is a required response time for certain operations. We propose supporting such applications by allowing programmers to associate time with atomic blocks in the forms of deadlines and QoS requirements. Based on statistics of past executions, we adjust the execution mode of transactions by decreasing the level of optimism as the deadline approaches. In the presence of concurrent deadlines, we propose different conflict resolution policies. Execution mode switching mechanisms allow meeting multiple deadlines in a consistent manner, with potential QoS degradations being split fairly among several threads as contention increases, and avoiding starvation. Our implementation consists of extensions to a STM runtime that allow gathering statistics and switching execution modes. We also propose novel contention managers adapted to transactional workloads subject to deadlines. The experimental evaluation shows that our approaches significantly improve the likelihood of a transaction meeting its deadline and QoS requirement, even in cases where progress is hampered by conflicts and other concurrent transactions with deadlines
Hardware extensions to make lazy subscription safe
Abstract Transactional Lock Elision (TLE) uses Hardware Transactional Memory (HTM) to execute unmodified critical sections concurrently, even if they are protected by the same lock. To ensure correctness, the transactions used to execute these critical sections "subscribe" to the lock by reading it and checking that it is available. A recent paper proposed using the tempting "lazy subscription" optimization for a similar technique in a different context, namely transactional systems that use a single global lock (SGL) to protect all transactional data. We identify several pitfalls that show that lazy subscription is not safe for TLE because unmodified critical sections executing before subscribing to the lock may behave incorrectly in a number of subtle ways. We also show that recently proposed compiler support for modifying transaction code to ensure subscription occurs before any incorrect behavior could manifest is not sufficient to avoid all of the pitfalls we identify. We further argue that extending such compiler support to avoid all pitfalls would add substantial complexity and would usually limit the extent to which subscription can be deferred, undermining the effectiveness of the optimization. Hardware extensions suggested in the recent proposal also do not address all of the pitfalls we identify. In this extended version of our WTTM 2014 paper, we describe hardware extensions that make lazy subscription safe, both for SGL-based transactional systems and for TLE, without the need for special compiler support. We also explain how nontransactional loads can be exploited, if available, to further enhance the effectiveness of lazy subscription
Data Management for Dynamic Multimedia Analytics and Retrieval
Multimedia data in its various manifestations poses a unique challenge from a data storage and data management perspective, especially if search, analysis and analytics in large data corpora is considered. The inherently unstructured nature of the data itself and the curse of dimensionality that afflicts the representations we typically work with in its stead are cause for a broad range of issues that require sophisticated solutions at different levels. This has given rise to a huge corpus of research that puts focus on techniques that allow for effective and efficient multimedia search and exploration. Many of these contributions have led to an array of purpose-built, multimedia search systems.
However, recent progress in multimedia analytics and interactive multimedia retrieval, has demonstrated that several of the assumptions usually made for such multimedia search workloads do not hold once a session has a human user in the loop. Firstly, many of the required query operations cannot be expressed by mere similarity search and since the concrete requirement cannot always be anticipated, one needs a flexible and adaptable data management and query framework. Secondly, the widespread notion of staticity of data collections does not hold if one considers analytics workloads, whose purpose is to produce and store new insights and information. And finally, it is impossible even for an expert user to specify exactly how a data management system should produce and arrive at the desired outcomes of the potentially many different queries.
Guided by these shortcomings and motivated by the fact that similar questions have once been answered for structured data in classical database research, this Thesis presents three contributions that seek to mitigate the aforementioned issues. We present a query model that generalises the notion of proximity-based query operations and formalises the connection between those queries and high-dimensional indexing. We complement this by a cost-model that makes the often implicit trade-off between query execution speed and results quality transparent to the system and the user. And we describe a model for the transactional and durable maintenance of high-dimensional index structures.
All contributions are implemented in the open-source multimedia database system Cottontail DB, on top of which we present an evaluation that demonstrates the effectiveness of the proposed models. We conclude by discussing avenues for future research in the quest for converging the fields of databases on the one hand and (interactive) multimedia retrieval and analytics on the other
Reconfiguring the Real: Art & Aesthetic Innovation in Kazuo Ishiguroâs Axiomatic Fictions
This study approaches six of Ishiguroâs novels ââ A Pale View of Hills (1982), An Artist of the Floating World (1986), The Remains of the Day (1989), The Unconsoled (1995), When We Were Orphans (2000), and Never Let Me Go (2005) â â through a treatment of these works as novelistic works of art. It derives its theoretical inspiration from aesthetic theories of art by EÌtienne Gilson, Graham Gordon, Peter Lamarque, Susanne Langer, and NoÌel Carroll, as well as concepts found within the disciplines of philosophy of mind (especially phenomenology), post-classical narratology (possible world theory applied to literary studies), and studies on memory as well as narrative immersion. The point of departure of this study lies in its drawing attention to and placing of greater focus on the artistic character of Ishiguroâs style ââ an examination of the aesthetic construction of his novels through close readings of his novels and early short stories. The thesis focuses on the ways in which the narrative form, voice, and structuration of Ishiguroâs works flaunt and flout the analytic unreal/real fictional paradox through a signposting of their fictionality, whilst paradoxically and simultaneously, producing an intensified feeling of the real through directed formal techniques. In so doing, my study also seeks to highlight a deceptive aspect of Ishiguroâs seemingly transparent prose, to show how it harbours a subtle experimental dimension working at the formal level of his novels. The duplicitous nature of his prose, I postulate, is the source of their original quality.1 What this results in consequently, the thesis postulates, is a mode of fiction that I term âaxiomatic fictionâ ââ fiction that deftly negotiates fictional self-consciousness at the same time that it works to maintain a seamless semblance of poetic illusion that is able to captivate and enthral readers
Tracing the Compositional Process. Sound art that rewrites its own past: formation, praxis and a computer framework
The domain of this thesis is electroacoustic computer-based music and sound art. It investigates
a facet of composition which is often neglected or ill-defined: the process of composing itself
and its embedding in time. Previous research mostly focused on instrumental composition or,
when electronic music was included, the computer was treated as a tool which would eventually
be subtracted from the equation. The aim was either to explain a resultant piece of music by
reconstructing the intention of the composer, or to explain human creativity by building a model
of the mind.
Our aim instead is to understand composition as an irreducible unfolding of material traces which
takes place in its own temporality. This understanding is formalised as a software framework
that traces creation time as a version graph of transactions. The instantiation and manipulation
of any musical structure implemented within this framework is thereby automatically stored
in a database. Not only can it be queried ex post by an external researcherâproviding a new
quality for the empirical analysis of the activity of composingâbut it is an integral part of
the composition environment. Therefore it can recursively become a source for the ongoing
composition and introduce new ways of aesthetic expression. The framework aims to unify
creation and performance time, fixed and generative composition, human and algorithmic
âwritingâ, a writing that includes indeterminate elements which condense as concurrent vertices
in the version graph.
The second major contribution is a critical epistemological discourse on the question of ob-
servability and the function of observation. Our goal is to explore a new direction of artistic
research which is characterised by a mixed methodology of theoretical writing, technological
development and artistic practice. The form of the thesis is an exercise in becoming process-like
itself, wherein the epistemic thing is generated by translating the gaps between these three levels.
This is my idea of the new aesthetics: That through the operation of a re-entry one may establish
a sort of process âformâ, yielding works which go beyond a categorical either âsound-in-itselfâ
or âconceptualismâ.
Exemplary processes are revealed by deconstructing a series of existing pieces, as well as
through the successful application of the new framework in the creation of new pieces
Critical Pragmatism: Peirce and Marcuse on the Socio-Political Influences on Human Development in Advanced Industrial Societies
My dissertation brings together representatives from two otherwise antagonistic traditions: Charles Peirce of the pragmatists and Herbert Marcuse of the critical theorists. I demonstrate the affinities between the two philosophers with a focus on their contributions to socio-political thought in advanced industrial societies. After addressing the antagonisms between the two traditions I offer a reading that allows for a Peircean complement to Marcuse\u27s One-Dimensional Man and a Marcusean complement to Peirce\u27s critique of the method of authority in his seminal essay, The Fixation of Belief
- âŠ