2,872 research outputs found
TANDEM: taming failures in next-generation datacenters with emerging memory
The explosive growth of online services, leading to unforeseen scales, has made modern datacenters highly prone to failures. Taming these failures hinges on fast and correct recovery, minimizing service interruptions.
Applications, owing to recovery, entail additional measures to maintain a recoverable state of data and computation logic during their failure-free execution. However, these precautionary measures have
severe implications on performance, correctness, and programmability, making recovery incredibly challenging to realize in practice.
Emerging memory, particularly non-volatile memory (NVM) and disaggregated memory (DM), offers a promising opportunity to achieve fast recovery with maximum performance. However, incorporating these technologies into datacenter architecture presents significant challenges; Their distinct architectural attributes, differing significantly from traditional memory devices, introduce new semantic challenges for
implementing recovery, complicating correctness and programmability.
Can emerging memory enable fast, performant, and correct recovery in the datacenter? This thesis aims to answer this question while addressing the associated challenges.
When architecting datacenters with emerging memory, system architects face four key challenges: (1) how to guarantee correct semantics; (2) how to efficiently enforce correctness with optimal performance; (3) how to validate end-to-end correctness including recovery; and (4) how to preserve programmer productivity (Programmability).
This thesis aims to address these challenges through the following approaches: (a)
defining precise consistency models that formally specify correct end-to-end semantics
in the presence of failures (consistency models also play a crucial role in programmability); (b) developing new low-level mechanisms to efficiently enforce the prescribed models given the capabilities of emerging memory; and (c) creating robust testing frameworks to validate end-to-end correctness and recovery.
We start our exploration with non-volatile memory (NVM), which offers fast persistence capabilities directly accessible through the processor’s load-store (memory) interface. Notably, these capabilities can be leveraged to enable fast recovery for Log-Free Data Structures (LFDs) while maximizing performance. However, due to the complexity of modern cache hierarchies, data hardly persist in any specific order, jeop-
ardizing recovery and correctness. Therefore, recovery needs primitives that explicitly control the order of updates to NVM (known as persistency models). We outline the precise specification of a novel persistency model – Release Persistency (RP) – that provides a consistency guarantee for LFDs on what remains in non-volatile memory upon failure. To efficiently enforce RP, we propose a novel microarchitecture mechanism,
lazy release persistence (LRP). Using standard LFDs benchmarks, we show that LRP achieves fast recovery while incurring minimal overhead on performance.
We continue our discussion with memory disaggregation which decouples memory from traditional monolithic servers, offering a promising pathway for achieving very high availability in replicated in-memory data stores. Achieving such availability hinges on transaction protocols that can efficiently handle recovery in this setting, where
compute and memory are independent. However, there is a challenge: disaggregated memory (DM) fails to work with RPC-style protocols, mandating one-sided transaction protocols. Exacerbating the problem, one-sided transactions expose critical low-level
ordering to architects, posing a threat to correctness. We present a highly available transaction protocol, Pandora, that is specifically designed to achieve fast recovery in disaggregated key-value stores (DKVSes).
Pandora is the first one-sided transactional protocol that ensures correct, non-blocking, and fast recovery in DKVS. Our experimental implementation artifacts demonstrate that Pandora achieves fast recovery and high availability while causing minimal disruption to services.
Finally, we introduce a novel target litmus-testing framework – DART – to validate the end-to-end correctness of transactional protocols with recovery. Using DART’s target testing capabilities, we have found several critical bugs in Pandora, highlighting the need for robust end-to-end testing methods in the design loop to iteratively fix correctness bugs. Crucially, DART is lightweight and black-box, thereby eliminating
any intervention from the programmers
Exponential-time approximation schemes via compression
In this paper, we give a framework to design exponential-time approximation schemes for basic graph partitioning problems such as k-way cut, Multiway Cut, Steiner k-cut and Multicut, where the goal is to minimize the number of edges going across the parts. Our motivation to focus on approximation schemes for these problems comes from the fact that while it is possible to solve them exactly in 2^nn^{
LIPIcs, Volume 251, ITCS 2023, Complete Volume
LIPIcs, Volume 251, ITCS 2023, Complete Volum
Subgroup discovery for structured target concepts
The main object of study in this thesis is subgroup discovery, a theoretical framework for finding subgroups in data—i.e., named sub-populations— whose behaviour with respect to a specified target concept is exceptional when compared to the rest of the dataset. This is a powerful tool that conveys crucial information to a human audience, but despite past advances has been limited to simple target concepts. In this work we propose algorithms that bring this framework to novel application domains. We introduce the concept of representative subgroups, which we use not only to ensure the fairness of a sub-population with regard to a sensitive trait, such as race or gender, but also to go beyond known trends in the data. For entities with additional relational information that can be encoded as a graph, we introduce a novel measure of robust connectedness which improves on established alternative measures of density; we then provide a method that uses this measure to discover which named sub-populations are more well-connected. Our contributions within subgroup discovery crescent with the introduction of kernelised subgroup discovery: a novel framework that enables the discovery of subgroups on i.i.d. target concepts with virtually any kind of structure. Importantly, our framework additionally provides a concrete and efficient tool that works out-of-the-box without any modification, apart from specifying the Gramian of a positive definite kernel. To use within kernelised subgroup discovery, but also on any other kind of kernel method, we additionally introduce a novel random walk graph kernel. Our kernel allows the fine tuning of the alignment between the vertices of the two compared graphs, during the count of the random walks, while we also propose meaningful structure-aware vertex labels to utilise this new capability. With these contributions we thoroughly extend the applicability of subgroup discovery and ultimately re-define it as a kernel method.Der Hauptgegenstand dieser Arbeit ist die Subgruppenentdeckung (Subgroup Discovery), ein theoretischer Rahmen für das Auffinden von Subgruppen in Daten—d. h. benannte Teilpopulationen—deren Verhalten in Bezug auf ein bestimmtes Targetkonzept im Vergleich zum Rest des Datensatzes außergewöhnlich ist. Es handelt sich hierbei um ein leistungsfähiges Instrument, das einem menschlichen Publikum wichtige Informationen vermittelt. Allerdings ist es trotz bisherigen Fortschritte auf einfache Targetkonzepte beschränkt. In dieser Arbeit schlagen wir Algorithmen vor, die diesen Rahmen auf neuartige Anwendungsbereiche übertragen. Wir führen das Konzept der repräsentativen Untergruppen ein, mit dem wir nicht nur die Fairness einer Teilpopulation in Bezug auf ein sensibles Merkmal wie Rasse oder Geschlecht sicherstellen, sondern auch über bekannte Trends in den Daten hinausgehen können. Für Entitäten mit zusätzlicher relationalen Information, die als Graph kodiert werden kann, führen wir ein neuartiges Maß für robuste Verbundenheit ein, das die etablierten alternativen Dichtemaße verbessert; anschließend stellen wir eine Methode bereit, die dieses Maß verwendet, um herauszufinden, welche benannte Teilpopulationen besser verbunden sind. Unsere Beiträge in diesem Rahmen gipfeln in der Einführung der kernelisierten Subgruppenentdeckung: ein neuartiger Rahmen, der die Entdeckung von Subgruppen für u.i.v. Targetkonzepten mit praktisch jeder Art von Struktur ermöglicht. Wichtigerweise, unser Rahmen bereitstellt zusätzlich ein konkretes und effizientes Werkzeug, das ohne jegliche Modifikation funktioniert, abgesehen von der Angabe des Gramian eines positiv definitiven Kernels. Für den Einsatz innerhalb der kernelisierten Subgruppentdeckung, aber auch für jede andere Art von Kernel-Methode, führen wir zusätzlich einen neuartigen Random-Walk-Graph-Kernel ein. Unser Kernel ermöglicht die Feinabstimmung der Ausrichtung zwischen den Eckpunkten der beiden unter-Vergleich-gestelltenen Graphen während der Zählung der Random Walks, während wir auch sinnvolle strukturbewusste Vertex-Labels vorschlagen, um diese neue Fähigkeit zu nutzen. Mit diesen Beiträgen erweitern wir die Anwendbarkeit der Subgruppentdeckung gründlich und definieren wir sie im Endeffekt als Kernel-Methode neu
Mooring the global archive: a Japanese ship and its migrant histories
Martin Dusinberre follows the Yamashiro-maru steamship across Asian and Pacific waters in an innovative history of Japan's engagement with the outside world in the late-nineteenth century. His compelling in-depth analysis reconstructs the lives of some of the thousands of male and female migrants who left Japan for work in Hawai'i, Southeast Asia and Australia. These stories bring together transpacific historiographies of settler colonialism, labour history and resource extraction in new ways. Drawing on an unconventional and deeply material archive, from gravestones to government files, paintings to song, and from digitized records to the very earth itself, Dusinberre addresses key questions of method and authorial positionality in the writing of global history. This engaging investigation into archival practice asks, what is the global archive, where is it cited, and who are 'we' as we cite it? This title is also available as Open Access
Sensing Collectives: Aesthetic and Political Practices Intertwined
Are aesthetics and politics really two different things? The book takes a new look at how they intertwine, by turning from theory to practice. Case studies trace how sensory experiences are created and how collective interests are shaped. They investigate how aesthetics and politics are entangled, both in building and disrupting collective orders, in governance and innovation. This ranges from populist rallies and artistic activism over alternative lifestyles and consumer culture to corporate PR and governmental policies. Authors are academics and artists. The result is a new mapping of the intermingling and co-constitution of aesthetics and politics in engagements with collective orders
A New Deterministic Algorithm for Fully Dynamic All-Pairs Shortest Paths
We study the fully dynamic All-Pairs Shortest Paths (APSP) problem in
undirected edge-weighted graphs. Given an -vertex graph with
non-negative edge lengths, that undergoes an online sequence of edge insertions
and deletions, the goal is to support approximate distance queries and
shortest-path queries. We provide a deterministic algorithm for this problem,
that, for a given precision parameter , achieves approximation factor
, and has amortized update time
per operation, where is the ratio of longest to
shortest edge length. Query time for distance-query is
, and query time for
shortest-path query is , where is the path that the algorithm returns. To the best of our
knowledge, even allowing any -approximation factor, no adaptive-update
algorithms with better than amortized update time and better than
query time were known prior to this work. We also note that our
guarantees are stronger than the best current guarantees for APSP in
decremental graphs in the adaptive-adversary setting.Comment: arXiv admin note: text overlap with arXiv:2109.0562
Fault-Tolerant Spanners against Bounded-Degree Edge Failures: Linearly More Faults, Almost For Free
We study a new and stronger notion of fault-tolerant graph structures whose
size bounds depend on the degree of the failing edge set, rather than the total
number of faults. For a subset of faulty edges , the
faulty-degree is the largest number of faults in incident to any
given vertex. We design new fault-tolerant structures with size comparable to
previous constructions, but which tolerate every fault set of small
faulty-degree , rather than only fault sets of small size . Our
main results are:
- New FT-Certificates: For every -vertex graph and degree threshold
, one can compute a connectivity certificate with edges that has the following guarantee: for any edge set
with faulty-degree and every vertex pair , it holds that
and are connected in iff they are connected in . This bound on is nearly tight. Since our certificates
handle some fault sets of size up to , prior work did not imply any
nontrivial upper bound for this problem, even when .
- New FT-Spanners: We show that every -vertex graph admits a
-spanner with edges, which
tolerates any fault set of faulty-degree at most . This bound on
optimal up to its hidden dependence on , and it is close to the
bound of that is known for the case where the
total number of faults is [Bodwin, Dinitz, Robelle SODA '22]. Our proof
of this theorem is non-constructive, but by following a proof strategy of
Dinitz and Robelle [PODC '20], we show that the runtime can be made polynomial
by paying an additional factor in spanner size
Theologische Zugänge zu Technik und Künstlicher Intelligenz
The publication of this work was supported by the Open Access Publication Fund of Humboldt-Universität zu Berlin.Technik und Künstliche Intelligenz gehören zu den brisanten Themen der gegenwärtigen Theologie. Wie kann Theologie zu Technik und KI beitragen? Der Technikdiskurs ist aufgeladen mit religiösen Motiven, und Technologien wie Roboter fordern die Theologie, z. B. das Menschenbild, die Ethik und die religiöse Praxis, neu heraus. Der Sammelband erforscht aus theologischer Perspektive die drängenden Themen unserer Zeit. Dazu begibt sich die Theologie in Dialog mit den Technikwissenschaften. Untersucht werden die Veränderungen des Menschenbildes durch Roboter, Religiöse Roboter, Optimierung des Körpers, medizinische Technologien, Autoregulative Waffensysteme und wie die Theologie durch die Technologisierung transformiert wird. Aus interdisziplinärer Perspektive werden neue Forschungsergebnisse aus dem internationalen Raum vorgestellt und neue Wege beschritten
Advertising as a Creative Industry:Regime of Paradoxes
At the crossroads of culture and commerce, the advertising industry is a regime of paradoxes. This book examines the place of advertising in today’s creative industries, exploring the major challenges advertisers confront as they engage with other creative sectors. Izabela Derda, author, media scholar, and industry expert, offers insights into how the industry keeps deconstructing its own creative processes and collaborative models as it attempts to stay relevant. Through extensive case studies and interviews with industry professionals and thought leaders, this book examines the sector’s struggle to adapt to new business models and to monetize creativity in today’s media landscape, from re-engaging audiences through media more typical of arts and entertainment to managing intricate cross-sectoral creative collaborations. From redesigning workplaces to satisfy the expectations of the youngest generations of creatives to reconsidering the paradigm of conventional creative teams, the advertising sector has swiftly adjusted to the seismic changes in today’s media landscape. The book will be of interest to scholars and students of creative media, advertising, and media studies, as well as those interested in understanding the changing complexities and latest innovations of the creative industries. Advertising professionals, artists, and policymakers will find relevant insights and possible solutions for the major challenges facing the advertising industry today. The Open Access version of this book, available at www.taylorfrancis.com, has been made available under a CC-BY license.</p
- …