1,401 research outputs found
TANDEM: taming failures in next-generation datacenters with emerging memory
The explosive growth of online services, leading to unforeseen scales, has made modern datacenters highly prone to failures. Taming these failures hinges on fast and correct recovery, minimizing service interruptions.
Applications, owing to recovery, entail additional measures to maintain a recoverable state of data and computation logic during their failure-free execution. However, these precautionary measures have
severe implications on performance, correctness, and programmability, making recovery incredibly challenging to realize in practice.
Emerging memory, particularly non-volatile memory (NVM) and disaggregated memory (DM), offers a promising opportunity to achieve fast recovery with maximum performance. However, incorporating these technologies into datacenter architecture presents significant challenges; Their distinct architectural attributes, differing significantly from traditional memory devices, introduce new semantic challenges for
implementing recovery, complicating correctness and programmability.
Can emerging memory enable fast, performant, and correct recovery in the datacenter? This thesis aims to answer this question while addressing the associated challenges.
When architecting datacenters with emerging memory, system architects face four key challenges: (1) how to guarantee correct semantics; (2) how to efficiently enforce correctness with optimal performance; (3) how to validate end-to-end correctness including recovery; and (4) how to preserve programmer productivity (Programmability).
This thesis aims to address these challenges through the following approaches: (a)
defining precise consistency models that formally specify correct end-to-end semantics
in the presence of failures (consistency models also play a crucial role in programmability); (b) developing new low-level mechanisms to efficiently enforce the prescribed models given the capabilities of emerging memory; and (c) creating robust testing frameworks to validate end-to-end correctness and recovery.
We start our exploration with non-volatile memory (NVM), which offers fast persistence capabilities directly accessible through the processorâs load-store (memory) interface. Notably, these capabilities can be leveraged to enable fast recovery for Log-Free Data Structures (LFDs) while maximizing performance. However, due to the complexity of modern cache hierarchies, data hardly persist in any specific order, jeop-
ardizing recovery and correctness. Therefore, recovery needs primitives that explicitly control the order of updates to NVM (known as persistency models). We outline the precise specification of a novel persistency model â Release Persistency (RP) â that provides a consistency guarantee for LFDs on what remains in non-volatile memory upon failure. To efficiently enforce RP, we propose a novel microarchitecture mechanism,
lazy release persistence (LRP). Using standard LFDs benchmarks, we show that LRP achieves fast recovery while incurring minimal overhead on performance.
We continue our discussion with memory disaggregation which decouples memory from traditional monolithic servers, offering a promising pathway for achieving very high availability in replicated in-memory data stores. Achieving such availability hinges on transaction protocols that can efficiently handle recovery in this setting, where
compute and memory are independent. However, there is a challenge: disaggregated memory (DM) fails to work with RPC-style protocols, mandating one-sided transaction protocols. Exacerbating the problem, one-sided transactions expose critical low-level
ordering to architects, posing a threat to correctness. We present a highly available transaction protocol, Pandora, that is specifically designed to achieve fast recovery in disaggregated key-value stores (DKVSes).
Pandora is the first one-sided transactional protocol that ensures correct, non-blocking, and fast recovery in DKVS. Our experimental implementation artifacts demonstrate that Pandora achieves fast recovery and high availability while causing minimal disruption to services.
Finally, we introduce a novel target litmus-testing framework â DART â to validate the end-to-end correctness of transactional protocols with recovery. Using DARTâs target testing capabilities, we have found several critical bugs in Pandora, highlighting the need for robust end-to-end testing methods in the design loop to iteratively fix correctness bugs. Crucially, DART is lightweight and black-box, thereby eliminating
any intervention from the programmers
Planetary Hinterlands:Extraction, Abandonment and Care
This open access book considers the concept of the hinterland as a crucial tool for understanding the global and planetary present as a time defined by the lasting legacies of colonialism, increasing labor precarity under late capitalist regimes, and looming climate disasters. Traditionally seen to serve a (colonial) port or market town, the hinterland here becomes a lens to attend to the times and spaces shaped and experienced across the received categories of the urban, rural, wilderness or nature. In straddling these categories, the concept of the hinterland foregrounds the human and more-than-human lively processes and forms of care that go on even in sites defined by capitalist extraction and political abandonment. Bringing together scholars from the humanities and social sciences, the book rethinks hinterland materialities, affectivities, and ecologies across places and cultural imaginations, Global North and South, urban and rural, and land and water
Evaluating the sustainability and resiliency of local food systems
With an ever-rising global population and looming environmental challenges such as climate change and soil degradation, it is imperative to increase the sustainability of food production. The drastic rise in food insecurity during the COVID-19 pandemic has further shown a pressing need to increase the resiliency of food systems. One strategy to reduce the dependence on complex, vulnerable global supply chains is to strengthen local food systems, such as by producing more food in cities. This thesis uses an interdisciplinary, food systems approach to explore aspects of sustainability and resiliency within local food systems.
Lifecycle assessment (LCA) was used to evaluate how farm scale, distance to consumer, and management practices influence environmental impacts for different local agriculture models in two case study locations: Georgia, USA and England, UK. Farms were grouped based on urbanisation level and management practices, including: urban organic, peri-urban organic, rural organic, and rural conventional. A total of 25 farms and 40 crop lifecycles were evaluated, focusing on two crops (kale and tomatoes) and including impacts from seedling production through final distribution to the point of sale. Results were extremely sensitive to the allocation of composting burdens (decomposition emissions), with impact variation between organic farms driven mainly by levels of compost use. When composting burdens were attributed to compost inputs, the rural conventional category in the U.S. and the rural organic category in the UK had the lowest average impacts per kg sellable crop produced, including the lowest global warming potential (GWP). However, when subtracting avoided burdens from the municipal waste stream from compost inputs, trends reversed entirely, with urban or peri-urban farm categories having the lowest impacts (often negative) for GWP and marine eutrophication. Overall, farm management practices were the most important factor driving environmental impacts from local food supply chains.
A soil health assessment was then performed on a subset of the UK farms to provide insight to ecosystem services that are not captured within LCA frameworks. Better soil health was observed in organically-farmed and uncultivated soils compared to conventionally farmed soils, suggesting higher ecosystem service provisioning as related to improved soil structure, flood mitigation, erosion control, and carbon storage. However, relatively high heavy metal concentrations were seen on urban and peri-urban farms, as well as those located in areas with previous mining activity. This implies that there are important services and disservices on farms that are not captured by LCAs.
Zooming out from a focus on food production, a qualitative methodology was used to explore experiences of food insecurity and related health and social challenges during the COVID-19 pandemic. Fourteen individuals receiving emergency food parcels from a community food project in Sheffield, UK were interviewed. Results showed that maintaining food security in times of crisis requires a diverse set of individual, household, social, and place-based resources, which were largely diminished or strained during the pandemic. Drawing upon social capital and community support was essential to cope with a multiplicity of hardship, highlighting a need to develop community food infrastructure that supports ideals of mutual aid and builds connections throughout the food supply chain. Overall, this thesis shows that a range of context-specific solutions are required to build sustainable and resilient food systems. This can be supported by increasing local control of food systems and designing strategies to meet specific community needs, whilst still acknowledging a shared global responsibility to protect ecosystem, human, and planetary health
Parallel and Flow-Based High Quality Hypergraph Partitioning
Balanced hypergraph partitioning is a classic NP-hard optimization problem that is a fundamental tool in such diverse disciplines as VLSI circuit design, route planning, sharding distributed databases, optimizing communication volume in parallel computing, and accelerating the simulation of quantum circuits.
Given a hypergraph and an integer , the task is to divide the vertices into disjoint blocks with bounded size, while minimizing an objective function on the hyperedges that span multiple blocks.
In this dissertation we consider the most commonly used objective, the connectivity metric, where we aim to minimize the number of different blocks connected by each hyperedge.
The most successful heuristic for balanced partitioning is the multilevel approach, which consists of three phases.
In the coarsening phase, vertex clusters are contracted to obtain a sequence of structurally similar but successively smaller hypergraphs.
Once sufficiently small, an initial partition is computed.
Lastly, the contractions are successively undone in reverse order, and an iterative improvement algorithm is employed to refine the projected partition on each level.
An important aspect in designing practical heuristics for optimization problems is the trade-off between solution quality and running time.
The appropriate trade-off depends on the specific application, the size of the data sets, and the computational resources available to solve the problem.
Existing algorithms are either slow, sequential and offer high solution quality, or are simple, fast, easy to parallelize, and offer low quality.
While this trade-off cannot be avoided entirely, our goal is to close the gaps as much as possible.
We achieve this by improving the state of the art in all non-trivial areas of the trade-off landscape with only a few techniques, but employed in two different ways.
Furthermore, most research on parallelization has focused on distributed memory, which neglects the greater flexibility of shared-memory algorithms and the wide availability of commodity multi-core machines.
In this thesis, we therefore design and revisit fundamental techniques for each phase of the multilevel approach, and develop highly efficient shared-memory parallel implementations thereof.
We consider two iterative improvement algorithms, one based on the Fiduccia-Mattheyses (FM) heuristic, and one based on label propagation.
For these, we propose a variety of techniques to improve the accuracy of gains when moving vertices in parallel, as well as low-level algorithmic improvements.
For coarsening, we present a parallel variant of greedy agglomerative clustering with a novel method to resolve cluster join conflicts on-the-fly.
Combined with a preprocessing phase for coarsening based on community detection, a portfolio of from-scratch partitioning algorithms, as well as recursive partitioning with work-stealing, we obtain our first parallel multilevel framework.
It is the fastest partitioner known, and achieves medium-high quality, beating all parallel partitioners, and is close to the highest quality sequential partitioner.
Our second contribution is a parallelization of an n-level approach, where only one vertex is contracted and uncontracted on each level.
This extreme approach aims at high solution quality via very fine-grained, localized refinement, but seems inherently sequential.
We devise an asynchronous n-level coarsening scheme based on a hierarchical decomposition of the contractions, as well as a batch-synchronous uncoarsening, and later fully asynchronous uncoarsening.
In addition, we adapt our refinement algorithms, and also use the preprocessing and portfolio.
This scheme is highly scalable, and achieves the same quality as the highest quality sequential partitioner (which is based on the same components), but is of course slower than our first framework due to fine-grained uncoarsening.
The last ingredient for high quality is an iterative improvement algorithm based on maximum flows.
In the sequential setting, we first improve an existing idea by solving incremental maximum flow problems, which leads to smaller cuts and is faster due to engineering efforts.
Subsequently, we parallelize the maximum flow algorithm and schedule refinements in parallel.
Beyond the strive for highest quality, we present a deterministically parallel partitioning framework.
We develop deterministic versions of the preprocessing, coarsening, and label propagation refinement.
Experimentally, we demonstrate that the penalties for determinism in terms of partition quality and running time are very small.
All of our claims are validated through extensive experiments, comparing our algorithms with state-of-the-art solvers on large and diverse benchmark sets.
To foster further research, we make our contributions available in our open-source framework Mt-KaHyPar.
While it seems inevitable, that with ever increasing problem sizes, we must transition to distributed memory algorithms, the study of shared-memory techniques is not in vain.
With the multilevel approach, even the inherently slow techniques have a role to play in fast systems, as they can be employed to boost quality on coarse levels at little expense.
Similarly, techniques for shared-memory parallelism are important, both as soon as a coarse graph fits into memory, and as local building blocks in the distributed algorithm
LIPIcs, Volume 261, ICALP 2023, Complete Volume
LIPIcs, Volume 261, ICALP 2023, Complete Volum
âConclusion: Youth aspirations, trajectories, and farming futures
This book commenced with a question of global importance: in a world in which farming populations are ageing, who is going to provide the planetâs peoples with the âsufficient, safe and nutritious foodâ that is needed to meet the âdietary needs and food preferences for an active and healthy lifeâ (FAO 2006)? In other words, where are the people who are needed to generationally renew farming? As explained in the introduction, addressing this question meant going against the grain of much research on youth and agriculture. Rather than seeking to understand youthâs apparent disinterest in farming and their exodus from the countryside, the research teams focused on those youth and young adults who stayed in, returned, or relocated to rural areas and were involved in farming (often alongside various other economic activities). Thereby, the case studies presented in this book have put in the spotlight the next generation of farmers. In this concluding chapter, we draw out some important issues emerging from across the chapters and reflect on key differences. This way, we reiterate the various pathways of becoming a farmer, the main challenges experienced by these young farming women and men, and the roles that policies and organizations could play in facilitating the process of becoming a farmer
Microbe Hunters
Microbe Hunters by Paul de Kruif was first published in 1926 by Harcourt, Brace and Company, New York. It dramatically recounts the breakthrough discoveries of the fundamental elements of bacteriology. It features exciting profiles of Antony Leeuwenhoek, Lazzaro Spallanzani, Louis Pasteur, Robert Koch, Ămile Roux, Emil Behring, Ălie Metchnikoff, Theobald Smith, David Bruce, Ronald Ross, Battista Grassi, Walter Reed, and Paul Ehrlich. Their development of germ theory and its scientific proofs led to the first effective treatments for human diseases like anthrax, rabies, diptheria, malaria, sleeping sickness, syphilis, and yellow fever. They also made discoveries that saved the dairy, wine, beer, silk, and cattle industries. These determined experimenters proved time and again that tiny living beings only seen by microscope can have huge impacts on human life, and they emphatically demonstrated the value of science for modern civilization. A best seller in its time, the work is an enduring classic that has inspired many scientific careers.
Paul de Kruif (1890â1971) was an American microbiologist and World War I veteran who turned to writing after his dismissal from the Rockefeller Institute for Medical Research because of his controversial opinions on current medical practice published in a book of essays. Among his other works, he also assisted Sinclair Lewis with the background of science for the novel Arrowsmith (1925).
doi: 10.32873/unl.dc.zea.1503https://digitalcommons.unl.edu/zeabook/1147/thumbnail.jp
Basketball, culture and society in a devolved context: a qualitative analysis
This thesis investigates the potential of basketball as a tool for development in Scotland. It provides an original conceptual synthesis of knowledge that offers a critical narrative concerning the evolving relationship between basketball, development, and society alongside the limits and possibilities of basketball in Scotland. The study adopts the interpretivist paradigm alongside qualitative methodology. It consists of an exploratory mixed-methods approach which utilises audio-visuals, documents and reports alongside semi-structured interviews and embeds a case study design. The research comprises four empirical chapters: The Development and History of Basketball in Scotland; Grassroots Basketball in Scotland: basketballscotland; Community Basketball in Scotland: Blaze Basketball Club; and Professional Basketball in Scotland: Caledonia Gladiators Basketball Club. Findings indicate that basketball helps develop people, communities and nations through capability building processes. To generate optimal developmental outcomes through basketball, a collaborative, democratic, intentional, person-first, community-driven, needs-motivated, ground-level led system bound by connections alongside relationships, underpinned by passionate people is required. The earlier people are introduced to basketball and the longer they remain in basketball environments, the greater the potential for developmental outcomes. To maximise results, basketball in Scotland must address its main limitations: funding; lagging opportunities; participatory barriers; Scottish basketballâs disjointed community, nature, positionality, and system; alongside Scotlandâs sporting culture
Qualitative impact assessment of land management interventions on Ecosystem Services (âQEIAâ). Report 3.7: Cultural Services
The focus of this project was to provide a rapid qualitative assessment of land management interventions on Ecosystem Services (ES) proposed for inclusion in Environmental Land Management (ELM) schemes. This involved a review of the current evidence base by ten expert teams drawn from the independent research community in a consistent series of ten Evidence Reviews. These reviews were undertaken rapidly at Defraâs request and together captured more than 2000 individual sources of evidence. These reviews were then used to inform an Integrated Assessment (IA) to provide a more accessible summary of these evidence reviews with a focus on capturing the actions with the greatest potential magnitude of change for the intended ES and their potential co-benefits and trade-offs across the Ecosystem Services and Ecosystem Services Indicators.
The final IA table captured scores for 741 actions across 8 Themes, 33 ES and 53 ES-indicators. This produced a total possible matrix of 39,273 scores. It should be noted that this piece of work is just one element of the wider underpinning work Defra has commissioned to support the development of the ELM schemes. The project was carried out in two phases with the environmental and provisioning services commissioned in Phase 1 and cultural and regulatory services in a follow-on Phase 2.
Due to the urgency of the need for these evidence reviews, there was insufficient time for systematic reviews and therefore the reviews relied on the knowledge of the team of the peer reviewed and grey literature with some rapid additional checking of recent reports and papers. This limitation of the review process was clearly explained and understood by Defra. The review presented here is one of the ten evidence reviews which informed the IA
- âŚ