7,801 research outputs found
Tourism and heritage in the Chornobyl Exclusion Zone
Tourism and Heritage in the Chornobyl Exclusion Zone (CEZ) uses an ethnographic lens to explore the dissonances associated with the commodification of Chornobyl's heritage.
The book considers the role of the guides as experience brokers, focusing on the synergy between tourists and guides in the performance of heritage interpretation. Banaszkiewicz proposes to perceive tour guides as important actors in the bottom-up construction of heritage discourse contributing to more inclusive and participatory approach to heritage management. Demonstrating that the CEZ has been going through a dynamic transformation into a mass tourism attraction, the book offers a critical reflection on heritagisation as a meaning-making process in which the resources of the past are interpreted, negotiated, and recognised as a valuable legacy. Applying the concepts of dissonant heritage to describe the heterogeneous character of the CEZ, the book broadens the interpretative scope of dark tourism which takes on a new dimension in the context of the war in Ukraine.
Tourism and Heritage in the Chornobyl Exclusion Zone argues that post-disaster sites such as Chornobyl can teach us a great deal about the importance of preserving cultural and natural heritage for future generations. The book will be of interest to academics and students who are engaged in the study of heritage, tourism, memory, disasters and Eastern Europe
Performance, memory efficiency and programmability: the ambitious triptych of combining vertex-centricity with HPC
The field of graph processing has grown significantly due to the flexibility and wide
applicability of the graph data structure. In the meantime, so has interest from the
community in developing new approaches to graph processing applications. In 2010,
Google introduced the vertex-centric programming model through their framework Pregel. This consists of expressing computation from the perspective of a vertex, whilst inter-vertex communications are achieved via data exchanges along incoming and outgoing edges, using the message-passing abstraction provided. Pregel ’s high-level programming interface, designed around a set of simple functions, provides ease of programmability to the user. The aim is to enable the development of graph processing applications without requiring expertise in optimisation or parallel programming. Such challenges are instead abstracted from the user and offloaded to the underlying framework. However, fine-grained synchronisation, unpredictable memory access patterns and multiple sources of load imbalance make it difficult to implement the vertex centric model efficiently on high performance computing platforms without sacrificing programmability.
This research focuses on combining vertex-centric and High-Performance Comput-
ing (HPC), resulting in the development of a shared-memory framework, iPregel, which
demonstrates that a performance and memory efficiency similar to that of non-vertex-
centric approaches can be achieved while preserving the programmability benefits of
vertex-centric. Non-volatile memory is then explored to extend single-node capabilities, during which multiple versions of iPregel are implemented to experiment with the various data movement strategies.
Then, distributed memory parallelism is investigated to overcome the resource limitations of single node processing. A second framework named DiP, which ports applicable iPregel ’s optimisations to distributed memory, prioritises performance to high scalability.
This research has resulted in a set of techniques and optimisations illustrated through a shared-memory framework iPregel and a distributed-memory framework DiP. The former closes a gap of several orders of magnitude in both performance and memory efficiency, even able to process a graph of 750 billion edges using non-volatile memory. The latter has proved that this competitiveness can also be scaled beyond a single node, enabling the processing of the largest graph generated in this research, comprising 1.6 trillion edges. Most importantly, both frameworks achieved these performance and capability gains whilst also preserving programmability, which is the cornerstone of the vertex-centric programming model. This research therefore demonstrates that by combining vertex-centricity and High-Performance Computing (HPC), it is possible to maintain performance, memory efficiency and programmability
neuroAIx-Framework: design of future neuroscience simulation systems exhibiting execution of the cortical microcircuit model 20× faster than biological real-time
IntroductionResearch in the field of computational neuroscience relies on highly capable simulation platforms. With real-time capabilities surpassed for established models like the cortical microcircuit, it is time to conceive next-generation systems: neuroscience simulators providing significant acceleration, even for larger networks with natural density, biologically plausible multi-compartment models and the modeling of long-term and structural plasticity.MethodsStressing the need for agility to adapt to new concepts or findings in the domain of neuroscience, we have developed the neuroAIx-Framework consisting of an empirical modeling tool, a virtual prototype, and a cluster of FPGA boards. This framework is designed to support and accelerate the continuous development of such platforms driven by new insights in neuroscience.ResultsBased on design space explorations using this framework, we devised and realized an FPGA cluster consisting of 35 NetFPGA SUME boards.DiscussionThis system functions as an evaluation platform for our framework. At the same time, it resulted in a fully deterministic neuroscience simulation system surpassing the state of the art in both performance and energy efficiency. It is capable of simulating the microcircuit with 20× acceleration compared to biological real-time and achieves an energy efficiency of 48nJ per synaptic event
Recommended from our members
Asylum and immigration policy, policy communities and the British news media: a case study in policy-making
This research investigation examines the policy communities and networks (PC&N) perspective as a tool for understanding the influence of the news media in shaping the policy agenda. It does so, by examining the evolution of two case studies in a new policy arena, asylum, and immigration, from policy initiative to policy reversal. In order to understand how the dynamics of discourse shape the development of the policy agenda, it is fundamental to first understand the nature of information flow in social settings. Policy communities and networks provide the appropriate social setting in which to explore the role of the news media, as it facilitates the flow in which information is constructed, distributed, and absorbed within them. Existing literature on the influence of the news media on the development of opinion making is extensive, however literature on the influence of the news media on the development of policy making is emergent. By applying the PC&N perspective to understanding the role of the news media on issue definition, decision making and policy change, this research investigation contributes to the literature on both; as well as the emergent literature on the influence of the news media on immigration and asylum policy itself. In addition, through its empirical examination of the evolution of case study asylum and immigration policy reversals, this research investigation utilises a new methodology, content analysis, to identify the existence, nature and membership of policy communities and networks and insider groups active within them. In providing strong evidence that the policy communities and networks perspective is a valid approach for understanding the nature of policymaking and the role of the news media in shaping policy agendas, it also provides an alternative approach to examining policy making in an emergent field of policy science research, asylum and immigration policy network analysis
Recommended from our members
Credible to Whom? The Organizational Politics of Credibility in International Relations
Why do foreign policy decision makers care about the credibility of their own state’s commitments? How does organizational identity shape policymakers’ concern for credibility, and in turn, their willingness to use force during crises? While much previous research examines how decision makers assess others’ credibility, only recently have scholars questioned when and why leaders or their advisers prioritize their own state’s credibility.
Building on classic scholarship in bureaucratic politics, I argue that organizational identity affects the dimensions of credibility that national security officials value, and ultimately, their policy advocacy around the use of force. Particular differences arise between military and diplomatic organizations; while military officials equate credibility with hard military capabilities, diplomats view credibility in terms of reputation, or demonstrating reliability and resolve to external parties.
During crises, military officials confine their advice on the use of force to what can be achieved given current capabilities, while diplomats exhibit higher willingness to use force as a signal of a strong commitment. I test these propositions using text analysis of archival records from two collections of U.S. national security policy documents, eight case studies of American, British, and French crisis decision making, and an original survey experiment involving more than 400 current or former U.S. national security officials. I demonstrate that credibility concerns affect the balance of hawkishness in advice that diplomats and military officials deliver to leaders as a function of organizational identity
CITIES: Energetic Efficiency, Sustainability; Infrastructures, Energy and the Environment; Mobility and IoT; Governance and Citizenship
This book collects important contributions on smart cities. This book was created in collaboration with the ICSC-CITIES2020, held in San José (Costa Rica) in 2020. This book collects articles on: energetic efficiency and sustainability; infrastructures, energy and the environment; mobility and IoT; governance and citizenship
Bibliographic Control in the Digital Ecosystem
With the contributions of international experts, the book aims to explore the new boundaries of universal bibliographic control. Bibliographic control is radically changing because the bibliographic universe is radically changing: resources, agents, technologies, standards and practices. Among the main topics addressed: library cooperation networks; legal deposit; national bibliographies; new tools and standards (IFLA LRM, RDA, BIBFRAME); authority control and new alliances (Wikidata, Wikibase, Identifiers); new ways of indexing resources (artificial intelligence); institutional repositories; new book supply chain; “discoverability” in the IIIF digital ecosystem; role of thesauri and ontologies in the digital ecosystem; bibliographic control and search engines
COVID-19 Outbreak and Beyond
The COVID-19 pandemic drastically changed our lifestyle when, on 30 January 2020, the World Health Organization declared the coronavirus disease outbreak a public health emergency of international concern. Since then, many governments have introduced unprecedented containment measures, hoping to slow the spread of the virus. International research suggests that both the pandemic and the related protective measures, such as lockdown, curfews, and social distancing, are having a profound impact on the mental health of the population. Among the most commonly observed psychological effects, there are high levels of stress, anxiety, depression, and post-traumatic symptoms, along with boredom and frustration. At the same time, the behavioral response of the population is of paramount importance to successfully contain the outbreak, creating a vicious circle in which the psychological distress impacts the willingness to comply with the protective measures, which, in turn, if prolonged, could exacerbate the population’s distress. This book includes: i) original studies on the worldwide psychological and behavioral impact of COVID-19 on targeted individuals (e.g., parents, social workers, patients affected by physical and mental disorders); ii) studies exploring the effect of COVID-19 using advanced statistical and methodological techniques (e.g., machine learning technologies); iii) research on practical applications that could help identify persons at risk, mitigate the negative effects of this situation, and offer insights to policymakers to manage the pandemic are also highly welcomed
- …