1,835 research outputs found
Mobile heritage practices. Implications for scholarly research, user experience design, and evaluation methods using mobile apps.
Mobile heritage apps have become one of the most popular means for audience
engagement and curation of museum collections and heritage contexts. This
raises practical and ethical questions for both researchers and practitioners, such
as: what kind of audience engagement can be built using mobile apps? what are
the current approaches? how can audience engagement with these experience
be evaluated? how can those experiences be made more resilient, and in turn
sustainable? In this thesis I explore experience design scholarships together with
personal professional insights to analyse digital heritage practices with a view to
accelerating thinking about and critique of mobile apps in particular. As a result,
the chapters that follow here look at the evolution of digital heritage practices,
examining the cultural, societal, and technological contexts in which mobile
heritage apps are developed by the creative media industry, the academic
institutions, and how these forces are shaping the user experience design
methods. Drawing from studies in digital (critical) heritage, Human-Computer
Interaction (HCI), and design thinking, this thesis provides a critical analysis of
the development and use of mobile practices for the heritage. Furthermore,
through an empirical and embedded approach to research, the thesis also
presents auto-ethnographic case studies in order to show evidence that mobile
experiences conceptualised by more organic design approaches, can result in
more resilient and sustainable heritage practices. By doing so, this thesis
encourages a renewed understanding of the pivotal role of these practices in the
broader sociocultural, political and environmental changes.AHRC REAC
Ideas and perspectives: Beyond model evaluation â combining experiments and models to advance terrestrial ecosystem science
Ecosystem manipulative experiments are a powerful tool to
understand terrestrial ecosystem responses to global change because they
measure real responses in real ecosystems and yield insights into causal
relationships. However, their scope is limited in space and time due to
cost and labour intensity. This makes generalising results from such
experiments difficult, which creates a conceptual gap between local-scale
process understanding and global-scale future predictions. Recent efforts
have seen results from such experiments used in combination with dynamic
global vegetation models, most commonly to evaluate model predictions under
global change drivers. However, there is much more potential in combining
models and experiments. Here, we discuss the value and potential of a
workflow for using ecosystem experiments together with process-based models
to enhance the potential of both. We suggest that models can be used prior
to the start of an experiment to generate hypotheses, identify data needs,
and in general guide experimental design. Models, when adequately
constrained with observations, can also predict variables which are
difficult to measure frequently or at all, and together with the data they can
provide a more complete picture of ecosystem states. Finally, models can be
used to help generalise the experimental results in space and time, by
providing a framework in which process understanding derived from site-level
experiments can be incorporated. We also discuss the potential for using
manipulative experiments together with models in formalised modelâdata
integration frameworks for parameter estimation and model selection, a path
made possible by the increasing number of ecosystem experiments and diverse
observation streams. The ideas presented here can provide a roadmap to
future experimentâmodel studies.</p
Systems Analysis for Sustainable Wellbeing. 50 years of IIASA research, 40 years after the Brundtland Commission, contributing to the post-2030 Global Agenda
This report chronicles the half-century-long history of the International Institute for Applied Systems Analysis (IIASA), established in 1972 in Laxenburg, Austria, to address common social, economic, and environmental challenges at a time when the world was politically dominated by the Cold War.
The report shows IIASAâs transition from its original raison dâĂȘtre as a cooperative scientific venture between East and West to its position today as a global institute engaged in exploring solutions to some of the worldâs most intractable problemsâthe interconnected problems of population, climate change, biodiversity loss, land, energy, and water use, among others.
It provides a concise overview of IIASAâs key contributions to science over the last 50 years and of the advances it has made not only in analyzing existing and emerging trends but also in developing enhanced scientific tools to address them. The report also shows how IIASA is currently working with distinguished partners worldwide to establish the scientific basis for a successful transition to sustainable development.
The global mandate, to achieve the 2030 Agenda, its 17 Sustainable Development Goals (SDGs), and 169 specific targets, features prominently in the instituteâs work and in the report at hand: the pathways needed to achieve the SDGs have been the basis of many scientific studies by IIASA and its partners. The predominantly âbottom-upâ nature of tackling the SDGs has required optimal responses to the very diverse and overlapping issues they involve, including judicious tradeoffs among the solutions that can be applied. Now, at the mid-term review point of the 2030 Agenda, this report focuses on the big picture and clarifies why, after years of scientific endeavor, the ultimate goal of this difficult global mandate should be sustainable wellbeing for all.
The report is in six parts that summarize past and current IIASA research highlights and point toward future challenges and solutions: i) Systems analysis for a challenged world; ii) Population and human capital; iii) Food security, ecosystems, and biodiversity; iv) Energy, technology, and climate change; v) Global systems analysis for understanding the drivers of sustainable wellbeing; and vi) Moving into the future: Three critical policy messages.
The three critical policy messages, necessary to trigger discussions about a post-2030 Agenda for Sustainable Development are: (1) Suboptimization is suboptimal: Mainstream a systems-analysis approach into policymaking at all levels. (2) Enhance individual agency: Prioritize womenâs empowerment through universal female education; and (3) Strengthen collective action and governance: Global cooperation and representation for the global common
Shake-table testing of a stone masonry building aggregate: overview of blind prediction study
City centres of Europe are often composed of unreinforced masonry structural aggregates, whose seismic response is challenging to predict. To advance the state of the art on the seismic response of these aggregates, the Adjacent Interacting Masonry Structures (AIMS) subproject from Horizon 2020 project Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe (SERA) provides shake-table test data of a two-unit, double-leaf stone masonry aggregate subjected to two horizontal components of dynamic excitation. A blind prediction was organized with participants from academia and industry to test modelling approaches and assumptions and to learn about the extent of uncertainty in modelling for such masonry aggregates. The participants were provided with the full set of material and geometrical data, construction details and original seismic input and asked to predict prior to the test the expected seismic response in terms of damage mechanisms, base-shear forces, and roof displacements. The modelling approaches used differ significantly in the level of detail and the modelling assumptions. This paper provides an overview of the adopted modelling approaches and their subsequent predictions. It further discusses the range of assumptions made when modelling masonry walls, floors and connections, and aims at discovering how the common solutions regarding modelling masonry in general, and masonry aggregates in particular, affect the results. The results are evaluated both in terms of damage mechanisms, base shear forces, displacements and interface openings in both directions, and then compared with the experimental results. The modelling approaches featuring Discrete Element Method (DEM) led to the best predictions in terms of displacements, while a submission using rigid block limit analysis led to the best prediction in terms of damage mechanisms. Large coefficients of variation of predicted displacements and general underestimation of displacements in comparison with experimental results, except for DEM models, highlight the need for further consensus building on suitable modelling assumptions for such masonry aggregates
GHOST Commissioning Science Results II: a very metal-poor star witnessing the early Galactic assembly
This study focuses on Pristine (hereafter P180956,
[Fe/H] ), a star selected from the Pristine Inner Galaxy Survey
(PIGS), and followed-up with the recently commissioned Gemini High-resolution
Optical SpecTrograph (GHOST) at the Gemini South telescope. The GHOST
spectrograph's high efficiency in the blue spectral region (~\AA)
enables the detection of elemental tracers of early supernovae (e.g. Al, Mn,
Sr, Eu), which were not accessible in the previous analysis of P180956. The
star exhibits chemical signatures resembling those found in ultra-faint dwarf
systems, characterised by very low abundances of neutron-capture elements (Sr,
Ba, Eu), which are uncommon among stars of comparable metallicity in the Milky
Way. Our analysis suggests that P180956 bears the chemical imprints of a small
number (2 or 4) of low-mass hypernovae (\sim10-15\msun), which are needed to
reproduce the abundance pattern of the light-elements (e.g. [Si, Ti/Mg, Ca]
), and one fast-rotating intermediate-mass supernova (\sim300\kms,
\sim80-120\msun). Both types of supernovae explain the high [Sr/Ba] of
P180956 (). The small pericentric (\sim0.7\kpc) and apocentric
(\sim13\kpc) distances and its orbit confined to the plane (\lesssim
2\kpc), indicate that this star was likely accreted during the early Galactic
assembly phase. Its chemo-dynamical properties suggest that P180956 formed in a
system similar to an ultra-faint dwarf galaxy accreted either alone, as one of
the low-mass building blocks of the proto-Galaxy, or as a satellite of
Gaia-Sausage-Enceladus. The combination of Gemini's large aperture with GHOST's
high efficiency and broad spectral coverage makes this new spectrograph one of
the leading instruments for near-field cosmology investigations.Comment: Submitted to MNRAS. 8 figures, 15page
On the Simulation of Supersonic Flame Holder Cavities with OpenFOAM
One of the next major advancements in the aerospace industry will be hypersonic flight. However, to achieve hypersonic flight, propulsion systems capable of reaching hypersonic speeds need to be developed. One of the more promising hypersonic propulsion systems is the scramjet engine, however, several problems still need to be explored before reliable scramjet engines can be produced, the biggest being keeping the engine ignited. This has led to the use of flame holder cavities to create a region of subsonic flow within the engine to allow combustion to occur. High experimental costs make the use of computational fluid dynamic (CFD) simulations attractive to explore these problems. Numerical simulation is effective but is plagued by high computational costs. The question remains, how can we utilize CFD simulation to quickly develop scramjets? To solve this, an OpenFOAM solver, known as rssFOAM was developed to simulate supersonic combustion using finite-rate chemistry. RssFOAM is used for the simulation of a supersonic flame holder cavity corresponding to a series of experiments from the Air Force Research Laboratory (AFRL). The effect of the type of turbulence model, size of the chemical mechanism, and geometry used for simulation are explored. These results collected are intended to help with the transition between high-fidelity research-level simulations and lower-fidelity design-level simulations. Results will be compared to experimental data and prior simulation results from the AFRL. The results show that RANS turbulence models are more than capable of these types of simulations and smaller less detailed chemical mechanisms can be used. The results also show that the importance of properly capturing the boundary layer does not allow for inlet geometries to be ignored
Cybersecurity: Past, Present and Future
The digital transformation has created a new digital space known as
cyberspace. This new cyberspace has improved the workings of businesses,
organizations, governments, society as a whole, and day to day life of an
individual. With these improvements come new challenges, and one of the main
challenges is security. The security of the new cyberspace is called
cybersecurity. Cyberspace has created new technologies and environments such as
cloud computing, smart devices, IoTs, and several others. To keep pace with
these advancements in cyber technologies there is a need to expand research and
develop new cybersecurity methods and tools to secure these domains and
environments. This book is an effort to introduce the reader to the field of
cybersecurity, highlight current issues and challenges, and provide future
directions to mitigate or resolve them. The main specializations of
cybersecurity covered in this book are software security, hardware security,
the evolution of malware, biometrics, cyber intelligence, and cyber forensics.
We must learn from the past, evolve our present and improve the future. Based
on this objective, the book covers the past, present, and future of these main
specializations of cybersecurity. The book also examines the upcoming areas of
research in cyber intelligence, such as hybrid augmented and explainable
artificial intelligence (AI). Human and AI collaboration can significantly
increase the performance of a cybersecurity system. Interpreting and explaining
machine learning models, i.e., explainable AI is an emerging field of study and
has a lot of potentials to improve the role of AI in cybersecurity.Comment: Author's copy of the book published under ISBN: 978-620-4-74421-
- âŠ