18,258 research outputs found
Modelling uncertainties for measurements of the H → γγ Channel with the ATLAS Detector at the LHC
The Higgs boson to diphoton (H → γγ) branching ratio is only 0.227 %, but this
final state has yielded some of the most precise measurements of the particle. As
measurements of the Higgs boson become increasingly precise, greater import is
placed on the factors that constitute the uncertainty. Reducing the effects of these
uncertainties requires an understanding of their causes. The research presented
in this thesis aims to illuminate how uncertainties on simulation modelling are
determined and proffers novel techniques in deriving them.
The upgrade of the FastCaloSim tool is described, used for simulating events in
the ATLAS calorimeter at a rate far exceeding the nominal detector simulation,
Geant4. The integration of a method that allows the toolbox to emulate the
accordion geometry of the liquid argon calorimeters is detailed. This tool allows
for the production of larger samples while using significantly fewer computing
resources.
A measurement of the total Higgs boson production cross-section multiplied
by the diphoton branching ratio (σ × Bγγ) is presented, where this value was
determined to be (σ × Bγγ)obs = 127 ± 7 (stat.) ± 7 (syst.) fb, within agreement
with the Standard Model prediction. The signal and background shape modelling
is described, and the contribution of the background modelling uncertainty to the
total uncertainty ranges from 18–2.4 %, depending on the Higgs boson production
mechanism.
A method for estimating the number of events in a Monte Carlo background
sample required to model the shape is detailed. It was found that the size of
the nominal γγ background events sample required a multiplicative increase by
a factor of 3.60 to adequately model the background with a confidence level of
68 %, or a factor of 7.20 for a confidence level of 95 %. Based on this estimate,
0.5 billion additional simulated events were produced, substantially reducing the
background modelling uncertainty.
A technique is detailed for emulating the effects of Monte Carlo event generator
differences using multivariate reweighting. The technique is used to estimate the
event generator uncertainty on the signal modelling of tHqb events, improving the
reliability of estimating the tHqb production cross-section. Then this multivariate
reweighting technique is used to estimate the generator modelling uncertainties
on background V γγ samples for the first time. The estimated uncertainties were
found to be covered by the currently assumed background modelling uncertainty
Recommended from our members
Meaning-Making Practices of Emergent Arabic–English Bilingual Kindergarten Children in Cairo
The number of British Schools in the Middle East and North Africa (MENA) region is growing. The National Curriculum of England is used by an increasing number of such schools. As well as exporting a culturally-specific curriculum, these schools usually adopt an ideology of monolingualism, thus potentially limiting communication for emergent bilinguals and failing to acknowledge the multiple ways of meaning-making.
Current studies of translanguaging are moving the focus to multimodal forms of communication as a resource for thinking and communicating (García and Wei 2014, Wei 2018). Building on the work of Kress (1997, 2010) I explore pre-school emergent bilinguals’ wider signifying practices and create an analytical framework, which I call MMTL (multimodal translanguaging), used as a lens to illustrate meaning-making.
Valley Hill in Cairo, Egypt is a British school which encourages ‘English-only’ as the medium of instruction in the kindergarten. Using a case study methodology, this research explores the meaning-making practices of eight emergent bilingual children aged 3–4 during child-initiated play, later reduced to four in the thesis to provide a detailed multimodal analysis. The principal aim is to explore their speech, gaze, gesture, and their engagement (layout/position) with artefacts during play.
The findings of this study suggest that although there is an ‘English-only’ approach, these young emergent bilingual children are meaning-making in a variety of ways. Children are translanguaging but it is never in isolation from other modes of communication. Emergent bilinguals use a range of modes to mediate their understanding and communication with others. They use gesture, gaze, and artefacts alongside translingual practices to move meaning across to more accessible modes, enabling communication and understanding. The implications for schools should be to embrace such hybrid practices and for teachers to be more responsive to young children’s meaning-making to enable learning
Structure and adsorption properties of gas-ionic liquid interfaces
Supported ionic liquids are a diverse class of materials that have been considered
as a promising approach to design new surface properties within solids for gas
adsorption and separation applications. In these materials, the surface morphology and
composition of a porous solid are modified by depositing ionic liquid. The resulting
materials exhibit a unique combination of structural and gas adsorption properties
arising from both components, the support, and the liquid. Naturally, theoretical and
experimental studies devoted to understanding the underlying principles of exhibited
interfacial properties have been an intense area of research. However, a complete
understanding of the interplay between interfacial gas-liquid and liquid-solid
interactions as well as molecular details of these processes remains elusive.
The proposed problem is challenging and in this thesis, it is approached from
two different perspectives applying computational and experimental techniques. In
particular, molecular dynamics simulations are used to model gas adsorption in films
of ionic liquids on a molecular level. A detailed description of the modeled systems is
possible if the interfacial and bulk properties of ionic liquid films are separated. In this
study, we use a unique method that recognizes the interfacial and bulk structures of
ionic liquids and distinguishes gas adsorption from gas solubility. By combining
classical nitrogen sorption experiments with a mean-field theory, we study how liquid-solid interactions influence the adsorption of ionic liquids on the surface of the porous
support.
The developed approach was applied to a range of ionic liquids that feature
different interaction behavior with gas and porous support. Using molecular
simulations with interfacial analysis, it was discovered that gas adsorption capacity
can be directly related to gas solubility data, allowing the development of a predictive
model for the gas adsorption performance of ionic liquid films. Furthermore, it was
found that this CO2 adsorption on the surface of ionic liquid films is determined by the
specific arrangement of cations and anions on the surface. A particularly important
result is that, for the first time, a quantitative relation between these structural and
adsorption properties of different ionic liquid films has been established. This link
between two types of properties determines design principles for supported ionic
liquids.
However, the proposed predictive model and design principles rely on the
assumption that the ionic liquid is uniformly distributed on the surface of the porous
support. To test how ionic liquids behave under confinement, nitrogen physisorption
experiments were conducted for micro‐ and mesopore analysis of supported ionic
liquid materials. In conjunction with mean-field density functional theory applied to
the lattice gas and pore models, we revealed different scenarios for the pore-filling
mechanism depending on the strength of the liquid-solid interactions.
In this thesis, a combination of computational and experimental studies provides
a framework for the characterization of complex interfacial gas-liquid and liquid-solid
processes. It is shown that interfacial analysis is a powerful tool for studying
molecular-level interactions between different phases. Finally, nitrogen sorption
experiments were effectively used to obtain information on the structure of supported
ionic liquids
How to Be a God
When it comes to questions concerning the nature of Reality, Philosophers and Theologians have the answers.
Philosophers have the answers that can’t be proven right. Theologians have the answers that can’t be proven wrong.
Today’s designers of Massively-Multiplayer Online Role-Playing Games create realities for a living. They can’t spend centuries mulling over the issues: they have to face them head-on. Their practical experiences can indicate which theoretical proposals actually work in practice.
That’s today’s designers. Tomorrow’s will have a whole new set of questions to answer.
The designers of virtual worlds are the literal gods of those realities. Suppose Artificial Intelligence comes through and allows us to create non-player characters as smart as us. What are our responsibilities as gods? How should we, as gods, conduct ourselves?
How should we be gods
Studies of strategic performance management for classical organizations theory & practice
Nowadays, the activities of "Performance Management" have spread very broadly in actually every part of business and management. There are numerous practitioners and researchers from very different disciplines, who are involved in exploring the different contents of performance management. In this thesis, some relevant historic developments in performance management are first reviewed. This includes various theories and frameworks of performance management. Then several management science techniques are developed for assessing performance management, including new methods in Data Envelopment Analysis (DEA) and Soft System Methodology (SSM). A theoretical framework for performance management and its practical procedures (five phases) are developed for "classic" organizations using soft system thinking, and the relationship with the existing theories are explored. Eventually these results are applied in three case studies to verify our theoretical development. One of the main contributions of this work is to point out, and to systematically explore the basic idea that the effective forms and structures of performance management for an organization are likely to depend greatly on the organizational configuration, in order to coordinate well with other management activities in the organization, which has seemingly been neglected in the existing literature of performance management research in the sense that there exists little known research that associated particular forms of performance management with the explicit assumptions of organizational configuration. By applying SSM, this thesis logically derives some main functional blocks of performance management in 'classic' organizations and clarifies the relationships between performance management and other management activities. Furthermore, it develops some new tools and procedures, which can hierarchically decompose organizational strategies and produce a practical model of specific implementation steps for "classic" organizations. Our approach integrates popular types of performance management models. Last but not least, this thesis presents findings from three major cases, which are quite different organizations in terms of management styles, ownership, and operating environment, to illustrate the fliexbility of the developed theoretical framework
Anytime algorithms for ROBDD symmetry detection and approximation
Reduced Ordered Binary Decision Diagrams (ROBDDs) provide a dense and memory efficient representation of Boolean functions. When ROBDDs are applied in logic synthesis, the problem arises of detecting both classical and generalised symmetries. State-of-the-art in symmetry detection is represented by Mishchenko's algorithm. Mishchenko showed how to detect symmetries in ROBDDs without the need for checking equivalence of all co-factor pairs. This work resulted in a practical algorithm for detecting all classical symmetries in an ROBDD in O(|G|³) set operations where |G| is the number of nodes in the ROBDD. Mishchenko and his colleagues subsequently extended the algorithm to find generalised symmetries. The extended algorithm retains the same asymptotic complexity for each type of generalised symmetry. Both the classical and generalised symmetry detection algorithms are monolithic in the sense that they only return a meaningful answer when they are left to run to completion. In this thesis we present efficient anytime algorithms for detecting both classical and generalised symmetries, that output pairs of symmetric variables until a prescribed time bound is exceeded. These anytime algorithms are complete in that given sufficient time they are guaranteed to find all symmetric pairs. Theoretically these algorithms reside in O(n³+n|G|+|G|³) and O(n³+n²|G|+|G|³) respectively, where n is the number of variables, so that in practice the advantage of anytime generality is not gained at the expense of efficiency. In fact, the anytime approach requires only very modest data structure support and offers unique opportunities for optimisation so the resulting algorithms are very efficient. The thesis continues by considering another class of anytime algorithms for ROBDDs that is motivated by the dearth of work on approximating ROBDDs. The need for approximation arises because many ROBDD operations result in an ROBDD whose size is quadratic in the size of the inputs. Furthermore, if ROBDDs are used in abstract interpretation, the running time of the analysis is related not only to the complexity of the individual ROBDD operations but also the number of operations applied. The number of operations is, in turn, constrained by the number of times a Boolean function can be weakened before stability is achieved. This thesis proposes a widening that can be used to both constrain the size of an ROBDD and also ensure that the number of times that it is weakened is bounded by some given constant. The widening can be used to either systematically approximate an ROBDD from above (i.e. derive a weaker function) or below (i.e. infer a stronger function). The thesis also considers how randomised techniques may be deployed to improve the speed of computing an approximation by avoiding potentially expensive ROBDD manipulation
Recommended from our members
Privacy-aware Smart Home Interface Framework
Smart home user interfaces are pervasive and shared by multiple users who occupy the space. Therefore, they pose a risk to interpersonal privacy of occupants because an individual’s sensitive information can be leaked to other co-occupants (information privacy), or they can be disturbed by intrusions into their personal space (physical privacy) when the co-occupant interacts with the smart home user interfaces. This thesis hypothesises that interpersonal privacy violations can be mitigated by adapting the user interface layer and presents insights into how to achieve usable user interface adaptation to mitigate or minimise interpersonal privacy violations in smart homes.
The thesis reports two case studies and two user studies. The first case study identifies the key characteristics needed to model the rich context of interpersonal privacy violations scenarios. Then it presents knowledge representation models that are required to represent the identified characteristics and evaluates them for adequacy in modelling the context information of interpersonal privacy violation scenarios. The second case study presents a software architecture and a set of algorithms that can detect interpersonal privacy violations and generate usable user interface adaptations. Then it evaluates the architecture and the algorithms for adequacy in generating usable privacy-aware user interface adaptations. The first user study (N=15) evaluates the usability of the adaptive user interfaces generated from the framework where storyboards were used as the stimulant. Extending the findings from the usability study and expanding the coverage of example scenarios, the second user study (N=23) evaluates the overall user experience of the adaptive user interfaces, using video prototypes as the stimulant.
The research demonstrates that the characteristics identified, and the respective knowledge representation models adequately captured the context of interpersonal privacy violation scenarios. Furthermore, the software architecture and the algorithms could detect possible interpersonal privacy violations and generate usable user interface adaptations to mitigate them. The two user studies demonstrate that the adaptive user interfaces, when used in appropriate situations, were a suitable solution for addressing interpersonal privacy violations while providing high usability and a positive user experience. The thesis concludes by providing recommendations for developing privacy-aware user interface adaptations and suggesting future work that can extend this research
Adaptive task selection using threshold-based techniques in dynamic sensor networks
Sensor nodes, like many social insect species, exist in harsh environments in large groups, yet possess very limited amount of resources. Lasting for as long as possible, and fulfilling the network purposes are the ultimate goals of sensor networks. However, these goals are inherently contradictory. Nature can be a great source of inspiration for mankind to find methods to achieve both extended survival, and effective operation. This work aims at applying the threshold-based action selection mechanisms inspired from insect societies to perform action selection within sensor nodes. The effect of this micro-model on the macro-behaviour of the network is studied in terms of durability and task performance quality. Generally, this is an example of using bio-inspiration to achieve adaptivity in sensor networks
Examining the Potential for Isotope Analyses of Carbon, Nitrogen, and Sulphur in Burned Bone from Experimental and Archaeological Contexts.
The aim of this project was to determine whether isotope analyses of carbon, nitrogen and sulphur can be conducted on collagen extracted from burned bone. This project was conducted in two phases: a controlled heating experiment and an archaeological application. The controlled heating experiment used cow (Bos taurus) bone to test the temperature thresholds for the conservation of δ13C, δ15N, and δ34S values. These samples were also used to test the efficacy of Fourier Transform Infrared spectroscopy (FTIR) and colour analysis, for determining the burning intensities experienced by bone burned in unknown conditions.
The experiment showed that δ13C values were relatively unchanged up to 400°C (<2‰ variation), while δ15N values were relatively stable up to 200°C (0.5‰ variation). Values of δ34S were also relatively stable up to 200°C (1.4‰ variation). Colour change and FTIR data were well correlated with the change in isotope ratios. Models estimating burning intensities were created from the FTIR data.
For the archaeological application, samples were selected from two early Anglo-Saxon cemetery sites: Elsham and Cleatham. Samples were selected from both inhumed and cremated individuals. Among the inhumed individuals δ13C values suggested a C3 terrestrial diet and δ15N values suggested protein derived largely from terrestrial herbivores, as expected for the early Anglo-Saxon period. However, δ34S values suggested the consumption of freshwater resources and that this consumption was related to both the age and sex of the individual.
The experimental data shows that there is potential for isotope analyses of cremated remains, as during the cremation process heat exposures are not uniform across the body. The samples selected for the archaeological application, however, were not successful. Bone samples heated in controlled conditions produced viable collagen for isotope analysis; however, there are several differences between experiments conducted in a muffle furnace and open-air pyre cremation that need to be investigated further. Additionally, the influence of taphonomy on collagen survival in burned bone needs to be quantified. Finally, methods of sample selection need to be improved to find bone samples from archaeologically cremated remains that are most likely to retain viable collagen. While there is significant research that must be conducted before this research can be widely applied there are a multitude of cultures that practised cremation throughout history and around the world that could be investigated through the analyses proposed in this project
- …