20,977 research outputs found

    Modelling uncertainties for measurements of the H → γγ Channel with the ATLAS Detector at the LHC

    Get PDF
    The Higgs boson to diphoton (H → γγ) branching ratio is only 0.227 %, but this final state has yielded some of the most precise measurements of the particle. As measurements of the Higgs boson become increasingly precise, greater import is placed on the factors that constitute the uncertainty. Reducing the effects of these uncertainties requires an understanding of their causes. The research presented in this thesis aims to illuminate how uncertainties on simulation modelling are determined and proffers novel techniques in deriving them. The upgrade of the FastCaloSim tool is described, used for simulating events in the ATLAS calorimeter at a rate far exceeding the nominal detector simulation, Geant4. The integration of a method that allows the toolbox to emulate the accordion geometry of the liquid argon calorimeters is detailed. This tool allows for the production of larger samples while using significantly fewer computing resources. A measurement of the total Higgs boson production cross-section multiplied by the diphoton branching ratio (σ × Bγγ) is presented, where this value was determined to be (σ × Bγγ)obs = 127 ± 7 (stat.) ± 7 (syst.) fb, within agreement with the Standard Model prediction. The signal and background shape modelling is described, and the contribution of the background modelling uncertainty to the total uncertainty ranges from 18–2.4 %, depending on the Higgs boson production mechanism. A method for estimating the number of events in a Monte Carlo background sample required to model the shape is detailed. It was found that the size of the nominal γγ background events sample required a multiplicative increase by a factor of 3.60 to adequately model the background with a confidence level of 68 %, or a factor of 7.20 for a confidence level of 95 %. Based on this estimate, 0.5 billion additional simulated events were produced, substantially reducing the background modelling uncertainty. A technique is detailed for emulating the effects of Monte Carlo event generator differences using multivariate reweighting. The technique is used to estimate the event generator uncertainty on the signal modelling of tHqb events, improving the reliability of estimating the tHqb production cross-section. Then this multivariate reweighting technique is used to estimate the generator modelling uncertainties on background V γγ samples for the first time. The estimated uncertainties were found to be covered by the currently assumed background modelling uncertainty

    Visualisation of Fundamental Movement Skills (FMS): An Iterative Process Using an Overarm Throw

    Get PDF
    Fundamental Movement Skills (FMS) are precursor gross motor skills to more complex or specialised skills and are recognised as important indicators of physical competence, a key component of physical literacy. FMS are predominantly assessed using pre-defined manual methodologies, most commonly the various iterations of the Test of Gross Motor Development. However, such assessments are time-consuming and often require a minimum basic level of training to conduct. Therefore, the overall aim of this thesis was to utilise accelerometry to develop a visualisation concept as part of a feasibility study to support the learning and assessment of FMS, by reducing subjectivity and the overall time taken to conduct a gross motor skill assessment. The overarm throw, an important fundamental movement skill, was specifically selected for the visualisation development as it is an acyclic movement with a distinct initiation and conclusion. Thirteen children (14.8 ± 0.3 years; 9 boys) wore an ActiGraph GT9X Link Inertial Measurement Unit device on the dominant wrist whilst performing a series of overarm throws. This thesis illustrates how the visualisation concept was developed using raw accelerometer data, which was processed and manipulated using MATLAB 2019b software to obtain and depict key throw performance data, including the trajectory and velocity of the wrist during the throw. Overall, this thesis found that the developed visualisation concept can provide strong indicators of throw competency based on the shape of the throw trajectory. Future research should seek to utilise a larger, more diverse, population, and incorporate machine learning. Finally, further work is required to translate this concept to other gross motor skills

    Image classification over unknown and anomalous domains

    Get PDF
    A longstanding goal in computer vision research is to develop methods that are simultaneously applicable to a broad range of prediction problems. In contrast to this, models often perform best when they are specialized to some task or data type. This thesis investigates the challenges of learning models that generalize well over multiple unknown or anomalous modes and domains in data, and presents new solutions for learning robustly in this setting. Initial investigations focus on normalization for distributions that contain multiple sources (e.g. images in different styles like cartoons or photos). Experiments demonstrate the extent to which existing modules, batch normalization in particular, struggle with such heterogeneous data, and a new solution is proposed that can better handle data from multiple visual modes, using differing sample statistics for each. While ideas to counter the overspecialization of models have been formulated in sub-disciplines of transfer learning, e.g. multi-domain and multi-task learning, these usually rely on the existence of meta information, such as task or domain labels. Relaxing this assumption gives rise to a new transfer learning setting, called latent domain learning in this thesis, in which training and inference are carried out over data from multiple visual domains, without domain-level annotations. Customized solutions are required for this, as the performance of standard models degrades: a new data augmentation technique that interpolates between latent domains in an unsupervised way is presented, alongside a dedicated module that sparsely accounts for hidden domains in data, without requiring domain labels to do so. In addition, the thesis studies the problem of classifying previously unseen or anomalous modes in data, a fundamental problem in one-class learning, and anomaly detection in particular. While recent ideas have been focused on developing self-supervised solutions for the one-class setting, in this thesis new methods based on transfer learning are formulated. Extensive experimental evidence demonstrates that a transfer-based perspective benefits new problems that have recently been proposed in anomaly detection literature, in particular challenging semantic detection tasks

    Marvellous real in the Middle East: a comparative study of magical realism in contemporary women’s fiction

    Get PDF
    Magical realism has been studied extensively in relation to Latin America and subsequently in other parts of the world, yet the Middle East has not received adequate attention in academic scholarship. This PhD study examines a selection of contemporary female-authored narratives from the Middle East to establish an understanding of the practice of magical realism in this region. The selected texts for this study are: Raja Alem’s Fatma and My Thousand and One Nights; Shahrnush Parsipur’s Women Without Men and Touba and the Meaning of Night; Elif Shafak’s The Bastard of Istanbul and Gina B. Nahai’s Moonlight on the Avenue of Faith. This study firstly explores the concept of magical realism as a mode of writing and determines its relationship to the Middle Eastern context. It then evaluates the texts under scrutiny by examining how the narrative of magical realism is constructed and what the sources are of the magical component in these texts, specifically in relation to Middle Eastern mythology. It also investigates the ideological aspect behind the employment of magical realism and whether it serves any political goal. The analysis of the selected texts is approached from three standpoints, that is, from literary, mythological and ideological perspectives. I argue that magical realism serves various purposes and that it is applied from perspectives that can be regarded as marginal to their communities’ dominant values, to subvert mainstream ideology. I also demonstrate that the Middle East is a crucial place to investigate magical realism because of the numerous complex cultural values that interact with each other in this region, and which enrich the practice of magical realism

    Underwater optical wireless communications in turbulent conditions: from simulation to experimentation

    Get PDF
    Underwater optical wireless communication (UOWC) is a technology that aims to apply high speed optical wireless communication (OWC) techniques to the underwater channel. UOWC has the potential to provide high speed links over relatively short distances as part of a hybrid underwater network, along with radio frequency (RF) and underwater acoustic communications (UAC) technologies. However, there are some difficulties involved in developing a reliable UOWC link, namely, the complexity of the channel. The main focus throughout this thesis is to develop a greater understanding of the effects of the UOWC channel, especially underwater turbulence. This understanding is developed from basic theory through to simulation and experimental studies in order to gain a holistic understanding of turbulence in the UOWC channel. This thesis first presents a method of modelling optical underwater turbulence through simulation that allows it to be examined in conjunction with absorption and scattering. In a stationary channel, this turbulence induced scattering is shown to cause and increase both spatial and temporal spreading at the receiver plane. It is also demonstrated using the technique presented that the relative impact of turbulence on a received signal is lower in a highly scattering channel, showing an in-built resilience of these channels. Received intensity distributions are presented confirming that fluctuations in received power from this method follow the commonly used Log-Normal fading model. The impact of turbulence - as measured using this new modelling framework - on link performance, in terms of maximum achievable data rate and bit error rate is equally investigated. Following that, experimental studies comparing both the relative impact of turbulence induced scattering on coherent and non-coherent light propagating through water and the relative impact of turbulence in different water conditions are presented. It is shown that the scintillation index increases with increasing temperature inhomogeneity in the underwater channel. These results indicate that a light beam from a non-coherent source has a greater resilience to temperature inhomogeneity induced turbulence effect in an underwater channel. These results will help researchers in simulating realistic channel conditions when modelling a light emitting diode (LED) based intensity modulation with direct detection (IM/DD) UOWC link. Finally, a comparison of different modulation schemes in still and turbulent water conditions is presented. Using an underwater channel emulator, it is shown that pulse position modulation (PPM) and subcarrier intensity modulation (SIM) have an inherent resilience to turbulence induced fading with SIM achieving higher data rates under all conditions. The signal processing technique termed pair-wise coding (PWC) is applied to SIM in underwater optical wireless communications for the first time. The performance of PWC is compared with the, state-of-the-art, bit and power loading optimisation algorithm. Using PWC, a maximum data rate of 5.2 Gbps is achieved in still water conditions

    An investigation of the geothermal potential of the Upper Devonian sandstones beneath eastern Glasgow

    Get PDF
    The urban development of the city of Glasgow is a consequence of its economic development, in part fuelled by local coalfields which exploited rocks in the same sedimentary basin within which geothermal resources in flooded abandoned mine workings, and deeper hot sedimentary aquifers (HSA), are present. This creates an opportunity to provide geothermal heating to areas of dense urban population with high heat demand. The depth of the target HSA geothermal resource, in Upper Devonian aged sandstones of the Stratheden Group, beneath eastern Glasgow was determined by gravity surveying and structural geological modelling. The estimated depth of the geothermal resource ranged from c.1500-2000 m, in the eastward deepening sedimentary basin. To reliably estimate the temperature of the geothermal resource, rigorous corrections to account for the effects of palaeoclimate and topography on heat flow were applied to boreholes in the Greater Glasgow area. The mean regional corrected heat flow was calculated as 75.7 mW m-2, an increase of 13.8 mW m-2 from the uncorrected value of 61.9 mW m-2, emphasising the extent to which heat flow was previously underestimated. Extrapolation of the geothermal gradient, calculated from the mean regional corrected heat flow, results in aquifer temperatures of c. 64-79 °C at depths of c.1500-2000 m beneath eastern Glasgow. The geothermal resource may, therefore, be capable of supporting a wide variety of direct heat use applications if sufficient matrix permeability or fracture networks are present. However, diagenetic effects such as quartz and carbonate cementation were found to restrict the porosity in Upper Devonian sandstones in a borehole and outcrop analogue study. These effects may likewise reduce porosity and intergranular permeability in the target aquifer, although this crucial aspect cannot be fully understood without deep exploratory drilling. To quantify the magnitude of the deep geothermal resource, the indicative thermal power outputs of geothermal doublet wells located in Glasgow’s East End were calculated for the first time, with outputs ranging from 1.3-2.1 MW dependent upon the aquifer depth. This, however, is predicated upon an aquifer permeability of c. 40 mD, which if reduced to 10 mD or less due to the effects of diagenesis, significantly reduces the thermal power outputs to 230-390 kW. The lack of assured project-success, given uncertainties related to the aquifer properties at depth, coupled with high capital costs of drilling, pose barriers to the development of deep geothermal energy in Glasgow. Further investigation of the economic viability of geothermal exploration, and alternative technological solutions is therefore required to mitigate the technical and economic risks. However, if sufficient matrix permeability or fracture networks are present at depth in the Upper Devonian sandstone sequence, then the potential contribution that geothermal energy could make to meeting local heat demand, reducing greenhouse gas emissions, and addressing the ‘energy trilemma’ in Glasgow is significant

    Recursive Singular Spectrum Analysis for Induction Machines Unbalanced Rotor Fault Diagnosis

    Get PDF
    One of the major challenges of diagnosing rotor symmetry faults in induction machines is severe modulation of fault and supply frequency components. In particular, existing techniques are not able to identify fault components in the case of low slips. In this paper, this problem is tackled by proposing a novel approach. First, a new use of singular spectrum analysis (SSA), as a powerful spectrum analyser, is introduced for fault detection. Our idea is to treat the stator current signature of the wound rotor induction machine as a time series. In this approach, the current signature is decomposed into several eigenvalue spectra (rather than frequency spectra) to find a subspace where the fault component is recognisable. Subsequently, the fault component is detected using some data-driven filters constructed with the knowledge about characteristics of supply and fault components. Then, an inexpensive peak localisation procedure is applied to the power spectrum of the fault component to identify the exact frequency of the fault. The fault detection and localisation methods are then combined in a recursive regime to further improve the diagnosis’ performance particularly at high rotor speeds and small rotor faults. The proposed approach is data-driven and is directly applied to the raw signal with no suppression or filtration of the frequency harmonics with a low computational complexity. The numerical results obtained with real data at several rotation speeds and fault severities, unveil the effectiveness and real-time feature of the proposed approach

    The interpretation of Islam and nationalism by the elite through the English language media in Pakistan.

    Get PDF
    The media is constructed and interpreted through what people 'know'. That knowledge is, forthe most part, created through day to day experiences. In Pakistan, Islam and nationalism aretwo components of this social knowledge which are intrinsically tied to the experiences of thePakistani people. Censorship and selection are means through which this knowledge isarticulated and interpreted.General conceptions of partially shared large scale bodies of knowledge and ideas reinforce,and are reinforced by, general medium of mass communication: the print and electronic media.Focusing on the govermnent, media institutions and Pakistani elites, I describe and analyse thedifferent, sometimes conflicting, interpretations of Islam and Pakistani nationalism manifest inand through media productions presented in Pakistan.The media means many things, not least of which is power. It is the media as a source ofpower that is so frequently controlled, directed and manipulated. The terminology may beslightly different according to the context within which one is talking - propaganda, selection,etc. - but ultimately it comes down to the same thing - censorship. Each of the three groups:government, media institutions and Pakistani elites - have the power to interpret and censormedia content and consideration must be taken of each of the other power holders consequentlyrestricting the power of each group in relation to the other two. The processes of thismanipulation and their consequences form the major themes of this thesis

    Investigating and mitigating the role of neutralisation techniques on information security policies violation in healthcare organisations

    Get PDF
    Healthcare organisations today rely heavily on Electronic Medical Records systems (EMRs), which have become highly crucial IT assets that require significant security efforts to safeguard patients’ information. Individuals who have legitimate access to an organisation’s assets to perform their day-to-day duties but intentionally or unintentionally violate information security policies can jeopardise their organisation’s information security efforts and cause significant legal and financial losses. In the information security (InfoSec) literature, several studies emphasised the necessity to understand why employees behave in ways that contradict information security requirements but have offered widely different solutions. In an effort to respond to this situation, this thesis addressed the gap in the information security academic research by providing a deep understanding of the problem of medical practitioners’ behavioural justifications to violate information security policies and then determining proper solutions to reduce this undesirable behaviour. Neutralisation theory was used as the theoretical basis for the research. This thesis adopted a mixed-method research approach that comprises four consecutive phases, and each phase represents a research study that was conducted in light of the results from the preceding phase. The first phase of the thesis started by investigating the relationship between medical practitioners’ neutralisation techniques and their intention to violate information security policies that protect a patient’s privacy. A quantitative study was conducted to extend the work of Siponen and Vance [1] through a study of the Saudi Arabia healthcare industry. The data was collected via an online questionnaire from 66 Medical Interns (MIs) working in four academic hospitals. The study found that six neutralisation techniques—(1) appeal to higher loyalties, (2) defence of necessity, (3) the metaphor of ledger, (4) denial of responsibility, (5) denial of injury, and (6) condemnation of condemners—significantly contribute to the justifications of the MIs in hypothetically violating information security policies. The second phase of this research used a series of semi-structured interviews with IT security professionals in one of the largest academic hospitals in Saudi Arabia to explore the environmental factors that motivated the medical practitioners to evoke various neutralisation techniques. The results revealed that social, organisational, and emotional factors all stimulated the behavioural justifications to breach information security policies. During these interviews, it became clear that the IT department needed to ensure that security policies fit the daily tasks of the medical practitioners by providing alternative solutions to ensure the effectiveness of those policies. Based on these interviews, the objective of the following two phases was to improve the effectiveness of InfoSec policies against the use of behavioural justification by engaging the end users in the modification of existing policies via a collaborative writing process. Those two phases were conducted in the UK and Saudi Arabia to determine whether the collaborative writing process could produce a more effective security policy that balanced the security requirements with daily business needs, thus leading to a reduction in the use of neutralisation techniques to violate security policies. The overall result confirmed that the involvement of the end users via a collaborative writing process positively improved the effectiveness of the security policy to mitigate the individual behavioural justifications, showing that the process is a promising one to enhance security compliance
    • …
    corecore