200 research outputs found

    How is VR used to support training in industry? The INTUITION network of excellence working group on education and training

    Get PDF
    INTUITION is the European Network of Excellence on virtual reality and virtual environments applications for future workspaces. The purpose of the network is to gather expertise from partner members and determine the future research agenda for the development and use of virtual reality (VR) technologies. The working group on Education and Training (WG2.9) is specifically focused on understanding how VR is being used to support learning in educational and industrial contexts. This paper presents four case examples of VR technology currently in use or development for training in industry. Conclusions are drawn concerning future development of VR training applications and barriers that need to be overcome

    Dutch disease-cum-financialization booms and external balance cycles in developing countries

    Get PDF
    We formally investigate the medium-to-long-run dynamics emerging out of a Dutch disease-cum-financialization phenomenon. We take inspiration from the most recent Colombian development pattern. The “pure” Dutch disease first causes deindustrialization by permanently appreciating the economy’s exchange rate in the long run. Financialization, i.e. booming capital inflows taking place in a climate of natural resource-led financial over-optimism, causes medium-run exchange rate volatility and macroeconomic instability. This jeopardizes manufacturing development even further by raising macroeconomic uncertainty. We advise the adoption of capital controls and a developmentalist monetary policy to tackle these two distinct but often intertwined phenomena

    Search for direct production of charginos and neutralinos in events with three leptons and missing transverse momentum in √s = 7 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for the direct production of charginos and neutralinos in final states with three electrons or muons and missing transverse momentum is presented. The analysis is based on 4.7 fb−1 of proton–proton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with Standard Model expectations in three signal regions that are either depleted or enriched in Z-boson decays. Upper limits at 95% confidence level are set in R-parity conserving phenomenological minimal supersymmetric models and in simplified models, significantly extending previous results

    Jet size dependence of single jet suppression in lead-lead collisions at sqrt(s(NN)) = 2.76 TeV with the ATLAS detector at the LHC

    Get PDF
    Measurements of inclusive jet suppression in heavy ion collisions at the LHC provide direct sensitivity to the physics of jet quenching. In a sample of lead-lead collisions at sqrt(s) = 2.76 TeV corresponding to an integrated luminosity of approximately 7 inverse microbarns, ATLAS has measured jets with a calorimeter over the pseudorapidity interval |eta| < 2.1 and over the transverse momentum range 38 < pT < 210 GeV. Jets were reconstructed using the anti-kt algorithm with values for the distance parameter that determines the nominal jet radius of R = 0.2, 0.3, 0.4 and 0.5. The centrality dependence of the jet yield is characterized by the jet "central-to-peripheral ratio," Rcp. Jet production is found to be suppressed by approximately a factor of two in the 10% most central collisions relative to peripheral collisions. Rcp varies smoothly with centrality as characterized by the number of participating nucleons. The observed suppression is only weakly dependent on jet radius and transverse momentum. These results provide the first direct measurement of inclusive jet suppression in heavy ion collisions and complement previous measurements of dijet transverse energy imbalance at the LHC.Comment: 15 pages plus author list (30 pages total), 8 figures, 2 tables, submitted to Physics Letters B. All figures including auxiliary figures are available at http://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/HION-2011-02

    Extreme events and predictability of catastrophic failure in composite materials and in the Earth

    Get PDF
    Despite all attempts to isolate and predict extreme earthquakes, these nearly always occur without obvious warning in real time: fully deterministic earthquake prediction is very much a ‘black swan’. On the other hand engineering-scale samples of rocks and other composite materials often show clear precursors to dynamic failure under controlled conditions in the laboratory, and successful evacuations have occurred before several volcanic eruptions. This may be because extreme earthquakes are not statistically special, being an emergent property of the process of dynamic rupture. Nevertheless, probabilistic forecasting of event rate above a given size, based on the tendency of earthquakes to cluster in space and time, can have significant skill compared to say random failure, even in real-time mode. We address several questions in this debate, using examples from the Earth (earthquakes, volcanoes) and the laboratory, including the following. How can we identify ‘characteristic’ events, i.e. beyond the power law, in model selection (do dragon-kings exist)? How do we discriminate quantitatively between stationary and non-stationary hazard models (is a dragon likely to come soon)? Does the system size (the size of the dragon’s domain) matter? Are there localising signals of imminent catastrophic failure we may not be able to access (is the dragon effectively invisible on approach)? We focus on the effect of sampling effects and statistical uncertainty in the identification of extreme events and their predictability, and highlight the strong influence of scaling in space and time as an outstanding issue to be addressed by quantitative studies, experimentation and models

    Thermal Conductivity of Methane-Hydrate

    Full text link
    The thermal conductivity of the methane hydrate CH4 (5.75 H2O) was measured in the interval 2-140 K using the steady-state technique. The thermal conductivity corresponding to a homogeneous substance was calculated from the measured effective thermal conductivity obtained in the experiment. The temperature dependence of the thermal conductivity is typical for the thermal conductivity of amorphous solids. It is shown that after separation of the hydrate into ice and methane, at 240 K, the thermal conductivity of the ice exhibits a dependence typical of heavily deformed fine-grain polycrystal. The reason for the glass-like behavior in the thermal conductivity of clathrate compounds has been discussed. The experimental results can be interpreted within the phenomenological soft-potential model with two fitting parameters.Comment: 13 pages, 3 figure

    Towards a new image processing system at Wendelstein 7-X: From spatial calibration to characterization of thermal events

    Get PDF
    Wendelstein 7-X (W7-X) is the most advanced fusion experiment in the stellarator line and is aimed at proving that the stellarator concept is suitable for a fusion reactor. One of the most important issues for fusion reactors is the monitoring of plasma facing components when exposed to very high heat loads, through the use of visible and infrared (IR) cameras. In this paper, a new image processing system for the analysis of the strike lines on the inboard limiters from the first W7-X experimental campaign is presented. This system builds a model of the IR cameras through the use of spatial calibration techniques, helping to characterize the strike lines by using the information given by real spatial coordinates of each pixel. The characterization of the strike lines is made in terms of position, size, and shape, after projecting the camera image in a 2D grid which tries to preserve the curvilinear surface distances between points. The description of the strike-line shape is made by means of the Fourier Descriptors

    Forward modeling of collective Thomson scattering for Wendelstein 7-X plasmas: Electrostatic approximation

    Get PDF
    In this paper, we present a method for numerical computation of collective Thomson scattering (CTS). We developed a forward model, eCTS, in the electrostatic approximation and benchmarked it against a full electromagnetic model. Differences between the electrostatic and the electromagnetic models are discussed. The sensitivity of the results to the ion temperature and the plasma composition is demonstrated. We integrated the model into the Bayesian data analysis framework Minerva and used it for the analysis of noisy synthetic data sets produced by a full electromagnetic model. It is shown that eCTS can be used for the inference of the bulk ion temperature. The model has been used to infer the bulk ion temperature from the first CTS measurements on Wendelstein 7-X

    New insights into the genetic etiology of Alzheimer's disease and related dementias

    Get PDF
    Characterization of the genetic landscape of Alzheimer's disease (AD) and related dementias (ADD) provides a unique opportunity for a better understanding of the associated pathophysiological processes. We performed a two-stage genome-wide association study totaling 111,326 clinically diagnosed/'proxy' AD cases and 677,663 controls. We found 75 risk loci, of which 42 were new at the time of analysis. Pathway enrichment analyses confirmed the involvement of amyloid/tau pathways and highlighted microglia implication. Gene prioritization in the new loci identified 31 genes that were suggestive of new genetically associated processes, including the tumor necrosis factor alpha pathway through the linear ubiquitin chain assembly complex. We also built a new genetic risk score associated with the risk of future AD/dementia or progression from mild cognitive impairment to AD/dementia. The improvement in prediction led to a 1.6- to 1.9-fold increase in AD risk from the lowest to the highest decile, in addition to effects of age and the APOE Δ4 allele

    Measurement of the View the tt production cross-section using eÎŒ events with b-tagged jets in pp collisions at √s = 13 TeV with the ATLAS detector

    Get PDF
    This paper describes a measurement of the inclusive top quark pair production cross-section (σttÂŻ) with a data sample of 3.2 fb−1 of proton–proton collisions at a centre-of-mass energy of √s = 13 TeV, collected in 2015 by the ATLAS detector at the LHC. This measurement uses events with an opposite-charge electron–muon pair in the final state. Jets containing b-quarks are tagged using an algorithm based on track impact parameters and reconstructed secondary vertices. The numbers of events with exactly one and exactly two b-tagged jets are counted and used to determine simultaneously σttÂŻ and the efficiency to reconstruct and b-tag a jet from a top quark decay, thereby minimising the associated systematic uncertainties. The cross-section is measured to be: σttÂŻ = 818 ± 8 (stat) ± 27 (syst) ± 19 (lumi) ± 12 (beam) pb, where the four uncertainties arise from data statistics, experimental and theoretical systematic effects, the integrated luminosity and the LHC beam energy, giving a total relative uncertainty of 4.4%. The result is consistent with theoretical QCD calculations at next-to-next-to-leading order. A fiducial measurement corresponding to the experimental acceptance of the leptons is also presented
    • 

    corecore