14 research outputs found
Recommended from our members
Non-inclusive searches for squarks and gluinos at the Tevatron
Recent results from the CDF and D0 Collaborations on searches for squarks and gluinos at the Run II of the Fermilab Tevatron collider are presented. This review covers searches for final states involving specific mass hierarchies. The analyzed datasets correspond to an integrated luminosity of 300-1000 pb{sup -1} collected from proton anti-proton collisions at a center-of-mass energy of 1.96 TeV. No significant deviations from the Standard Model expectations are observed and limits on parameters of supersymmetry are set in generic MSSM and in specific mSUGRA scenarios
Recommended from our members
Search for squarks and gluinos using data from the D0 detector at the Tevatron
A search for squarks and gluinos is performed in the topology of multijet events accompanied by large missing transverse energy in 2.1fb{sup -1} of p{bar p} collision data collected using the D0 detector at the Fermilab Tevatron Collider at a center of mass energy of 1.96 TeV. About half of this dataset is specifically analyzed for events involving at least one tau lepton decaying hadronically in addition. No deviation from the Standard Model expectation is observed and the analyses are combined to set limits on the squark and gluino masses and on parameters of minimal supergravity
Legal situation and current practice of waste incineration bottom ash utilisation in Europe
Almost 500 municipal solid waste incineration plants in the EU, Norway, and Switzerland generate about 17.6 Mt/a of incinerator bottom ash (IBA). IBA contains minerals and metals. Metals are mostly separated and sold to the scrap market and minerals are either disposed of in landfills or utilised in the construction sector. Since there is no uniform regulation for IBA utilisation at EU level, countries developed own rules with varying requirements for utilisation. As a result from a cooperation network between European experts an up-to-date overview of documents regulating IBA utilisation is presented. Furthermore, this work highlights the different requirements that have to be considered. Overall, 51 different parameters for the total content and 36 different parameters for the emission by leaching are defined. An analysis of the defined parameter reveals that leaching parameters are significantly more to be considered compared to total content parameters. In order to assess the leaching behaviour nine different leaching tests, including batch tests, up-flow percolation tests and one diffusion test (monolithic materials) are in place. A further discussion of leaching parameters showed that certain countries took over limit values initially defined for landfills for inert waste and adopted them for IBA utilisation. The overall utilisation rate of IBA in construction works is approximately 54 wt.%. It is revealed that the rate of utilisation does not necessarily depend on how well regulated IBA utilisation is, but rather seems to be a result of political commitment for IBA recycling and economically interesting circumstances
A Roadmap for HEP Software and Computing R&D for the 2020s
Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe
Recommended from our members
Search for Higgs bosons beyond the Standard Model and supersymmetry at the Tevatron
Recommended from our members
Searches for Higgs bosons beyond the Standard Model at the Tevatron
Preliminary results from the CDF and D0 Collaborations on the searches for Higgs bosons beyond the Standard Model at the Run II Tevatron are reviewed. These results are based on datasets corresponding to an integrated luminosity of 100-200 pb{sup -1} collected from proton anti-proton collisions at a center of mass energy of 1.96 TeV. No evidence of signal is observed and limits on Higgs bosons production cross sections times branching ratio, couplings and masses from various models are set
Graph neural networks at the Large Hadron Collider
International audienceFrom raw detector activations to reconstructed particles, data at the Large Hadron Collider (LHC) are sparse, irregular, heterogeneous and highly relational in nature. Graph neural networks (GNNs), a class of algorithms belonging to the rapidly growing field of geometric deep learning (GDL), are well suited to tackling such data because GNNs are equipped with relational inductive biases that explicitly make use of localized information encoded in graphs. Furthermore, graphs offer a flexible and efficient alternative to rectilinear structures when representing sparse or irregular data, and can naturally encode heterogeneous information. For these reasons, GNNs have been applied to a number of LHC physics tasks including reconstructing particles from detector readouts and discriminating physics signals against background processes. We introduce and categorize these applications in a manner accessible to both physicists and non-physicists. Our explicit goal is to bridge the gap between the particle physics and GDL communities. After an accessible description of LHC physics, including theory, measurement, simulation and analysis, we overview applications of GNNs at the LHC. We conclude by highlighting technical challenges and future directions that may inspire further collaboration between the physics and GDL communities
Towards a realistic track reconstruction algorithm based on graph neural networks for the HL-LHC
The physics reach of the HL-LHC will be limited by how efficiently the experiments can use the available computing resources, i.e. affordable software and computing are essential. The development of novel methods for charged particle reconstruction at the HL-LHC incorporating machine learning techniques or based entirely on machine learning is a vibrant area of research. In the past two years, algorithms for track pattern recognition based on graph neural networks (GNNs) have emerged as a particularly promising approach. Previous work mainly aimed at establishing proof of principle. In the present document we describe new algorithms that can handle complex realistic detectors. The new algorithms are implemented in ACTS, a common framework for tracking software. This work aims at implementing a realistic GNN-based algorithm that can be deployed in an HL-LHC experiment