14 research outputs found

    Legal situation and current practice of waste incineration bottom ash utilisation in Europe

    Get PDF
    Almost 500 municipal solid waste incineration plants in the EU, Norway, and Switzerland generate about 17.6 Mt/a of incinerator bottom ash (IBA). IBA contains minerals and metals. Metals are mostly separated and sold to the scrap market and minerals are either disposed of in landfills or utilised in the construction sector. Since there is no uniform regulation for IBA utilisation at EU level, countries developed own rules with varying requirements for utilisation. As a result from a cooperation network between European experts an up-to-date overview of documents regulating IBA utilisation is presented. Furthermore, this work highlights the different requirements that have to be considered. Overall, 51 different parameters for the total content and 36 different parameters for the emission by leaching are defined. An analysis of the defined parameter reveals that leaching parameters are significantly more to be considered compared to total content parameters. In order to assess the leaching behaviour nine different leaching tests, including batch tests, up-flow percolation tests and one diffusion test (monolithic materials) are in place. A further discussion of leaching parameters showed that certain countries took over limit values initially defined for landfills for inert waste and adopted them for IBA utilisation. The overall utilisation rate of IBA in construction works is approximately 54 wt.%. It is revealed that the rate of utilisation does not necessarily depend on how well regulated IBA utilisation is, but rather seems to be a result of political commitment for IBA recycling and economically interesting circumstances

    A Roadmap for HEP Software and Computing R&D for the 2020s

    Get PDF
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe

    Graph neural networks at the Large Hadron Collider

    No full text
    International audienceFrom raw detector activations to reconstructed particles, data at the Large Hadron Collider (LHC) are sparse, irregular, heterogeneous and highly relational in nature. Graph neural networks (GNNs), a class of algorithms belonging to the rapidly growing field of geometric deep learning (GDL), are well suited to tackling such data because GNNs are equipped with relational inductive biases that explicitly make use of localized information encoded in graphs. Furthermore, graphs offer a flexible and efficient alternative to rectilinear structures when representing sparse or irregular data, and can naturally encode heterogeneous information. For these reasons, GNNs have been applied to a number of LHC physics tasks including reconstructing particles from detector readouts and discriminating physics signals against background processes. We introduce and categorize these applications in a manner accessible to both physicists and non-physicists. Our explicit goal is to bridge the gap between the particle physics and GDL communities. After an accessible description of LHC physics, including theory, measurement, simulation and analysis, we overview applications of GNNs at the LHC. We conclude by highlighting technical challenges and future directions that may inspire further collaboration between the physics and GDL communities

    Towards a realistic track reconstruction algorithm based on graph neural networks for the HL-LHC

    No full text
    The physics reach of the HL-LHC will be limited by how efficiently the experiments can use the available computing resources, i.e. affordable software and computing are essential. The development of novel methods for charged particle reconstruction at the HL-LHC incorporating machine learning techniques or based entirely on machine learning is a vibrant area of research. In the past two years, algorithms for track pattern recognition based on graph neural networks (GNNs) have emerged as a particularly promising approach. Previous work mainly aimed at establishing proof of principle. In the present document we describe new algorithms that can handle complex realistic detectors. The new algorithms are implemented in ACTS, a common framework for tracking software. This work aims at implementing a realistic GNN-based algorithm that can be deployed in an HL-LHC experiment
    corecore