894 research outputs found

    Hyperspectral monitoring of green roof vegetation health state in sub-mediterranean climate: preliminary results

    Get PDF
    In urban and industrial environments, the constant increase of impermeable surfaces has produced drastic changes in the natural hydrological cycle. Decreasing green areas not only produce negative effects from a hydrological-hydraulic perspective, but also from an energy point of view, modifying the urban microclimate and generating, as shown in the literature, heat islands in our cities. In this context, green infrastructures may represent an environmental compensation action that can be used to re-equilibrate the hydrological and energy balance and reduce the impact of pollutant load on receiving water bodies. To ensure that a green infrastructure will work properly, vegetated areas have to be continuously monitored to verify their health state. This paper presents a ground spectroscopy monitoring survey of a green roof installed at the University of Calabria fulfilled via the acquisition and analysis of hyperspectral data. This study is part of a larger research project financed by European Structural funds aimed at understanding the influence of green roofs on rainwater management and energy consumption for air conditioning in the Mediterranean area. Reflectance values were acquired with a field-portable spectroradiometer that operates in the range of wavelengths 350–2500 nm. The survey was carried out during the time period November 2014–June 2015 and data were acquired weekly. Climatic, thermo-physical, hydrological and hydraulic quantities were acquired as well and related to spectral data. Broadband and narrowband spectral indices, related to chlorophyll content and to chlorophyll–carotenoid ratio, were computed. The two narrowband indices NDVI705 and SIPI turned out to be the most representative indices to detect the plant health status

    On federated single sign-on in e-government interoperability frameworks

    Get PDF
    We consider the problem of handling digital identities within serviceoriented architecture (SOA) architectures. We explore federated, single signon (SSO) solutions based on identity managers and service providers. After an overview of the different standards and protocols, we introduce a middlewarebased architecture to simplify the integration of legacy systems within such platforms. Our solution is based on a middleware module that decouples the legacy system from the identity-management modules.We consider both standard point-to-point service architectures, and complex government interoperability frameworks, and report experiments to show that our solution provides clear advantages both in terms of effectiveness and performance

    Fabrication and First Full Characterisation of Timing Properties of 3D Diamond Detectors

    Get PDF
    Tracking detectors at future high luminosity hadron colliders are expected to be able to stand unprecedented levels of radiation as well as to efficiently reconstruct a huge number of tracks and primary vertices. To face the challenges posed by the radiation damage, new extremely radiation hard materials and sensor designs will be needed, while the track and vertex reconstruction problem can be significantly mitigated by the introduction of detectors with excellent timing capabilities. Indeed, the time coordinate provides extremely powerful information to disentangle overlapping tracks and hits in the harsh hadronic collision environment. Diamond 3D pixel sensors optimised for timing applications provide an appealing solution to the above problems as the 3D geometry enhances the already outstanding radiation hardness and allows to exploit the excellent timing properties of diamond. We report here the first full timing characterisation of 3D diamond sensors fabricated by electrode laser graphitisation in Florence. Results from a 270MeV pion beam test of a first prototype and from tests with a β source on a recently fabricated 55×55μm2 pitch sensor are discussed. First results on sensor simulation are also presented

    Fabrication and Characterisation of 3D Diamond Pixel Detectors With Timing Capabilities

    Get PDF
    Diamond sensors provide a promising radiation hard solution to the challenges posed by the future experiments at hadron machines. A 3D geometry with thin columnar resistive electrodes orthogonal to the diamond surface, obtained by laser nanofabrication, is expected to provide significantly better time resolution with respect to the extensively studied planar diamond sensors. We report on the development, production, and characterisation of innovative 3D diamond sensors achieving 30% improvement in both space and time resolution with respect to sensors from the previous generation. This is the first complete characterisation of the time resolution of 3D diamond sensors and combines results from tests with laser, beta rays and high energy particle beams. Plans and strategies for further improvement in the fabrication technology and readout systems are also discussed

    Precision tests of the Standard Model with leptonic and semileptonic kaon decays

    Get PDF
    We present a global analysis of leptonic and semileptonic kaon decays data, including all recent results by BNL-E865, KLOE, KTeV, ISTRA+, and NA48. Experimental results are critically reviewed and combined, taking into account theoretical (both analytical and numerical) constraints on the semileptonic kaon form factors. This analysis leads to a very accurate determination of Vus and allows us to perform several stringent tests of the Standard Model.We present a global analysis of leptonic and semileptonic kaon decays data, including all recent results by BNL-E865, KLOE, KTeV, ISTRA+, and NA48. Experimental results are critically reviewed and combined, taking into account theoretical (both analytical and numerical) constraints on the semileptonic kaon form factors. This analysis leads to a very accurate determination of Vus and allows us to perform several stringent tests of the Standard Model

    A Probe into the Reform of Public Calligraphy Course in Chinese Colleges and Universities

    Get PDF
    中国书法是中国传统文化中的瑰宝,它既是一门艺术,又有着深厚的文化内涵。我国普通高校开设公共书法教学是与目前倡导的素质教育,培养创新型人才,提倡人的全面和谐发展的高等教育理念相一致的。加强高校公共书法教育,是全面推进素质教育的有效途径之一,也是提高当代大学生人文素质的重要手段。它对促进大学生人格完善、创造力培养以及民族文化的传承等有着不可低估的作用。本文主要针对我国普通高等院校公共书法教学现状进行分析,并由此比较借鉴日本高校中实施书法教学的成功经验提出对我国高校公共书法教学改革的具体建议和措施,设计并实践一种具备学科视野的、以学生为主体的公共书法教学的新模式。论文共分为五部分。引言扼要介绍论文写...Chinese calligraphy is a treasure of Chinese traditional culture. It not only is an art, but also has profound culture contents. Opening the public calligraphy course to the university students is in accordance with the concept that advocated by the current quality- oriented education, which aims to cultivate the students’ creative talent and comprehensive development. This paper concentrates on a...学位:文学硕士院系专业:艺术教育学院美术系_美术学学号:20042201

    Search for a new gauge boson in π0\pi^{0} decays

    Get PDF
    A search was made for a new light gauge boson XX which might be produced in π0γ+X\pi^{0}\to\gamma + X decay from neutral pions generated by 450-GeV protons in the CERN SPS neutrino target. The X's would penetrate the downstream shielding and be observed in the NOMAD detector via the Primakoff effect, in the process of Xπ0X \to\pi^{0} conversion in the external Coulomb field of a nucleus. With 1.45×10181.45\times10^{18} protons on target, 20 candidate events with energy between 8 and 140 GeV were found from the analysis of neutrino data. This number is in agreement with the expectation of 18.1±\pm2.8 background events from standard neutrino processes. A new 90% C.L. upper limit on the branching ratio Br(π0γ+X)<(3.3to1.9)×105Br(\pi^{0}\to\gamma + X)< (3.3 to 1.9) \times10^{-5} for XX masses ranging from 0 to 120 MeV/c^2 is obtained.Comment: 15 pages, LaTex, 6 eps figures included, submitted to Physics Letters

    Physics case for an LHCb Upgrade II - Opportunities in flavour physics, and beyond, in the HL-LHC era

    Get PDF
    The LHCb Upgrade II will fully exploit the flavour-physics opportunities of the HL-LHC, and study additional physics topics that take advantage of the forward acceptance of the LHCb spectrometer. The LHCb Upgrade I will begin operation in 2020. Consolidation will occur, and modest enhancements of the Upgrade I detector will be installed, in Long Shutdown 3 of the LHC (2025) and these are discussed here. The main Upgrade II detector will be installed in long shutdown 4 of the LHC (2030) and will build on the strengths of the current LHCb experiment and the Upgrade I. It will operate at a luminosity up to 2×1034 cm−2s−1, ten times that of the Upgrade I detector. New detector components will improve the intrinsic performance of the experiment in certain key areas. An Expression Of Interest proposing Upgrade II was submitted in February 2017. The physics case for the Upgrade II is presented here in more depth. CP-violating phases will be measured with precisions unattainable at any other envisaged facility. The experiment will probe b → sl+l−and b → dl+l− transitions in both muon and electron decays in modes not accessible at Upgrade I. Minimal flavour violation will be tested with a precision measurement of the ratio of B(B0 → μ+μ−)/B(Bs → μ+μ−). Probing charm CP violation at the 10−5 level may result in its long sought discovery. Major advances in hadron spectroscopy will be possible, which will be powerful probes of low energy QCD. Upgrade II potentially will have the highest sensitivity of all the LHC experiments on the Higgs to charm-quark couplings. Generically, the new physics mass scale probed, for fixed couplings, will almost double compared with the pre-HL-LHC era; this extended reach for flavour physics is similar to that which would be achieved by the HE-LHC proposal for the energy frontier

    LHCb upgrade software and computing : technical design report

    Get PDF
    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis
    corecore