397 research outputs found

    Performance of post-processing algorithms for rainfall intensity using measurements from tipping-bucket rain gauges

    Get PDF
    Abstract. Eight rainfall events recorded from May to September 2013 at Hong Kong International Airport (HKIA) have been selected to investigate the performance of post-processing algorithms used to calculate the rainfall intensity (RI) from tipping-bucket rain gauges (TBRGs). We assumed a drop-counter catching-type gauge as a working reference and compared rainfall intensity measurements with two calibrated TBRGs operated at a time resolution of 1 min. The two TBRGs differ in their internal mechanics, one being a traditional single-layer dual-bucket assembly, while the other has two layers of buckets. The drop-counter gauge operates at a time resolution of 10 s, while the time of tipping is recorded for the two TBRGs. The post-processing algorithms employed for the two TBRGs are based on the assumption that the tip volume is uniformly distributed over the inter-tip period. A series of data of an ideal TBRG is reconstructed using the virtual time of tipping derived from the drop-counter data. From the comparison between the ideal gauge and the measurements from the two real TBRGs, the performances of different post-processing and correction algorithms are statistically evaluated over the set of recorded rain events. The improvement obtained by adopting the inter-tip time algorithm in the calculation of the RI is confirmed. However, by comparing the performance of the real and ideal TBRGs, the beneficial effect of the inter-tip algorithm is shown to be relevant for the mid–low range (6–50 mmh−1) of rainfall intensity values (where the sampling errors prevail), while its role vanishes with increasing RI in the range where the mechanical errors prevail

    Thermo-fluid dynamic simulation of the Hotplate precipitation gauge.

    Get PDF
    The present study addresses the aerodynamic response of the recently developed "Hotplate" liquid/solid precipitation gauge when exposed to the wind. The Hotplate gauge employs two heated thin plates to provide a reliable method of precipitation measurement. The measuring principle is based on an algorithm to associate the latent heat needed to evaporate the snow, or the rain, falling on the instrument and the precipitation rate. However, the presence of the instrument body immersed in a wind field is expected to induce significant deformations of the airflow pattern near the gauge, with an impact on the associated catching efficiency. Indeed, the fall trajectories of the hydrometeors when approaching the gauge can be deviated away from the collecting plate resulting, in general, in some underestimation of the precipitation rate. After an initial analysis of real-world "Hotplate" measurements from a field test site located in Marshall, CO (USA) and the comparison with more traditional measurements obtained from a co-located, shielded reference gauge, the role of wind-induced errors is highlighted. The main approach used in this work is based on the numerical simulation of the airflow field around the gauge, using Computational Fluid Dynamics (CFD) to identify areas where the wind-induced updraft, local acceleration and turbulence are significant. The performed CFD airflow simulations use the URANS SST k - \u3c9 modelling scheme, and are the first modelling step to quantify the associated undercatch. These will be possibly coupled in future developments with particle tracking models to derive suitable correction curves for operational purposes. Due to the specific measurement principle exploited by the "Hotplate" gauge, which measures the heat flux needed to evaporate the collected water amount under a constant plate surface temperature, thermo-fluid dynamic simulations are addressed as well. Dedicated tests have been performed in the wind tunnel facility available at DICCA, University of Genoa to validate simulation results. Results indicate that the presence of wind is a relevant source of systematic bias when using the "Hotplate" gauge for the measurement of precipitation, and its effect must be corrected by adopting suitable correction curves as a function of the wind velocity. The magnitude of the correction can be derived from numerical thermo-fluid dynamic simulations and an assessment of the airflow patterns developing around the gauge at various wind velocity regimes is provided in this work. Wind tunnel tests allowed for a substantial validation of the numerical results, and possible improvements of the model are highlighted and proposed for future developments

    Clinical characteristics and risk factors associated with COVID-19 severity in patients with haematological malignancies in Italy: a retrospective, multicentre, cohort study

    Get PDF
    Several small studies on patients with COVID-19 and haematological malignancies are available showing a high mortality in this population. The Italian Hematology Alliance on COVID-19 aimed to collect data from adult patients with haematological malignancies who required hospitalisation for COVID-19

    Pattern of care and effectiveness of treatment for glioblastoma patients in the real world: Results from a prospective population-based registry. Could survival differ in a high-volume center?

    Get PDF
    BACKGROUND: As yet, no population-based prospective studies have been conducted to investigate the incidence and clinical outcome of glioblastoma (GBM) or the diffusion and impact of the current standard therapeutic approach in newly diagnosed patients younger than aged 70 years. METHODS: Data on all new cases of primary brain tumors observed from January 1, 2009, to December 31, 2010, in adults residing within the Emilia-Romagna region were recorded in a prospective registry in the Project of Emilia Romagna on Neuro-Oncology (PERNO). Based on the data from this registry, a prospective evaluation was made of the treatment efficacy and outcome in GBM patients. RESULTS: Two hundred sixty-seven GBM patients (median age, 64 y; range, 29-84 y) were enrolled. The median overall survival (OS) was 10.7 months (95% CI, 9.2-12.4). The 139 patients 64aged 70 years who were given standard temozolomide treatment concomitant with and adjuvant to radiotherapy had a median OS of 16.4 months (95% CI, 14.0-18.5). With multivariate analysis, OS correlated significantly with KPS (HR = 0.458; 95% CI, 0.248-0.847; P = .0127), MGMT methylation status (HR = 0.612; 95% CI, 0.388-0.966; P = .0350), and treatment received in a high versus low-volume center (HR = 0.56; 95% CI, 0.328-0.986; P = .0446). CONCLUSIONS: The median OS following standard temozolomide treatment concurrent with and adjuvant to radiotherapy given to (72.8% of) patients aged 6470 years is consistent with findings reported from randomized phase III trials. The volume and expertise of the treatment center should be further investigated as a prognostic factor

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Search for heavy resonances decaying to two Higgs bosons in final states containing four b quarks

    Get PDF
    A search is presented for narrow heavy resonances X decaying into pairs of Higgs bosons (H) in proton-proton collisions collected by the CMS experiment at the LHC at root s = 8 TeV. The data correspond to an integrated luminosity of 19.7 fb(-1). The search considers HH resonances with masses between 1 and 3 TeV, having final states of two b quark pairs. Each Higgs boson is produced with large momentum, and the hadronization products of the pair of b quarks can usually be reconstructed as single large jets. The background from multijet and t (t) over bar events is significantly reduced by applying requirements related to the flavor of the jet, its mass, and its substructure. The signal would be identified as a peak on top of the dijet invariant mass spectrum of the remaining background events. No evidence is observed for such a signal. Upper limits obtained at 95 confidence level for the product of the production cross section and branching fraction sigma(gg -> X) B(X -> HH -> b (b) over barb (b) over bar) range from 10 to 1.5 fb for the mass of X from 1.15 to 2.0 TeV, significantly extending previous searches. For a warped extra dimension theory with amass scale Lambda(R) = 1 TeV, the data exclude radion scalar masses between 1.15 and 1.55 TeV

    Search for supersymmetry in events with one lepton and multiple jets in proton-proton collisions at root s=13 TeV

    Get PDF
    Peer reviewe
    corecore