217 research outputs found

    a simple algorithm for the lexical classification of comparable adjectives

    Get PDF
    Abstract Lexical classification is one of the most widely investigated fields in (computational) linguistic and Natural language Processing. Adjectives play a significant role both in classification tasks and in applications as sentiment analysis. In this paper a simple algorithm for lexical classification of comparable adjectives, called MORE (coMparable fORm dEtector), is proposed. The algorithm is efficient in time. The method is a specific unsupervised learning technique. Results are verified against a reference standard built from 80 manually annotated lists of adjective. The algorithm exhibits an accuracy of 76%

    BIOMECHANICAL ANALYSIS OF THREE DIFFERENT BLOCKING FOOTWORK TECHNIQUES IN VOLLEYBALL: A PILOT STUDY

    Get PDF
    The purpose of this study was to analyse three different blocking footwork techniques in volleyball. In particular the attention was focused on the correlation between anthropometric and kinematic parameters. Three female athletes playing in the first national league were recruited for a pilot study. Bosco tests were executed to have a morphological classification. A stereophotogrammetric system was used to acquire three blocking footwork techniques: slide step, running and jab cross over patterns. Parameters of interest included the blocking time, the jump height, the horizontal and vertical speed of the centre of mass, the frontal position of the body with respect to the net and the invasion angle of the hands over the net. A correlation between jump height and blocking time was observed only in the running step technique. The time of centre of mass maximum speed was significantly less for the jab cross-over step technique. The most effective blocking technique for every athlete was finally obtained

    it could rain weather forecasting as a reasoning process

    Get PDF
    Abstract Meteorological forecasting is the process of providing reliable prediction about the future weathear within a given interval of time. Forecasters adopt a model of reasoning that can be mapped onto an integrated conceptual framework. A forecaster essentially precesses data in advance by using some models of machine learning to extract macroscopic tendencies such as air movements, pressure, temperature, and humidity differentials measured in ways that depend upon the model, but fundamentally, as gradients. Limit values are employed to transform these tendencies in fuzzy values, and then compared to each other in order to extract indicators, and then evaluate these indicators by means of priorities based upon distance in fuzzy values. We formalise the method proposed above in a workflow of evaluation steps, and propose an architecture that implements the reasoning techniques

    Multi-Analyte MS Based Investigation in Relation to the Illicit Treatment of Fish Products with Hydrogen Peroxide

    Get PDF
    Fishery products are perishable due to the action of many enzymes, both endogenous and exogenous. The latter are produced by bacteria that may contaminate the products. When fishes age, there is a massive bacteria growth that causes the appearance of off-flavor. In order to obtain “false” freshness of fishery products, an illicit treatment with hydrogen peroxide is reported to be used. Residues of hydrogen peroxide in food may be of toxicology concern. We developed two mass spectrometry based methodologies to identify and quantify molecules related to the treatment of fishes with hydrogen peroxide. With ultra-high-performance liquid chromatography–mass spectrometry (UHPLC-MS) we evaluated the concentration of trimethylamine-N-oxide (TMAO), trimethylamine (TMA), dimethylamine (DMA), and cadaverine (CAD) in fish products. After evaluating LOQ, we measured and validated the lower limits of quantification (LLOQs as first levels of calibration curves) values of 50 (TMAO), 70 (TMA), 45 (DMA), and 40 (CAD) ng/mL. A high ratio between TMAO and TMA species indicated the freshness of the food. With a GC-MS method we confirmed the illicit treatment measuring the levels of H2O2 after an analytical reaction with anisole to give 2-hydroxyanisole as a marker. This latter product was detected in the headspace of the homogenized sample with simplification of the work-up. A LLOQ of 50 ng/mL was checked and validated. When fish products were whitened and refreshed with hydrogen peroxide, the detected amount of the product 2-hydroxyanisole could be very important, (larger than 100 mg/kg). The developed analytical methods were suitable to detect the illicit management of fishery products with hydrogen peroxide; they resulted as sensitive, selective, and robust

    Cryogenic Characterization of FBK HD Near-UV Sensitive SiPMs

    Full text link
    We report on the characterization of near-ultraviolet high density silicon photomultiplier (\SiPM) developed at Fondazione Bruno Kessler (\FBK) at cryogenic temperature. A dedicated setup was built to measure the primary dark noise and correlated noise of the \SiPMs\ between 40 and 300~K. Moreover, an analysis program and data acquisition system were developed to allow the precise characterization of these parameters, some of which can vary up to 7 orders of magnitude between room temperature and 40~K. We demonstrate that it is possible to operate the \FBK\ near-ultraviolet high density \SiPMs\ at temperatures lower than 100~K with a dark rate below 0.01 cps/mm2^2 and total correlated noise probability below 35\% at an over-voltage of 6~V. These results are relevant for the development of future cryogenic particle detectors using \SiPMs\ as photosensors

    Compact Quantum Random Number Generator with Silicon Nanocrystals Light Emitting Device Coupled to a Silicon Photomultiplier

    Get PDF
    A small-sized photonic quantum random number generator, easy to be implemented in small electronic devices for secure data encryption and other applications, is highly demanding nowadays. Here, we propose a compact configuration with Silicon nanocrystals large area light emitting device (LED) coupled to a Silicon photomultiplier to generate random numbers. The random number generation methodology is based on the photon arrival time and is robust against the non-idealities of the detector and the source of quantum entropy. The raw data show high quality of randomness and pass all the statistical tests in national institute of standards and technology tests (NIST) suite without a post-processing algorithm. The highest bit rate is 0.5 Mbps with the efficiency of 4 bits per detected photon

    Long-Term Survival and Predictors of Failure of Opening Wedge High Tibial Osteotomy

    Get PDF
    Objective: High tibial valgus osteotomy (HTO) is a widely accepted procedure indicated for varus knee with symptomatic osteoarthritis of the medial compartment. However, there is a lack of studies evaluating long term results of this procedure. The primary aim of this study was to evaluate the long-term survival of opening wedge high tibial osteotomy (HTO) for isolated osteoarthritis in the medial compartment of the knee. The secondary objective was to identify independent predictors of conversion to total knee arthroplasty (TKA). Methods: This is a long term retrospective study of 296 cases of open wedge HTOs performed at a single center (level of evidence IV) between January 2005 and August 2015. Opening wedge medial HTO was always performed after diagnostic arthroscopy. Eighty-three percent of the population (233 patients, 247 procedures) was followed up at a mean 11.6 years (6-17) by telephone interview, to evaluate the possible conversion to TKA. Mean age at the index operation was 42.8 years (range 15-70) and most patients were male (70%). Associated procedures (e.g., platelet rich plasma supplementation, microfractures, meniscectomy, etc.) were carried out at the time of the HTO in 80 (32%) cases. Survival of HTO and its association with age, sex, body mass index, smoking habit, preoperative severity of varus deformity, cartilage status at surgery, and associated procedures were evaluated. Kaplan-Meier and Cox regression analyses were performed. Results: Thirty-three of the 247 HTOs (13.4%) were converted to knee replacement, with 86.6% of the original procedures surviving at a mean 12-year follow-up. Kaplan-Meier survival estimates at 17 years for HTO were 75.5% (95% confidence interval [CI] 66.7-84.3). There was significant difference (P < 0.001) in the 17-year survival rate between obese (55.5%; 95% CI 35.3-75.6) and non-obese (79.7%; 95% CI 70.1-89.2) patients. The determinants of conversion to knee arthroplasty detected at multivariate Cox regression analysis were body mass index, severity of cartilage degeneration in the medial compartment (Outerbridge grade), and age. Conclusion: The long-term survival of open wedge HTO for osteoarthritis in the medial compartment of the knee is satisfactory. The risk of conversion to TKA is significantly increased in obese patients. Advanced age and severity of pre-existing cartilage damage may also contribute to the risk of conversion to TKA

    Estimating Overall and Cause-Specific Excess Mortality during the COVID-19 Pandemic: Methodological Approaches Compared

    Get PDF
    During the COVID-19 pandemic, excess mortality has been reported worldwide, but its magnitude has varied depending on methodological differences that hinder between-study comparability. Our aim was to estimate variability attributable to different methods, focusing on specific causes of death with different pre-pandemic trends. Monthly mortality figures observed in 2020 in the Veneto Region (Italy) were compared with those forecasted using: (1) 2018–2019 monthly average number of deaths; (2) 2015–2019 monthly average age-standardized mortality rates; (3) Seasonal Autoregressive Integrated Moving Average (SARIMA) models; (4) Generalized Estimating Equations (GEE) models. We analyzed deaths due to all-causes, circulatory diseases, cancer, and neurologic/mental disorders. Excess all-cause mortality estimates in 2020 across the four approaches were: +17.2% (2018–2019 average number of deaths), +9.5% (five-year average age-standardized rates), +15.2% (SARIMA), and +15.7% (GEE). For circulatory diseases (strong pre-pandemic decreasing trend), estimates were +7.1%, −4.4%, +8.4%, and +7.2%, respectively. Cancer mortality showed no relevant variations (ranging from −1.6% to −0.1%), except for the simple comparison of age-standardized mortality rates (−5.5%). The neurologic/mental disorders (with a pre-pandemic growing trend) estimated excess corresponded to +4.0%/+5.1% based on the first two approaches, while no major change could be detected based on the SARIMA and GEE models (−1.3%/+0.3%). The magnitude of excess mortality varied largely based on the methods applied to forecast mortality figures. The comparison with average age-standardized mortality rates in the previous five years diverged from the other approaches due to the lack of control over pre-existing trends. Differences across other methods were more limited, with GEE models probably representing the most versatile option
    • …
    corecore