428 research outputs found

    Low-complexity dominance-based Sphere Decoder for MIMO Systems

    Full text link
    The sphere decoder (SD) is an attractive low-complexity alternative to maximum likelihood (ML) detection in a variety of communication systems. It is also employed in multiple-input multiple-output (MIMO) systems where the computational complexity of the optimum detector grows exponentially with the number of transmit antennas. We propose an enhanced version of the SD based on an additional cost function derived from conditions on worst case interference, that we call dominance conditions. The proposed detector, the king sphere decoder (KSD), has a computational complexity that results to be not larger than the complexity of the sphere decoder and numerical simulations show that the complexity reduction is usually quite significant

    Impact probability computation of Near-Earth Objects using Monte Carlo Line Sampling and Subset Simulation

    Full text link
    This work introduces two Monte Carlo (MC)-based sampling methods, known as line sampling and subset simulation, to improve the performance of standard MC analyses in the context of asteroid impact risk assessment. Both techniques sample the initial uncertainty region in different ways, with the result of either providing a more accurate estimate of the impact probability or reducing the number of required samples during the simulation with respect to standard MC techniques. The two methods are first described and then applied to some test cases, providing evidence of the increased accuracy or the reduced computational burden with respect to a standard MC simulation. Finally, a sensitivity analysis is carried out to show how parameter setting affects the accuracy of the results and the numerical efficiency of the two methods

    Vulnerability and resilience to food and nutrition insecurity: A review of the literature towards a unified framework

    Get PDF
    Current approaches to measuring food and nutrition security (FNS) mainly consider past access to food, while assessing vulnerability and resilience to food insecurity requires a dynamic setting and sound predictive models, conditional to the entire set of food-related multiple-scale shocks and stresses as well as households’ characteristics. The aim of this work is twofold: i) to review the state of the relevant literature on the conceptualization and the empirical measurement of vulnerability and resilience to food insecurity; ii) to frame the main coordinates of a possible unifying framework aiming at improving ex-ante targeting of policy interventions and resilience-enhancing programs. Our argument is that clarifying the relationships existing between vulnerability and resilience provides a better understanding and a more comprehensive picture of food insecurity that includes higher-order conditional moments and non-linearities. Furthermore, adopting the proposed unified framework, one can derive FNS measures that are: scalable and aggregable into higher-level dimensions (scale axiom); inherently dynamic (time axiom); conditioned to various factors (access axiom); applicable to various measures of food and nutrition as dependent variables (outcomes axiom). Unfortunately, the proposed unified framework shows some limitations. First, estimating conditional moments is highly data-demanding, requiring high-quality and high-frequency micro-level panel data for all the relevant FNS dimensions, not mentioning the difficulty of measuring risks/shocks and their associated probabilities using short panel data. Hence, there is a general issue of applicability of the proposed approach to typically data-scarce environments such as developing contexts. Second, there is an inherent tradeoff between the proposed approach in-sample precision and out-of-sample predictive performance. This is key to implement effective early warning systems and foster resilience-building programs

    Cost-effectiveness of direct acting oral anticoagulants in the prevention of thromboembolic complications : limits and concerns of economic evaluations

    Get PDF
    Economic evaluations have a widespread application in many areas of clinical research and play a key role in the clinical decision-making process. However, economic analyses have been sometimes used to produce new 'evidence' that is not adequately tested in the target population. This is the case of data arising from a systematic review of clinical trials evaluating the use of direct acting oral anticoagulants for the prevention of stroke in patients with atrial fibrillation. Taking into account this example, here we discuss the concerns raised by the improper interpretation of the results. Our conclusions are three-fold. Data from economic analyses should not be shifted to a clinical recommendation. Simulation models should not be used to generate new 'evidence' that is not supported by experimental data and is misleading. Clinical judgment is therefore pivotal to interpret results emerging from economic analyses

    The Simple View of Reading in Children Acquiring a Regular Orthography (Italian): A Network Analysis Approach

    Get PDF
    In the present study, we explored the unique contribution of reading accuracy, reading fluency and linguistic comprehension within the frame of Simple View of Reading (SVR). The experimental sample included 118 3rd to 5th grade children learning Italian, a language with a highly regular orthography. We adopted a flexible method of analysis, i.e., the Network Analysis (NA), particularly suited for exploring relations among different domains and where the direct relations between a set of intercorrelated variables is the main interest. Results indicated an independent and unique contribution of syntactic comprehension skills as well as reading fluency and reading accuracy in the comprehension of a written text. The decoding measures were not directly associated with non-verbal reasoning and the latter was not directly associated with reading comprehension but was strongly related to oral syntactic comprehension. Overall, the pattern of findings is broadly consistent with the predictions of SVR and underscores how, in an orthographically regular language, reading fluency and reading accuracy as well as oral comprehension skills directly influence reading comprehension. Data are discussed in a cross-linguistic perspective. Implications for education and rehabilitation are also presented

    Survivin gene levels in the peripheral blood of patients with gastric cancer independently predict survival

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The detection of circulating tumor cells (CTC) is considered a promising tool for improving risk stratification in patients with solid tumors. We investigated on whether the expression of CTC related genes adds any prognostic power to the TNM staging system in patients with gastric carcinoma.</p> <p>Methods</p> <p>Seventy patients with TNM stage I to IV gastric carcinoma were retrospectively enrolled. Peripheral blood samples were tested by means of quantitative real time PCR (qrtPCR) for the expression of four CTC related genes: carcinoembryonic antigen (CEA), cytokeratin-19 (CK19), vascular endothelial growth factor (VEGF) and Survivin (BIRC5).</p> <p>Results</p> <p>Gene expression of Survivin, CK19, CEA and VEGF was higher than in normal controls in 98.6%, 97.1%, 42.9% and 38.6% of cases, respectively, suggesting a potential diagnostic value of both Survivin and CK19. At multivariable survival analysis, TNM staging and Survivin mRNA levels were retained as independent prognostic factors, demonstrating that Survivin expression in the peripheral blood adds prognostic information to the TNM system. In contrast with previously published data, the transcript abundance of CEA, CK19 and VEGF was not associated with patients' clinical outcome.</p> <p>Conclusions</p> <p>Gene expression levels of Survivin add significant prognostic value to the current TNM staging system. The validation of these findings in larger prospective and multicentric series might lead to the implementation of this biomarker in the routine clinical setting in order to optimize risk stratification and ultimately personalize the therapeutic management of these patients.</p

    The Wine: typicality or mere diversity? The effect of spontaneous fermentations and biotic factors on the characteristics of wine

    Get PDF
    Wine is probably one of the main fermented beverages for which the recognition of the \u201cterritoriality\u201d is fundamental for its appreciation. The sensory profile of wine is significantly affected by microbial activities, and indigenous microorganisms may significantly contribute to the expression of wine typicality. The microbial ecology of wines is complex and includes several species and strains of yeasts, bacteria and molds. Several works showed the positive effects of spontaneous fermentations on the quality of wine as a consequence of the growth of different species and/or strains together at high levels. Furthermore, a new style of \u201cnatural\u201d winemaking is gaining importance, since the resulting wines are obtained thanks to the action of spontaneous autochthonous agents and the use of chemical addition is not allowed. In this contest, natural winemaking could provide enhanced opportunities for products with unique characters and popularly recognized as typical. The present work reports on microbial ecology and molecular profile characterizing natural large-scale vinifications, and an innovative procedure, named \u201cfortified pied de cuve\u201d, to accelerate the alcoholic fermentation performed spontaneously is also reported. Furthermore, this work reports on how the biotic factors, such as migratory birds, contribute in disseminating of winerelated yeasts over long distances, opening up new fields of research that will allow to unravel connection between wine and environmental factors

    Role of economic evaluations on pricing of medicines reimbursed by the Italian National Health Service

    Get PDF
    Objective The main objective of this study was to explore the extent to which the incremental cost-effectiveness ratio (ICER), alongside other factors, predicts the final outcome of medicine price negotiation in Italy. The second objective was to depict the mean ICER of medicines obtained after negotiation. Methods Data were extracted from company dossiers submitted to the Italian Medicines Agency (AIFA) from October 2016 to January 2021 and AIFA’s internal database. Beta-based regression analyses were used to test the effect of ICER and other variables on the outcome of price negotiation (ΔP), defined as the percentage difference between the list price requested by manufacturers and the final price paid by the Italian National Health Service (INHS). Results In our dataset of 48 pricing and reimbursement procedures, the ICER before negotiation was one of the variables with a major impact on the outcome of negotiation when ≥ 40,000€/QALY. As resulting from multiple regression analyses, the effect of the ICER on ΔP seemed driven by medicines for non-onco-immunological and non-rare diseases. Overall, the negotiation process granted mean incremental costs of €64,688 and mean incremental QALYs of 1.96, yielding an average ICER of €33,004/QALY. Conclusions This study provides support on the influence of cost-effectiveness analysis on price negotiation in the Italian context, providing an estimate of the mean ICER of reimbursed medicines, calculated using net confidential prices charged by the INHS. The role and use of economic evaluations in medicines pricing should be further improved to get the best value for money
    • …
    corecore