527 research outputs found

    Is there an effective therapy available for non-alcoholic fatty liver disease?

    Get PDF
    Non-alcoholic fatty liver disease (NAFLD) is defined as fat accumulation in the liver, ranging from simple steatosis to non-alcoholic steatohepatitis (NASH). Although it used to be considered a benign condition, nowadays it is known to be associated with liver injury and the development of end-stage liver disease. NAFLD is the hepatic manifestation of metabolic syndrome (MS) with an incidence rising in accordance with the increased prevalence of MS, the latter being considered the most common cause of liver enzyme elevation in Western countries. To date, no medications or surgical procedures have been approved for effective treatment of NAFLD, and all of the therapies tested so far must still be regarded as experimental. It is expected that, based on the large amount of data produced in the last few years and the ongoing large multicenter clinical trials, the effective treatment(s) for NASH will soon be defined. Meanwhile, lifestyle interventions and behavior therapy, the only treatments shown to be effective, must be introduced in daily clinical practice and, if possible, supported by public health programs

    Classification Trees for Ordinal Responses in R: The rpartScore Package

    Get PDF
    This paper introduces rpartScore (Galimberti, Soffritti, and Di Maso 2012), a new R package for building classification trees for ordinal responses, that can be employed whenever a set of scores is assigned to the ordered categories of the response. This package has been created to overcome some problems that produced unexpected results from the package rpartOrdinal (Archer 2010). Explanations for the causes of these unexpected results are provided. The main functionalities of rpartScore are described, and its use is illustrated through some examples

    «Cor Iesu». La dimensione politico-religiosa del culto gesuitico al Sacro Cuore di Gesù fra tradizione e secolarizzazione (1689-1789)

    Get PDF
    Il culto del Sacro Cuore di Gesù, argomento già affrontato in studi importanti (Mario Rosa, Daniele Menozzi, Fulvio De Giorgi), è indubbiamente uno dei presupposti per comprendere la formazione della mentalità di una parte consistente della popolazione europea nel XVIII secolo. Nucleo essenziale di questo lavoro un’analisi attenta e di ampio respiro, sul confronto sempre più acceso, di entrambi i poli entro i quali si giocò il rapporto tra politica e religione nel XVIII secolo: ovvero tra una "religione del Cuore", come fenomeno di reazione ai processi di secolarizzazione della società europea settecentesca, e una “politica della Ragione”, istanza della cultura moderna dei Lumi che allo stesso tempo sollecitava, anche dall’interno della Chiesa stessa, una riforma dottrinale e disciplinare. Uno scontro fra due modi di vivere il proprio tempo che necessariamente modella anche la vita civile dei paesi attraversati dal conflitto e dalle diverse modulazioni che lo scontro imprime all’ordine del discorso politico degli Stati. La devozione al Sacro Cuore di Gesù, oltre a rapresentare un episodio della più ampia lotta ingaggiata dalla Chiesa e dalla stesso ordine fondato da Ignazio di Loyola contro la modernità, risulta anche di particolare interesse per analizzare il cambio di paradigma impresso dallo scoppio della Rivoluzione francese e come e quanto la propaganda religiosa si orientasse a restaurare un mondo chiuso con gli stessi elementi discorsivi usati dagli avversari

    ESTIMATES OF CANCER POPULATION ATTRIBUTABLE FRACTIONS FOR MULTIPLE RISK FACTORS FROM A NETWORK OF ITALIAN CASE-CONTROL STUDIES

    Get PDF
    Introduction. Attributable fraction (AF), proposed by Levin, quantifies the reduction in the disease prevalence that could be achieved by eliminating the exposure (or risk factor) of interest from the population. Disease etiology involves multiple risk factors that may act simultaneously in the occurrence of disease and the optimal approach to quantify the individual and the joint effects of different risk factors on the disease burden is one of the goals in epidemiological research. Adjusted AFs quantify the effect of one risk factor after controlling of other factors (i.e., risk factors that may act together to cause disease, adjustment variables or confounders). Adjusted AFs may add up more than the joint AF (i.e., the AF for eliminating all risk factors from the population) and in some situation may add up to more than 1, leading to the conclusion that adjusted AFs should not be used to the purpose of partitioning the joint effect into individual contributions. Eide and Gefeller proposed a way to accomplish this task. Sequential AFs quantify the additional effect of one risk factor on the disease risk after the preceding risk factors have already been removed in a specified order from the population. However, sequential AFs depend on the order in which risk factors are removed from the population. Average AFs overcome this shortcoming by averaging sequential AFs for a risk factor over all orders by which risk factors can be removed from the population. Average AFs quantify the additional effect of one risk factor on the disease risk after the preceding factors selected randomly have already been removed from the population. Objective. This work aims to illustrate the main methodologies to estimate AFs and corresponding confidence intervals in presence of multiple risk factors with a focus on case-control study design. Moreover, we provide AF estimates for the major risk factors using Italian case-control data on oral cavity and breast cancers. Modification of case-control study design. In the original notation, sequential and average AFs could not be used in case-control study design, since the ratio of controls to cases in the sample is fixed a priori and the resulting AF estimates will be biased. Ferguson et al. proposed a prevalence-based weighting approach to correct the imbalance between controls and cases. The method consists in weighting the likelihood function of the model used to estimate sequential and average AFs for the disease prevalence. Variance estimation. The main approaches for estimating AF confidence intervals (CIs) are based on asymptotic approximation (Delta method) and simulations (Monte Carlo method). Ferguson proposed a method based on Monte Carlo simulations for constructing average AF variance. They also proposed the \u201caverisk\u201d R package for calculating average AFs and corresponding CIs in both prospective and case-control studies. In this work, we proposed a modification of the Ferguson\u2019s method to account for sequential AF variability on the total variability. Variances comparison. We compared our and Ferguson\u2019s methods to estimate average AF variance using simulated data. We generate two classes of simulated dataset. Each class included four scenarios according to different correlation structure: from independence (scenario 1) to strong correlation among risk factors (scenario 4). The two classes differed in the prevalence and strength of the association between risk factors. In particular, the first class had a high prevalence and modest relative risks, whereas the second class had a low prevalence and huge relative risks. For both classes of simulated data, standard deviation increment (i.e., the relative difference between our and Ferguson\u2019s methods) became gradually larger increasing the number of independent risk factors (from two to ten). Conversely, standard deviation increment decreased incrementing the number of correlated risk factors. Although in some situations (i.e., for correlated risk factors) the contribution of our method could have a substantial relative impact on total AF variability (up to 88%), the absolute standard deviation differences between two methods were very small (less than 0.15) indicating a limited contribution of our method than the Feguson\u2019s one. Application to real data. We estimated average AFs using a case-control study conducted in Italy on 946 oral cavity cases and 2492 controls. Risk factors considered for AF estimation were smoking, alcohol drinking, red meat intake, vegetables intake, fruit intake, and family history of oral cavity cancer. The final model included also terms for sex, age, study centre, years of education, BMI, and non-alcohol energy intake to account for possible confounding effect. We set a prevalence of oral cavity cancer according to statistics from the consortium of Italian Cancer Registry (AIRTUM) to adjust average AFs for case-control data structure. Eighty-eight percent (95% CI: 78%; 98%) of oral cavity cases were attributable to the considered risk factors. In particular, the average AF for smoking was 0.34 (95% CI: 0.27; 0.41), indicating that 34% of oral cavity cases would not has occurred if smoking was randomly removed from the population over all possible risk factor removal orders. For the remaining risk factors, average AFs were 0.27 (95% CI: 0.17; 0.37) for alcohol drinking, 0.11 (95% CI: 0.06; 0.17) for low vegetables intake, 0.08 (95% CI: 0.02; 0.15) for low fruit intake, 0.06 (95% CI: 0.01; 0.12) for high red meat intake, and 0.009 (95% CI: -0.001; 0.02) for family history. We analyzed a further case-control study on 2569 breast cancer cases and 2588 controls. We set a prevalence of breast cancer to adjust average AFs for case-control data structure. The final model included alcohol drinking, parity, breastfeeding, use of oral contraceptives (OCs), and family history of breast cancer as risk factors; study centre, age, years of education, smoking, age at menarche and use of hormonal replacement therapy (HRT) as adjusting factors. The joint AF was 0.49 (95% CI: 0.35; 0.63) indicating that approximately half of the breast cancer cases would not has occurred if all risk factors were simultaneously eliminated from the population. In particular, average AFs were 0.27 (95% CI: 0.16; 0.39) for parity, 0.12 (95% CI: 0.06; 0.18) for alcohol drinking, 0.04 (95% CI: -0.02; 0.10) for breastfeeding (No or <4 months), 0.04 (95% CI: 0.03; 0.06) for family history of breast cancer, and 0.01 (95% CI: -0.01; 0.03) for OCs users. Conclusions. Sequential and average AFs are useful tools to apportion exposure-specific contributions in a population exposed to multiple risk factors. Sequential and average AFs share some mathematical properties such as component-additivity, symmetry, marginal rationality, and internal marginal rationality. Average AFs, however, do not represent the actual amount of disease ascribable for each risk factors because they assume that risk factors are removed from the population in a random order. Nevertheless, average AFs could be useful parameters to estimate the average burden of disease for each risk factors across all possible removal orders. In this work, we proposed an alternative approach to estimate the average AF confidence interval accounting for sequential AF variability on the total AF one. We compared the performance between our and Fergusons\u2019 methods to estimate AF variance. Although our method could have a relative impact on total AF variability, the absolute standard deviation differences suggest a limited contribution of our method. However, this topic should be further analyzed

    Chapter Longitudinal profile of a set of biomarkers in predicting Covid-19 mortality using joint models

    Get PDF
    In survival analysis, time-varying covariates are endogenous when their measurements are directly related to the event status and incomplete information occur at random points during the follow-up. Consequently, the time-dependent Cox model leads to biased estimates. Joint models (JM) allow to correctly estimate these associations combining a survival and longitudinal sub-models by means of a shared parameter (i.e., random effects of the longitudinal sub-model are inserted in the survival one). This study aims at showing the use of JM to evaluate the association between a set of inflammatory biomarkers and Covid-19 mortality. During Covid-19 pandemic, physicians at Istituto Clinico di CittĂ  Studi in Milan collected biomarkers (endogenous time-varying covariates) to understand what might be used as prognostic factors for mortality. Furthermore, in the first epidemic outbreak, physicians did not have standard clinical protocols for management of Covid-19 disease and measurements of biomarkers were highly incomplete especially at the baseline. Between February and March 2020, a total of 403 COVID-19 patients were admitted. Baseline characteristics included sex and age, whereas biomarkers measurements, during hospital stay, included log-ferritin, log-lymphocytes, log-neutrophil granulocytes, log-C-reactive protein, glucose and LDH. A Bayesian approach using Markov chain Monte Carlo algorithm were used for fitting JM. Independent and non-informative priors for the fixed effects (age and sex) and for shared parameters were used. Hazard ratios (HR) from a (biased) time-dependent Cox and joint models for log-ferritin levels were 2.10 (1.67-2.64) and 1.73 (1.38-2.20), respectively. In multivariable JM, doubling of biomarker levels resulted in a significantly increase of mortality risk for log-neutrophil granulocytes, HR=1.78 (1.16-2.69); for log-C-reactive protein, HR=1.44 (1.13-1.83); and for LDH, HR=1.28 (1.09-1.49). Increasing of 100 mg/dl of glucose resulted in a HR=2.44 (1.28-4.26). Age, however, showed the strongest effect with mortality risk starting to rise from 60 years

    Arc-Jet Testing of Ultra-High-Temperature-Ceramics

    Get PDF
    The article deals with arc-jet experiments on different Ultra High Temperature Ceramics models in high enthalpy hypersonic non equilibrium flow. Typical geometries for nose tip or wing leading edges of interest for hypersonic vehicles, as rounded wedge, hemisphere and cone are considered. Temperature measurements have been performed using pyrometers, an IR thermocamera and thermocouples. Spectral emissivity has been evaluated by suitable experimental techniques. The details of the experimental set-up, the tests procedure and the measurements are discussed in the text. The UHTC materials have been tested for several minutes to temperatures up to 2050 K showing a good resistance in extreme conditions. Fundamental differences between the various model shapes have been analysed and discussed. Numerical-experimental correlations have been carried out by a CFD code, resulting in good agreement with proper modelling. The numerical rebuilding also allowed to evaluate the catalytic efficiency and the emissivity of the materials at different temperature

    Arc-Jet Testing on HfB2-TaSi2 Models: Effect of the Geometry on the Aerothermal Behaviour

    Get PDF
    Arc-jet experiments in high enthalpy hypersonic (Mach 3) non equilibrium flow were carried out on a HfB2 composite with addition of 15 vol% TaSi2, at temperatures exceeding 2000 K. The aerothermal behaviour was tested considering models having two different geometries, i.e. hemispheric and cone-shaped. The surface temperature and emissivity of the material were evaluated during the tests. Numerical computations of the nozzle flow were carried out in order to identify the flow conditions around the model and to analyze the details of thermal heating. The chemical- physical modifications were analysed after exposures. The surface emissivity changed from 0.85 to 0.5 due to surface oxidation. The maximum temperatures reached on the tip were strongly dependent on the sample geometry, being around 2300 K for the hemisphere and 2800 K for the cone. Post test SEM analyses confirmed an excellent stability for this HfB2-based materia

    An example of innovative university teaching: the model of Constructive and Collaborative Professional Participation

    Full text link
    [EN] This contribution presents a blended course model called Constructive and Collaborative Professional Participation (CCPP), developed since 2005. We will describe theories of reference, course structure, activities performed and methods adopted. Starting from a socio-constructivist framework, both online individual and group activities and offline individual and group activities were organized together with Role Taking, "expert" and "Jigsaw" groups inspired by the Aronson method, web-forum and in presence discussions aimed at building various products. The model has been implemented in university courses about Psychology of e-learning and involves companies from the field to professionalize the activities. Academic and business tutors have been purposely trained, to support student participation. Following the Design Based Research methodology, at the end of each edition various kinds of data were collected: questionnaires, interviews, and focus groups with the students and feedback from the tutors and the companies involved. The course trained students on skills related to the syllabus, together with communication, organizational and self-assessment skills. Our results also showed how it was possible to develop identity positioning, in particular the transition from positions as students towards professional positioning.Di Maso, R.; Ligorio, MB. (2019). An example of innovative university teaching: the model of Constructive and Collaborative Professional Participation. En HEAD'19. 5th International Conference on Higher Education Advances. Editorial Universitat Politècnica de València. 415-421. https://doi.org/10.4995/HEAD19.2019.9293OCS41542

    Reconstruction of material losses by perimeter penalization and phase-field methods

    Get PDF
    We treat the inverse problem of determining material losses, such as cavities, in a conducting body, by performing electrostatic measurements at the boundary. We develop a numerical approach, based on variational methods, to reconstruct the unknown material loss by a single boundary measurement of current and voltage type. The method is based on the use of phase-field functions to model the material losses and on a perimeter-like penalization to regularize the otherwise ill-posed problem.We justify the proposed approach by a convergence result, as the error on the measurement goes to zero.Comment: 28 page

    Upper limb work-related musculoskeletal disorders in operating room nurses: A multicenter cross-sectional study

    Get PDF
    This study aimed to evaluate the association between personal and job characteristics and the risk of upper limb work-related musculoskeletal disorders (WMSDs) among operating room nurses (ORNs). To this end, we collected data from 148 ORNs working at 8 Italian hospitals and measured any upper limb disabilities experienced in the previous year using the Italian version of the disabilities of the arm, shoulder and hand (DASH) questionnaire. The associations between personal and job characteristics and risk of upper limb WMSDs were estimated by unconditional logistic regression models. The prevalence of upper limb WMSDs was 45.9%. Multivariate analysis showed the \u201cfemale gender\u201d and \u201cmonthly hours spent working as a scrub nurse\u201d to be directly associated with a higher DASH score (adjusted OR for gender = 5.37, 95% CI: 1.65\u201317.51, p &lt; 0.01; adjusted OR for monthly hours as scrub nurse = 3.09, 95% CI: 1.33\u20137.19, p &lt; 0.01). Overall, our findings indicate that a full-time job (&gt;120 h/month) as a scrub nurse significantly increases the risk of developing upper limb WMSDs among female ORNs. Thus, to reduce such risk in this particularly sensitive population, we recommend urgent implementation of ergonomic interventions on surgical equipment alongside job rotation and medical surveillance programs
    • …
    corecore