5,732 research outputs found

    Numerical approximation of poroelasticity with random coefficients using Polynomial Chaos and Hybrid High-Order methods

    Get PDF
    In this work, we consider the Biot problem with uncertain poroelastic coefficients. The uncertainty is modelled using a finite set of parameters with prescribed probability distribution. We present the variational formulation of the stochastic partial differential system and establish its well-posedness. We then discuss the approximation of the parameter-dependent problem by non-intrusive techniques based on Polynomial Chaos decompositions. We specifically focus on sparse spectral projection methods, which essentially amount to performing an ensemble of deterministic model simulations to estimate the expansion coefficients. The deterministic solver is based on a Hybrid High-Order discretization supporting general polyhedral meshes and arbitrary approximation orders. We numerically investigate the convergence of the probability error of the Polynomial Chaos approximation with respect to the level of the sparse grid. Finally, we assess the propagation of the input uncertainty onto the solution considering an injection-extraction problem.Comment: 30 pages, 15 Figure

    Statistical Reliability Estimation of Microprocessor-Based Systems

    Get PDF
    What is the probability that the execution state of a given microprocessor running a given application is correct, in a certain working environment with a given soft-error rate? Trying to answer this question using fault injection can be very expensive and time consuming. This paper proposes the baseline for a new methodology, based on microprocessor error probability profiling, that aims at estimating fault injection results without the need of a typical fault injection setup. The proposed methodology is based on two main ideas: a one-time fault-injection analysis of the microprocessor architecture to characterize the probability of successful execution of each of its instructions in presence of a soft-error, and a static and very fast analysis of the control and data flow of the target software application to compute its probability of success. The presented work goes beyond the dependability evaluation problem; it also has the potential to become the backbone for new tools able to help engineers to choose the best hardware and software architecture to structurally maximize the probability of a correct execution of the target softwar

    Morphology of Lipari offshore (Southern Tyrrhenian Sea)

    Get PDF
    High-resolution multibeam bathymetry was recently collected around Lipari, the largest and most densely populated island of the Aeolian Archipelago (Southern Tyrrhenian Sea). The data were acquired within the context of marine geological studies performed in the area over the last 10 years. We present the first detailed morphological map of the Lipari offshore at 1:100,000 scale (Main Map). A rugged morphology characterizes the submarine portions of Lipari volcano, reflecting both volcanic and erosive-depositional processes. The volcanic features include cones, lava flows and bedrock outcrops. Erosive-depositional features include an insular shelf topped by submarine depositional terraces related to LateQuaternary sea-level fluctuations, as well as landslide scars, channelized features, fanshaped deposits and wavy bedforms. The different distribution of volcanic and erosivedepositional features on the various sectors of Lipari is mainly related to the older age of the western flank with respect to the eastern one. The map also provides insights for a first marine geohazard assessment of this active volcanic area

    GPU cards as a low cost solution for efficient and fast classification of high dimensional gene expression datasets

    Get PDF
    The days when bioinformatics tools will be so reliable to become a standard aid in routine clinical diagnostics are getting very close. However, it is important to remember that the more complex and advanced bioinformatics tools become, the more performances are required by the computing platforms. Unfortunately, the cost of High Performance Computing (HPC) platforms is still prohibitive for both public and private medical practices. Therefore, to promote and facilitate the use of bioinformatics tools it is important to identify low-cost parallel computing solutions. This paper presents a successful experience in using the parallel processing capabilities of Graphical Processing Units (GPU) to speed up classification of gene expression profiles. Results show that using open source CUDA programming libraries allows to obtain a significant increase in performances and therefore to shorten the gap between advanced bioinformatics tools and real medical practic

    CleAir monitoring system for particulate matter. A case in the Napoleonic Museum in Rome

    Get PDF
    Monitoring the air particulate concentration both outdoors and indoors is becoming a more relevant issue in the past few decades. An innovative, fully automatic, monitoring system called CleAir is presented. Such a system wants to go beyond the traditional technique (gravimetric analysis), allowing for a double monitoring approach: the traditional gravimetric analysis as well as the optical spectroscopic analysis of the scattering on the same filters in steady-state conditions. The experimental data are interpreted in terms of light percolation through highly scattering matter by means of the stretched exponential evolution. CleAir has been applied to investigate the daily distribution of particulate matter within the Napoleonic Museum in Rome as a test case

    La riforma Biagi del mercato del lavoro. Prime interpretazioni e proposte di lettura del d.lgs. 10 settembre 2003, n. 276. Il diritto transitorio e i tempi della riforma

    Get PDF
    Con il d.lgs. n. 276/2003, attuativo della legge Biagi, si è aperta una fase decisiva per la riforma del mercato del lavoro italiano. Cambia infatti il quadro normativo di regolazione dei rapporti di lavoro, in funzione dell’obiettivo di Lisbona di incrementare in modo consistente i tassi di occupazione regolare e le occasioni di lavoro di buona qualità. Esso cambia con una rapidità straordinaria, che è davvero senza precedenti. Pochi mesi di intenso lavoro e 86 articoli sono stati sufficienti a dare attuazione al complessivo impianto di modernizzazione del nostro diritto del lavoro delineato nella legge Biagi. Tale riforma non intende rappresentare il punto finale del progetto di modernizzazione del diritto del lavoro delineato nel Libro Bianco dell’ottobre 2001 e confermato nel Patto per l’Italia del luglio 2002 ma costituisce il punto di partenza – imprescindibile, ma di per sé non sufficiente – del complesso e delicato processo di ridefinizione e razionalizzazione delle regole che governano il nostro mercato del lavoro. Le parole chiave con cui leggere il provvedimento sono piuttosto occupabilità, adattabilità e pari opportunità. Parole moderne ed europee, che sono state importate nel nostro ordinamento soprattutto grazie al pensiero e all’elaborazione progettuale di Marco Biagi. Parole che si traducono, di volta in volta, nel corpo dello schema di decreto, in un sistema efficiente di servizi per l’impiego, pubblici e privati, autorizzati e accreditati, che, in rete fra loro, grazie alla borsa continua nazionale del lavoro, accompagnano e facilitano l’incontro tra coloro che cercano lavoro e coloro che cercano lavoratori; in forme di flessibilità regolata e contrattata con il sindacato, alternative al lavoro precario e nero, in modo da bilanciare le esigenze delle imprese di poter competere sui mercati internazionali con le irrinunciabili istanze di tutela e valorizzazione della persona del lavoratore; in provvedimenti per la formazione continua e il rilancio dell’apprendistato, come canale di formazione per il mercato e un lavoro di qualità; in misure di politica attiva a favore di quei gruppi di lavoratori che oggi incontrano maggiori difficoltà nell’accedere a un lavoro regolare e di buona qualità o anche a conciliare tempi di vita e tempi di lavoro. Il commentario vuole fornire, attraverso i suoi numerosi contributi, una lettura interdisciplinare della riforma

    La riforma del collocamento e i nuovi servizi per l’impiego. Commentario al D.Lgs. 19 dicembre 2002, n. 297 e prospettive di attuazione dell’articolo 1, legge 14 febbraio 2003, n. 30

    Get PDF
    Il volume contiene i contributi di: M. Biagi, M. Bassi, M. Colucci, L. Corazza, L. Degan, L. Fantini, D. Gilli, R. Giorgetti, R. Leoni, C. Magri, G. Razzoli, C. Riviello, S. Rosato, A. Salvoni, D. Savastano, S. Scagliarini, A. Scialdone, P. Sestito, S. Spattini, P. Spinelli, G. Steurs, L. Struyven, M. Tiraboschi e S. Trapani

    HBsAg clearance by Peg-interferon addition to a long-term nucleos(t)ide analogue therapy.

    Get PDF
    The ideal endpoint of hepatitis B virus (HBV) antiviral therapy is HBsAg loss, a difficult goal to obtain, especially in HBeAg negative patients. Herein, we report the results obtained by the addition of peg-interferon α-2a to a long-lasting nucleos(t)ide analogue therapy in a HBeAg negative, genotype D patient with steadily HBV-DNA negative/HBsAg positive values. In 2002, our Caucasian 44-year-old male patient received lamivudine and, 4 years later, added adefovir because of a virological breakthrough. In 2011, considering his young age, liver stiffness (4.3 kPa) and HBsAg levels (3533 IU/mL), we added Peg-interferon α-2a for six months (3 in combination with nucleos(t)ide analogues followed by 3 mo of Peg-interferon α-2a monotherapy). A decrease of HBsAg levels was observed after 1 mo (1.21 log) of Peg-interferon and 3 mo (1.88 log) after the discontinuation of all drugs. Later, a complete clearance of HBsAg was obtained with steadily undetectable HBV-DNA serum levels (< 9 IU/mL). HBsAg clearance by the addition of a short course of Peg-interferon α-2a represents an important result with clinical and pharmaco-economic implications, considering that nucleos(t)ide analogues therapy in HBeAg negative chronic hepatitis B patients is considered a long-lasting/life-long treatment

    Short communication: Relationships between milk quality and acidification in the production of table Mozzarella without starters.

    Get PDF
    The effect of some quality parameters of the milk (refrigeration time, pH, protein, and fat/protein ratio) on the extent of acidification in the production technology of table Mozzarella without starters was investigated. A screening phase carried out at the laboratory level demonstrated that variations of the milk characteristics require different levels of acidification to keep constant the quality of the cheese. The elaboration of the data collected throughout the successive experimentation on industrial scale allowed us to find a mathematical model to describe the relationships between the pH of the curd at stretching time and the milk characteristics, of which the protein concentration and the refrigeration time play the main roles

    A low-order nonconforming method for linear elasticity on general meshes

    Full text link
    In this work we construct a low-order nonconforming approximation method for linear elasticity problems supporting general meshes and valid in two and three space dimensions. The method is obtained by hacking the Hybrid High-Order method, that requires the use of polynomials of degree k1k\ge1 for stability. Specifically, we show that coercivity can be recovered for k=0k=0 by introducing a novel term that penalises the jumps of the displacement reconstruction across mesh faces. This term plays a key role in the fulfillment of a discrete Korn inequality on broken polynomial spaces, for which a novel proof valid for general polyhedral meshes is provided. Locking-free error estimates are derived for both the energy- and the L2L^2-norms of the error, that are shown to convergence, for smooth solutions, as hh and h2h^2, respectively (here, hh denotes the meshsize). A thorough numerical validation on a complete panel of two- and three-dimensional test cases is provided.Comment: 26 pages, 6 tables, and 4 Figure
    corecore