96 research outputs found

    Extreme wind speed distribution in a mixed wind climate

    Get PDF
    The meteorological services of mid-latitude countries record wind speeds averaged over 10 min or 1 h periods and peak wind speeds for the same averaging period or a full day. Design wind speeds based on the statistical analysis of this data in a mixed wind climate may prove to be imprecise and unsafe due to the occurrence of intense, small and rapid extreme wind events such as thunderstorm outflows. Considering the 6 year continuous high-frequency records registered in two Port areas of the Upper Tyrrhenian Sea, a preliminary but representative analysis of the extreme wind speed distribution has been carried out in a mixed wind climate area frequently struck by thunderstorms. Results show that wind speeds with a high return period are always related to thunderstorm outflows. The mixed extreme distribution asymptotically overlaps with that for thunderstorms for high return periods and always provides the highest wind speeds. Gathering the ensemble of all extreme values into a single set leads to underestimating of the extreme wind speed. The Italian code provides conservative estimates of the extreme wind speed that protect designers from thunderstorms as well. However, refined analyses of the local wind climate that ignore thunderstorm events may lead to severe underestimations of the design wind velocity

    A New Paradigm to Address Threats for Virtualized Services

    Get PDF
    With the uptaking of virtualization technologies and the growing usage of public cloud infrastructures, an ever larger number of applications run outside of the traditional enterprise’s perimeter, and require new security paradigms that fit the typical agility and elasticity of cloud models in service creation and management. Though some recent proposals have integrated security appliances in the logical application topology, we argue that this approach is sub-optimal. Indeed, we believe that embedding security agents in virtualization containers and delegating the control logic to the software orchestrator provides a much more effective, flexible, and scalable solution to the problem. In this paper, we motivate our mindset and outline a novel framework for assessing cyber-threats of virtualized applications and services. We also review existing technologies that build the foundation of our proposal, which we are going to develop in the context of a joint research project

    Predicting protein stability changes upon single-point mutation: a thorough comparison of the available tools on a new dataset

    Get PDF
    : Predicting the difference in thermodynamic stability between protein variants is crucial for protein design and understanding the genotype-phenotype relationships. So far, several computational tools have been created to address this task. Nevertheless, most of them have been trained or optimized on the same and 'all' available data, making a fair comparison unfeasible. Here, we introduce a novel dataset, collected and manually cleaned from the latest version of the ThermoMutDB database, consisting of 669 variants not included in the most widely used training datasets. The prediction performance and the ability to satisfy the antisymmetry property by considering both direct and reverse variants were evaluated across 21 different tools. The Pearson correlations of the tested tools were in the ranges of 0.21-0.5 and 0-0.45 for the direct and reverse variants, respectively. When both direct and reverse variants are considered, the antisymmetric methods perform better achieving a Pearson correlation in the range of 0.51-0.62. The tested methods seem relatively insensitive to the physiological conditions, performing well also on the variants measured with more extreme pH and temperature values. A common issue with all the tested methods is the compression of the DeltaDeltaGDelta Delta G predictions toward zero. Furthermore, the thermodynamic stability of the most significantly stabilizing variants was found to be more challenging to predict. This study is the most extensive comparisons of prediction methods using an entirely novel set of variants never tested before

    Near Infra-Red Spectroscopy: a low cost device

    Get PDF
    Abstract In vivo near infra-red spectroscopy ( NIRS) is based on interrogation of tissue with light, a non invasive measurement. Penetration with near infra-red wavelengths in tissues is greater than with visible light, and specific absorption by compounds relevant for diagnosis and monitoring enable safe and convenient in vivo measurement.. NIRS can be used for the early diagnosis of cerebral pathologies of vascular origin, cortical blood flow monitoring and the analysis of cortical activity. Recent advances in microelectronics make it possible to build small, portable, low-cost NIRS instruments capable of measuring chemical compounds relevant for diagnosis and monitoring. We have developed a low cost NIR module to be used for spectroscopy and imaging. This device is based on emitter -detector modules using laser diodes and PIN photodiodes, overcoming disadvantages of vacuum photomultipliers, and obviates the need for optical fibre connection. We have used intensity modulation with spatially resolved measurements. For measurements of (O2Hb) an (HHb) two wavelengths are sufficient. For each of them we measure demodulation and phase shift and hence the absorption coefficient and the reduced scattering coefficient, from which chromophore concentrations can be determined. Detector signal analysis and sequentially controlled switching is achieved using a Pentium III computer. The module has a cost below a few hundreds US dollars and it is quite small, and can be used for cortical oxygenation maps
    • …
    corecore