548 research outputs found
Analyzing ion distributions around DNA: sequence-dependence of potassium ion distributions from microsecond molecular dynamics
Microsecond molecular dynamics simulations of B-DNA oligomers carried out in an aqueous environment with a physiological salt concentration enable us to perform a detailed analysis of how potassium ions interact with the double helix. The oligomers studied contain all 136 distinct tetranucleotides and we are thus able to make a comprehensive analysis of base sequence effects. Using a recently developed curvilinear helicoidal coordinate method we are able to analyze the details of ion populations and densities within the major and minor grooves and in the space surrounding DNA. The results show higher ion populations than have typically been observed in earlier studies and sequence effects that go beyond the nature of individual base pairs or base pair steps. We also show that, in some special cases, ion distributions converge very slowly and, on a microsecond timescale, do not reflect the symmetry of the corresponding base sequenc
Analyzing ion distributions around DNA
We present a new method for analyzing ion, or molecule, distributions around helical nucleic acids and illustrate the approach by analyzing data derived from molecular dynamics simulations. The analysis is based on the use of curvilinear helicoidal coordinates and leads to highly localized ion densities compared to those obtained by simply superposing molecular dynamics snapshots in Cartesian space. The results identify highly populated and sequence-dependent regions where ions strongly interact with the nucleic and are coupled to its conformational fluctuations. The data from this approach is presented as ion populations or ion densities (in units of molarity) and can be analyzed in radial, angular and longitudinal coordinates using 1D or 2D graphics. It is also possible to regenerate 3D densities in Cartesian space. This approach makes it easy to understand and compare ion distributions and also allows the calculation of average ion populations in any desired zone surrounding a nucleic acid without requiring references to its constituent atoms. The method is illustrated using microsecond molecular dynamics simulations for two different DNA oligomers in the presence of 0.15 M potassium chloride. We discuss the results in terms of convergence, sequence-specific ion binding and coupling with DNA conformatio
Exploring the role of ICT-Enabled Social Innovation to support the modernisation of EU Social Protection Systems: findings and insights from analysis of case studies in fourteen Member States
This report presents the results of the analysis of case studies on how ICT-enabled social innovations promoting social investment can contribute to the modernisation of social protection systems in the EU. The case studies are drawn from 14 different Member States and address diverse social services and policy domains. Evidence from the analysis points out to the strong potential of using new approaches based on ICT-enabled social innovation to support public authorities, at various governance levels, in their efforts to improve the effectiveness and impact of social services delivery mechanisms and outreach. The analysis makes a first attempt to assess the relationship between different typologies of ICT-enabled social innovation and the broader social protection system in which they are embedded in. However, more research is needed to better understand the potential impact these initiatives could have on enhancing the adequacy and sustainability of welfare systems in the EU.JRC.B.4-Human Capital and Employmen
A Limited area model intercomparison on the "Montserrat-2000" flash-flood event using statistical and deterministic methods
In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event.
The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered.
For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts.
High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set
White Matter Microstructural Damage on Diffusion Tensor Imaging in Cerebral Small Vessel Disease: Clinical Consequences.
The 8 and 9 September 2002 flash flood event in France: a model intercomparison
Within the framework of the European Interreg
IIIb Medocc program, the HYDROPTIMET project aims
at the optimization of the hydrometeorological forecasting
tools in the context of intense precipitation within complex
topography. Therefore, some meteorological forecast models
and hydrological models were tested on four Mediterranean
flash-flood events. One of them occured in France where the
South-eastern ridge of the French “Massif Central”, the Gard
region, experienced a devastating flood on 8 and 9 September
2002. 24 people were killed during this event and the
economic damage was estimated at 1.2 billion euros.
To built the next generation of the hydrometeorological
forecasting chain that will be able to capture such localized
and fast events and the resulting discharges, the forecasted
rain fields might be improved to be relevant for hydrological
purposes.
In such context, this paper presents the results of the evaluation
methodology proposed by Yates et al. (2005) that highlights
the relevant hydrological scales of a simulated rain
field. Simulated rain fields of 7 meteorological model runs
concerning with the French event are therefore evaluated for
different accumulation times. The dynamics of these models
are either based on non-hydrostatic or hydrostatic equation
systems. Moreover, these models were run under different
configurations (resolution, initial conditions). The classical
score analysis and the areal evaluation of the simulated rain
fields are then performed in order to put forward the main
simulation characteristics that improve the quantitative precipitation
forecast.
The conclusions draw some recommendations on the value
of the quantitative precipitation forecasts and way to use it for
quantitative discharge forecasts within mountainous areas
Personalization in BERT with Adapter Modules and Topic Modelling
As a result of the widespread use of intelligent assistants, personalization in dialogue systems has become
a hot topic in both research and industry. Typically, training such systems is computationally expensive,
especially when using recent large language models. To address this challenge, we develop an approach
to personalize dialogue systems using adapter layers and topic modelling. Our implementation enables
the model to incorporate user-specific information, achieving promising results by training only a small
fraction of parameters
Health Misinformation Detection in Web Content via Web2Vec: A Structural-, Content-based, and Context-aware Approach based on Web2Vec
In recent years, we have witnessed the proliferation of large amounts of
online content generated directly by users with virtually no form of external
control, leading to the possible spread of misinformation. The search for
effective solutions to this problem is still ongoing, and covers different
areas of application, from opinion spam to fake news detection. A more recently
investigated scenario, despite the serious risks that incurring disinformation
could entail, is that of the online dissemination of health information. Early
approaches in this area focused primarily on user-based studies applied to Web
page content. More recently, automated approaches have been developed for both
Web pages and social media content, particularly with the advent of the
COVID-19 pandemic. These approaches are primarily based on handcrafted features
extracted from online content in association with Machine Learning. In this
scenario, we focus on Web page content, where there is still room for research
to study structural-, content- and context-based features to assess the
credibility of Web pages. Therefore, this work aims to study the effectiveness
of such features in association with a deep learning model, starting from an
embedded representation of Web pages that has been recently proposed in the
context of phishing Web page detection, i.e., Web2Vec
- …
