220 research outputs found
Recommended from our members
Identification of a Widespread Palmitoylethanolamide Contamination in Standard Laboratory Glassware.
Introduction: Fatty acid ethanolamides (FAEs) are a family of lipid mediators that participate in a host of biological functions. Procedures for the quantitative analysis of FAEs include organic solvent extraction from biological matrices (e.g., blood), followed by purification and subsequent quantitation by liquid chromatography-mass spectrometry (LC/MS) or gas chromatography-mass spectrometry. During the validation process of a new method for LC/MS analysis of FAEs in biological samples, we observed unusually high levels of the FAE, palmitoylethanolamide (PEA), in blank samples that did not contain any biological material. Materials and Methods: We investigated a possible source of this PEA artifact via liquid chromatography coupled to tandem mass spectrometry, as well as accurate mass analysis. Results: We found that high levels of a contaminant indistinguishable from PEA is present in new 5.75âł glass Pasteur pipettes, which are routinely used by laboratories to carry out lipid extractions. This artifact might account for discrepancies found in the literature regarding PEA levels in human blood serum and other tissues. Conclusions: It is recommended to take into account this pitfall by analyzing potential contamination of the disposable glassware during the validation process of any method used for analysis of FAEs
Monetary Policy Rules for the Euro Area: What Role for National Information?
Using a simple multi-country econometric model covering the three main countries of the euro area, the paper focuses on the role that can be played by information at the national level in defining the monetary policy of the Union. We find that the performance of a central bank that chooses the nominal interest rate to minimize a standard quadratic loss function of area-wide inflation and output gap improves significantly if the reaction function includes national variables - as opposed to the case in which the interest rate reacts to area-wide variables only. Our results suggest that asymmetries within the euro area are relevant to the central bank; overall, we interpret them as making a case for exploiting the available national information in the conduct of the single monetary policy.monetary policy rules, Eurosystem
Novel perspectives in redox biology and pathophysiology of failing myocytes: modulation of the intramyocardial redox milieu for therapeutic interventions - A review article from the Working Group of Cardiac Cell Biology, Italian Society of Cardiology
The prevalence of heart failure (HF) is still increasing worldwide, with enormous human, social, and economic costs, in spite of
huge efforts in understanding pathogeneticmechanisms and in developing effective therapies that have transformed this syndrome
into a chronic disease. Myocardial redox imbalance is a hallmark of this syndrome, since excessive reactive oxygen and nitrogen
species can behave as signaling molecules in the pathogenesis of hypertrophy and heart failure, leading to dysregulation of cellular
calcium handling, of the contractile machinery, of myocardial energetics and metabolism, and of extracellular matrix deposition.
Recently, following new interesting advances in understanding myocardial ROS and RNS signaling pathways, new promising
therapeutical approaches with antioxidant properties are being developed, keeping in mind that scavenging ROS and RNS tout
court is detrimental as well, since these molecules also play a role in physiological myocardial homeostasis
Synergetic and redundant information flow detected by unnormalized Granger causality: application to resting state fMRI
Objectives: We develop a framework for the analysis of synergy and redundancy
in the pattern of information flow between subsystems of a complex network.
Methods: The presence of redundancy and/or synergy in multivariate time series
data renders difficult to estimate the neat flow of information from each
driver variable to a given target. We show that adopting an unnormalized
definition of Granger causality one may put in evidence redundant multiplets of
variables influencing the target by maximizing the total Granger causality to a
given target, over all the possible partitions of the set of driving variables.
Consequently we introduce a pairwise index of synergy which is zero when two
independent sources additively influence the future state of the system,
differently from previous definitions of synergy. Results: We report the
application of the proposed approach to resting state fMRI data from the Human
Connectome Project, showing that redundant pairs of regions arise mainly due to
space contiguity and interhemispheric symmetry, whilst synergy occurs mainly
between non-homologous pairs of regions in opposite hemispheres. Conclusions:
Redundancy and synergy, in healthy resting brains, display characteristic
patterns, revealed by the proposed approach. Significance: The pairwise synergy
index, here introduced, maps the informational character of the system at hand
into a weighted complex network: the same approach can be applied to other
complex systems whose normal state corresponds to a balance between redundant
and synergetic circuits.Comment: 6 figures. arXiv admin note: text overlap with arXiv:1403.515
Information transfer of an Ising model on a brain network
We implement the Ising model on a structural connectivity matrix describing
the brain at a coarse scale. Tuning the model temperature to its critical
value, i.e. at the susceptibility peak, we find a maximal amount of total
information transfer between the spin variables. At this point the amount of
information that can be redistributed by some nodes reaches a limit and the net
dynamics exhibits signature of the law of diminishing marginal returns, a
fundamental principle connected to saturated levels of production. Our results
extend the recent analysis of dynamical oscillators models on the connectome
structure, taking into account lagged and directional influences, focusing only
on the nodes that are more prone to became bottlenecks of information. The
ratio between the outgoing and the incoming information at each node is related
to the number of incoming links
Consensus clustering approach to group brain connectivity matrices
A novel approach rooted on the notion of consensus clustering, a strategy
developed for community detection in complex networks, is proposed to cope with
the heterogeneity that characterizes connectivity matrices in health and
disease. The method can be summarized as follows:
(i) define, for each node, a distance matrix for the set of subjects by
comparing the connectivity pattern of that node in all pairs of subjects; (ii)
cluster the distance matrix for each node; (iii) build the consensus network
from the corresponding partitions; (iv) extract groups of subjects by finding
the communities of the consensus network thus obtained.
Differently from the previous implementations of consensus clustering, we
thus propose to use the consensus strategy to combine the information arising
from the connectivity patterns of each node. The proposed approach may be seen
either as an exploratory technique or as an unsupervised pre-training step to
help the subsequent construction of a supervised classifier. Applications on a
toy model and two real data sets, show the effectiveness of the proposed
methodology, which represents heterogeneity of a set of subjects in terms of a
weighted network, the consensus matrix
Rough volatility via the Lamperti transform
We study the roughness of the log-volatility process by testing the self-similarity of the process obtained by the de-Lampertized realized volatility. The value added of our analysis rests on the application of a distribution-based estimator providing results which are more robust with respect to those deduced by the scaling of the individual moments of the process. Our findings confirm the roughness of the log-volatility process
Synergy as a warning sign of transitions: the case of the two-dimensional Ising model
We consider the formalism of information decomposition of target effects from
multi-source interactions, i.e. the problem of defining redundant and
synergistic components of the information that a set of source variables
provides about a target, and apply it to the two-dimensional Ising model as a
paradigm of a critically transitioning system. Intuitively, synergy is the
information about the target variable that is uniquely obtained taking the
sources together, but not considering them alone; redundancy is the information
which is shared by the sources. To disentangle the components of the
information both at the static level and at the dynamical one, the
decomposition is applied respectively to the mutual information and to the
transfer entropy between a given spin, the target, and a pair of neighbouring
spins (taken as the drivers). We show that a key signature of an impending
phase transition (approached from the disordered size) is the fact that the
synergy peaks in the disordered phase, both in the static and in the dynamic
case: the synergy can thus be considered a precursor of the transition. The
redundancy, instead, reaches its maximum at the critical temperature. The peak
of the synergy of the transfer entropy is far more pronounced than those of the
static mutual information. We show that these results are robust w.r.t. the
details of the information decomposition approach, as we find the same results
using two different methods; moreover, w.r.t. previous literature rooted on the
notion of Global Transfer Entropy, our results demonstrate that considering as
few as three variables is sufficient to construct a precursor of the
transition, and provide a paradigm for the investigation of a variety of
systems prone to crisis, like financial markets, social media, or epileptic
seizures
PROPAGATE: a seed propagation framework to compute Distance-based metrics on Very Large Graphs
We propose PROPAGATE, a fast approximation framework to estimate
distance-based metrics on very large graphs such as the (effective) diameter,
the (effective) radius, or the average distance within a small error. The
framework assigns seeds to nodes and propagates them in a BFS-like fashion,
computing the neighbors set until we obtain either the whole vertex set (the
diameter) or a given percentage (the effective diameter). At each iteration, we
derive compressed Boolean representations of the neighborhood sets discovered
so far. The PROPAGATE framework yields two algorithms: PROPAGATE-P, which
propagates all the seeds in parallel, and PROPAGATE-s which propagates the
seeds sequentially. For each node, the compressed representation of the
PROPAGATE-P algorithm requires bits while that of PROPAGATE-S only bit.
Both algorithms compute the average distance, the effective diameter, the
diameter, and the connectivity rate within a small error with high probability:
for any and using sample nodes, the error for the average distance is
bounded by , the error for the
effective diameter and the diameter are bounded by , and the error for the connectivity rate is bounded
by where is the diameter and is a measure of
connectivity of the graph. The time complexity is , where is the number of edges of the
graph. The experimental results show that the PROPAGATE framework improves the
current state of the art both in accuracy and speed. Moreover, we
experimentally show that PROPAGATE-S is also very efficient for solving the All
Pair Shortest Path problem in very large graphs
- âŠ