20,751 research outputs found
Itraconazole-induced Torsade de Pointes in a patient receiving methadone substitution therapy
Issues. Methadone, a pharmacological agent used to treat heroin dependence is relatively safe, but may cause cardiac arrhythmias in the concurrent presence of other risk factors. Approach and Key Findings. This case report highlights the risk of Torsade de Pointes, a life-threatening cardiac arrhythmia, in a heroin-dependent patient receiving methadone substitution therapy who was prescribed itraconazole for vaginal thrush. The patient presented to the accident and emergency department for chest discomfort and an episode of syncope following two doses of itraconazole (200 mg). Electrocardiogram monitoring at the accident and emergency department showed prolonged rate-corrected QT interval leading to Torsade de Pointes. The patient was admitted for cardiac monitoring, and electrocardiogram returned to normal upon discontinuation of methadone. Implication. This cardiac arrhythmia was most likely as a result of a drug interaction between methadone and itraconazole because the patient presented with no other risk factors. Conclusion. Given the benefits of methadone as a substitution treatment for heroin-dependent individuals, the association between methadone and cardiac arrhythmias is of great concern. Physicians treating heroin-dependent patients on methadone substitution therapy should therefore be cautious of the potential risk of drug interactions that may lead to fatal cardiac arrhythmias
Asymptotics of Harish-Chandra expansions, bounded hypergeometric functions associated with root systems, and applications
A series expansion for Heckman-Opdam hypergeometric functions
is obtained for all
As a consequence, estimates for away from the walls of a Weyl
chamber are established. We also characterize the bounded hypergeometric
functions and thus prove an analogue of the celebrated theorem of Helgason and
Johnson on the bounded spherical functions on a Riemannian symmetric space of
the noncompact type. The -theory for the hypergeometric Fourier transform
is developed for . In particular, an inversion formula is proved when
struc2vec: Learning Node Representations from Structural Identity
Structural identity is a concept of symmetry in which network nodes are
identified according to the network structure and their relationship to other
nodes. Structural identity has been studied in theory and practice over the
past decades, but only recently has it been addressed with representational
learning techniques. This work presents struc2vec, a novel and flexible
framework for learning latent representations for the structural identity of
nodes. struc2vec uses a hierarchy to measure node similarity at different
scales, and constructs a multilayer graph to encode structural similarities and
generate structural context for nodes. Numerical experiments indicate that
state-of-the-art techniques for learning node representations fail in capturing
stronger notions of structural identity, while struc2vec exhibits much superior
performance in this task, as it overcomes limitations of prior approaches. As a
consequence, numerical experiments indicate that struc2vec improves performance
on classification tasks that depend more on structural identity.Comment: 10 pages, KDD2017, Research Trac
Disordered social media use and risky drinking in young adults:Differential associations with addiction-linked traits
Background Excessive or compulsive use of social media has been likened to an addiction, similar to other behavioural addictions such as pathological gambling or Internet addiction. This investigation sought to determine the degree to which personality traits associated with such disordered social media use overlap with those known to predict problematic substance use, with use of the most commonly abused legal substance alcohol as an example of the latter. Method Well‐known indices of disordered social media use, risky or problematic alcohol use, and the personality traits alexithymia, reward sensitivity, narcissism, and impulsivity were administered online to 143 men and women aged 18–35-years who were regular users of social media. The traits examined had previously been linked to substance misuse for a variety of substances, including alcohol, as presumed predisposing factors. Results After controlling for age, gender, and social desirability in hierarchical regressions, disordered social media use was predicted by narcissism, reward sensitivity, and impulsivity, whereas risky alcohol use was predicted by narcissism, alexithymia, and impulsivity. The ability of narcissism to predict disordered social media use was mediated by reward sensitivity, which was not the case for risky drinking. Conclusions Present results point to similarities and differences in addiction‐linked traits when comparing disordered social media use to risky or problematic substance use
A Rate-Distortion Exponent Approach to Multiple Decoding Attempts for Reed-Solomon Codes
Algorithms based on multiple decoding attempts of Reed-Solomon (RS) codes
have recently attracted new attention. Choosing decoding candidates based on
rate-distortion (R-D) theory, as proposed previously by the authors, currently
provides the best performance-versus-complexity trade-off. In this paper, an
analysis based on the rate-distortion exponent (RDE) is used to directly
minimize the exponential decay rate of the error probability. This enables
rigorous bounds on the error probability for finite-length RS codes and leads
to modest performance gains. As a byproduct, a numerical method is derived that
computes the rate-distortion exponent for independent non-identical sources.
Analytical results are given for errors/erasures decoding.Comment: accepted for presentation at 2010 IEEE International Symposium on
Information Theory (ISIT 2010), Austin TX, US
On Multiple Decoding Attempts for Reed-Solomon Codes: A Rate-Distortion Approach
One popular approach to soft-decision decoding of Reed-Solomon (RS) codes is
based on using multiple trials of a simple RS decoding algorithm in combination
with erasing or flipping a set of symbols or bits in each trial. This paper
presents a framework based on rate-distortion (RD) theory to analyze these
multiple-decoding algorithms. By defining an appropriate distortion measure
between an error pattern and an erasure pattern, the successful decoding
condition, for a single errors-and-erasures decoding trial, becomes equivalent
to distortion being less than a fixed threshold. Finding the best set of
erasure patterns also turns into a covering problem which can be solved
asymptotically by rate-distortion theory. Thus, the proposed approach can be
used to understand the asymptotic performance-versus-complexity trade-off of
multiple errors-and-erasures decoding of RS codes.
This initial result is also extended a few directions. The rate-distortion
exponent (RDE) is computed to give more precise results for moderate
blocklengths. Multiple trials of algebraic soft-decision (ASD) decoding are
analyzed using this framework. Analytical and numerical computations of the RD
and RDE functions are also presented. Finally, simulation results show that
sets of erasure patterns designed using the proposed methods outperform other
algorithms with the same number of decoding trials.Comment: to appear in the IEEE Transactions on Information Theory (Special
Issue on Facets of Coding Theory: from Algorithms to Networks
Polymerase chain reaction in clinical practice
One of the most heralded developments
in basic science to reach clinical application
in recent years has been the Polymerase
Chain Reaction (PCR).
PCR has been applied in various areas
of clinical medicine including rapid diagnosis
of viral, bacterial, fungal and parasitic
disease, the diagnosis and prediction of inherited
disease, the detection of an association
between certain viruses and specific cancers,
the detection of organ transplant rejection and
HLA subtyping. In basic research PCR is
useful in identification of point mutation,
deletion, insertions, rearrangements, amplifications
and translocations
Numerical computation of the beta function of large N SU(N) gauge theory coupled to an adjoint Dirac fermion
We use a single site lattice in four dimensions to study the scaling of large
N Yang-Mills field coupled to a single massless Dirac fermion in the adjoint
representation. We use the location of the strong to weak coupling transition
defined through the eigenvalues of the folded Wilson loop operator to set a
scale. We do not observe perturbative scaling in the region studied in this
paper. Instead, we observe that the scale changes very slowly with the bare
coupling. The lowest eigenvalue of the overlap Dirac operator is another scale
that shows similar behavior as a function of the lattice coupling. We speculate
that this behavior is due to the beta function appoaching close to a zero.Comment: 16 pages, 9 figures, revised version DOES NOT match the published
version in Physical Review
Vacuum Polarization and Chiral Lattice Fermions
The vacuum polarization due to chiral fermions on a 4--dimensional Euclidean
lattice is calculated according to the overlap prescription. The fermions are
coupled to weak and slowly varying background gauge and Higgs fields, and the
polarization tensor is given by second order perturbation theory. In this order
the overlap constitutes a gauge invariant regularization of the fermion vacuum
amplitude. Its low energy -- long wavelength behaviour can be computed
explicitly and we verify that it coincides with the Feynman graph result
obtainable, for example, by dimensional regularization of continuum gauge
theory. In particular, the Standard Model Callan--Symanzik RG functions are
recovered. Moreover, there are no residual lattice artefacts such as a
dependence on Wilson--type mass parameters.Comment: 23 pages, LaTe
- …
