180 research outputs found
Telemedicine in chronic disease management: a Public Health perspective
Introduction
In 2014, the School of Hygiene of the University of Padua carried out an evaluation of home telemonitoring (HTM) programs for the management of chronic diseases. Our aims were to verify their efficacy, and to identify a model of care that could be integrated into the current health system. Our analysis addressed both organizational and clinical matters.
Methods
Our evaluation involved 19 reviews and 53 randomized controlled trials (RCT). Main selection criteria were: papers published over the last 15 years, HTM performed through a sensor system, data sent remotely to physicians, health out-comes and monitored parameters clearly stated. Included diseases were: heart failure, hypertension, COPD, asthma and diabetes.
Results
Several critical issues were highlighted. Due to the general tendency in the scientific literature to report HTM efficacy, there is a lack of conclusive evidence whether telemedicine actually improves both clinical (e.g. decreased disease/all-cause mortality, drop in disease/all-cause hospitalization rates, improvement in biological parameters and quality of life) and organizational (decreased length of hospital stay, decreased emergency room/other service use, decreased costs) outcomes or not.
Discussion
From a Public Health perspective, discrepancies and weaknesses may affect published results, since the best method for organizing and delivering telemedicine programs has not yet been identified. There is still no consensus on the following topics: setting: which context expresses the potential of technology best? No studies were found comparing, e.g., rural with urban communities. Within urban scenarios, samples do not discriminate users by their capability to access the healthcare network (e.g. residents in peripheral areas with limited transportation resources, rather than users with reduced mobility); target: it is unclear which demographic or socioeconomic characteristics users should possess to gain most benefit from HTM; duration and frequency: there are significant differences in RCT (and HTM program) duration. It has not been established whether HTM is more effective when permanently implemented, or only in the early stages of disease (i.e. until stabilization). There is no agreement on the optimal HTM implementation frequency, nor whether the patients should also receive traditional interventions (e.g. nurse home visits);scope: it has not been determined whether measurements should be disclosed to patients as educational means to improve disease management. However, past literature does include some indications that the effectiveness of HTM programs may be attributable to care intensification (or to a perceived intensification by the patient, as per the \u201cHawthorne effect\u201d described in sociology) or to the empowerment process.
Conclusions
HTM management of chronic diseases is a promising and remarkable strategy, still flawed by the lack of evidence. Reported efficacy, although modest, probably has a multifactorial origin. Our hypothesis is that it may not result from the technology itself, but from the impact of such process on multiple components of care, emphasizing patients' involvement and autonomy, and improving monitoring intensity. Further studies are needed to clarify the role played by the different HTM components (target, setting, etc.). The application of HTM as a tool for prevention, empowerment and reduction of healthcare access remains little explored
Sensitivity to initial conditions at bifurcations in one-dimensional nonlinear maps: rigorous nonextensive solutions
Using the Feigenbaum renormalization group (RG) transformation we work out
exactly the dynamics and the sensitivity to initial conditions for unimodal
maps of nonlinearity at both their pitchfork and tangent
bifurcations. These functions have the form of -exponentials as proposed in
Tsallis' generalization of statistical mechanics. We determine the -indices
that characterize these universality classes and perform for the first time the
calculation of the -generalized Lyapunov coefficient . The
pitchfork and the left-hand side of the tangent bifurcations display weak
insensitivity to initial conditions, while the right-hand side of the tangent
bifurcations presents a `super-strong' (faster than exponential) sensitivity to
initial conditions. We corroborate our analytical results with {\em a priori}
numerical calculations.Comment: latex, 4 figures. Updated references and some general presentation
improvements. To appear published in Europhysics Letter
Is depression a real risk factor for acute myocardial infarction mortality? A retrospective cohort study
Background: Depression has been associated with a higher risk of cardiovascular events and a higher mortality in patients with one or more comorbidities. This study investigated whether continuative use of antidepressants (ADs), considered as a proxy of a state of depression, prior to acute myocardial infarction (AMI) is associated with a higher mortality afterwards. The outcome to assess was mortality by AD use. Methods: A retrospective cohort study was conducted in the Veneto Region on hospital discharge records with a primary diagnosis of AMI in 2002-2015. Subsequent deaths were ascertained from mortality records. Drug purchases were used to identify AD users. A descriptive analysis was conducted on patients' demographics and clinical data. Survival after discharge was assessed with a Kaplan-Meier survival analysis and Cox's multiple regression model. Results: Among 3985 hospital discharge records considered, 349 (8.8%) patients were classified as AD users'. The mean AMI-related hospitalization rate was 164.8/100,000 population/year, and declined significantly from 204.9 in 2002 to 130.0 in 2015, but only for AD users (-40.4%). The mean overall follow-up was 4.64.1years. Overall, 523 patients (13.1%) died within 30days of their AMI. The remainder survived a mean 5.3 +/- 4.0years. After adjusting for potential confounders, use of antidepressants was independently associated with mortality (adj OR=1.75, 95% CI: 1.40-2.19). Conclusions: Our findings show that AD users hospitalized for AMI have a worse prognosis in terms of mortality. The use of routinely-available records can prove an efficient way to monitor trends in the state of health of specific subpopulations, enabling the early identification of AMI survivors with a history of antidepressant use
Chaos edges of -logistic maps: Connection between the relaxation and sensitivity entropic indices
Chaos thresholds of the -logistic maps are numerically analysed at accumulation points of cycles 2, 3
and 5. We verify that the nonextensive -generalization of a Pesin-like
identity is preserved through averaging over the entire phase space. More
precisely, we computationally verify , where the entropy (), the sensitivity to the initial
conditions , and
(). The entropic index
depend on
both and the cycle. We also study the relaxation that occurs if we start
with an ensemble of initial conditions homogeneously occupying the entire phase
space. The associated Lebesgue measure asymptotically decreases as
(). These results led to (i) the first
illustration of the connection (conjectured by one of us) between sensitivity
and relaxation entropic indices, namely , where the positive numbers depend on the
cycle; (ii) an unexpected and new scaling, namely ( for , and for ).Comment: 5 pages, 5 figure
Linear instability and statistical laws of physics
We show that a meaningful statistical description is possible in conservative
and mixing systems with zero Lyapunov exponent in which the dynamical
instability is only linear in time. More specifically, (i) the sensitivity to
initial conditions is given by with
; (ii) the statistical entropy in the infinitely fine graining limit (i.e., {\it
number of cells into which the phase space has been partitioned} ),
increases linearly with time only for ; (iii) a nontrivial,
-generalized, Pesin-like identity is satisfied, namely the . These facts (which are
in analogy to the usual behaviour of strongly chaotic systems with ), seem
to open the door for a statistical description of conservative many-body
nonlinear systems whose Lyapunov spectrum vanishes.Comment: 7 pages including 2 figures. The present version is accepted for
publication in Europhysics Letter
May car washing represent a risk for Legionella infection?
Background. Legionella is a ubiquitous Gram-negative bacterium naturally found in aquatic environments. It can pose a health problem when it grows and spreads in man-made water systems. Legionella pneumophila is the most common cause of Legionnaires\u2019 disease nowadays, a community-acquired pneumonia with pulmonary symptoms and chest radiography no different from any other form of infectious pneumonia. Legionella monitoring is important for public health reasons, including the identification of unusual environmental sources of Legionella.
Methods. We report two cases of Legionnaires\u2019 disease associated with two different car wash installations in the province of Vicenza, in the Veneto region, northeastern Italy. Patients were not employees of the car wash installations, but users of the service. In both cases, Legionella antigen was detected in urine using the Alere BinaxNOW\uae Legionella Urinary Antigen, and Legionella antibodies were detected in serum using SERION ELISA classic Legionella pneumophila 1-7 IgG and IgM. Water samples were also analyzed as part of the surveillance program for Legionella prevention and control in compliance with the Italian guidelines.
Results. Both patients had clinical symptoms and chest radiography compatible with pneumonia, and only one of them had diabetes as a risk factor. Legionella urinary antigen and serological test on serum samples were positive for Legionella in both patients, even if much slighter in the case A due to the retrospective serological investigation performed a year later the episode and after the second clinical case occurred in the same district. The environmental investigations highlighted two different car wash plants as potential source of infection. A certified company using shock hyperchlorination was asked to disinfect the two plants and, subsequently, control samples resulted negative for Legionella pneumophila.
Conclusions. Any water source producing aerosols should be considered at risk for the transmission of Legionella bacteria, including car wash installations frequently used by a large number of customers and where poor maintenance probably creates favorable conditions for Legionella overgrowth and spreading. Additional research is needed to ascertain optimal strategies for Legionella monitoring and control, but environmental surveillance, paying careful attention to possible unconventional sources, should remain an important component of any Legionnaires\u2019 disease prevention program. Additionally, all available diagnostic methods would be recommended for the confirmation of all cases even in the event of non-serogroup 1 Legionella pneumophila infection, probably underestimated at this time
Comment on "Critique of q-entropy for thermal statistics" by M. Nauenberg
It was recently published by M. Nauenberg [1] a quite long list of objections
about the physical validity for thermal statistics of the theory sometimes
referred to in the literature as {\it nonextensive statistical mechanics}. This
generalization of Boltzmann-Gibbs (BG) statistical mechanics is based on the
following expression for the entropy:
S_q= k\frac{1- \sum_{i=1}^Wp_i^q}{q-1} (q \in {\cal R}; S_1=S_{BG} \equiv
-k\sum_{i=1}^W p_i \ln p_i) .
The author of [1] already presented orally the essence of his arguments in
1993 during a scientific meeting in Buenos Aires. I am replying now
simultaneously to the just cited paper, as well as to the 1993 objections
(essentially, the violation of "fundamental thermodynamic concepts", as stated
in the Abstract of [1]).Comment: 7 pages including 2 figures. This is a reply to M. Nauenberg, Phys.
Rev. E 67, 036114 (2003
Universal renormalization-group dynamics at the onset of chaos in logistic maps and nonextensive statistical mechanics
We uncover the dynamics at the chaos threshold of the logistic
map and find it consists of trajectories made of intertwined power laws that
reproduce the entire period-doubling cascade that occurs for . We corroborate this structure analytically via the Feigenbaum
renormalization group (RG) transformation and find that the sensitivity to
initial conditions has precisely the form of a -exponential, of which we
determine the -index and the -generalized Lyapunov coefficient . Our results are an unequivocal validation of the applicability of the
non-extensive generalization of Boltzmann-Gibbs (BG) statistical mechanics to
critical points of nonlinear maps.Comment: Revtex, 3 figures. Updated references and some general presentation
improvements. To appear published as a Rapid communication of PR
Infinite ergodic theory and Non-extensive entropies
We bring into account a series of result in the infinite ergodic theory that
we believe that they are relevant to the theory of non-extensive entropie
- …