31 research outputs found

    Thermodynamic analysis of the Quantum Critical behavior of Ce-lattice compounds

    Full text link
    A systematic analysis of low temperature magnetic phase diagrams of Ce compounds is performed in order to recognize the thermodynamic conditions to be fulfilled by those systems to reach a quantum critical regime and, alternatively, to identify other kinds of low temperature behaviors. Based on specific heat (CmC_m) and entropy (SmS_m) results, three different types of phase diagrams are recognized: i) with the entropy involved into the ordered phase (SMOS_{MO}) decreasing proportionally to the ordering temperature (TMOT_{MO}), ii) those showing a transference of degrees of freedom from the ordered phase to a non-magnetic component, with their Cm(TMO)C_m(T_{MO}) jump (ΔCm\Delta C_m) vanishing at finite temperature, and iii) those ending in a critical point at finite temperature because their ΔCm\Delta C_m do not decrease with TMOT_{MO} producing an entropy accumulation at low temperature. Only those systems belonging to the first case, i.e. with SMO0S_{MO}\to 0 as TMO0T_{MO}\to 0, can be regarded as candidates for quantum critical behavior. Their magnetic phase boundaries deviate from the classical negative curvature below T2.5T\approx 2.5\,K, denouncing frequent misleading extrapolations down to T=0. Different characteristic concentrations are recognized and analyzed for Ce-ligand alloyed systems. Particularly, a pre-critical region is identified, where the nature of the magnetic transition undergoes significant modifications, with its Cm/T\partial C_m/\partial T discontinuity strongly affected by magnetic field and showing an increasing remnant entropy at T0T\to 0. Physical constraints arising from the third law at T0T\to 0 are discussed and recognized from experimental results

    Rethinking the patient: using Burden of Treatment Theory to understand the changing dynamics of illness

    Get PDF
    <b>Background</b> In this article we outline Burden of Treatment Theory, a new model of the relationship between sick people, their social networks, and healthcare services. Health services face the challenge of growing populations with long-term and life-limiting conditions, they have responded to this by delegating to sick people and their networks routine work aimed at managing symptoms, and at retarding - and sometimes preventing - disease progression. This is the new proactive work of patient-hood for which patients are increasingly accountable: founded on ideas about self-care, self-empowerment, and self-actualization, and on new technologies and treatment modalities which can be shifted from the clinic into the community. These place new demands on sick people, which they may experience as burdens of treatment.<p></p> <b>Discussion</b> As the burdens accumulate some patients are overwhelmed, and the consequences are likely to be poor healthcare outcomes for individual patients, increasing strain on caregivers, and rising demand and costs of healthcare services. In the face of these challenges we need to better understand the resources that patients draw upon as they respond to the demands of both burdens of illness and burdens of treatment, and the ways that resources interact with healthcare utilization.<p></p> <b>Summary</b> Burden of Treatment Theory is oriented to understanding how capacity for action interacts with the work that stems from healthcare. Burden of Treatment Theory is a structural model that focuses on the work that patients and their networks do. It thus helps us understand variations in healthcare utilization and adherence in different healthcare settings and clinical contexts

    Baseline mitral regurgitation predicts outcome in patients referred for dobutamine stress echocardiography

    Get PDF
    Purpose: A number of parameters recorded during dobutamine stress echocardiography (DSE) are associated with worse outcome. However, the relative importance of baseline mitral regurgitation (MR) is unknown. The aim of this study was to assess the prevalence and associated implications of functional MR with long-term mortality in a large cohort of patients referred for DSE. Methods: 6745 patients (mean age 64.9±12.2 years) were studied. Demographic, baseline and peak DSE data were collected. All-cause mortality was retrospectively analyzed. DSE was successfully completed in all patients with no adverse outcomes. Results: MR was present in 1019 (15.1%) patients. During a mean follow up of 5.1±1.8 years, 1642 (24.3%) patients died and MR was significantly associated with increased all-cause mortality (p<0.001). With Kaplan-Meier analysis, survival was significantly worse for patients with moderate and severe MR (p<0.001). With multivariate Cox regression analysis, moderate and severe MR (HR 2.78; 95% CI 2.17 - 3.57; and HR 3.62; 95% CI 2.89 - 4.53, respectively) were independently associated with all-cause mortality. The addition of MR to C statistic models significantly improved discrimination. Conclusions: MR is associated with all-cause mortality and adds incremental prognostic information among patients referred for DSE. The presence of MR should be taken into account when evaluating the prognostic significance of DSE results

    Digital hyperplane fitting

    Get PDF
    International audienceThis paper addresses the hyperplane fitting problem of discrete points in any dimension (i.e. in Z d). For that purpose, we consider a digital model of hyperplane, namely digital hyperplane, and present a combinatorial approach to find the optimal solution of the fitting problem. This method consists in computing all possible digital hyperplanes from a set S of n points, then an exhaustive search enables us to find the optimal hyperplane that best fits S. The method has, however, a high complexity of O(n d), and thus can not be applied for big datasets. To overcome this limitation, we propose another method relying on the Delaunay triangulation of S. By not generating and verifying all possible digital hyperplanes but only those from the elements of the triangula-tion, this leads to a lower complexity of O(n d 2 +1). Experiments in 2D, 3D and 4D are shown to illustrate the efficiency of the proposed method

    Valorizing the 'Irulas' traditional knowledge of medicinal plants in the Kodiakkarai Reserve Forest, India

    Get PDF
    A mounting body of critical research is raising the credibility of Traditional Knowledge (TK) in scientific studies. These studies have gained credibility because their claims are supported by methods that are repeatable and provide data for quantitative analyses that can be used to assess confidence in the results. The theoretical importance of our study is to test consensus (reliability/replicable) of TK within one ancient culture; the Irulas of the Kodiakkarai Reserve Forest (KRF), India. We calculated relative frequency (RF) and consensus factor (Fic) of TK from 120 Irulas informants knowledgeable of medicinal plants. Our research indicates a high consensus of the Irulas TK concerning medicinal plants. The Irulas revealed a diversity of plants that have medicinal and nutritional utility in their culture and specific ethnotaxa used to treat a variety of illnesses and promote general good health in their communities. Throughout history aboriginal people have been the custodians of bio-diversity and have sustained healthy life-styles in an environmentally sustainable manner. However this knowledge has not been transferred to modern society. We suggest this may be due to the asymmetry between scientific and TK, which demands a new approach that considers the assemblage of TK and scientific knowledge. A greater understanding of TK is beginning to emerge based on our research with both the Irulas and Malasars; they believe that a healthy lifestyle is founded on a healthy environment. These aboriginal groups chose to share this knowledge with society-at-large in order to promote a global lifestyle of health and environmental sustainability

    N_NC_C- Algorithms for Minimum Link Path and Related Problems

    No full text
    The link metric, defined on a constrained region R of the plane, sets the distance between a pair of points in R to equal the minimum number of line segments or links that are needed to construct a path in R between the points. The minimum link path problem is to compute a path consisting of the minimum number of links between two points in R, when R is the inside of an n-sided simple polygon. The minimum nested polygon problem asks for a minimum link closed path (girth) when R is an annular region defined by a pair of nested simple polygons. Efficient sequential algorithms based on greedy methods have been described for both problems. However, neither problem was known to be NC.In this paper we present algorithms that require O(log n log log n) time and O(n) space using O(n) processors for both problems. The approach used involves new results on the parallel (NC1)(NC^1) computation of the complete visibility polygon of a simple polygon from a set of points inside it, along with an algebraic technique based on fractional linear transforms that permits effective parallelization of the "greedy" computations. The complexity results of this paper are with respect to the CREW-PRAM model of computation. The time x processor product of these algorithms is within a small polylog factor of the best known sequential algorithms for the respective problems

    N_NC_C- Algorithms for Minimum Link Path and Related Problems

    No full text
    The link metric, defined on a constrained region R of the plane, sets the distance between a pair of points in R to equal the minimum number of line segments or links that are needed to construct a path in R between the points. The minimum link path problem is to compute a path consisting of the minimum number of links between two points in R, when R is the inside of an n-sided simple polygon. The minimum nested polygon problem asks for a minimum link closed path (girth) when R is an annular region defined by a pair of nested simple polygons. Efficient sequential algorithms based on greedy methods have been described for both problems. However, neither problem was known to be NC.In this paper we present algorithms that require O(log n log log n) time and O(n) space using O(n) processors for both problems. The approach used involves new results on the parallel (NC1)(NC^1) computation of the complete visibility polygon of a simple polygon from a set of points inside it, along with an algebraic technique based on fractional linear transforms that permits effective parallelization of the "greedy" computations. The complexity results of this paper are with respect to the CREW-PRAM model of computation. The time x processor product of these algorithms is within a small polylog factor of the best known sequential algorithms for the respective problems

    The Power of Orthogonal Duals (Invited Talk)

    No full text
    Triangle meshes have found widespread acceptance in computer graphics as a simple, convenient, and versatile representation of surfaces. In particular, computing on such simplicial meshes is a workhorse in a variety of graphics applications. In this context, mesh duals (tied to Poincaré duality and extending the well known relationship between Delaunay triangulations and Voronoi diagrams) are often useful, be it for physical simulation of fluids or parameterization. However, the precise embedding of a dual diagram with respect to its triangulation (i.e., the placement of dual vertices) has mostly remained a matter of taste or a numerical after-thought, and barycentric versus circumcentric duals are often the only options chosen in practice. In this chapter we discuss the notion of orthogonal dual diagrams, and show through a series of recent works that exploring the full space of orthogonal dual diagrams to a given simplicial complex is not only powerful and numerically beneficial, but it also reveals (using tools from algebraic topology and computational geometry) discrete analogs to continuous properties
    corecore