555 research outputs found

    Prevention and treatment of venous thromboembolism with low-molecular-weight heparins: Clinical implications of the recent European guidelines

    Get PDF
    Venous thromboembolism (VTE) is an important cause of avoidable morbidity and mortality. However, routine prophylaxis for at-risk patients is underused. Recent guidelines issued by an international consensus group, including the International Union of Angiology (IUA), recommend use of low-molecular-weight heparins (LMWHs) for the treatment of acute VTE and prevention of recurrence, and for prophylaxis in surgical and medical patients. This review highlights current inadequacies in the provision of thromboprophylaxis, and considers the clinical implications of the European guidelines on the prevention and treatment of VTE

    Syncope in Elderly People: A Threatening Presentation of Pulmonary Embolism: A Case Report

    Get PDF
    age. Its prognosis is largely unpredictable, pending the origin of the sudden loss of consciousness. We report a case of an old woman affected by severe chronic heart failure, who died soon after the development of an episode of syncope, which was eventually attributed to pulmonary embolism. Anticoagulant therapy, promptly instituted, was ineffective. In the differential diagnoses of syncope, pulmonary embolism should always be considered, especially in old patients with risk factors for venous thromboembolism such as a severe heart failure. In patients with high risk of death according to the widely adopted risk stratifications score, aggressive therapy may be considered also in elderly people to prevent unfavourable outcomes

    The Post Thrombotic Syndrome

    Get PDF

    Optimal segmentation techniques for piecewise stationary signals

    Get PDF
    The concept of stationarity is central to signal processing; it indeed guarantees that the deterministic spectral properties of linear time-invariant systems are also applicable to realizations of stationary random processes. In almost all practical settings, however, the signal at hand is just a finite-size vector whose underlying generating process, if we are willing to admit one, is unknown; In this case, invoking stationarity is tantamount to stating that a single linear system (however complex) suffices to model the data effectively, be it for analysis, processing, or compression purposes. It is intuitively clear that if the complexity of the model can grow unchecked, its accuracy can increase arbitrarily (short of computational limitations); this defines a tradeoff in which, for a given data vector, a set of complexity/accuracy pairs are defined for each possible model. In general one is interested in parsimonious modeling; by identifying complexity with "rate" and model misadjustment with "distortion", the situation becomes akin to an operational rate-distortion (R/D) problem in which, for each possible "rate", the goal is to find the model yielding t lie minimum distortion. In practice, however, only a finite palette of models is available, the complexity of which is limited by computational reasons: therefore, the entire data vector often proves too "non-stationary" for any single model. If the process is just slowly drifting, adaptive systems are probably the best choice; on the other hand, a wide class of signals exhibits a series of rather abrupt transition between locally regular portions (e.g. speech, images). In this case a common solution is to partition the data uniformly so that the resulting pieces are small enough to appear stationary with respect to the available models. However, if the goal is again to obtain an overall modeling which is optimal in the above R/D sense, it is necessary that the segmentation be a free variable in the modelization process; this is however not the case if a fixed-size time windowing is used. Note that now the reachable points in the R/D plane are in fact indexed not just by a model but by a segmentation/model-sequence pair; their number therefore grows exponentially with the size of the data vector. This thesis is concerned with the development of efficient techniques to explore this R/D set and to determine its operational lower bound for specific signal processing problems. It will be shown that, under very mild hypotheses, many practical applications dealing with nonstationary data sets can be cast as a R/D optimization problem involving segmentation, which can in turn be solved using polynomial-time dynamic programming techniques. The flexibility of the approach will be demonstrated by a very diverse set of examples: after a general overview of the various facets of the dynamic segmentation problem in Chapter 2, Chapter 3 will use the framework to determine an operational R/D bound for the approximation of piecewise polynomial function with respect to wavelet-based approximation; Chapter 4 will show its relevant to compression problems, and in particular to speech coding based on linear prediction and to arithmetic coding for binary sources; finally, in Chapter 5, an efficient data hiding scheme for PCM audio signals will be described, in which the optimal power allocation for the hidden data is determined with respect to the time-varying characteristics of the host signal

    Warfarin Versus Low-Molecular-Weight Heparin Therapy in Cancer Patients

    Get PDF
    Abstract Learning Objectives After completing this course, the reader will be able to: Define characteristics of the interface between deep vein thrombosis and malignancy. Evaluate patient factors that may complicate long-term warfarin use in patients with cancer. List advantages that may be realized with low-molecular-weight heparin (versus warfarin) therapy in cancer patients. Access and take the CME test online and receive 1 hour of AMA PRA category 1 credit at CME.TheOncologist.com The association between cancer and venous thromboembolism (VTE) is well established. Importantly, VTE is a significant cause of mortality in cancer patients. Although long-term warfarin (Coumadin™; Bristol-Myers Squibb; New York, NY) therapy is the mainstay of treatment for cancer patients with VTE, there are many practical problems with its use in this population. In particular, achieving therapeutic drug levels is difficult in cancer patients due to the increased risk of drug interactions, malnutrition, vomiting, and liver dysfunction in these patients. Moreover, cancer patients are at an increased risk of adverse effects of warfarin therapy. In contrast, low-molecular-weight heparins (LMWHs) are associated with a lower risk of adverse events compared with warfarin in patients with cancer. These agents also offer practical advantages compared with warfarin, including more predictable anticoagulant effects and ease of administration in addition to possible antineoplastic effects. Several LMWHs have demonstrated superior efficacy to warfarin in the secondary prevention of VTE. In particular, the LMWH, dalteparin (Fragmin®; Pfizer; New York, NY), has recently been shown to have superior efficacy to warfarin in a large trial of patients with cancer and VTE without increasing the risk of bleeding. A randomized trial of dalteparin has also shown improved response rates and survival in patients with small cell lung cancer. In view of the availability of more effective and reliable alternatives to warfarin therapy in cancer patients, it is appropriate to reassess the role of warfarin therapy in patients with cancer and VTE. Further evaluation of the LMWHs for effects on cancer outcome is indicated

    From Lagrange to Shannon...and Back: Another Look at Sampling

    Get PDF
    Classical digital signal processing (DSP) lore tells us the tale of a continuous-time primeval signal, of its brutal sampling, and of the magic sinc interpolation that, under the aegis of bandlimitedness, brings the original signal back to (continuous) life. In this article, we would like to switch the conventional viewpoint and cast discrete-time sequences in the lead role, with continuous- time signals entering the scene as a derived version of their gap-toothed archetypes. To this end, we will bring together some well-known but seldomtaught facts about interpolation and vector spaces and recall how the classic sinc reconstruction formula derives naturally from the Lagrange interpolation method. Such an elegant and mathematically simple result can have a great educational value in building a solid yet very intuitive grasp of the interplay between analog and digital signals

    R/D optimal data hiding

    Get PDF
    • …
    corecore