231 research outputs found

    Rare case of autonomic instability of the lower limb presenting as painless Complex Regional Pain Syndrome type I following hip surgery: two case reports

    Get PDF
    <p>Abstract</p> <p>Introduction</p> <p>According to the International Association for the Study of Pain criteria of 1994, pain is a diagnostic requirement for Complex Regional Pain Syndrome type I. However, other authors have suggested that patients can rarely present with the sensory and vascular symptoms of Complex Regional Pain Syndrome without pain. This entity has not been reported following hip surgery in the English medical literature.</p> <p>Case presentation</p> <p>We present two cases of Complex Regional Pain Syndrome-like symptoms following hip surgery and with the total absence of pain. The first case was a 29-year-old Caucasian woman who had a reattachment of the greater trochanter following non-union of an intertrochanteric osteotomy of the hip. Five weeks later, the patient presented with features of Complex Regional Pain Syndrome but with the absence of pain. The second patient was a 20-year-old Caucasian woman who had undergone an open debridement and repair of a torn acetabular labrum. Ten days later, the patient presented with features suggestive of Complex Regional Pain Syndrome which was again painless. Both patients were non-weight bearing at presentation and the symptoms resolved following recommencement of weight bearing.</p> <p>Conclusions</p> <p>The authors believe these symptoms are manifestations of vascular changes to the lower limb as a result of non-weight bearing status. Painless Complex Regional Pain Syndrome-like symptoms may occur in patients who are kept non-weight bearing following hip surgery. However, vascular insufficiency and deep venous thrombosis must be excluded before this diagnosis is made. If the clinical situation permits, early weight bearing may relieve symptoms. Orthopaedic and vascular surgeons should be aware of this entity when a postoperative patient presents to them with the above clinical picture. This is also relevant to general practitioners who are likely to see the patients in the postoperative period.</p

    Characterization of complex networks: A survey of measurements

    Full text link
    Each complex network (or class of networks) presents specific topological features which characterize its connectivity and highly influence the dynamics of processes executed on the network. The analysis, discrimination, and synthesis of complex networks therefore rely on the use of measurements capable of expressing the most relevant topological features. This article presents a survey of such measurements. It includes general considerations about complex network characterization, a brief review of the principal models, and the presentation of the main existing measurements. Important related issues covered in this work comprise the representation of the evolution of complex networks in terms of trajectories in several measurement spaces, the analysis of the correlations between some of the most traditional measurements, perturbation analysis, as well as the use of multivariate statistics for feature selection and network classification. Depending on the network and the analysis task one has in mind, a specific set of features may be chosen. It is hoped that the present survey will help the proper application and interpretation of measurements.Comment: A working manuscript with 78 pages, 32 figures. Suggestions of measurements for inclusion are welcomed by the author

    Semi-automated non-target processing in GC × GC–MS metabolomics analysis: applicability for biomedical studies

    Get PDF
    Due to the complexity of typical metabolomics samples and the many steps required to obtain quantitative data in GC × GC–MS consisting of deconvolution, peak picking, peak merging, and integration, the unbiased non-target quantification of GC × GC–MS data still poses a major challenge in metabolomics analysis. The feasibility of using commercially available software for non-target processing of GC × GC–MS data was assessed. For this purpose a set of mouse liver samples (24 study samples and five quality control (QC) samples prepared from the study samples) were measured with GC × GC–MS and GC–MS to study the development and progression of insulin resistance, a primary characteristic of diabetes type 2. A total of 170 and 691 peaks were quantified in, respectively, the GC–MS and GC × GC–MS data for all study and QC samples. The quantitative results for the QC samples were compared to assess the quality of semi-automated GC × GC–MS processing compared to targeted GC–MS processing which involved time-consuming manual correction of all wrongly integrated metabolites and was considered as golden standard. The relative standard deviations (RSDs) obtained with GC × GC–MS were somewhat higher than with GC–MS, due to less accurate processing. Still, the biological information in the study samples was preserved and the added value of GC × GC–MS was demonstrated; many additional candidate biomarkers were found with GC × GC–MS compared to GC–MS

    Rationalization and Design of the Complementarity Determining Region Sequences in an Antibody-Antigen Recognition Interface

    Get PDF
    Protein-protein interactions are critical determinants in biological systems. Engineered proteins binding to specific areas on protein surfaces could lead to therapeutics or diagnostics for treating diseases in humans. But designing epitope-specific protein-protein interactions with computational atomistic interaction free energy remains a difficult challenge. Here we show that, with the antibody-VEGF (vascular endothelial growth factor) interaction as a model system, the experimentally observed amino acid preferences in the antibody-antigen interface can be rationalized with 3-dimensional distributions of interacting atoms derived from the database of protein structures. Machine learning models established on the rationalization can be generalized to design amino acid preferences in antibody-antigen interfaces, for which the experimental validations are tractable with current high throughput synthetic antibody display technologies. Leave-one-out cross validation on the benchmark system yielded the accuracy, precision, recall (sensitivity) and specificity of the overall binary predictions to be 0.69, 0.45, 0.63, and 0.71 respectively, and the overall Matthews correlation coefficient of the 20 amino acid types in the 24 interface CDR positions was 0.312. The structure-based computational antibody design methodology was further tested with other antibodies binding to VEGF. The results indicate that the methodology could provide alternatives to the current antibody technologies based on animal immune systems in engineering therapeutic and diagnostic antibodies against predetermined antigen epitopes

    Individual-environment interactions in swimming: The smallest unit for analysing the emergence of coordination dynamics in performance?

    Get PDF
    Displacement in competitive swimming is highly dependent on fluid characteristics, since athletes use these properties to propel themselves. It is essential for sport scientists and practitioners to clearly identify the interactions that emerge between each individual swimmer and properties of an aquatic environment. Traditionally, the two protagonists in these interactions have been studied separately. Determining the impact of each swimmer’s movements on fluid flow, and vice versa, is a major challenge. Classic biomechanical research approaches have focused on swimmers’ actions, decomposing stroke characteristics for analysis, without exploring perturbations to fluid flows. Conversely, fluid mechanics research has sought to record fluid behaviours, isolated from the constraints of competitive swimming environments (e.g. analyses in two-dimensions, fluid flows passively studied on mannequins or robot effectors). With improvements in technology, however, recent investigations have focused on the emergent circular couplings between swimmers’ movements and fluid dynamics. Here, we provide insights into concepts and tools that can explain these on-going dynamical interactions in competitive swimming within the theoretical framework of ecological dynamics

    Pb(II) Induces Scramblase Activation and Ceramide-Domain Generation in Red Blood Cells

    Get PDF
    The mechanisms of Pb(II) toxicity have been studied in human red blood cells using confocal microscopy, immunolabeling, fluorescence-activated cell sorting and atomic force microscopy. The process follows a sequence of events, starting with calcium entry, followed by potassium release, morphological change, generation of ceramide, lipid flip-flop and finally cell lysis. Clotrimazole blocks potassium channels and the whole process is inhibited. Immunolabeling reveals the generation of ceramide-enriched domains linked to a cell morphological change, while the use of a neutral sphingomyelinase inhibitor greatly delays the process after the morphological change, and lipid flip-flop is significantly reduced. These facts point to three major checkpoints in the process: first the upstream exchange of calcium and potassium, then ceramide domain formation, and finally the downstream scramblase activation necessary for cell lysis. In addition, partial non-cytotoxic cholesterol depletion of red blood cells accelerates the process as the morphological change occurs faster. Cholesterol could have a role in modulating the properties of the ceramide-enriched domains. This work is relevant in the context of cell death, heavy metal toxicity and sphingolipid signaling.AGA was a predoctoral student supported by the Basque Government and later by the University of the Basque Country (UPV/EHU). This work was also supported in part by grants from the Spanish Government (FEDER/MINECO BFU 2015-66306-P to F.M.G. and A.A.) and the Basque Government (IT849-13 to F.M.G. and IT838-13 to A.A.), and by the Swiss National Science Foundation

    Snake Cytotoxins Bind to Membranes via Interactions with Phosphatidylserine Head Groups of Lipids

    Get PDF
    The major representatives of Elapidae snake venom, cytotoxins (CTs), share similar three-fingered fold and exert diverse range of biological activities against various cell types. CT-induced cell death starts from the membrane recognition process, whose molecular details remain unclear. It is known, however, that the presence of anionic lipids in cell membranes is one of the important factors determining CT-membrane binding. In this work, we therefore investigated specific interactions between one of the most abundant of such lipids, phosphatidylserine (PS), and CT 4 of Naja kaouthia using a combined, experimental and modeling, approach. It was shown that incorporation of PS into zwitterionic liposomes greatly increased the membrane-damaging activity of CT 4 measured by the release of the liposome-entrapped calcein fluorescent dye. The CT-induced leakage rate depends on the PS concentration with a maximum at approximately 20% PS. Interestingly, the effects observed for PS were much more pronounced than those measured for another anionic lipid, sulfatide. To delineate the potential PS binding sites on CT 4 and estimate their relative affinities, a series of computer simulations was performed for the systems containing the head group of PS and different spatial models of CT 4 in aqueous solution and in an implicit membrane. This was done using an original hybrid computational protocol implementing docking, Monte Carlo and molecular dynamics simulations. As a result, at least three putative PS-binding sites with different affinities to PS molecule were delineated. Being located in different parts of the CT molecule, these anion-binding sites can potentially facilitate and modulate the multi-step process of the toxin insertion into lipid bilayers. This feature together with the diverse binding affinities of the sites to a wide variety of anionic targets on the membrane surface appears to be functionally meaningful and may adjust CT action against different types of cells

    OVERHEATED SECURITY? The Securitisation of Climate Change and the Governmentalisation of Security

    Get PDF
    Since the mid-2000s, climate change has become one of the defining security issues in political as well as academic debates and amongst others has repeatedly been discussed in the UN Security Council and countless high level government reports in various countries. Beyond the question whether the characterisation as ‘security issue’ is backed up by any robust empirical findings, this begs the question whether the ‘securitisation’ of climate change itself has had tangible political consequences. Moreover, within this research area there is still a lively discussion about which security conceptions apply, how to conceptualise (successful) securitisation and whether it is a (politically and normatively) desirable approach to deal with climate change. The aim of this dissertation is to shed light on these issues and particularly to contribute to a more thorough understanding of different forms or ‘discourses’ of securitisation and their political effects on a theoretical and empirical level. Theoretically, it conceptualises securitisation as resting on different forms of power, which are derived from Michel Foucault’s governmentality lectures. The main argument is that this framework allows me to better capture the ambiguous and diverse variants of securitisation and the ever-changing concept of security as well as to come to a more thorough understanding of the political consequences and powerful effects of constructing issues in terms of security. Empirically, the thesis looks at three country cases, namely the United States, Germany and Mexico. This comparative angle allows me to go beyond the existing literature on the securitisation of climate change that mostly looks at the global level, and to come to a more comprehensive and detailed understanding of different climate security discourses and their political consequences. Concerning the main results, the thesis finds that climate change has indeed been securitised very differently in the three countries and thus has facilitated diverse political consequences. These range from an incorporation of climate change into the defence sector in the US, the legitimisation of far-reaching climate policies in Germany, to the integration of climate change into several civil protection and agricultural insurance schemes in Mexico. Moreover, resting on different forms of power, the securitisation of climate change has played a key role in constructing specific actors and forms of knowledge as legitimate as well as in shaping certain identities in the face of the dangers of climate change. From a normative perspective, neither of these political consequences is purely good or bad but highly ambiguous and necessitates a careful, contextual assessment
    corecore