251 research outputs found

    The Suitability of 3D Data: 3D Digitisation of Human Remains

    Get PDF
    The use of 3D data in the analysis of skeletal and fossil materials has conveyed numerous advantages in many fields; however, as the availability and use of 3D scanning equipment are rapidly increasing, it is important for researchers to consider whether these methods are suitable for the proposed study. The issue of suitability has been largely overlooked in previous research; for instance, casts and reconstruction methods are frequently used to increase sample sizes, without sufficient assessment of the effect, this may have on the accuracy and reliability of results. Furthermore, the reliability of geometric morphometric methods and the implications of virtual curation have not received sufficient consideration. This paper discusses the suitability of 3D research with regard to the accuracy, reliability, and accessibility of methods and materials, as well as the effects of the current learning environment. Areas where future work will progress 3D research are proposed

    IDEAL-D Framework for Device Innovation: A Consensus Statement on the Preclinical Stage

    Get PDF
    OBJECTIVE: To extend the IDEAL Framework for device innovation, IDEAL-D, to include the preclinical stage of development (Stage 0). BACKGROUND: In previous work, the IDEAL collaboration has proposed frameworks for new surgical techniques and complex therapeutic technologies, the central tenet being that development and evaluation can and should proceed together in an ordered and logical manner that balances innovation and safety. METHODS: Following agreement at the IDEAL Collaboration Council, a multidisciplinary working group was formed comprising 12 representatives from healthcare, academia, industry, and a patient advocate. The group conducted a series of discussions following the principles used in the development of the original IDEAL Framework. Importantly, IDEAL aims for maximal transparency, optimal validity in the evaluation of primary effects and minimisation of potential risk to patients or others. The proposals were subjected to further review and editing by members of the IDEAL Council before a final consensus version was adopted. RESULTS: In considering which studies are required before a first-in-human study, we have: (1) classified devices according to what they do and the risks they carry, (2) classified studies according to what they show about the device, and (3) made recommendations based on the principle that the more invasive and high risk a device is, the greater proof required of their safety and effectiveness prior to progression to clinical studies (Stage 1). CONCLUSIONS: The proposed recommendations for preclinical evaluation of medical devices represent a proportionate and pragmatic approach that balances the de-risking of first-in-human translational studies against the benefits of rapid translation of new devices into clinical practice

    Aortic dissection at the University hospital of the West Indies: A 20-year clinicopathological study of autopsy cases

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>An autopsy study of aortic dissection (AD) at our institution was previously reported. In the approximately 20 years since then, however, many aspects of diagnosis and treatment of this disease have changed, with a fall in mortality reported in many centers around the world. An impression amongst our pathologists that, there might be an increase in the prevalence of AD in the autopsy service at our hospital, since that earlier report, led to this repeated study, in an attempt to validate that notion. We also sought to identify any changes in clinicopathological features between the two series or any occurring during this study period itself.</p> <p>Findings</p> <p>All cases of AD identified at autopsy, during the 20-year period since the conclusion of the last study, were collected and pertinent clinical and pathological data were analyzed and compared, both within the two decades of this study period and against the results of the last study.</p> <p>Fifty-six cases comprised this study group including 36 males and 20 females, with a mean age of 63.9 years. There were, more patients in the second decade (n = 33; 59%) compared with the first decade (n = 23; 41%). Hypertension as a risk factor was identified in 52 (93%) cases and rupture occurred in 49 (88%) cases. A clinical diagnosis of AD was considered prior to surgery or autopsy in 25 (45%) cases overall, more during the second decade. Surgery was attempted in 25% of all cases with an increase in the second decade compared with the first.</p> <p>Conclusions</p> <p>Compared with the earlier review, a variety of changes in the profile of patients with AD in the autopsy service has been noted, including a reversal in the female predominance seen previously. Other observations include an increase in cases where the correct clinical diagnosis was considered and in which surgical treatment was attempted, changes also evident when the second decade of the present study was compared with the earlier decade. Overall, there were many positive trends. However, areas that could still be improved include an increased index of suspicion for the diagnosis of AD and perhaps in the initiation of treatment, earlier, in those cases where the correct diagnosis was considered.</p

    Rates of glycaemic deterioration in a real-world population with type 2 diabetes

    Get PDF
    Aims/hypothesis: There is considerable variability in how diabetes progresses after diagnosis. Progression modelling has largely focused on 'time to failure' methods, yet determining a 'coefficient of failure' has many advantages. We derived a rate of glycaemic deterioration in type 2 diabetes, using a large real-world cohort, and aimed to investigate the clinical, biochemical, pharmacological and immunological variables associated with fast and slow rates of glycaemic deterioration. Methods: An observational cohort study was performed using the electronic medical records from participants in the Genetics of Diabetes Audit and Research in Tayside Study (GoDARTS). A model was derived based on an individual's observed HbA(1c) measures from the first eligible HbA(1c) after the diagnosis of diabetes through to the study end (defined as insulin initiation, death, leaving the area or end of follow-up). Each HbA(1c) measure was time-dependently adjusted for the effects of non-insulin glucose-lowering drugs, changes in BMI and corticosteroid use. GAD antibody (GADA) positivity was defined as GAD titres above the 97.5th centile of the population distribution. Results: The mean (95% CI) glycaemic deterioration for type 2 diabetes and GADA-positive individuals was 1.4 (1.3, 1.4) and 2.8 (2.4, 3.3) mmol/mol HbA(1c) per year, respectively. A younger age of diagnosis, lower HDL-cholesterol concentration, higher BMI and earlier calendar year of diabetes diagnosis were independently associated with higher rates of glycaemic deterioration in individuals with type 2 diabetes. The rate of deterioration in those diagnosed at over 70 years of age was very low, with 66% having a rate of deterioration of less than 1.1 mmol/mol HbA(1c) per year, and only 1.5% progressing more rapidly than 4.4 mmol/mol HbA(1c) per year. Conclusions/interpretation: We have developed a novel approach for modelling the progression of diabetes in observational data across multiple drug combinations. This approach highlights how glycaemic deterioration in those diagnosed at over 70 years of age is minimal, supporting a stratified approach to diabetes management

    Uncoupling the functions of CALM in VAMP sorting and clathrin-coated pit formation.

    Get PDF
    CALM (clathrin assembly lymphoid myeloid leukemia protein) is a cargo-selective adaptor for the post-Golgi R-SNAREs VAMPs 2, 3, and 8, and it also regulates the size of clathrin-coated pits and vesicles at the plasma membrane. The present study has two objectives: to determine whether CALM can sort additional VAMPs, and to investigate whether VAMP sorting contributes to CALM-dependent vesicle size regulation. Using a flow cytometry-based endocytosis efficiency assay, we demonstrate that CALM is also able to sort VAMPs 4 and 7, even though they have sorting signals for other clathrin adaptors. CALM homologues are present in nearly every eukaryote, suggesting that the CALM family may have evolved as adaptors for retrieving all post-Golgi VAMPs from the plasma membrane. Using a knockdown/rescue system, we show that wild-type CALM restores normal VAMP sorting in CALM-depleted cells, but that two non-VAMP-binding mutants do not. However, when we assayed the effect of CALM depletion on coated pit morphology, using a fluorescence microscopy-based assay, we found that the two mutants were as effective as wild-type CALM. Thus, we can uncouple the sorting function of CALM from its structural role

    Monocytes regulate the mechanism of T-cell death by inducing Fas-mediated apoptosis during bacterial infection.

    Get PDF
    Monocytes and T-cells are critical to the host response to acute bacterial infection but monocytes are primarily viewed as amplifying the inflammatory signal. The mechanisms of cell death regulating T-cell numbers at sites of infection are incompletely characterized. T-cell death in cultures of peripheral blood mononuclear cells (PBMC) showed 'classic' features of apoptosis following exposure to pneumococci. Conversely, purified CD3(+) T-cells cultured with pneumococci demonstrated necrosis with membrane permeabilization. The death of purified CD3(+) T-cells was not inhibited by necrostatin, but required the bacterial toxin pneumolysin. Apoptosis of CD3(+) T-cells in PBMC cultures required 'classical' CD14(+) monocytes, which enhanced T-cell activation. CD3(+) T-cell death was enhanced in HIV-seropositive individuals. Monocyte-mediated CD3(+) T-cell apoptotic death was Fas-dependent both in vitro and in vivo. In the early stages of the T-cell dependent host response to pneumococci reduced Fas ligand mediated T-cell apoptosis was associated with decreased bacterial clearance in the lung and increased bacteremia. In summary monocytes converted pathogen-associated necrosis into Fas-dependent apoptosis and regulated levels of activated T-cells at sites of acute bacterial infection. These changes were associated with enhanced bacterial clearance in the lung and reduced levels of invasive pneumococcal disease

    Automatic structure classification of small proteins using random forest

    Get PDF
    <p>Abstract</p> <p><b>Background</b></p> <p>Random forest, an ensemble based supervised machine learning algorithm, is used to predict the SCOP structural classification for a target structure, based on the similarity of its structural descriptors to those of a template structure with an equal number of secondary structure elements (SSEs). An initial assessment of random forest is carried out for domains consisting of three SSEs. The usability of random forest in classifying larger domains is demonstrated by applying it to domains consisting of four, five and six SSEs.</p> <p><b>Result</b>s</p> <p>Random forest, trained on SCOP version 1.69, achieves a predictive accuracy of up to 94% on an independent and non-overlapping test set derived from SCOP version 1.73. For classification to the SCOP <it>Class, Fold, Super-family </it>or <it>Family </it>levels, the predictive quality of the model in terms of Matthew's correlation coefficient (MCC) ranged from 0.61 to 0.83. As the number of constituent SSEs increases the MCC for classification to different structural levels decreases.</p> <p>Conclusions</p> <p>The utility of random forest in classifying domains from the place-holder classes of SCOP to the true <it>Class, Fold, Super-family </it>or <it>Family </it>levels is demonstrated. Issues such as introduction of a new structural level in SCOP and the merger of singleton levels can also be addressed using random forest. A real-world scenario is mimicked by predicting the classification for those protein structures from the PDB, which are yet to be assigned to the SCOP classification hierarchy.</p

    Diversity of sympathetic vasoconstrictor pathways and their plasticity after spinal cord injury

    Get PDF
    Sympathetic vasoconstrictor pathways pass through paravertebral ganglia carrying ongoing and reflex activity arising within the central nervous system to their vascular targets. The pattern of reflex activity is selective for particular vascular beds and appropriate for the physiological outcome (vasoconstriction or vasodilation). The preganglionic signals are distributed to most postganglionic neurones in ganglia via synapses that are always suprathreshold for action potential initiation (like skeletal neuromuscular junctions). Most postganglionic neurones receive only one of these “strong” inputs, other preganglionic connections being ineffective. Pre- and postganglionic neurones discharge normally at frequencies of 0.5–1 Hz and maximally in short bursts at <10 Hz. Animal experiments have revealed unexpected changes in these pathways following spinal cord injury. (1) After destruction of preganglionic neurones or axons, surviving terminals in ganglia sprout and rapidly re-establish strong connections, probably even to inappropriate postganglionic neurones. This could explain aberrant reflexes after spinal cord injury. (2) Cutaneous (tail) and splanchnic (mesenteric) arteries taken from below a spinal transection show dramatically enhanced responses in vitro to norepinephrine released from perivascular nerves. However the mechanisms that are modified differ between the two vessels, being mostly postjunctional in the tail artery and mostly prejunctional in the mesenteric artery. The changes are mimicked when postganglionic neurones are silenced by removal of their preganglionic input. Whether or not other arteries are also hyperresponsive to reflex activation, these observations suggest that the greatest contribution to raised peripheral resistance in autonomic dysreflexia follows the modifications of neurovascular transmission

    Control of Cyclin C Levels during Development of Dictyostelium

    Get PDF
    Background: Cdk8 and its partner cyclin C form part of the mediator complex which links the basal transcription machinery to regulatory proteins. The pair are required for correct regulation of a subset of genes and have been implicated in control of development in a number of organisms including the social amoeba Dictyostelium discoideum. When feeding, Dictyostelium amoebae are unicellular but upon starvation they aggregate to form a multicellular structure which develops into a fruiting body containing spores. Cells in which the gene encoding Cdk8 has been deleted fail to enter aggregates due to a failure of early gene expression.Principal Findings: We have monitored the expression levels of cyclin C protein during development and find levels decrease after the multicellular mound is formed. This decrease is triggered by extracellular cAMP that, in turn, is working in part through an increase in intracellular cAMP. The loss of cyclin C is coincident with a reduction in the association of Cdk8 with a high molecular weight complex in the nucleus. Overexpression of cyclin C and Cdk8 lead to an increased rate of early development, consistent with the levels being rate limiting.Conclusions: Overall these results show that both cyclin C and Cdk8 are regulated during development in response to extracellular signals and the levels of these proteins are important in controlling the timing of developmental processes. These findings have important implications for the role of these proteins in controlling development, suggesting that they are targets for developmental signals to regulate gene expression.</p
    corecore