62 research outputs found

    Portable Three-Dimensional imaging to monitor small volume enhancement in face, vulva and hand: a comparative study

    Get PDF
    Multiple handheld 3-dimentional systems are available on the market but data regarding their use in detecting small volumes are limited. The aim of this study was to compare different portable 3D technologies in detecting small volumetric enhancement on a mannequin model and a series of patients. Five portable 3D systems (Artec Eva, Crisalix, Go!Scan, LifeViz Mini, and Vectra H1) were tested in a controlled environment with standardised volumes and in a clinical setting with patients undergoing small volume fat grafting to face, vulva and hand. Accuracy was assessed with absolute and relative technical error measurement (TEM and rTEM); precision with intra- and inter-observer reliability (rp and ICC), and usability in clinical practice with the following parameters: portability, suitability of use in operating theatre/clinic, ease of use of hardware and software, speed of capture, image quality, patient comfort, and cost. All tested devices presented overall good accuracy in detecting small volumetric changes ranging from 0.5 to 4 cc. Structured-light laser scanners (Artec Eva and Go!Scan) showed high accuracy but their use in clinical practice was limited by longer capture time, multiple wiring, and complex software for analysis. Crisalix was considered the most user-friendly, less bothering for patients, and truly portable but its use was limited to the face because the software does not include vulva and hand. 3D technologies exploiting the principle of passive stereophotogrammetry such as LifeViz Mini and Vectra H1 were the most versatile for assessing accurately multiple body areas, representing overall the best long-term value-for-money. 3D portable technology is a non-invasive, accurate, and reproducible method to assess the volumetric outcome after facial, vulval and hand injectables. The choice of the 3D system should be based on the clinical need and resources available

    Single-Cell Analysis of ADSC Interactions with Fibroblasts and Endothelial Cells in Scleroderma Skin

    Get PDF
    Adipose-derived stem cells (ADSCs) as part of autologous fat grafting have anti-fibrotic and anti-inflammatory effects, but the exact mechanisms of action remain unknown. By simulating the interaction of ADSCs with fibroblasts and endothelial cells (EC) from scleroderma (SSc) skin in silico, we aim to unravel these mechanisms. Publicly available single-cell RNA sequencing data from the stromal vascular fraction of 3 lean patients and biopsies from the skin of 10 control and 12 patients with SSc were obtained from the GEO and analysed using R and Seurat. Differentially expressed genes were used to compare the fibroblast and EC transcriptome between controls and SSc. GO and KEGG functional enrichment was performed. Ligand–receptor interactions of ADSCs with fibroblasts and ECs were explored with LIANA. Pro-inflammatory and extracellular matrix (ECM) interacting fibroblasts were identified in SSc. Arterial, capillary, venous and lymphatic ECs showed a pro-fibrotic and pro-inflammatory transcriptome. Most interactions with both cell types were based on ECM proteins. Differential interactions identified included NTN1, VEGFD, MMP2, FGF2, and FNDC5. The ADSC secretome may disrupt vascular and perivascular inflammation hubs in scleroderma by promoting angiogenesis and especially lymphangiogenesis. Key phenomena observed after fat grafting remain unexplained, including modulation of fibroblast behaviour

    Study protocol: a randomized controlled trial of a computer-based depression and substance abuse intervention for people attending residential substance abuse treatment

    Get PDF
    Background: A large proportion of people attending residential alcohol and other substance abuse treatment have a co-occurring mental illness. Empirical evidence suggests that it is important to treat both the substance abuse problem and co-occurring mental illness concurrently and in an integrated fashion. However, the majority of residential alcohol and other substance abuse services do not address mental illness in a systematic way. It is likely that computer delivered interventions could improve the ability of substance abuse services to address co-occurring mental illness. This protocol describes a study in which we will assess the effectiveness of adding a computer delivered depression and substance abuse intervention for people who are attending residential alcohol and other substance abuse treatment. Methods/Design. Participants will be recruited from residential rehabilitation programs operated by the Australian Salvation Army. All participants who satisfy the diagnostic criteria for an alcohol or other substance dependence disorder will be asked to participate in the study. After completion of a baseline assessment, participants will be randomly assigned to either a computer delivered substance abuse and depression intervention (treatment condition) or to a computer-delivered typing tutorial (active control condition). All participants will continue to complete The Salvation Army residential program, a predominantly 12-step based treatment facility. Randomisation will be stratified by gender (Male, Female), length of time the participant has been in the program at the commencement of the study (4 weeks or less, 4 weeks or more), and use of anti-depressant medication (currently prescribed medication, not prescribed medication). Participants in both conditions will complete computer sessions twice per week, over a five-week period. Research staff blind to treatment allocation will complete the assessments at baseline, and then 3, 6, 9, and 12 months post intervention. Participants will also complete weekly self-report measures during the treatment period. Discussion. This study will provide comprehensive data on the effect of introducing a computer delivered, cognitive behavioral therapy based co-morbidity treatment program within a residential substance abuse setting. If shown to be effective, this intervention can be disseminated within other residential substance abuse programs. Trial registration. Australia and New Zealand Clinical Trials Register (ANZCTR): ACTRN12611000618954

    Using combined diagnostic test results to hindcast trends of infection from cross-sectional data

    Get PDF
    Infectious disease surveillance is key to limiting the consequences from infectious pathogens and maintaining animal and public health. Following the detection of a disease outbreak, a response in proportion to the severity of the outbreak is required. It is thus critical to obtain accurate information concerning the origin of the outbreak and its forward trajectory. However, there is often a lack of situational awareness that may lead to over- or under-reaction. There is a widening range of tests available for detecting pathogens, with typically different temporal characteristics, e.g. in terms of when peak test response occurs relative to time of exposure. We have developed a statistical framework that combines response level data from multiple diagnostic tests and is able to ‘hindcast’ (infer the historical trend of) an infectious disease epidemic. Assuming diagnostic test data from a cross-sectional sample of individuals infected with a pathogen during an outbreak, we use a Bayesian Markov Chain Monte Carlo (MCMC) approach to estimate time of exposure, and the overall epidemic trend in the population prior to the time of sampling. We evaluate the performance of this statistical framework on simulated data from epidemic trend curves and show that we can recover the parameter values of those trends. We also apply the framework to epidemic trend curves taken from two historical outbreaks: a bluetongue outbreak in cattle, and a whooping cough outbreak in humans. Together, these results show that hindcasting can estimate the time since infection for individuals and provide accurate estimates of epidemic trends, and can be used to distinguish whether an outbreak is increasing or past its peak. We conclude that if temporal characteristics of diagnostics are known, it is possible to recover epidemic trends of both human and animal pathogens from cross-sectional data collected at a single point in time

    Insights into Candida tropicalis nosocomial infections and virulence factors

    Get PDF
    Candida tropicalis is considered the first or the second non-Candida albicans Candida (NCAC) species most frequently isolated from candidosis, mainly in patients admitted in intensive care units (ICUs), especially with cancer, requiring prolonged catheterization, or receiving broad-spectrum antibiotics. The proportion of candiduria and candidemia caused by C. tropicalis varies widely with geographical area and patient group. Actually, in certain countries, C. tropicalis is more prevalent, even compared with C. albicans or other NCAC species. Although prophylactic treatments with fluconazole cause a decrease in the frequency of candidosis caused by C. tropicalis, it is increasingly showing a moderate level of fluconazole resistance. The propensity of C. tropicalis for dissemination and the high mortality associated with its infections might be strongly related to the potential of virulence factors exhibited by this species, such as adhesion to different host surfaces, biofilm formation, infection and dissemination, and enzymes secretion. Therefore, the aim of this review is to outline the present knowledge on all the above-mentioned C. tropicalis virulence traits.The authors acknowledge Coordenacao de Aperfeicoamento de Pessoal de Nivel Superior (CAPES), Brazil, for supporting Melyssa Negri (BEX 4642/06-6) and Fundacao para a Ciencia e Tecnologia (FCT), Portugal, for supporting Sonia Silva (SFRH/BPD/71076/2010), and European Community fund FEDER, trough Program COMPETE under the Project FCOMP-01-0124-FEDER-007025 (PTDC/AMB/68393/2006) is gratefully acknowledged

    The Mitochondrial Chaperone Protein TRAP1 Mitigates α-Synuclein Toxicity

    Get PDF
    Overexpression or mutation of α-Synuclein is associated with protein aggregation and interferes with a number of cellular processes, including mitochondrial integrity and function. We used a whole-genome screen in the fruit fly Drosophila melanogaster to search for novel genetic modifiers of human [A53T]α-Synuclein–induced neurotoxicity. Decreased expression of the mitochondrial chaperone protein tumor necrosis factor receptor associated protein-1 (TRAP1) was found to enhance age-dependent loss of fly head dopamine (DA) and DA neuron number resulting from [A53T]α-Synuclein expression. In addition, decreased TRAP1 expression in [A53T]α-Synuclein–expressing flies resulted in enhanced loss of climbing ability and sensitivity to oxidative stress. Overexpression of human TRAP1 was able to rescue these phenotypes. Similarly, human TRAP1 overexpression in rat primary cortical neurons rescued [A53T]α-Synuclein–induced sensitivity to rotenone treatment. In human (non)neuronal cell lines, small interfering RNA directed against TRAP1 enhanced [A53T]α-Synuclein–induced sensitivity to oxidative stress treatment. [A53T]α-Synuclein directly interfered with mitochondrial function, as its expression reduced Complex I activity in HEK293 cells. These effects were blocked by TRAP1 overexpression. Moreover, TRAP1 was able to prevent alteration in mitochondrial morphology caused by [A53T]α-Synuclein overexpression in human SH-SY5Y cells. These results indicate that [A53T]α-Synuclein toxicity is intimately connected to mitochondrial dysfunction and that toxicity reduction in fly and rat primary neurons and human cell lines can be achieved using overexpression of the mitochondrial chaperone TRAP1. Interestingly, TRAP1 has previously been shown to be phosphorylated by the serine/threonine kinase PINK1, thus providing a potential link of PINK1 via TRAP1 to α-Synuclein

    Study protocol for the translating research in elder care (TREC): building context – an organizational monitoring program in long-term care project (project one)

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>While there is a growing awareness of the importance of organizational context (or the work environment/setting) to successful knowledge translation, and successful knowledge translation to better patient, provider (staff), and system outcomes, little empirical evidence supports these assumptions. Further, little is known about the factors that enhance knowledge translation and better outcomes in residential long-term care facilities, where care has been shown to be suboptimal. The project described in this protocol is one of the two main projects of the larger five-year Translating Research in Elder Care (TREC) program.</p> <p>Aims</p> <p>The purpose of this project is to establish the magnitude of the effect of organizational context on knowledge translation, and subsequently on resident, staff (unregulated, regulated, and managerial) and system outcomes in long-term care facilities in the three Canadian Prairie Provinces (Alberta, Saskatchewan, Manitoba).</p> <p>Methods/Design</p> <p>This study protocol describes the details of a multi-level – including provinces, regions, facilities, units within facilities, and individuals who receive care (residents) or work (staff) in facilities – and longitudinal (five-year) research project. A stratified random sample of 36 residential long-term care facilities (30 urban and 6 rural) from the Canadian Prairie Provinces will comprise the sample. Caregivers and care managers within these facilities will be asked to complete the TREC survey – a suite of survey instruments designed to assess organizational context and related factors hypothesized to be important to successful knowledge translation and to achieving better resident, staff, and system outcomes. Facility and unit level data will be collected using standardized data collection forms, and resident outcomes using the Resident Assessment Instrument-Minimum Data Set version 2.0 instrument. A variety of analytic techniques will be employed including descriptive analyses, psychometric analyses, multi-level modeling, and mixed-method analyses.</p> <p>Discussion</p> <p>Three key challenging areas associated with conducting this project are discussed: sampling, participant recruitment, and sample retention; survey administration (with unregulated caregivers); and the provision of a stable set of study definitions to guide the project.</p

    Human plague: An old scourge that needs new answers

    Get PDF
    Yersinia pestis, the bacterial causative agent of plague, remains an important threat to human health. Plague is a rodent-borne disease that has historically shown an outstanding ability to colonize and persist across different species, habitats, and environments while provoking sporadic cases, outbreaks, and deadly global epidemics among humans. Between September and November 2017, an outbreak of urban pneumonic plague was declared in Madagascar, which refocused the attention of the scientific community on this ancient human scourge. Given recent trends and plague’s resilience to control in the wild, its high fatality rate in humans without early treatment, and its capacity to disrupt social and healthcare systems, human plague should be considered as a neglected threat. A workshop was held in Paris in July 2018 to review current knowledge about plague and to identify the scientific research priorities to eradicate plague as a human threat. It was concluded that an urgent commitment is needed to develop and fund a strong research agenda aiming to fill the current knowledge gaps structured around 4 main axes: (i) an improved understanding of the ecological interactions among the reservoir, vector, pathogen, and environment; (ii) human and societal responses; (iii) improved diagnostic tools and case management; and (iv) vaccine development. These axes should be cross-cutting, translational, and focused on delivering context-specific strategies. Results of this research should feed a global control and prevention strategy within a “One Health” approach

    Prognostic model to predict postoperative acute kidney injury in patients undergoing major gastrointestinal surgery based on a national prospective observational cohort study.

    Get PDF
    Background: Acute illness, existing co-morbidities and surgical stress response can all contribute to postoperative acute kidney injury (AKI) in patients undergoing major gastrointestinal surgery. The aim of this study was prospectively to develop a pragmatic prognostic model to stratify patients according to risk of developing AKI after major gastrointestinal surgery. Methods: This prospective multicentre cohort study included consecutive adults undergoing elective or emergency gastrointestinal resection, liver resection or stoma reversal in 2-week blocks over a continuous 3-month period. The primary outcome was the rate of AKI within 7 days of surgery. Bootstrap stability was used to select clinically plausible risk factors into the model. Internal model validation was carried out by bootstrap validation. Results: A total of 4544 patients were included across 173 centres in the UK and Ireland. The overall rate of AKI was 14·2 per cent (646 of 4544) and the 30-day mortality rate was 1·8 per cent (84 of 4544). Stage 1 AKI was significantly associated with 30-day mortality (unadjusted odds ratio 7·61, 95 per cent c.i. 4·49 to 12·90; P < 0·001), with increasing odds of death with each AKI stage. Six variables were selected for inclusion in the prognostic model: age, sex, ASA grade, preoperative estimated glomerular filtration rate, planned open surgery and preoperative use of either an angiotensin-converting enzyme inhibitor or an angiotensin receptor blocker. Internal validation demonstrated good model discrimination (c-statistic 0·65). Discussion: Following major gastrointestinal surgery, AKI occurred in one in seven patients. This preoperative prognostic model identified patients at high risk of postoperative AKI. Validation in an independent data set is required to ensure generalizability

    Increased mitochondrial DNA diversity in ancient Columbia River basin Chinook salmon Oncorhynchus tshawytscha

    Get PDF
    The Columbia River and its tributaries provide essential spawning and rearing habitat for many salmonid species, including Chinook salmon (Oncorhynchus tshawytscha). Chinook salmon were historically abundant throughout the basin and Native Americans in the region relied heavily on these fish for thousands of years. Following the arrival of Europeans in the 1800s, salmon in the basin experienced broad declines linked to overfishing, water diversion projects, habitat destruction, connectivity reduction, introgression with hatchery-origin fish, and hydropower development. Despite historical abundance, many native salmonids are now at risk of extinction. Research and management related to Chinook salmon is usually explored under what are termed “the four H’s”: habitat, harvest, hatcheries, and hydropower; here we explore a fifth H, history. Patterns of prehistoric and contemporary mitochondrial DNA variation from Chinook salmon were analyzed to characterize and compare population genetic diversity prior to recent alterations and, thus, elucidate a deeper history for this species. A total of 346 ancient and 366 contemporary samples were processed during this study. Species was determined for 130 of the ancient samples and control region haplotypes of 84 of these were sequenced. Diversity estimates from these 84 ancient Chinook salmon were compared to 379 contemporary samples. Our analysis provides the first direct measure of reduced genetic diversity for Chinook salmon from the ancient to the contemporary period, as measured both in direct loss of mitochondrial haplotypes and reductions in haplotype and nucleotide diversity. However, these losses do not appear equal across the basin, with higher losses of diversity in the mid-Columbia than in the Snake subbasin. The results are unexpected, as the two groups were predicted to share a common history as parts of the larger Columbia River Basin, and instead indicate that Chinook salmon in these subbasins may have divergent demographic histories.Ye
    corecore