27 research outputs found

    Characterization of multinucleated giant cells in synovium and subchondral bone in knee osteoarthritis and rheumatoid arthritis

    Get PDF
    Background: Multinucleated giant cells have been noticed in diverse arthritic conditions since their first description in rheumatoid synovium. However, their role in the pathogenesis of osteoarthritis (OA) or rheumatoid arthritis (RA) still remains broadly unknown. We aimed to study the presence and characteristics of multinucleated giant cells (MGC) both in synovium and in subchondral bone tissues of patients with OA or RA. Methods: Knee synovial and subchondral bone samples were from age-matched patients undergoing total joint replacement for OA or RA, or non-arthritic post mortem (PM) controls. OA synovium was stratified by histological inflammation grade using index tissue sections. Synovitis was assessed by Krenn score. Histological studies employed specific antibodies against macrophage markers or cathepsin K, or TRAP enzymatic assay. Results: Inflamed OA and RA synovia displayed more multinucleated giant cells than did non-inflamed OA and PM synovia. There was a significant association between MGC numbers and synovitis severity. A TRAP negative/cathepsin K negative Langhans-like subtype was predominant in OA, whereas both Langhans-like and TRAP-positive/ cathepsin K negative foreign-body-like subtypes were most commonly detected in RA. Plasma-like and foam-like subtypes also were observed in OA and RA synovia, and the latter was found surrounding adipocytes. TRAP positive/ cathepsin K positive osteoclasts were only identified adjacent to subchondral bone surfaces. TRAP positive osteoclasts were significantly increased in subchondral bone in OA and RA compared to PM controls. Conclusions: Multinucleated giant cells are associated with synovitis severity, and subchondral osteoclast numbers are increased in OA, as well as in RA. Further research targeting multinucleated giant cells is warranted to elucidate their contributions to the symptoms and joint damage associated with arthritis

    Nucleic acid-based fluorescent probes and their analytical potential

    Get PDF
    It is well known that nucleic acids play an essential role in living organisms because they store and transmit genetic information and use that information to direct the synthesis of proteins. However, less is known about the ability of nucleic acids to bind specific ligands and the application of oligonucleotides as molecular probes or biosensors. Oligonucleotide probes are single-stranded nucleic acid fragments that can be tailored to have high specificity and affinity for different targets including nucleic acids, proteins, small molecules, and ions. One can divide oligonucleotide-based probes into two main categories: hybridization probes that are based on the formation of complementary base-pairs, and aptamer probes that exploit selective recognition of nonnucleic acid analytes and may be compared with immunosensors. Design and construction of hybridization and aptamer probes are similar. Typically, oligonucleotide (DNA, RNA) with predefined base sequence and length is modified by covalent attachment of reporter groups (one or more fluorophores in fluorescence-based probes). The fluorescent labels act as transducers that transform biorecognition (hybridization, ligand binding) into a fluorescence signal. Fluorescent labels have several advantages, for example high sensitivity and multiple transduction approaches (fluorescence quenching or enhancement, fluorescence anisotropy, fluorescence lifetime, fluorescence resonance energy transfer (FRET), and excimer-monomer light switching). These multiple signaling options combined with the design flexibility of the recognition element (DNA, RNA, PNA, LNA) and various labeling strategies contribute to development of numerous selective and sensitive bioassays. This review covers fundamentals of the design and engineering of oligonucleotide probes, describes typical construction approaches, and discusses examples of probes used both in hybridization studies and in aptamer-based assays

    Global patient outcomes after elective surgery: prospective cohort study in 27 low-, middle- and high-income countries.

    Get PDF
    BACKGROUND: As global initiatives increase patient access to surgical treatments, there remains a need to understand the adverse effects of surgery and define appropriate levels of perioperative care. METHODS: We designed a prospective international 7-day cohort study of outcomes following elective adult inpatient surgery in 27 countries. The primary outcome was in-hospital complications. Secondary outcomes were death following a complication (failure to rescue) and death in hospital. Process measures were admission to critical care immediately after surgery or to treat a complication and duration of hospital stay. A single definition of critical care was used for all countries. RESULTS: A total of 474 hospitals in 19 high-, 7 middle- and 1 low-income country were included in the primary analysis. Data included 44 814 patients with a median hospital stay of 4 (range 2-7) days. A total of 7508 patients (16.8%) developed one or more postoperative complication and 207 died (0.5%). The overall mortality among patients who developed complications was 2.8%. Mortality following complications ranged from 2.4% for pulmonary embolism to 43.9% for cardiac arrest. A total of 4360 (9.7%) patients were admitted to a critical care unit as routine immediately after surgery, of whom 2198 (50.4%) developed a complication, with 105 (2.4%) deaths. A total of 1233 patients (16.4%) were admitted to a critical care unit to treat complications, with 119 (9.7%) deaths. Despite lower baseline risk, outcomes were similar in low- and middle-income compared with high-income countries. CONCLUSIONS: Poor patient outcomes are common after inpatient surgery. Global initiatives to increase access to surgical treatments should also address the need for safe perioperative care. STUDY REGISTRATION: ISRCTN5181700

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore