7 research outputs found

    Tissue‐engineered tendon constructs for rotator cuff repair in sheep

    Full text link
    Current rotator cuff repair commonly involves the use of single or double row suture techniques, and despite successful outcomes, failure rates continue to range from 20 to 95%. Failure to regenerate native biomechanical properties at the enthesis is thought to contribute to failure rates. Thus, the need for technologies that improve structural healing of the enthesis after rotator cuff repair is imperative. To address this issue, our lab has previously demonstrated enthesis regeneration using a tissue‐engineered graft approach in a sheep anterior cruciate ligament (ACL) repair model. We hypothesized that our tissue‐engineered graft designed for ACL repair also will be effective in rotator cuff repair. The goal of this study was to test the efficacy of our Engineered Tissue Graft for Rotator Cuff (ETG‐RC) in a rotator cuff tear model in sheep and compare this novel graft technology to the commonly used double row suture repair technique. Following a 6‐month recovery, the grafted and contralateral shoulders were removed, imaged using X‐ray, and tested biomechanically. Additionally, the infraspinatus muscle, myotendinous junction, enthesis, and humeral head were preserved for histological analysis of muscle, tendon, and enthesis structure. Our results showed that our ETC‐RCs reached 31% of the native tendon tangent modulus, which was a modest, non‐significant, 11% increase over that of the suture‐only repairs. However, the histological analysis showed the regeneration of a native‐like enthesis in the ETG‐RC‐repaired animals. This advanced structural healing may improve over longer times and may diminish recurrence rates of rotator cuff tears and lead to better clinical outcomes. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 36:289–299, 2018.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/142510/1/jor23642.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/142510/2/jor23642_am.pd

    Experimental Saltwater Intrusion Drives Rapid Soil Elevation and Carbon Loss in Freshwater and Brackish Everglades Marshes

    No full text
    Increasing rates of sea-level rise (SLR) threaten to submerge coastal wetlands unless they increase soil elevation at similar pace, often by storing soil organic carbon (OC). Coastal wetlands face increasing salinity, marine-derived nutrients, and inundation depths from increasing rates of SLR. To quantify the effects of SLR on soil OC stocks and fluxes and elevation change, we conducted two mesocosm experiments using the foundation species sawgrass (Cladium jamaicense) and organic soils from freshwater and brackish Florida Everglades marshes for 1 year. In freshwater mesocosms, we compared ambient and elevated salinity (fresh, 9 ppt) and phosphorus (ambient, + 1 g P m−2 year−1) treatments with a 2 × 2 factorial design. Salinity addition reduced root biomass (48%), driving 2.8 ± 0.3 cm year−1 of elevation loss, while soil elevation was maintained in freshwater conditions. Added P increased root productivity (134%) but also increased breakdown rates (k) of roots (31%) and leaves (42%) with no effect on root biomass or soil elevation. In brackish mesocosms, we compared ambient and elevated salinity (10, 19 ppt) and inundated and exposed conditions (water level 5-cm below and 4-cm above soil). Elevated salinity decreased root productivity (70%) and root biomass (37%) and increased k in litter (33%) and surface roots (11%), whereas inundation decreased subsurface root k (10%). All brackish marshes lost elevation at similar rates (0.6 ± 0.2 cm year−1). In conclusion, saltwater intrusion in freshwater and brackish wetlands may reduce net OC storage and increase vulnerability to SLR despite inundation or marine P supplies

    Risk of COVID-19 after natural infection or vaccinationResearch in context

    No full text
    Summary: Background: While vaccines have established utility against COVID-19, phase 3 efficacy studies have generally not comprehensively evaluated protection provided by previous infection or hybrid immunity (previous infection plus vaccination). Individual patient data from US government-supported harmonized vaccine trials provide an unprecedented sample population to address this issue. We characterized the protective efficacy of previous SARS-CoV-2 infection and hybrid immunity against COVID-19 early in the pandemic over three-to six-month follow-up and compared with vaccine-associated protection. Methods: In this post-hoc cross-protocol analysis of the Moderna, AstraZeneca, Janssen, and Novavax COVID-19 vaccine clinical trials, we allocated participants into four groups based on previous-infection status at enrolment and treatment: no previous infection/placebo; previous infection/placebo; no previous infection/vaccine; and previous infection/vaccine. The main outcome was RT-PCR-confirmed COVID-19 >7–15 days (per original protocols) after final study injection. We calculated crude and adjusted efficacy measures. Findings: Previous infection/placebo participants had a 92% decreased risk of future COVID-19 compared to no previous infection/placebo participants (overall hazard ratio [HR] ratio: 0.08; 95% CI: 0.05–0.13). Among single-dose Janssen participants, hybrid immunity conferred greater protection than vaccine alone (HR: 0.03; 95% CI: 0.01–0.10). Too few infections were observed to draw statistical inferences comparing hybrid immunity to vaccine alone for other trials. Vaccination, previous infection, and hybrid immunity all provided near-complete protection against severe disease. Interpretation: Previous infection, any hybrid immunity, and two-dose vaccination all provided substantial protection against symptomatic and severe COVID-19 through the early Delta period. Thus, as a surrogate for natural infection, vaccination remains the safest approach to protection. Funding: National Institutes of Health
    corecore