4,926 research outputs found

    Landscape variation in plant leaf flammability is driven by leaf traits responding to environmental gradients:

    Full text link
    © 2018 Krix and Murray. Landscape differences in environmental conditions select for divergences among plant species in strategically important leaf traits such as leaf mass per area (LMA) and leaf area (LA). Interspecific variation in some of these same leaf traits has been associated to varying degrees with differences among species in leaf flammability, including the attributes ignitibility, sustainability, and combustibility. Yet, how environmentally selected variation in leaf traits drives variation in leaf flammability at landscape scales remains largely unknown. Here, we compared leaf traits and flammability attributes between species of sheltered forest vegetation (low light, moist habitat) and plant species of exposed woodland vegetation (high light, dry habitat) in a fire-prone landscape of south-eastern Australia. We found that leaves of sheltered forest species were significantly more flammable via both higher ignitibility and combustibility compared with exposed woodland species. These significant differences in leaf ignitibility and combustibility were underpinned by sheltered forest species having leaves with significantly larger LA and lower LMA compared with exposed woodland species. Further, multiple regression analyses revealed that both LA and LMA were significantly and uniquely related to faster time to ignition (TTI; ignitibility) and higher mean mass loss rate (combustibility). Most notably, although significantly higher fuel moisture content (FMC) of leaves of sheltered forest species significantly lengthened TTI, the lower LMA of these species played a more critical role in reducing TTI, with low LMA explaining more unique variation (partial r2 = 0.78) in high leaf ignitibility than low FMC (partial r2 = 0.49). Our findings provide the first evidence that landscape-scale variation in leaf flammability is tightly coordinated with the primary strategic response of the leaf traits LMA and LA to an environmental gradient. Furthermore, projections for increasing wildfire frequency and intensity in the region will likely allow wildfires to overcome the once protective nature provided by topography to sheltered forest vegetation, which means that higher leaf flammability in sheltered forest species has the potential to exacerbate the effects of changing weather conditions to place sheltered forest habitat, their plants, and their animals, at even higher risk of catastrophic wildfire

    Selecting low-flammability plants as green firebreaks within sustainable urban garden design

    Full text link
    In response to an increasing risk of property loss from wildfires at the urban–wildland interface, there has been growing interest around the world in the plant characteristics of urban gardens that can be manipulated to minimize the chances of property damage or destruction. To date, considerable discussion of this issue can be found in the ‘grey’ literature, covering garden characteristics such as the spatial arrangement of plants in relation to each other, proximity of plants to houses, plant litter and fuel reduction, and the use of low-flammability plants as green firebreaks [1,2,3,4]. Recently, scientific studies from a geographically wide range of fire-prone regions including Europe [5], the USA [6], Australia [7], South Africa [8], and New Zealand [9] have been explicitly seeking to quantify variation among plant species with respect to different aspects of their flammability and to identify low-flammability horticultural species appropriate for implementation as green firebreaks in urban landscapes. The future prospects of this scientific work will ultimately depend on how successfully the results are integrated into the broader context of garden design in fire-prone regions at the urban–wildland interface. Although modern design of urban gardens must consider more than just the issue of green firebreaks, we and others [10,11] believe that selection of low-flammability plants should be high on the priority list of plant selection criteria in fire-prone regions

    A Predictive Model of Leaf Flammability Using Leaf Traits and Radiant Heat Flux for Plants of Fire-Prone Dry Sclerophyll Forest

    Full text link
    The differential flammability of individual plant species in landscape-scale fire behaviour is an important consideration, but one that is often overlooked. This is in part due to a relative dearth in the availability of plant flammability data. Here, we present a highly accurate predictive model of the likelihood of plant leaves entering flaming combustion as a function of leaf mass per area (LMA), leaf area (LA) and radiant heat flux using species of fire-prone dry sclerophyll forests of south-eastern Australia. We validated the performance of the model on two separate datasets, and on plant species not included in the model building process. Our model gives accurate predictions (75–84%) of leaf flaming with potential application in the next generation of fire behaviour models. Given the global wealth of species’ data for LMA and LA, in stark contrast to leaf flammability data, our model has the potential to improve understanding of forest flammability in the absence of leaf flammability information.</jats:p

    The role of the patellar tendon angle and patellar flexion angle in the interpretation of sagittal plane kinematics of the knee after knee arthroplasty: A modelling analysis

    Get PDF
    BACKGROUND: Many different measures have been used to describe knee kinematics. This study investigated the changes of two measures, the patellar tendon angle and the patellar flexion angle, in response to variations in the geometry of the knee due to surgical technique or implant design. METHODS: A mathematical model was developed to calculate the equilibrium position of the extensor mechanism for a particular tibiofemoral position. Calculating the position of the extensor mechanism allowed for the determination of the patellar tendon angle and patellar flexion angle relationships to the knee flexion angle. The model was used to investigate the effect of anterior-posterior position of the femur, change in joint line, patellar thickness (overstuffing, understuffing), and patellar tendon length; these parameters were varied to determine the effect on the patellar tendon angle/knee flexion angle and patellar flexion angle/knee flexion angle relationships. RESULTS: The patellar tendon angle was a good indicator of anterior-posterior femoral position and change in patellar thickness, and the patellar flexion angle a good indicator of change in joint line, and patellar tendon length. CONCLUSIONS: The patellar tendon angle/knee flexion angle relationship was found to be an effective means of identifying abnormal kinematics post-knee arthroplasty. However, the use of both the patellar tendon angle and patellar flexion angle together provided a more informative overview of the sagittal plane kinematics of the knee

    Posterior Bearing Overhang Following Medial and Lateral Mobile Bearing Unicompartmental Knee Replacements

    Get PDF
    This study explores the extent of bearing overhang following mobile bearing Oxford unicompartmental knee replacement (OUKR) (Oxford Phase 3, Zimmer Biomet). The Oxford components are designed to be fully congruent, however knee movements involve femoral rollback, which may result in bearing overhang at the posterior margin of the tibial implant, with potential implications for; pain, wear, and dislocation. Movement is known to be greater, and therefore posterior overhang more likely to occur, with; lateral compared to medial implants, anterior cruciate ligament deficiency, and at extremes of movement. 24 medial, and 20 domed lateral, OUKRs underwent sagittal plane knee fluoroscopy during step‐up and forward lunge exercises. The bearing position was inferred from the relative position of the femoral and tibial components. Based on the individual component sizes and geometry the extent the posterior part of the bearing which overhung the posterior part of the tibial component was calculated. There was no significant posterior overhang in knees with medial implants. Knees with lateral domed implants exhibited overhang at flexion angles beyond 60°, the magnitude of which increased with increasing flexion angle, reaching a maximum of 50% of the bearing length at 140° (range 0‐140°). This demonstrates a clear difference between the kinematics, and prevalence and extent of posterior bearing overhang between medial and lateral OUKRs

    Notices sur les collaborateurs et les collaboratrices

    Get PDF
    Periprosthetic fracture (PF) after primary total hip replacement (THR) is an uncommon but potentially devastating complication. We analysed data on 257,202 primary THRs with cemented stems and 390 linked first revisions for PF recorded in the National Joint Registry (NJR) of England and Wales to determine if cemented femoral stem brand was associated with the risk of having revision for a PF after primary THR. All cemented femoral stem brands with more than 10,000 primary operations recorded in the NJR were identified. The four most commonly used cemented femoral stems were: Exeter V40 (n=146,409), CPT (n=24,300), C-Stem (n=15,113) and Charnley (n=20,182). We compared the revision risk ratios due to PF amongst the stems using a Poisson regression model adjusting for patient factors. Compared to the Exeter V40, the age, gender and ASA grade adjusted revision rate ratio for the cemented CPT stem was 3.89 (95%CI 3.07,4.93), for the C-Stem 0.89 (95%CI 0.57,1.41) and for the Charnley stem 0.41 (95%CI 0.24,0.70). Limitations of the study include incomplete data capture, analysis of only PF requiring revision and that observation does not imply causality. Nevertheless, this study demonstrates that the choice of a cemented stem is associated with the risk of revision for PF. </p

    Development of a patient-reported outcome measure (PROM) and change measure for use in early recovery following hip or knee replacement

    Get PDF
    Background Hip and knee replacement are effective procedures for end-stage arthritis that has not responded to medical management. However, until now, there have been no validated, patient-reported tools to measure early recovery in this growing patient population. The process of development and psychometric evaluation of the Oxford Arthroplasty Early Recovery Score (OARS), a 14-item patient-reported outcome measure (PROM) measuring health status, and the Oxford Arthroplasty Early Change Score (OACS) a 14-item measure to assess change during the first 6 weeks following surgery is reported. Patients and methods A five-phased, best practice, iterative approach was used. From a literature based starting point, qualitative interviews with orthopaedic healthcare professionals, were then performed ascertaining if and how clinicians would use such a PROM and change measure. Analysis of in-depth patient-interviews in phase one identified important patient-reported factors in early recovery which were used to provide questionnaire themes. In Phase two, candidate items from Phase One interviews were generated and pilot questionnaires developed and tested. Exploratory factor analysis with item reduction and final testing of the questionnaires was performed in phase three. Phase Four involved validation testing. Results Qualitative interviews (n = 22) with orthopaedic healthcare professionals, helped determine views of potential users, and guide structure. In Phase One, factors from patient interviews (n = 30) were used to find questionnaire themes and generate items. Pilot questionnaires were developed and tested in Phase Two. Items were refined in the context of cognitive debrief interviews (n = 34) for potential inclusion in the final tools. Final testing of questionnaire properties with item reduction (n = 168) was carried out in phase three. Validation of the OARS and OACS was performed in phase four. Both measures were administered to consecutive patients (n = 155) in an independent cohort. Validity and reliability were assessed. Psychometric testing showed positive results, in terms of internal consistency and sensitivity to change, content validity and relevance to patients and clinicians. In addition, these measures have been found to be acceptable to patients throughout early recovery with validation across the 6 week period. Conclusions These brief, easy-to-use tools could be of great use in assessing recovery pathways and interventions in arthroplasty surgery

    Unicompartmental knee arthroplasty: is the glass half full or half empty?

    Get PDF
    There is a large amount of evidence available about the relative merits of unicompartmental and total knee arthroplasty (UKA and TKA). Based on the same evidence, different people draw different conclusions and as a result, there is great variability in the usage of UKA. The revision rate of UKA is much higher than TKA and so some surgeons conclude that UKA should not be performed. Other surgeons believe that the main reason for the high revision rate is that UKA is easy to revise and, therefore, the threshold for revision is low. They also believe that UKA has many advantages over TKA such as a faster recovery, lower morbidity and mortality and better function. They therefore conclude that UKA should be undertaken whenever appropriate. The solution to this argument is to minimise the revision rate of UKA, thereby addressing the main disadvantage of UKA. The evidence suggests that this will be achieved if surgeons use UKA for at least 20% of their knee arthroplasties and use implants that are appropriate for these broad indications

    Outcomes After Metal-on-metal Hip Revision Surgery Depend on the Reason for Failure: A Propensity Score-matched Study

    Get PDF
    Background Metal-on-metal hip replacement (MoMHR) revision surgery for adverse reactions to metal debris (ARMD) has been associated with an increased risk of early complications and reoperation and inferior patient-reported outcome scores compared with non-ARMD revisions. As a result, early revision specifically for ARMD with adoption of a lower surgical threshold has been widely recommended with the goal of improving the subsequent prognosis after ARMD revisions. However, no large cohorts have compared the risk of complications and reoperation after MoMHR revision surgery for ARMD (an unanticipated revision indication) with those after non-ARMD revisions (which represent conventional modes of arthroplasty revision). Questions/purposes (1) Does the risk of intraoperative complications differ between MoMHRs revised for ARMD compared with non-ARMD indications? (2) Do mortality rates differ after MoMHRs revised for ARMD compared with non-ARMD indications? (3) Do rerevision rates differ after MoMHRs revised for ARMD compared with non-ARMD indications? (4) How do implant survival rates differ after MoMHR revision when performed for specific non-ARMD indications compared with ARMD? Methods This retrospective observational study involved all patients undergoing MoMHR from the National Joint Registry (NJR) for England and Wales subsequently revised for any indication between 2008 and 2014. The NJR achieves high levels of patient consent (93%) and linked procedures (ability to link serial procedures performed on the same patient and hip; 95%). Furthermore, recent validation studies have demonstrated that when revision procedures have been captured within the NJR, the data completion and accuracy were excellent. Revisions for ARMD and non-ARMD indications were matched one to one for multiple potential confounding factors using propensity scores. The propensity score summarizes the many patient and surgical factors that were used in the matching process (including sex, age, type of primary arthroplasty, time to revision surgery, and details about the revision procedure performed such as the approach, specific components revised, femoral head size, bearing surface, and use of bone graft) using one single score for each revised hip. The patient and surgical factors within the ARMD and non-ARMD groups subsequently became much more balanced once the groups had been matched based on the propensity scores. The matched cohort included 2576 MoMHR revisions with each study group including 1288 revisions (mean followup of 3 years for both groups; range, 1-7 years). Intraoperative complications, mortality, and rerevision surgery were compared between matched groups using univariable regression analyses. Implant survival rates in the non-ARMD group were calculated for each specific revision indication with each individual non-ARMD indication subsequently compared with the implant survival rate in the ARMD group using Cox regression analyses. Results There was no difference between the ARMD and non-ARMD MoMHR revisions in terms of intraoperative complications (odds ratio, 0.97; 95% confidence interval [CI], 0.59-1.59; p = 0.900). Mortality rates were lower after ARMD revision compared with non-ARMD revision (hazard ratio [HR], 0.43; CI, 0.21-0.87; p = 0.019); however, there was no difference when revisions performed for infection were excluded from the non-ARMD indication group (HR, 0.69; CI, 0.35-1.37; p = 0.287). Rerevision rates were lower after ARMD revision compared with non-ARMD revision (HR, 0.52; CI, 0.36-0.75; p < 0.001); this difference persisted even after removing revisions performed for infection (HR, 0.59; CI, 0.40-0.89; p = 0.011). Revisions for infection (5-year survivorship = 81%; CI, 55%-93%; p = 0.003) and dislocation/subluxation (5-year survivorship = 82%; CI, 69%-90%; p < 0.001) had the lowest implant survival rates when compared with revisions for ARMD (5-year survivorship = 94%; CI, 92%-96%). Conclusions Contrary to previous observations, MoMHRs revised for ARMD have approximately half the risk of rerevision compared with non-ARMD revisions. We suspect worldwide regulatory authorities have positively influenced rerevision rates after ARMD revision by recommending that surgeons exercise a lower revision threshold and that such revisions are now being performed at an earlier stage. The high risk of rerevision after MoMHR revision for infection and dislocation is concerning. Infected MoMHR revisions were responsible for the increased mortality risk observed after non-ARMD revision, which parallels findings in non-MoMHR revisions for infection. Level of Evidence Level III, therapeutic study
    corecore