15 research outputs found

    Genome Sequence of Striga asiatica Provides Insight into the Evolution of Plant Parasitism

    Get PDF
    Parasitic plants in the genus Striga, commonly known as witchweeds, cause major crop losses in sub-Saharan Africa and pose a threat to agriculture worldwide. An understanding of Striga parasite biology, which could lead to agricultural solutions, has been hampered by the lack of genome information. Here, we report the draft genome sequence of Striga asiatica with 34,577 predicted protein-coding genes, which reflects gene family contractions and expansions that are consistent with a three-phase model of parasitic plant genome evolution. Striga seeds germinate in response to host-derived strigolactones (SLs) and then develop a specialized penetration structure, the haustorium, to invade the host root. A family of SL receptors has undergone a striking expansion, suggesting a molecular basis for the evolution of broad host range among Striga spp. We found that genes involved in lateral root development in non-parasitic model species are coordinately induced during haustorium development in Striga, suggesting a pathway that was partly co-opted during the evolution of the haustorium. In addition, we found evidence for horizontal transfer of host genes as well as retrotransposons, indicating gene flow to S. asiatica from hosts. Our results provide valuable insights into the evolution of parasitism and a key resource for the future development of Striga control strategies.Peer reviewe

    Key Proliferative Activity in the Junction between the Leaf Blade and Leaf Petiole of Arabidopsis1[W][OA]

    No full text
    Leaves are the most important, fundamental units of organogenesis in plants. Although the basic form of a leaf is clearly divided into the leaf blade and leaf petiole, no study has yet revealed how these are differentiated from a leaf primordium. We analyzed the spatiotemporal pattern of mitotic activity in leaf primordia of Arabidopsis (Arabidopsis thaliana) in detail using molecular markers in combination with clonal analysis. We found that the proliferative zone is established after a short interval following the occurrence of a rod-shaped early leaf primordium; it is separated spatially from the shoot apical meristem and seen at the junction region between the leaf blade and leaf petiole and produces both leaf-blade and leaf-petiole cells. This proliferative region in leaf primordia is marked by activity of the ANGUSTIFOLIA3 (AN3) promoter as a whole and seems to be differentiated into several spatial compartments: activities of the CYCLIN D4;2 promoter and SPATULA enhancer mark parts of it specifically. Detailed analyses of the an3 and blade-on-petiole mutations further support the idea that organogenesis of the leaf blade and leaf petiole is critically dependent on the correct spatial regulation of the proliferative region of leaf primordia. Thus, the proliferative zone of leaf primordia is spatially differentiated and supplies both the leaf-blade and leaf-petiole cells

    Application of first-generation high- and low-dose drug-coated balloons to the femoropopliteal artery disease: a sub-analysis of the POPCORN registry

    No full text
    Abstract Background Drug-coated balloons (DCBs) have significantly changed endovascular therapy (EVT) for femoropopliteal artery (FPA) disease, in terms of the expansion of indications for EVT for symptomatic lower extremity arterial disease (LEAD). However, whether there is a difference in the performance among individual DCBs has not yet been fully discussed. The present sub-analysis of real-world data from a prospective trial of first-generation DCBs compared the clinical outcomes between high- and low-dose DCBs using propensity score matching methods. The primary endpoint was the restenosis-free and revascularization-free rates at 1 year. Results We compared 592 pairs matched for patient and lesion characteristics using propensity score matching among a total of 2,507 cases with first-generation DCBs (592 and 1,808 cases in the Lutonix low-dose and In.PACT Admiral high-dose DCB groups, respectively). There were no differences in patient/lesion characteristics, procedural success rates, or complications between the two groups. First-generation low-dose DCB had significantly lower patency (73.3% [95% confidence interval, 69.6%–77.3%] in the low-dose DCB group versus 86.2% [84.1%–88.3%] in the high-dose DCB group; P < 0.001) and revascularization-free (84.9% [81.9%–88.1%] versus 92.5% [90.8%–94.1%]; P < 0.001) rates. Chronic kidney disease on dialysis, cilostazol use, anticoagulant use, and severe calcification had a significant interaction effect in the association (all P < 0.05). Conclusions EVT to FPA with first-generation DCBs had inferior low-dose patency outcomes as compared with high-dose outcomes in the present cohort. Level of evidence Sub analysis of a prospective multicenter study

    Oral Iron Absorption of Ferric Citrate Hydrate and Hepcidin-25 in Hemodialysis Patients: A Prospective, Multicenter, Observational Riona-Oral Iron Absorption Trial

    No full text
    Oral ferric citrate hydrate (FCH) is effective for iron deficiencies in hemodialysis patients; however, how iron balance in the body affects iron absorption in the intestinal tract remains unclear. This prospective observational study (Riona-Oral Iron Absorption Trial, R-OIAT, UMIN 000031406) was conducted at 42 hemodialysis centers in Japan, wherein 268 hemodialysis patients without inflammation were enrolled and treated with a fixed amount of FCH for 6 months. We assessed the predictive value of hepcidin-25 for iron absorption and iron shift between ferritin (FTN) and red blood cells (RBCs) following FCH therapy. Serum iron changes at 2 h (ΔFe2h) after FCH ingestion were evaluated as iron absorption. The primary outcome was the quantitative delineation of iron variables with respect to ΔFe2h, and the secondary outcome was the description of the predictors of the body’s iron balance. Generalized estimating equations (GEEs) were used to identify the determinants of iron absorption during each phase of FCH treatment. ΔFe2h increased when hepcidin-25 and TSAT decreased (−0.459, −0.643 to −0.276, p = 0.000; −0.648, −1.099 to −0.197, p = 0.005, respectively) in GEEs. FTN increased when RBCs decreased (−1.392, −1.749 to −1.035, p = 0.000) and hepcidin-25 increased (0.297, 0.239 to 0.355, p = 0.000). Limiting erythropoiesis to maintain hemoglobin levels induces RBC reduction in hemodialysis patients, resulting in increased hepcidin-25 and FTN levels. Hepcidin-25 production may prompt an iron shift from RBC iron to FTN iron, inhibiting iron absorption even with continued FCH intake
    corecore