37 research outputs found

    Successful treatment of primary bone marrow Hodgkin lymphoma with brentuximab vedotin: a case report and review of the literature

    No full text
    Abstract Background Hodgkin lymphoma usually presents with sequential enlargement of peripheral lymph nodes, and bone marrow invasion rarely occurs (approximately 3–5%). However, several cases have been reported as “primary” bone marrow Hodgkin lymphoma, especially among patients with human immunodeficiency virus and the elderly. This type of Hodgkin lymphoma is characterized by no peripheral lymphadenopathies and has been reported to have poorer prognosis. Case presentation A 38-year-old Japanese man was admitted to our hospital because of fever of unknown origin and pancytopenia without lymphadenopathies. Bone marrow examination revealed Hodgkin cells mimicking abnormal cells. These were positive for CD30, EBER-1, CD15, PAX-5, and Bob-1 and negative for Oct-2, CD3, CD20, surface immunoglobulin, CD56. On the basis of systemic evaluation and bone marrow examination, he was diagnosed with primary bone marrow Hodgkin lymphoma. We initiated therapy with DeVIC (dexamethasone, etoposide, ifosfamide, and carboplatin) therapy, but remission was not achieved. Then, the patient was treated with brentuximab vedotin combined with systemic chemotherapy (Adriamycin, vinblastine and dacarbazine), which was effective. Conclusions There is no established treatment strategy for Hodgkin lymphoma, and therapeutic outcomes using ABVD (Adriamycin, bleomycin, vinblastine and dacarbazine)-like or CHOP (cyclophosphamide, Adriamycin, vincristine, and prednisone)-like regimens are reportedly poor. Only a few patients have been reported to achieve long-term remission. Through this case report, we suggest an alternative therapeutic option for primary bone marrow Hodgkin lymphoma

    Reduction of high levels of internal radio-contamination by dietary intervention in residents of areas affected by the Fukushima Daiichi nuclear plant disaster: a case series.

    No full text
    Maintaining low levels of chronic internal contamination among residents in radiation-contaminated areas after a nuclear disaster is a great public health concern. However, the efficacy of reduction measures for individual internal contamination remains unknown. To reduce high levels of internal radiation exposure in a group of individuals exposed through environmental sources, we performed careful dietary intervention with identification of suspected contaminated foods, as part of mass voluntary radiation contamination screenings and counseling program in Minamisoma Municipal General Hospital and Hirata Central Hospital. From a total of 30,622 study participants, only 9 residents displayed internal cesium-137 (Cs-137) levels of more than 50 Bq/kg. The median level of internal Cs-137 contamination in these residents at the initial screening was 4,830 Bq/body (range: 2,130-15,918 Bq/body) and 69.6 Bq/kg (range: 50.7-216.3 Bq/kg). All these residents with high levels of internal contamination consumed homegrown produce without radiation inspection, and often collected mushrooms in the wild or cultivated them on bed-logs in their homes. They were advised to consume distributed food mainly and to refrain from consuming potentially contaminated foods without radiation inspection and local produces under shipment restrictions such as mushrooms, mountain vegetables, and meat of wild life. A few months after the intervention, re-examination of Cs levels revealed remarkable reduction of internal contamination in all residents. Although the levels of internal radiation exposure appear to be minimal amongst most residents in Fukushima, a subset of the population, who unknowingly consumed highly contaminated foodstuffs, experienced high levels of internal contamination. There seem to be similarities in dietary preferences amongst residents with high internal contamination levels, and intervention based on pre- and post-test counseling and dietary advice from medical care providers about risky food intake appears to be a feasible option for changing residents' dietary practices, subsequently resulting in a reduction in Cs internal contamination levels

    Pathophysiology of Lung Injury Induced by Common Bile Duct Ligation in Mice

    No full text
    <div><p>Background</p><p>Liver dysfunction and cirrhosis affect vasculature in several organ systems and cause impairment of organ functions, thereby increasing morbidity and mortality. Establishment of a mouse model of hepatopulmonary syndrome (HPS) would provide greater insights into the genetic basis of the disease. Our objectives were to establish a mouse model of lung injury after common bile duct ligation (CBDL) and to investigate pulmonary pathogenesis for application in future therapeutic approaches.</p><p>Methods</p><p>Eight-week-old Balb/c mice were subjected to CBDL. Immunohistochemical analyses and real-time quantitative reverse transcriptional polymerase chain reaction were performed on pulmonary tissues. The presence of HPS markers was detected by western blot and microarray analyses.</p><p>Results</p><p>We observed extensive proliferation of CD31-positive pulmonary vascular endothelial cells at 2 weeks after CBDL and identified 10 upregulated and 9 down-regulated proteins that were associated with angiogenesis. TNF-α and MMP-9 were highly expressed at 3 weeks after CBDL and were less expressed in the lungs of the control group.</p><p>Conclusions</p><p>We constructed a mouse lung injury model by using CBDL. Contrary to our expectation, lung pathology in our mouse model exhibited differences from that of rat models, and the mechanisms responsible for these differences are unknown. This phenomenon may be explained by contrasting processes related to TNF induction of angiogenic signaling pathways in the inflammatory phase. Thus, we suggest that our mouse model can be applied to pulmonary pathological analyses in the inflammatory phase, i.e., to systemic inflammatory response syndrome, acute lung injury, and multiple organ dysfunction syndrome.</p></div

    Comparison between Direct Measurements and Modeled Estimates of External Radiation Exposure among School Children 18 to 30 Months after the Fukushima Nuclear Accident in Japan

    No full text
    After a major radioactive incident, accurate dose reconstruction is important for evaluating health risks and appropriate radiation protection policies. After the 2011 Japan Fukushima nuclear incident, we assessed the level of agreement between the modeled and directly measured dose and estimated the uncertainties. The study population comprised 520 school children from Minamisoma city, located 20 km north of the nuclear plant. The annual dose 18–30 months after the incident was assessed using two approaches: estimation using the model proposed by the Japanese government and direct measurement by radiation dosemeters. The ratio of the average of modeled and measured doses was 3.0 (standard deviation (SD): 2.0). The reduction coefficient, an index for radiation attenuation properties, was 0.3 (SD: 0.1) on average, whereas the value used in the government model was 0.6. After adjusting for covariates, the coefficient had a significant negative correlation with the air dose rate in the dwelling location (<i>p</i> < 0.001), indicating that stronger building shielding effects are valuable in areas with higher air contamination levels. The present study demonstrated that some overestimation may have been related to uncertainties in radiation reduction effects, and that the air contamination level might provide a more important indicator of these effects
    corecore