221 research outputs found
Concomitant primary breast carcinoma and primary choroidal melanoma: a case report
<p>Abstract</p> <p>Introduction</p> <p>Choroidal melanoma and choroidal metastasis are distinct pathological entities with very different treatments and prognoses. They may be difficult to distinguish to the untrained observer.</p> <p>Case presentation</p> <p>A case of concomitant choroidal melanoma in a woman with primary breast carcinoma is described. The choroidal lesion was thought initially to be a metastasis, and treated with external beam radiotherapy. The tumour did not regress but remained stable in size for a period of three years. Following referral to an ophthalmologist, the diagnosis was revised after re-evaluation of the clinical, ultrasonographic and angiographic findings.</p> <p>Conclusion</p> <p>Although metastases are the most common ocular tumour, a differential diagnosis of a concurrent primary ocular malignancy should always be considered, even in patients with known malignant disease. Thorough ophthalmic evaluation is important, as multiple primary malignancies may occur concomitantly. The prognostic and therapeutic implications of accurate diagnosis by an ophthalmologist are of profound significance to affected patients and their families.</p
Beyond experiments
It is often claimed that only experiments can support strong causal inferences and therefore they should be privileged in the behavioral sciences. We disagree. Overvaluing experiments results in their overuse both by researchers and decision-makers, and in an underappreciation of their shortcomings. Neglecting other methods often follows. Experiments can suggest whether X causes Y in a specific experimental setting; however, they often fail to elucidate either the mechanisms responsible for an effect, or the strength of an effect in everyday natural settings. In this paper, we consider two overarching issues. First, experiments have important limitations. We highlight problems with: external, construct, statistical conclusion, and internal validity; replicability; and with conceptual issues associated with simple X-causes-Y thinking. Second, quasi-experimental and non-experimental methods are absolutely essential. As well as themselves estimating causal effects, these other methods can provide information and understanding that goes beyond that provided by experiments. A research program progresses best when experiments are not treated as privileged but instead are combined with these other methods
Incidence of ocular complications in patients with multibacillary leprosy after completion of a 2 year course of multidrug therapy
Aim: To evaluate the incidence of and risk factors for ocular complications in multibacillary (MB) leprosy patients following completion of 2 year, fixed duration, multidrug therapy (MDT). Methods: Biannual eye examinations were conducted prospectively on a cohort of MB patients who had completed MDT and followed up for 5 years. The incidence of ocular pathology was calculated as the number of events per person year of event free follow up of patients who did not have the specific finding before completion of MDT. Results: 278 patients had one or more follow up visits after completion of MDT. The incidence of lagophthalmos was 0.24%/patient year (95% CI 0.10% to 0.37%); corneal opacity, 5.35%/patient year (95% CI 4.27% to 6.70%); uveal involvement, 3.78%/patient year (95% CI 2.96% to 4.83%); and cataract that reduced vision to 6/18 or less, 2.4%/patient year (95% CI 1.77% to 3.26%). Overall, 5.65%/patient year (95% CI 4.51% to 7.09%) developed leprosy related ocular disease and 3.86%/patient year (95% CI 3.00% to 4.95%) developed leprosy related, potentially blinding ocular pathology during the period following MDT. Age and other disability also predicted incident eye disease. Conclusions: Every year, approximately 5.6% of patients with MB who have completed MDT can be expected to develop new ocular complications of leprosy, which often (3.9%) are potentially vision threatening. Because many of these complications cannot be detected without slit lamp examination, periodic monitoring, particularly of older patients and those with other disability, is recommended, in order to detect and treat ocular complications satisfactorily. I t is estimated that by the end of the year 2005, more than 14 million leprosy patients will have completed a standard course of anti-leprosy multidrug therapy (MDT). 1 Although the incidence of leprosy is declining in some areas, approximately half a million new patients are diagnosed with leprosy each year. Recent changes in the epidemiology of leprosy include a gradual shift in the proportion of the type of leprosy from the paucibacillary to the multibacillary (MB) form, as well as a shift to an older age at diagnosis of disease. 2 Improving health care and socioeconomic conditions predict increasing survival, with the fortunate result that there will be an ever increasing number of antimicrobially ''cured'' leprosy patients than ever existed in history. There is evidence that even after adequate treatment with MDT, a sizeable proportion of cured leprosy patients continue to manifest progressive impairment of nerve function. Although the pathophysiology of this process is not fully understood, it is thought to be related to continuing immunological reactions and slow evolution of pre-existing nerve damage. Ocular complications are frequently observed in newly diagnosed leprosy patients and in patients who are undergoing MDT. 7-9 However, little information exists on the incidence of ocular complications after completion of MDT in MB patients who have completed the recommended course of MDT. Knowledge of the risk and nature of ocular morbidity in leprosy patients after treatment with MDT is needed to prevent and/or manage such complications promptly and effectively in programmes worldwide. Such information potentially could identify risk factors that may be amenable to intervention and help prioritise groups for more active follow up. In our previous reports, we have described a cohort of newly diagnosed MB leprosy patients who were followed for ocular complications during 2 year fixed MDT. 8 10 These patients were followed up for a further 5 years. In this paper, we report information on ocular complications that were incident during the post-MDT period. MATERIAL AND METHODS All new clinically diagnosed MB patients starting on a 2 year MDT and living within the leprosy control area of the Schieffelin Leprosy Research and Training Centre in southern India were invited to participate. Recruitment began in 1991 and was completed in 1997. Consenting patients received a baseline ocular examination followed by biannual examinations during MDT and for a period of at least 5 years after completion of MDT. Based on sample size calculations taking into account possible losses to follow up resulting from migration and mortality, 301 MB leprosy patients were enrolled over a period of 6 years. Research methods and protocols were approved by the institutional review board of the Schieffelin Leprosy Research and Training Centre. All patients were examined and given treatment free of charge. At enrolment the following leprosy characteristics were recorded; the type of MB leprosy, based on the clinical classification of Ridley and Jopling 11 ; deformity grading of hands and legs, based on the WHO classification 12 ; the bacterial index, calculated from the results of the acid fast staining of smears from specific skin sites 13 ; presence or Abbreviations: LROP, leprosy related ocular pathology; MB, multibacillary; MDT, multidrug therapy; PBLROP, potentially blinding leprosy related ocular pathology 949 www.bjophthalmol.co
Thrombolytic removal of intraventricular haemorrhage in treatment of severe stroke: results of the randomised, multicentre, multiregion, placebo-controlled CLEAR III trial
Background:
Intraventricular haemorrhage is a subtype of intracerebral haemorrhage, with 50% mortality and serious disability for survivors. We aimed to test whether attempting to remove intraventricular haemorrhage with alteplase versus saline irrigation improved functional outcome.
Methods:
In this randomised, double-blinded, placebo-controlled, multiregional trial (CLEAR III), participants with a routinely placed extraventricular drain, in the intensive care unit with stable, non-traumatic intracerebral haemorrhage volume less than 30 mL, intraventricular haemorrhage obstructing the 3rd or 4th ventricles, and no underlying pathology were adaptively randomly assigned (1:1), via a web-based system to receive up to 12 doses, 8 h apart of 1 mg of alteplase or 0·9% saline via the extraventricular drain. The treating physician, clinical research staff, and participants were masked to treatment assignment. CT scans were obtained every 24 h throughout dosing. The primary efficacy outcome was good functional outcome, defined as a modified Rankin Scale score (mRS) of 3 or less at 180 days per central adjudication by blinded evaluators. This study is registered with ClinicalTrials.gov, NCT00784134.
Findings:
Between Sept 18, 2009, and Jan 13, 2015, 500 patients were randomised: 249 to the alteplase group and 251 to the saline group. 180-day follow-up data were available for analysis from 246 of 249 participants in the alteplase group and 245 of 251 participants in the placebo group. The primary efficacy outcome was similar in each group (good outcome in alteplase group 48% vs saline 45%; risk ratio [RR] 1·06 [95% CI 0·88–1·28; p=0·554]). A difference of 3·5% (RR 1·08 [95% CI 0·90–1·29], p=0·420) was found after adjustment for intraventricular haemorrhage size and thalamic intracerebral haemorrhage. At 180 days, the treatment group had lower case fatality (46 [18%] vs saline 73 [29%], hazard ratio 0·60 [95% CI 0·41–0·86], p=0·006), but a greater proportion with mRS 5 (42 [17%] vs 21 [9%]; RR 1·99 [95% CI 1·22–3·26], p=0·007). Ventriculitis (17 [7%] alteplase vs 31 [12%] saline; RR 0·55 [95% CI 0·31–0·97], p=0·048) and serious adverse events (114 [46%] alteplase vs 151 [60%] saline; RR 0·76 [95% CI 0·64–0·90], p=0·002) were less frequent with alteplase treatment. Symptomatic bleeding (six [2%] in the alteplase group vs five [2%] in the saline group; RR 1·21 [95% CI 0·37–3·91], p=0·771) was similar.
Interpretation:
In patients with intraventricular haemorrhage and a routine extraventricular drain, irrigation with alteplase did not substantially improve functional outcomes at the mRS 3 cutoff compared with irrigation with saline. Protocol-based use of alteplase with extraventricular drain seems safe. Future investigation is needed to determine whether a greater frequency of complete intraventricular haemorrhage removal via alteplase produces gains in functional status
Self-medication of migraine and tension-type headache: summary of the evidence-based recommendations of the Deutsche Migräne und Kopfschmerzgesellschaft (DMKG), the Deutsche Gesellschaft für Neurologie (DGN), the Österreichische Kopfschmerzgesellschaft (ÖKSG) and the Schweizerische Kopfwehgesellschaft (SKG)
The current evidence-based guideline on self-medication in migraine and tension-type headache of the German, Austrian and Swiss headache societies and the German Society of Neurology is addressed to physicians engaged in primary care as well as pharmacists and patients. The guideline is especially concerned with the description of the methodology used, the selection process of the literature used and which evidence the recommendations are based upon. The following recommendations about self-medication in migraine attacks can be made: The efficacy of the fixed-dose combination of acetaminophen, acetylsalicylic acid and caffeine and the monotherapies with ibuprofen or naratriptan or acetaminophen or phenazone are scientifically proven and recommended as first-line therapy. None of the substances used in self-medication in migraine prophylaxis can be seen as effective. Concerning the self-medication in tension-type headache, the following therapies can be recommended as first-line therapy: the fixed-dose combination of acetaminophen, acetylsalicylic acid and caffeine as well as the fixed combination of acetaminophen and caffeine as well as the monotherapies with ibuprofen or acetylsalicylic acid or diclofenac. The four scientific societies hope that this guideline will help to improve the treatment of headaches which largely is initiated by the patients themselves without any consultation with their physicians
Hospital characteristics associated with highly automated and usable clinical information systems in Texas, United States
<p>Abstract</p> <p>Background</p> <p>A hospital's clinical information system may require a specific environment in which to flourish. This environment is not yet well defined. We examined whether specific hospital characteristics are associated with highly automated and usable clinical information systems.</p> <p>Methods</p> <p>This was a cross-sectional survey of 125 urban hospitals in Texas, United States using the Clinical Information Technology Assessment Tool (CITAT), which measures a hospital's level of automation based on physician interactions with the information system. Physician responses were used to calculate a series of CITAT scores: automation and usability scores, four automation sub-domain scores, and an overall clinical information technology (CIT) score. A multivariable regression analysis was used to examine the relation between hospital characteristics and CITAT scores.</p> <p>Results</p> <p>We received a sufficient number of physician responses at 69 hospitals (55% response rate). Teaching hospitals, hospitals with higher IT operating expenses (>75,000 annually) and hospitals with larger IT staff (≥ 10 full-time staff) had higher automation scores than hospitals that did not meet these criteria (p < 0.05 in all cases). These findings held after adjustment for bed size, total margin, and ownership (p < 0.05 in all cases). There were few significant associations between the hospital characteristics tested in this study and usability scores.</p> <p>Conclusion</p> <p>Academic affiliation and larger IT operating, capital, and staff budgets are associated with more highly automated clinical information systems.</p
National Income and Income Inequality, Family Affluence and Life Satisfaction Among 13 year Old Boys and Girls: A Multilevel Study in 35 Countries
Adolescence is a critical period where many patterns of health and health behaviour are formed. The objective of this study was to investigate cross-national variation in the relationship between family affluence and adolescent life satisfaction, and the impact of national income and income inequality on this relationship. Data from the 2006 Health Behaviour in School-aged Children: WHO collaborative Study (N = 58,352 across 35 countries) were analysed using multilevel linear and logistic regression analyses for outcome measures life satisfaction score and binary high/low life satisfaction. National income and income inequality were associated with aggregated life satisfaction score and prevalence of high life satisfaction. Within-country socioeconomic inequalities in life satisfaction existed even after adjustment for family structure. This relationship was curvilinear and varied cross-nationally. Socioeconomic inequalities were greatest in poor countries and in countries with unequal income distribution. GDP (PPP US$) and Gini did not explain between country variance in socioeconomic inequalities in life satisfaction. The existence of, and variation in, within-country socioeconomic inequalities in adolescent life satisfaction highlights the importance of identifying and addressing mediating factors during this life stage
Refining trait resilience: identifying engineering, ecological, and adaptive facets from extant measures of resilience
The current paper presents a new measure of trait resilience derived from three common
mechanisms identified in ecological theory: Engineering, Ecological and Adaptive (EEA)
resilience. Exploratory and confirmatory factor analyses of five existing resilience scales
suggest that the three trait resilience facets emerge, and can be reduced to a 12-item scale.
The conceptualization and value of EEA resilience within the wider trait and well-being psychology
is illustrated in terms of differing relationships with adaptive expressions of the traits
of the five-factor personality model and the contribution to well-being after controlling for
personality and coping, or over time. The current findings suggest that EEA resilience is a
useful and parsimonious model and measure of trait resilience that can readily be placed
within wider trait psychology and that is found to contribute to individual well-bein
Impact of Schistosome Infection on Plasmodium falciparum Malariometric Indices and Immune Correlates in School Age Children in Burma Valley, Zimbabwe
A group of children aged 6–17 years was recruited and followed up for 12 months to study the impact of schistosome infection on malaria parasite prevalence, density, distribution and anemia. Levels of cytokines, malaria specific antibodies in plasma and parasite growth inhibition capacities were assessed. Baseline results suggested an increased prevalence of malaria parasites in children co-infected with schistosomiasis (31%) compared to children infected with malaria only (25%) (p = 0.064). Moreover, children co-infected with schistosomes and malaria had higher sexual stage geometric mean malaria parasite density (189 gametocytes/µl) than children infected with malaria only (73/µl gametocytes) (p = 0.043). In addition, a larger percentage of co-infected children (57%) had gametocytes as observed by microscopy compared to the malaria only infected children (36%) (p = 0.06). There was no difference between the two groups in terms of the prevalence of anemia, which was approximately 64% in both groups (p = 0.9). Plasma from malaria-infected children exhibited higher malaria antibody activity compared to the controls (p = 0.001) but was not different between malaria and schistosome plus malaria infected groups (p = 0.44) and malaria parasite growth inhibition activity at baseline was higher in the malaria-only infected group of children than in the co-infected group though not reaching statistical significance (p = 0.5). Higher prevalence and higher mean gametocyte density in the peripheral blood may have implications in malaria transmission dynamics during co-infection with helminths
- …