102 research outputs found

    The Effect of Epstein-Barr Virus Latent Membrane Protein 2 Expression on the Kinetics of Early B Cell Infection

    Get PDF
    Infection of human B cells with wild-type Epstein-Barr virus (EBV) in vitro leads to activation and proliferation that result in efficient production of lymphoblastoid cell lines (LCLs). Latent Membrane Protein 2 (LMP2) is expressed early after infection and previous research has suggested a possible role in this process. Therefore, we generated recombinant EBV with knockouts of either or both protein isoforms, LMP2A and LMP2B (Δ2A, Δ2B, Δ2A/Δ2B) to study the effect of LMP2 in early B cell infection. Infection of B cells with Δ2A and Δ2A/Δ2B viruses led to a marked decrease in activation and proliferation relative to wild-type (wt) viruses, and resulted in higher percentages of apoptotic B cells. Δ2B virus infection showed activation levels comparable to wt, but fewer numbers of proliferating B cells. Early B cell infection with wt, Δ2A and Δ2B viruses did not result in changes in latent gene expression, with the exception of elevated LMP2B transcript in Δ2A virus infection. Infection with Δ2A and Δ2B viruses did not affect viral latency, determined by changes in LMP1/Zebra expression following BCR stimulation. However, BCR stimulation of Δ2A/Δ2B cells resulted in decreased LMP1 expression, which suggests loss of stability in viral latency. Long-term outgrowth assays revealed that LMP2A, but not LMP2B, is critical for efficient long-term growth of B cells in vitro. The lowest levels of activation, proliferation, and LCL formation were observed when both isoforms were deleted. These results suggest that LMP2A appears to be critical for efficient activation, proliferation and survival of EBV-infected B cells at early times after infection, which impacts the efficient long-term growth of B cells in culture. In contrast, LMP2B did not appear to play a significant role in these processes, and long-term growth of infected B cells was not affected by the absence of this protein. © 2013 Wasil et al

    Specification of progression in glaucomatous visual field loss, applying locally condensed stimulus arrangements

    Get PDF
    The goal of this work was to (i) determine patterns of progression in glaucomatous visual field loss, (ii) compare the detection rate of progression between locally condensed stimulus arrangements and conventional 6° × 6° grid, and (iii) assess the individual frequency distribution of test locations exhibiting a local event (i.e., an abrupt local deterioration of differential luminance sensitivity (DLS) by more than -10dB between any two examinations). The visual function of 41 glaucomatous eyes of 41 patients (16 females, 25 males, 37 to 75 years old) was examined with automated static perimetry (Tuebingen Computer Campimeter or Octopus 101-Perimeter). Stimuli were added to locally enhance the spatial resolution in suspicious regions of the visual field. The minimum follow-up was four subsequent sessions with a minimum of 2-month (median 6-month) intervals between each session. Progression was identified using a modified pointwise linear regression (PLR) method and a modified Katz criterion. The presence of events was assessed in all progressive visual fields. Eleven eyes (27%) showed progression over the study period (median 2.5 years, range 1.3–8.6 years). Six (55%) of these had combined progression in depth and size and five eyes (45%) progressed in depth only. Progression in size conformed always to the nerve fiber course. Seven out of 11 (64%) of the progressive scotomata detected by spatially condensed grids would have been missed by the conventional 6° × 6° grid. At least one event occurred in 64% of all progressive eyes. Five of 11 (46%) progressive eyes showed a cluster of events. The most common pattern of progression in glaucomatous visual fields is combined progression in depth and size of an existing scotoma. Applying individually condensed test grids remarkably enhances the detection rate of glaucomatous visual field deterioration (at the expense of an increased examination time) compared to conventional stimulus arrangements

    Hardship financing of healthcare among rural poor in Orissa, India

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>This study examines health-related "hardship financing" in order to get better insights on how poor households finance their out-of-pocket healthcare costs. We define hardship financing as having to borrow money with interest or to sell assets to pay out-of-pocket healthcare costs.</p> <p>Methods</p> <p>Using survey data of 5,383 low-income households in Orissa, one of the poorest states of India, we investigate factors influencing the risk of hardship financing with the use of a logistic regression.</p> <p>Results</p> <p>Overall, about 25% of the households (that had any healthcare cost) reported hardship financing during the year preceding the survey. Among households that experienced a hospitalization, this percentage was nearly 40%, but even among households with outpatient or maternity-related care around 25% experienced hardship financing.</p> <p>Hardship financing is explained not merely by the wealth of the household (measured by assets) or how much is spent out-of-pocket on healthcare costs, but also by when the payment occurs, its frequency and its duration (e.g. more severe in cases of chronic illnesses). The location where a household resides remains a major predictor of the likelihood to have hardship financing despite all other household features included in the model.</p> <p>Conclusions</p> <p>Rural poor households are subjected to considerable and protracted financial hardship due to the indirect and longer-term deleterious effects of how they cope with out-of-pocket healthcare costs. The social network that households can access influences exposure to hardship financing. Our findings point to the need to develop a policy solution that would limit that exposure both in quantum and in time. We therefore conclude that policy interventions aiming to ensure health-related financial protection would have to demonstrate that they have reduced the frequency and the volume of hardship financing.</p

    Integration and fusion of standard automated perimetry and optical coherence tomography data for improved automated glaucoma diagnostics

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The performance of glaucoma diagnostic systems could be conceivably improved by the integration of functional and structural test measurements that provide relevant and complementary information for reaching a diagnosis. The purpose of this study was to investigate the performance of data fusion methods and techniques for simple combination of Standard Automated Perimetry (SAP) and Optical Coherence Tomography (OCT) data for the diagnosis of glaucoma using Artificial Neural Networks (ANNs).</p> <p>Methods</p> <p>Humphrey 24-2 SITA standard SAP and StratusOCT tests were prospectively collected from a randomly selected population of 125 healthy persons and 135 patients with glaucomatous optic nerve heads and used as input for the ANNs. We tested commercially available standard parameters as well as novel ones (fused OCT and SAP data) that exploit the spatial relationship between visual field areas and sectors of the OCT peripapillary scan circle. We evaluated the performance of these SAP and OCT derived parameters both separately and in combination.</p> <p>Results</p> <p>The diagnostic accuracy from a combination of fused SAP and OCT data (95.39%) was higher than that of the best conventional parameters of either instrument, i.e. SAP Glaucoma Hemifield Test (p < 0.001) and OCT Retinal Nerve Fiber Layer Thickness ≥ 1 quadrant (p = 0.031). Fused OCT and combined fused OCT and SAP data provided similar Area under the Receiver Operating Characteristic Curve (AROC) values of 0.978 that were significantly larger (p = 0.047) compared to ANNs using SAP parameters alone (AROC = 0.945). On the other hand, ANNs based on the OCT parameters (AROC = 0.970) did not perform significantly worse than the ANNs based on the fused or combined forms of input data. The use of fused input increased the number of tests that were correctly classified by both SAP and OCT based ANNs.</p> <p>Conclusions</p> <p>Compared to the use of SAP parameters, input from the combination of fused OCT and SAP parameters, and from fused OCT data, significantly increased the performance of ANNs. Integrating parameters by including a priori relevant information through data fusion may improve ANN classification accuracy compared to currently available methods.</p

    Quantification of Visual Field Loss in Age-Related Macular Degeneration

    Get PDF
    Background An evaluation of standard automated perimetry (SAP) and short wavelength automated perimetry (SWAP) for the central 10–2 visual field test procedure in patients with age-related macular degeneration (AMD) is presented in order to determine methods of quantifying the central sensitivity loss in patients at various stages of AMD. Methods 10–2 SAP and SWAP Humphrey visual fields and stereoscopic fundus photographs were collected in 27 eyes of 27 patients with AMD and 22 eyes of 22 normal subjects. Results Mean Deviation and Pattern Standard Deviation (PSD) varied significantly with stage of disease in SAP (both p<0.001) and SWAP (both p<0.001), but post hoc analysis revealed overlap of functional values among stages. In SWAP, indices of focal loss were more sensitive to detecting differences in AMD from normal. SWAP defects were greater in depth and area than those in SAP. Central sensitivity (within 1°) changed by −3.9 and −4.9 dB per stage in SAP and SWAP, respectively. Based on defect maps, an AMD Severity Index was derived. Conclusions Global indices of focal loss were more sensitive to detecting early stage AMD from normal. The SWAP sensitivity decline with advancing stage of AMD was greater than in SAP. A new AMD Severity Index quantifies visual field defects on a continuous scale. Although not all patients are suitable for SWAP examinations, it is of value as a tool in research studies of visual loss in AMD

    Relationship between visual field loss and contrast threshold elevation in glaucoma

    Get PDF
    BACKGROUND: There is a considerable body of literature which indicates that contrast thresholds for the detection of sinusoidal grating patterns are abnormally high in glaucoma, though just how these elevations are related to the location of visual field loss remains unknown. Our aim, therefore, has been to determine the relationship between contrast threshold elevation and visual field loss in corresponding regions of the peripheral visual field in glaucoma patients. METHODS: Contrast thresholds were measured in arcuate regions of the superior, inferior, nasal and temporal visual field in response to laser interference fringes presented in the Maxwellian view. The display consisted of vertical green stationary laser interference fringes of spatial frequency 1.0 c deg(-1 )which appeared in a rotatable viewing area in the form of a truncated quadrant extending from 10 to 20° from fixation which was marked with a central fixation light. Results were obtained from 36 normal control subjects in order to provide a normal reference for 21 glaucoma patients and 5 OHT (ocular hypertensive) patients for whom full clinical data, including Friedmann visual fields, had been obtained. RESULTS: Abnormally high contrast thresholds were identified in 20 out of 21 glaucoma patients and in 2 out of 5 OHT patients when compared with the 95% upper prediction limit for normal values from one eye of the 36 normal age-matched control subjects. Additionally, inter-ocular differences in contrast threshold were also abnormally high in 18 out of 20 glaucoma patients who had vision in both eyes compared with the 95% upper prediction limit. Correspondence between abnormally high contrast thresholds and visual field loss in the truncated quadrants was significant in 5 patients, borderline in 4 patients and absent in 9 patients. CONCLUSION: While the glaucoma patients tested in our study invariably had abnormally high contrast thresholds in one or more of the truncated quadrants in at least one eye, reasonable correspondence with the location of the visual field loss only occurred in half the patients studied. Hence, while contrast threshold elevations are indicative of glaucomatous damage to vision, they are providing a different assessment of visual function from conventional visual field tests

    Climate change impacts and adaptation in forest management: a review

    Get PDF
    corecore