349 research outputs found
Providing web-based mental health services to at-risk women
<p>Abstract</p> <p>Background</p> <p>We examined the feasibility of providing web-based mental health services, including synchronous internet video conferencing of an evidence-based support/education group, to at-risk women, specifically poor lone mothers. The objectives of this study were to: (i) adapt a face-to-face support/education group intervention to a web-based format for lone mothers, and (ii) evaluate lone mothers' response to web-based services, including an online video conferencing group intervention program.</p> <p>Methods</p> <p>Participating mothers were recruited through advertisements. To adapt the face-to-face intervention to a web-based format, we evaluated participant motivation through focus group/key informant interviews (n = 7), adapted the intervention training manual for a web-based environment and provided a computer training manual. To evaluate response to web-based services, we provided the intervention to two groups of lone mothers (n = 15). Pre-post quantitative evaluation of mood, self-esteem, social support and parenting was done. Post intervention follow up interviews explored responses to the group and to using technology to access a health service. Participants received $20 per occasion of data collection. Interviews were taped, transcribed and content analysis was used to code and interpret the data. Adherence to the intervention protocol was evaluated.</p> <p>Results</p> <p>Mothers participating in this project experienced multiple difficulties, including financial and mood problems. We adapted the intervention training manual for use in a web-based group environment and ensured adherence to the intervention protocol based on viewing videoconferencing group sessions and discussion with the leaders. Participant responses to the group intervention included decreased isolation, and increased knowledge and confidence in themselves and their parenting; the responses closely matched those of mothers who obtained same service in face-to-face groups. Pre-and post-group quantitative evaluations did not show significant improvements on measures, although the study was not powered to detect these.</p> <p>Conclusions</p> <p>We demonstrated that an evidence-based group intervention program for lone mothers developed and evaluated in face-to-face context transferred well to an online video conferencing format both in terms of group process and outcomes.</p
Persistent TÂ Cell Repertoire Perturbation and TÂ Cell Activation in HIV After Long Term Treatment
Objective: In people living with HIV (PLHIV), we sought to test the hypothesis that long term anti-retroviral therapy restores the normal T cell repertoire, and investigate the functional relationship of residual repertoire abnormalities to persistent immune system dysregulation. Methods: We conducted a case-control study in PLHIV and HIV-negative volunteers, of circulating T cell receptor repertoires and whole blood transcriptomes by RNA sequencing, complemented by metadata from routinely collected health care records. Results: T cell receptor sequencing revealed persistent abnormalities in the clonal T cell repertoire of PLHIV, characterized by reduced repertoire diversity and oligoclonal T cell expansion correlated with elevated CD8 T cell counts. We found no evidence that these expansions were driven by cytomegalovirus or another common antigen. Increased frequency of long CDR3 sequences and reduced frequency of public sequences among the expanded clones implicated abnormal thymic selection as a contributing factor. These abnormalities in the repertoire correlated with systems level evidence of persistent T cell activation in genome-wide blood transcriptomes. Conclusions: The diversity of T cell receptor repertoires in PLHIV on long term anti-retroviral therapy remains significantly depleted, and skewed by idiosyncratic clones, partly attributable to altered thymic output and associated with T cell mediated chronic immune activation. Further investigation of thymic function and the antigenic drivers of T cell clonal selection in PLHIV are critical to efforts to fully re-establish normal immune function
Management and control of tuberculosis control in socially complex groups: a research programme including three RCTs
Background: Socially complex groups, including people experiencing homelessness, prisoners and drug users, have very high levels of tuberculosis, often complicated by late diagnosis and difficulty in adhering to treatment. /
Objective: To assess a series of interventions to improve tuberculosis control in socially complex groups. /
Design: A series of observational surveys, evaluations and trials of interventions. /
Setting: The pan-London Find&Treat service, which supports tuberculosis screening and case management in socially complex groups across London. /
Participants: Socially complex groups with tuberculosis or at risk of tuberculosis, including people experiencing homelessness, prisoners, drug users and those at high risk of poor adherence to tuberculosis treatment.
Interventions and main outcome measures
We screened 491 people in homeless hostels and 511 people in prison for latent tuberculosis infection, human immunodeficiency virus, hepatitis B and hepatitis C. We evaluated an NHS-led prison radiographic screening programme. We conducted a cluster randomised controlled trial (2348 eligible people experiencing homelessness in 46 hostels) of the effectiveness of peer educators (22 hostels) compared with NHS staff (24 hostels) at encouraging the uptake of mobile radiographic screening. We initiated a trial of the use of point-of-care polymerase chain reaction diagnostics to rapidly confirm tuberculosis alongside mobile radiographic screening. We undertook a randomised controlled trial to improve treatment adherence, comparing face-to-face, directly observed treatment with video-observed treatment using a smartphone application. The primary outcome was completion of ≥ 80% of scheduled treatment observations over the first 2 months following enrolment. We assessed the cost-effectiveness of latent tuberculosis screening alongside radiographic screening of people experiencing homelessness. The costs of video-observed treatment and directly observed treatment were compared. /
Results: In the homeless hostels, 16.5% of people experiencing homelessness had latent tuberculosis infection, 1.4% had current hepatitis B infection, 10.4% had hepatitis C infection and 1.0% had human immunodeficiency virus infection. When a quality-adjusted life-year is valued at £30,000, the latent tuberculosis screening of people experiencing homelessness was cost-effective provided treatment uptake was ≥ 25% (for a £20,000 quality-adjusted life-year threshold, treatment uptake would need to be > 50%). In prison, 12.6% of prisoners had latent tuberculosis infection, 1.9% had current hepatitis B infection, 4.2% had hepatitis C infection and 0.0% had human immunodeficiency virus infection. In both settings, levels of latent tuberculosis infection and blood-borne viruses were higher among injecting drug users. A total of 1484 prisoners were screened using chest radiography over a total of 112 screening days (new prisoner screening coverage was 43%). Twenty-nine radiographs were reported as potentially indicating tuberculosis. One prisoner began, and completed, antituberculosis treatment in prison. In the cluster randomised controlled trial of peer educators to increase screening uptake, the median uptake was 45% in the control arm and 40% in the intervention arm (adjusted risk ratio 0.98, 95% confidence interval 0.80 to 1.20). A rapid diagnostic service was established on the mobile radiographic unit but the trial of rapid diagnostics was abandoned because of recruitment and follow-up difficulties. We randomly assigned 112 patients to video-observed treatment and 114 patients to directly observed treatment. Fifty-eight per cent of those recruited had a history of homelessness, addiction, imprisonment or severe mental health problems. Seventy-eight (70%) of 112 patients on video-observed treatment achieved the primary outcome, compared with 35 (31%) of 114 patients on directly observed treatment (adjusted odds ratio 5.48, 95% confidence interval 3.10 to 9.68; p < 0.0001). Video-observed treatment was superior to directly observed treatment in all demographic and social risk factor subgroups. The cost for 6 months of treatment observation was £1645 for daily video-observed treatment, £3420 for directly observed treatment three times per week and £5700 for directly observed treatment five times per week. /
Limitations: Recruitment was lower than anticipated for most of the studies. The peer advocate study may have been contaminated by the fact that the service was already using peer educators to support its work. /
Conclusions: There are very high levels of latent tuberculosis infection among prisoners, people experiencing homelessness and drug users. Screening for latent infection in people experiencing homelessness alongside mobile radiographic screening would be cost-effective, providing the uptake of treatment was 25–50%. Despite ring-fenced funding, the NHS was unable to establish static radiographic screening programmes. Although we found no evidence that peer educators were more effective than health-care workers in encouraging the uptake of mobile radiographic screening, there may be wider benefits of including peer educators as part of the Find&Treat team. Utilising polymerase chain reaction-based rapid diagnostic testing on a mobile radiographic unit is feasible. Smartphone-enabled video-observed treatment is more effective and cheaper than directly observed treatment for ensuring that treatment is observed. /
Future work: Trials of video-observed treatment in high-incidence settings are needed. /
Trial registration: Current Controlled Trials ISRCTN17270334 and ISRCTN26184967. /
Funding: This project was funded by the National Institute for Health Research (NIHR) Programme Grants for Applied Research programme and will be published in full in Programme Grants for Applied Research; Vol. 8, No. 9. See the NIHR Journals Library website for further project information
Evolution favors protein mutational robustness in sufficiently large populations
BACKGROUND: An important question is whether evolution favors properties such
as mutational robustness or evolvability that do not directly benefit any
individual, but can influence the course of future evolution. Functionally
similar proteins can differ substantially in their robustness to mutations and
capacity to evolve new functions, but it has remained unclear whether any of
these differences might be due to evolutionary selection for these properties.
RESULTS: Here we use laboratory experiments to demonstrate that evolution
favors protein mutational robustness if the evolving population is sufficiently
large. We neutrally evolve cytochrome P450 proteins under identical selection
pressures and mutation rates in populations of different sizes, and show that
proteins from the larger and thus more polymorphic population tend towards
higher mutational robustness. Proteins from the larger population also evolve
greater stability, a biophysical property that is known to enhance both
mutational robustness and evolvability. The excess mutational robustness and
stability is well described by existing mathematical theories, and can be
quantitatively related to the way that the proteins occupy their neutral
network.
CONCLUSIONS: Our work is the first experimental demonstration of the general
tendency of evolution to favor mutational robustness and protein stability in
highly polymorphic populations. We suggest that this phenomenon may contribute
to the mutational robustness and evolvability of viruses and bacteria that
exist in large populations
PUGeo-Net: A Geometry-centric Network for 3D Point Cloud Upsampling
This paper addresses the problem of generating uniform dense point clouds to
describe the underlying geometric structures from given sparse point clouds.
Due to the irregular and unordered nature, point cloud densification as a
generative task is challenging. To tackle the challenge, we propose a novel
deep neural network based method, called PUGeo-Net, that learns a
linear transformation matrix for each input point. Matrix
approximates the augmented Jacobian matrix of a local parameterization and
builds a one-to-one correspondence between the 2D parametric domain and the 3D
tangent plane so that we can lift the adaptively distributed 2D samples (which
are also learned from data) to 3D space. After that, we project the samples to
the curved surface by computing a displacement along the normal of the tangent
plane. PUGeo-Net is fundamentally different from the existing deep learning
methods that are largely motivated by the image super-resolution techniques and
generate new points in the abstract feature space. Thanks to its
geometry-centric nature, PUGeo-Net works well for both CAD models with sharp
features and scanned models with rich geometric details. Moreover, PUGeo-Net
can compute the normal for the original and generated points, which is highly
desired by the surface reconstruction algorithms. Computational results show
that PUGeo-Net, the first neural network that can jointly generate vertex
coordinates and normals, consistently outperforms the state-of-the-art in terms
of accuracy and efficiency for upsampling factor .Comment: 17 pages, 10 figure
The link between volcanism and plutonism in epizonal magma systems; high-precision U–Pb zircon geochronology from the Organ Mountains caldera and batholith, New Mexico
The Organ Mountains caldera and batholith expose the volcanic and epizonal plutonic record of an Eocene caldera complex. The caldera and batholith are well exposed, and extensive previous mapping and geochemical analyses have suggested a clear link between the volcanic and plutonic sections, making this an ideal location to study magmatic processes associated with caldera volcanism. Here we present high-precision thermal ionization mass spectrometry U–Pb zircon dates from throughout the caldera and batholith, and use these dates to test and improve existing petrogenetic models. The new dates indicate that Eocene volcanic and plutonic rocks in the Organ Mountains formed from ~44 to 34 Ma. The three largest caldera-related tuff units yielded weighted mean [superscript 206]Pb/[superscript 238]U dates of 36.441 ± 0.020 Ma (Cueva Tuff), 36.259 ± 0.016 Ma (Achenback Park tuff), and 36.215 ± 0.016 Ma (Squaw Mountain tuff). An alkali feldspar granite, which is chemically similar to the erupted tuffs, yielded a synchronous weighted mean [superscript 206]Pb/[superscript 238]U date of 36.259 ± 0.021 Ma. Weighted mean [superscript 206]Pb/[superscript 238]U dates from the larger volume syenitic phase of the underlying Organ Needle pluton range from 36.130 ± 0.031 to 36.071 ± 0.012 Ma, and the youngest sample is 144 ± 20 to 188 ± 20 ka younger than the Squaw Mountain and Achenback Park tuffs, respectively. Younger plutonism in the batholith continued through at least 34.051 ± 0.029 Ma. We propose that the Achenback Park tuff, Squaw Mountain tuff, alkali feldspar granite and Organ Needle pluton formed from a single, long-lived magma chamber/mush zone. Early silicic magmas generated by partial melting of the lower crust rose to form an epizonal magma chamber. Underplating of the resulting mush zone led to partial melting and generation of a high-silica alkali feldspar granite cap, which erupted to form the tuffs. The deeper parts of the chamber underwent continued recharge and crystallization for 144 ± 20 ka after the final eruption. Calculated magmatic fluxes for the Organ Needle pluton range from 0.0006 to 0.0030 km3/year, in agreement with estimates from other well-studied plutons. The petrogenetic evolution proposed here may be common to many small-volume silicic volcanic systems
The duration of protection of school-aged BCG vaccination in England: a population-based case–control study
BACKGROUND: Evidence of protection from childhood Bacillus Calmette-Guerin (BCG) against tuberculosis (TB) in adulthood, when most transmission occurs, is important for TB control and resource allocation.
METHODS: We conducted a population-based case–control study of protection by BCG given to children aged 12–13 years against tuberculosis occurring 10–29 years later. We recruited UK-born White subjects with tuberculosis and randomly sampled White community controls. Hazard ratios and 95% confidence intervals (CIs) were estimated using case–cohort Cox regression, adjusting for potential confounding factors, including socio-economic status, smoking, drug use, prison and homelessness. Vaccine effectiveness (VE = 1 – hazard ratio) was assessed at successive intervals more than 10 years following vaccination.
RESULTS: We obtained 677 cases and 1170 controls after a 65% response rate in both groups. Confounding by deprivation, education and lifestyle factors was slight 10–20 years after vaccination, and more evident after 20 years. VE 10–15 years after vaccination was 51% (95% CI 21, 69%) and 57% (CI 33, 72%) at 15–20 years. Subsequently, BCG protection appeared to wane; 20–25 years VE = 25% (CI –14%, 51%) and 25–29 years VE = 1% (CI –84%, 47%). Based on multiple imputation of missing data (in 17% subjects), VE estimated in the same intervals after vaccination were similar [56% (CI 33, 72%), 57% (CI 36, 71%), 25% (–10, 48%), 21% (–39, 55%)].
CONCLUSIONS: School-aged BCG vaccination offered moderate protection against tuberculosis for at least 20 years, which is longer than previously thought. This has implications for assessing the cost-effectiveness of BCG vaccination and when evaluating new TB vaccines
Systematic assessment with I-SCAN magnification endoscopy and acetic acid improves dysplasia detection in patients with Barrett's esophagus
BACKGROUND AND STUDY AIMS: Enhanced endoscopic imaging with chromoendoscopy may improve dysplasia recognition in patients undergoing assessment of Barrett's esophagus (BE). This may reduce the need for random biopsies to detect more dysplasia. The aim of this study was to assess the effect of magnification endoscopy with I-SCAN (Pentax, Tokyo, Japan) and acetic acid (ACA) on dysplasia detection in BE using a novel mucosal and vascular classification system. METHODS: BE segments and suspicious lesions were recorded with high definition white-light and magnification endoscopy enhanced using all I-SCAN modes in combination. We created a novel mucosal and vascular classification system based on similar previously validated classifications for narrow-band imaging (NBI). A total of 27 videos were rated before and after ACA application. Following validation, a further 20 patients had their full endoscopies recorded and analyzed to model use of the system to detect dysplasia in a routine clinical scenario. RESULTS: The accuracy of the I-SCAN classification system for BE dysplasia improved with I-SCAN magnification from 69 % to 79 % post-ACA (P = 0.01). In the routine clinical scenario model in 20 new patients, accuracy of dysplasia detection increased from 76 % using a "pull-through" alone to 83 % when ACA and magnification endoscopy were combined (P = 0.047). Overall interobserver agreement between experts for dysplasia detection was substantial (0.69). CONCLUSIONS: A new I-SCAN classification system for BE was validated against similar systems for NBI with similar outcomes. When used in combination with magnification and ACA, the classification detected BE dysplasia in clinical practice with good accuracy.Trials registered at ISRCTN (58235785)
Improvement over time in outcomes for patients undergoing endoscopic therapy for Barrett's oesophagus-related neoplasia: 6-year experience from the first 500 patients treated in the UK patient registry.
BACKGROUND: Barrett's oesophagus (BE) is a pre-malignant condition leading to oesophageal adenocarcinoma (OAC). Treatment of neoplasia at an early stage is desirable. Combined endoscopic mucosal resection (EMR) followed by radiofrequency ablation (RFA) is an alternative to surgery for patients with BE-related neoplasia. METHODS: We examined prospective data from the UK registry of patients undergoing RFA/EMR for BE-related neoplasia from 2008 to 2013. Before RFA, visible lesions were removed by EMR. Thereafter, patients had RFA 3-monthly until all BE was ablated or cancer developed (endpoints). End of treatment biopsies were recommended at around 12 months from first RFA treatment or when endpoints were reached. Outcomes for clearance of dysplasia (CR-D) and BE (CR-IM) at end of treatment were assessed over two time periods (2008-2010 and 2011-2013). Durability of successful treatment and progression to OAC were also evaluated. RESULTS: 508 patients have completed treatment. CR-D and CR-IM improved significantly between the former and later time periods, from 77% and 56% to 92% and 83%, respectively (p<0.0001). EMR for visible lesions prior to RFA increased from 48% to 60% (p=0.013). Rescue EMR after RFA decreased from 13% to 2% (p<0.0001). Progression to OAC at 12 months is not significantly different (3.6% vs 2.1%, p=0.51). CONCLUSIONS: Clinical outcomes for BE neoplasia have improved significantly over the past 6 years with improved lesion recognition and aggressive resection of visible lesions before RFA. Despite advances in technique, the rate of cancer progression remains 2-4% at 1 year in these high-risk patients. TRIAL REGISTRATION NUMBER: ISRCTN93069556
- …