569 research outputs found
Differences in the Activity of Endogenous Bone Morphogenetic Protein Signaling Impact on the Ability of Induced Pluripotent Stem Cells to Differentiate to Corneal Epithelial-Like Cells
Cornea is a clear outermost layer of the eye which enables transmission of light onto the retina. The transparent corneal epithelium is regenerated by limbal stem cells (LSCs), whose loss/dysfunction results in LSCs deficiency (LSCD). Ex vivo expansion of autologous LSCs obtained from patient's healthy eye followed by transplantation onto the LSCs damaged/deficient eye, has provided a successful treatment for unilateral LSCD. However, this is not applicable to patient with total bilateral LSCD, where LSCs are lost/damaged from both eyes. We investigated the potential of human induced pluripotent stem cell (hiPSC) to differentiate into corneal epithelial-like cells as a source of autologous stem cell treatment for patients with total bilateral LSCD. Our study showed that combined addition of bone morphogenetic protein 4 (BMP4), all trans-retinoic acid and epidermal growth factor for the first 9 days of differentiation followed by cell-replating on collagen-IV-coated surfaces with a corneal-specific-epithelial cell media for an additional 11 days, resulted in step wise differentiation of human embryonic stem cells (hESC) to corneal epithelial progenitors and mature corneal epithelial-like cells. We observed differences in the ability of hiPSC lines to undergo differentiation to corneal epithelial-like cells which were dependent on the level of endogenous BMP signaling and could be restored via the activation of this signaling pathway by a specific transforming growth factor β inhibitor (SB431542). Together our data reveal a differential ability of hiPSC lines to generate corneal epithelial cells which is underlined by the activity of endogenous BMP signaling pathway
Primary vs. Secondary Antibody Deficiency: Clinical Features and Infection Outcomes of Immunoglobulin Replacement
<div><p>Secondary antibody deficiency can occur as a result of haematological malignancies or certain medications, but not much is known about the clinical and immunological features of this group of patients as a whole. Here we describe a cohort of 167 patients with primary or secondary antibody deficiencies on immunoglobulin (Ig)-replacement treatment. The demographics, causes of immunodeficiency, diagnostic delay, clinical and laboratory features, and infection frequency were analysed retrospectively. Chemotherapy for B cell lymphoma and the use of Rituximab, corticosteroids or immunosuppressive medications were the most common causes of secondary antibody deficiency in this cohort. There was no difference in diagnostic delay or bronchiectasis between primary and secondary antibody deficiency patients, and both groups experienced disorders associated with immune dysregulation. Secondary antibody deficiency patients had similar baseline levels of serum IgG, but higher IgM and IgA, and a higher frequency of switched memory B cells than primary antibody deficiency patients. Serious and non-serious infections before and after Ig-replacement were also compared in both groups. Although secondary antibody deficiency patients had more serious infections before initiation of Ig-replacement, treatment resulted in a significant reduction of serious and non-serious infections in both primary and secondary antibody deficiency patients. Patients with secondary antibody deficiency experience similar delays in diagnosis as primary antibody deficiency patients and can also benefit from immunoglobulin-replacement treatment.</p></div
Of cattle, sand flies and men : a systematic review of risk factor analyses for South Asian visceral leishmaniasis and implications for elimination
Background: Studies performed over the past decade have identified fairly consistent epidemiological patterns of risk
factors for visceral leishmaniasis (VL) in the Indian subcontinent.
Methods and Principal Findings: To inform the current regional VL elimination effort and identify key gaps in knowledge,
we performed a systematic review of the literature, with a special emphasis on data regarding the role of cattle because
primary risk factor studies have yielded apparently contradictory results. Because humans form the sole infection reservoir,
clustering of kala-azar cases is a prominent epidemiological feature, both at the household level and on a larger scale.
Subclinical infection also tends to show clustering around kala-azar cases. Within villages, areas become saturated over a
period of several years; kala-azar incidence then decreases while neighboring areas see increases. More recently, post kalaazar
dermal leishmaniasis (PKDL) cases have followed kala-azar peaks. Mud walls, palpable dampness in houses, and peridomestic
vegetation may increase infection risk through enhanced density and prolonged survival of the sand fly vector.
Bed net use, sleeping on a cot and indoor residual spraying are generally associated with decreased risk. Poor micronutrient
status increases the risk of progression to kala-azar. The presence of cattle is associated with increased risk in some studies
and decreased risk in others, reflecting the complexity of the effect of bovines on sand fly abundance, aggregation, feeding
behavior and leishmanial infection rates. Poverty is an overarching theme, interacting with individual risk factors on multiple
levels.
Conclusions: Carefully designed demonstration projects, taking into account the complex web of interconnected risk
factors, are needed to provide direct proof of principle for elimination and to identify the most effective maintenance
activities to prevent a rapid resurgence when interventions are scaled back. More effective, short-course treatment
regimens for PKDL are urgently needed to enable the elimination initiative to succeed
Recommended from our members
Family treatment of child anxiety: outcomes, limitations and future directions
Anxiety of childhood is a common and serious condition. The past decade has seen an increase in treatment-focussed research, with recent trials tending to give greater attention to parents in the treatment process. This review examines the efficacy of family-based cognitive behaviour therapy and attempts to delineate some of the factors that might have an impact on its efficacy. The choice and timing of outcome measure, age and gender of the child, level of parental anxiety, severity and type of child anxiety and treatment format and content are scrutinised. The main conclusions are necessarily tentative, but it seems likely that Family Cognitive Behaviour Therapy (FCBT) is superior to no treatment, and, for some outcome measures, also superior to Child Cognitive Behaviour Therapy (CCBT). Where FCBT is successful, the results are consistently maintained at follow-up. It appears that where a parent is anxious, and this is not addressed, outcomes are less good. However, for children of anxious parents, FCBT is probably more effective than CCBT. What is most clear is that large, well-designed studies, examining these factors alone and in combination, are now needed
The Role of Recipient Characteristics in Health Video Communication Outcomes: Scoping Review
Background:
The importance of effective communication during public health emergencies has been highlighted by the World Health Organization, and it has published guidelines for effective communication in such situations. With video being a popular medium, video communication has been a growing area of study over the past decades and is increasingly used across different sectors and disciplines, including health. Health-related video communication gained momentum during the SARS-CoV-2 pandemic, and video was among the most frequently used modes of communication worldwide. However, although much research has been done regarding different characteristics of video content (the message) and its delivery (the messenger), there is a lack of knowledge about the role played by the characteristics of the recipients for the creation of effective communication.
Objective:
The aim of this review is to identify how health video communication outcomes are shaped by recipient characteristics, as such characteristics might affect the effectiveness of communication. The main research question of the study is as follows: do the characteristics of the recipients of health videos affect the outcomes of the communication?
Methods:
A scoping review describing the existing knowledge within the field was conducted. We searched for literature in 3 databases (PubMed, Scopus, and Embase) and defined eligibility criteria based on the relevance to the research question. Recipient characteristics and health video communication outcomes were identified and classified.
Results:
Of the 1040 documents initially identified, 128 (12.31%) met the criteria for full-text assessment, and 39 (3.75%) met the inclusion criteria. The included studies reported 56 recipient characteristics and 42 communication outcomes. The reported associations between characteristics and outcomes were identified, and the potential research opportunities were discussed. Contributions were made to theory development by amending the existing framework of the Integrated-Change model, which is an integrated model of motivational and behavioral change.
Conclusions:
Although several recipient characteristics and health video communication outcomes were identified, there is a lack of robust empirical evidence on the association between them. Further research is needed to understand how the preceding characteristics of the recipients might affect the various outcomes of health video communication
Combined loss of the BH3-only proteins Bim and Bmf restores B-cell development and function in TACI-Ig transgenic mice.
Terminal differentiation of B cells depends on two interconnected survival pathways, elicited by the B-cell receptor (BCR) and the BAFF receptor (BAFF-R), respectively. Loss of either signaling pathway arrests B-cell development. Although BCR-dependent survival depends mainly on the activation of the v-AKT murine thymoma viral oncogene homolog 1 (AKT)/PI3-kinase network, BAFF/BAFF-R-mediated survival engages non-canonical NF-κB signaling as well as MAPK/extracellular-signal regulated kinase and AKT/PI3-kinase modules to allow proper B-cell development. Plasma cell survival, however, is independent of BAFF-R and regulated by APRIL that signals NF-κB activation via alternative receptors, that is, transmembrane activator and CAML interactor (TACI) or B-cell maturation (BCMA). All these complex signaling events are believed to secure survival by increased expression of anti-apoptotic B-cell lymphoma 2 (Bcl2) family proteins in developing and mature B cells. Curiously, how lack of BAFF- or APRIL-mediated signaling triggers B-cell apoptosis remains largely unexplored. Here, we show that two pro-apoptotic members of the 'Bcl2 homology domain 3-only' subgroup of the Bcl2 family, Bcl2 interacting mediator of cell death (Bim) and Bcl2 modifying factor (Bmf), mediate apoptosis in the context of TACI-Ig overexpression that effectively neutralizes BAFF as well as APRIL. Surprisingly, although Bcl2 overexpression triggers B-cell hyperplasia exceeding the one observed in Bim(-/-)Bmf(-/-) mice, Bcl2 transgenic B cells remain susceptible to the effects of TACI-Ig expression in vivo, leading to ameliorated pathology in Vav-Bcl2 transgenic mice. Together, our findings shed new light on the molecular machinery restricting B-cell survival during development, normal homeostasis and under pathological conditions. Our data further suggest that Bcl2 antagonists might improve the potency of BAFF/APRIL-depletion strategies in B-cell-driven pathologies
Shortcomings in public health authorities’ videos on COVID-19: limited reach and a creative gap
Science Communication and Societ
Health authorities’ health risk communication with the public during pandemics: a rapid scoping review
Background
Responses from the H1N1 swine flu pandemic and the recent COVID-19 coronavirus pandemic provide an opportunity for insight into the role of health authorities’ ways of communicating health risk information to the public. We aimed to synthesise the existing evidence regarding different modes of communication used by health authorities in health risk communication with the public during a pandemic.
Methods
We conducted a rapid scoping review. MEDLINE and EMBASE were searched for publications in English from January 2009 through October 2020, covering both the full H1N1 pandemic and the response phase during the COVID-19 pandemic. The search resulted in 1440 records, of which 48 studies met our eligibility criteria.
Results
The present review identified studies across a broad interdisciplinary field of health risk communication. The majority focused on the H1N1 pandemic and the COVID-19 pandemic. A content analysis of the studies identified three categories for modes of communication: i) communication channels, ii) source credibility and iii) how the message is communicated. The identified studies on social media focused mainly on content and engagement, while studies on the effect of the use of social media and self-protective behaviour were lacking. Studies on the modes of communication that take the diversity of receivers in the field into account are lacking. A limited number of studies of health authorities’ use of graphic and audio-visual means were identified, yet these did not consider/evaluate creative communication choices.
Conclusion
Experimental studies that investigate the effect of health authorities’ videos and messages on social media platforms and self-protective behaviour are needed. More studies are needed across the fields of health risk communication and media studies, including visual communication, web design, video and digital marketing, at a time when online digital communication is central to reaching the public
Applying the ROBINS-I tool to natural experiments: an example from public health
Background:
A new tool to assess Risk of Bias In Non-randomised Studies of Interventions (ROBINS-I) was published in Autumn 2016. ROBINS-I uses the Cochrane-approved risk of bias (RoB) approach and focusses on internal validity. As such, ROBINS-I represents an important development for those conducting systematic reviews which include non-randomised studies (NRS), including public health researchers. We aimed to establish the applicability of ROBINS-I using a group of NRS which have evaluated non-clinical public health natural experiments.
Methods:
Five researchers, all experienced in critical appraisal of non-randomised studies, used ROBINS-I to independently assess risk of bias in five studies which had assessed the health impacts of a domestic energy efficiency intervention. ROBINS-I assessments for each study were entered into a database and checked for consensus across the group. Group discussions were used to identify reasons underpinning lack of consensus for specific questions and bias domains.
Results:
ROBINS-I helped to systematically articulate sources of bias in NRS. However, the lack of consensus in assessments for all seven bias domains raised questions about ROBINS-I’s reliability and applicability for natural experiment studies. The two RoB domains with least consensus were selection (Domain 2) and performance (Domain 4). Underlying the lack of consensus were difficulties in applying an intention to treat or per protocol effect of interest to the studies. This was linked to difficulties in determining whether the intervention status was classified retrospectively at follow-up, i.e. post hoc. The overall risk of bias ranged from moderate to critical; this was most closely linked to the assessment of confounders.
Conclusion:
The ROBINS-I tool is a conceptually rigorous tool which focusses on risk of bias due to the counterfactual. Difficulties in applying ROBINS-I may be due to poor design and reporting of evaluations of natural experiments. While the quality of reporting may improve in the future, improved guidance on applying ROBINS-I is needed to enable existing evidence from natural experiments to be assessed appropriately and consistently. We hope future refinements to ROBINS-I will address some of the issues raised here to allow wider use of the tool
Purchase price of tobacco in small retailers in Great Britain: the relationships with neighbourhood deprivation and urbanicity between 2016–2021
Background: Tobacco price is an important determinant of smoking behaviour. Using electronic point-of-sales (EPOS) data, this study assesses purchase price of factory-made cigarettes (FMC) and roll-your-own (RYO) tobacco across neighbourhood deprivation and urban/rural status in Britain. It considers price changes, 2016–2021 and brand price segmentation.
Methods: The analysis uses EPOS data describing 10 156 106 tobacco packs sold from 1012 convenience stores, across 24 seasonally-distributed weeks (2016–2021). Gross sales prices were adjusted for inflation and presented per 20 cigarette sticks of FMC and equivalent RYO. Tobacco brand variants were assigned to four price segments (sub value, value, midprice and premium).
Results: Between 2016–2021, the sales-weighted price of tobacco (20 sticks or equivalent) reduced from £8.72 to £8.10, reflecting a shift from FMC to RYO (RYO increasing from 32–46% of tobacco sales). The mean price of 20 sticks of FMC in the most deprived neighbourhoods was 5% (£0.51–£0.59) lower compared with the least deprived in all years; for RYO, this price difference grew from 3% to 5% (£0.13–£0.28). The greater likelihood that tobacco was from lower price segments in more deprived areas largely accounted for this price difference.
Conclusions: Differences in average price paid for tobacco between more and less deprived neighbourhoods reflect variations in numbers of purchases across price segments. Combined (FMC and RYO) tobacco prices per stick have fallen, reflecting increasing RYO sales. Innovative approaches are required to respond to the tobacco industry’s price differentiation by both price segment and product type and the growing importance of lower price RYO products
- …
