443 research outputs found
Quantifying mechanistic traits of influenza viral dynamics using in vitro data.
When analysing in vitro data, growth kinetics of influenza virus strains are often compared by computing their growth rates, which are sometimes used as proxies for fitness. However, analogous to mathematical models for epidemics, the growth rate can be defined as a function of mechanistic traits: the basic reproduction number (the average number of cells each infected cell infects) and the mean generation time (the average length of a replication cycle). Fitting a model to previously published and newly generated data from experiments in human lung cells, we compared estimates of growth rate, reproduction number and generation time for six influenza A strains. Of four strains in previously published data, A/Canada/RV733/2003 (seasonal H1N1) had the lowest basic reproduction number, followed by A/Mexico/INDRE4487/2009 (pandemic H1N1), then A/Indonesia/05/2005 (spill-over H5N1) and A/Anhui/1/2013 (spill-over H7N9). This ordering of strains was preserved for both generation time and growth rate, suggesting a positive biological correlation between these quantities which have not been previously observed. We further investigated these potential correlations using data from reassortant viruses with different internal proteins (from A/England/195/2009 (pandemic H1N1) and A/Turkey/05/2005 (H5N1)), and the same surface proteins (from A/Puerto Rico/8/34 (lab-adapted H1N1)). Similar correlations between traits were observed for these viruses, confirming our initial findings and suggesting that these patterns were related to the degree of human adaptation of internal genes. Also, the model predicted that strains with a smaller basic reproduction number, shorter generation time and slower growth rate underwent more replication cycles by the time of peak viral load, potentially accumulating mutations more quickly. These results illustrate the utility of mathematical models in inferring traits driving observed differences in in vitro growth of influenza strains
Dynamics of SARS-CoV-2 infection hospitalisation and infection fatality ratios over 23 months in England
The relationship between prevalence of infection and severe outcomes such as hospitalisation and death changed over the course of the COVID-19 pandemic. Reliable estimates of the infection fatality ratio (IFR) and infection hospitalisation ratio (IHR) along with the time-delay between infection and hospitalisation/death can inform forecasts of the numbers/timing of severe outcomes and allow healthcare services to better prepare for periods of increased demand. The REal-time Assessment of Community Transmission-1 (REACT-1) study estimated swab positivity for Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) infection in England approximately monthly from May 2020 to March 2022. Here, we analyse the changing relationship between prevalence of swab positivity and the IFR and IHR over this period in England, using publicly available data for the daily number of deaths and hospitalisations, REACT-1 swab positivity data, time-delay models, and Bayesian P-spline models. We analyse data for all age groups together, as well as in 2 subgroups: those aged 65 and over and those aged 64 and under. Additionally, we analysed the relationship between swab positivity and daily case numbers to estimate the case ascertainment rate of England's mass testing programme. During 2020, we estimated the IFR to be 0.67% and the IHR to be 2.6%. By late 2021/early 2022, the IFR and IHR had both decreased to 0.097% and 0.76%, respectively. The average case ascertainment rate over the entire duration of the study was estimated to be 36.1%, but there was some significant variation in continuous estimates of the case ascertainment rate. Continuous estimates of the IFR and IHR of the virus were observed to increase during the periods of Alpha and Delta's emergence. During periods of vaccination rollout, and the emergence of the Omicron variant, the IFR and IHR decreased. During 2020, we estimated a time-lag of 19 days between hospitalisation and swab positivity, and 26 days between deaths and swab positivity. By late 2021/early 2022, these time-lags had decreased to 7 days for hospitalisations and 18 days for deaths. Even though many populations have high levels of immunity to SARS-CoV-2 from vaccination and natural infection, waning of immunity and variant emergence will continue to be an upwards pressure on the IHR and IFR. As investments in community surveillance of SARS-CoV-2 infection are scaled back, alternative methods are required to accurately track the ever-changing relationship between infection, hospitalisation, and death and hence provide vital information for healthcare provision and utilisation
The therapeutic potential of attentional bias modification training for insomnia: study protocol for a randomised controlled trial.
The efficacy of attentional bias modification (ABM) as a treatment for anxiety and depression has been extensively studied with promising results. Despite some evidence of sleep-related attentional biases in insomnia, only a small number of studies, yielding mixed results, have examined the application of ABM in insomnia. This study specifically aims to determine whether ABM can reduce (i) the presence of an attentional bias for sleep-related threatening words; (ii) insomnia symptom severity; (iii) sleep onset latency; and (iv) pre-sleep cognitive arousal amongst individuals with insomnia compared to a non-treatment control group of individuals with insomnia. We propose a randomised controlled trial of 90 individuals from the general population who meet the criteria for Insomnia Disorder. Following an initial examination for the presence of a sleep-related attentional bias using the dot-probe paradigm, participants will be randomised to an online attentional bias modification training condition, or to a standard attentional bias task (non-treatment) control condition. Both conditions will be delivered online by a web platform. All participants allocated to the non-treatment control group will be offered ABM training once the study is complete. The primary outcome will be the attentional bias indices of vigilance and disengagement and self-reported insomnia symptoms, sleep onset latency and pre-sleep cognitive arousal. Attentional bias and insomnia symptoms will be assessed at baseline (day 1) and post-treatment (2 days after the final training session: day 9). Insomnia symptoms will be again assessed at follow-up (day 16). Secondary outcomes include examining whether sleep associated monitoring and worry are related to a sleep-related attentional bias in insomnia, and whether such reports reduce following ABM. All main analyses will be carried out on completion of follow-up assessments. The trial is supported by the Department of Psychology, Sociology and Politics at Sheffield Hallam University. This study will extend the research base examining the efficacy of attentional bias modification for insomnia. ISRCTN ( ISRCTN11643569 , registered on 5 June 2018)
Use of radiotherapy in patients with oesophageal, stomach, colon, rectal, liver, pancreatic, lung, and ovarian cancer: an International Cancer Benchmarking Partnership (ICBP) population-based study
BACKGROUND: There is little evidence on variation in radiotherapy use in different countries, although it is a key treatment modality for some patients with cancer. Here we aimed to examine such variation. METHODS: This population-based study used data from Norway, the four UK nations (England, Northern Ireland, Scotland, and Wales), nine Canadian provinces (Alberta, British Columbia, Manitoba, New Brunswick, Newfoundland and Labrador, Nova Scotia, Ontario, Prince Edward Island, and Saskatchewan), and two Australian states (New South Wales and Victoria). Patients aged 15-99 years diagnosed with cancer in eight different sites (oesophageal, stomach, colon, rectal, liver, pancreatic, lung, or ovarian cancer), with no other primary cancer diagnosis occurring within the 5 years before to 1 year after the index cancer diagnosis or during the study period were included in the study. We examined variation in radiotherapy use from 31 days before to 365 days after diagnosis and time to its initiation, alongside related variation in patient group differences. Information was obtained from cancer registry records linked to clinical or patient management system data, or hospital administration data. Random-effects meta-analyses quantified interjurisdictional variation using 95% prediction intervals (95% PIs). FINDINGS: Between Jan 1, 2012, and Dec 31, 2017, of 902 312 patients with a new diagnosis of one of the studied cancers, 115 357 (12·8%) did not meet inclusion criteria, and 786,955 were included in the analysis. There was large interjurisdictional variation in radiotherapy use, with wide 95% PIs: 17·8 to 82·4 (pooled estimate 50·2%) for oesophageal cancer, 35·5 to 55·2 (45·2%) for rectal cancer, 28·6 to 54·0 (40·6%) for lung cancer, and 4·6 to 53·6 (19·0%) for stomach cancer. For patients with stage 2-3 rectal cancer, interjurisdictional variation was greater than that for all patients with rectal cancer (95% PI 37·0 to 84·6; pooled estimate 64·2%). Radiotherapy use was infrequent but variable in patients with pancreatic (95% PI 1·7 to 16·5%), liver (1·8 to 11·2%), colon (1·6 to 5·0%), and ovarian (0·8 to 7·6%) cancer. Patients aged 85-99 years had three-times lower odds of radiotherapy use than those aged 65-74 years, with substantial interjurisdictional variation in this age difference (odds ratio [OR] 0·38; 95% PI 0·20-0·73). Women had slightly lower odds of radiotherapy use than men (OR 0·88, 95% PI 0·77-1·01). There was large variation in median time to first radiotherapy (from diagnosis date) by cancer site, with substantial interjurisdictional variation (eg, oesophageal 95% PI 11·3 days to 112·8 days; pooled estimate 62·0 days; rectal 95% PI 34·7 days to 77·3 days; pooled estimate 56·0 days). Older patients had shorter median time to radiotherapy with appreciable interjurisdictional variation (-9·5 days in patients aged 85-99 years vs 65-74 years, 95% PI -26·4 to 7·4). INTERPRETATION: Large interjurisdictional variation in both use and time to radiotherapy initiation were observed, alongside large and variable age differences. To guide efforts to improve patient outcomes, underlying reasons for these differences need to be established. FUNDING: International Cancer Benchmarking Partnership (funded by the Canadian Partnership Against Cancer, Cancer Council Victoria, Cancer Institute New South Wales, Cancer Research UK, Danish Cancer Society, National Cancer Registry Ireland, The Cancer Society of New Zealand, National Health Service England, Norwegian Cancer Society, Public Health Agency Northern Ireland on behalf of the Northern Ireland Cancer Registry, DG Health and Social Care Scottish Government, Western Australia Department of Health, and Public Health Wales NHS Trust)
The use of representative community samples to assess SARS-CoV-2 lineage competition: Alpha outcompetes Beta and wild-type in England from January to March 2021.
Genomic surveillance for SARS-CoV-2 lineages informs our understanding of possible future changes in transmissibility and vaccine efficacy and will be a high priority for public health for the foreseeable future. However, small changes in the frequency of one lineage over another are often difficult to interpret because surveillance samples are obtained using a variety of methods all of which are known to contain biases. As a case study, using an approach which is largely free of biases, we here describe lineage dynamics and phylogenetic relationships of the Alpha and Beta variant in England during the first 3 months of 2021 using sequences obtained from a random community sample who provided a throat and nose swab for rt-PCR as part of the REal-time Assessment of Community Transmission-1 (REACT-1) study. Overall, diversity decreased during the first quarter of 2021, with the Alpha variant (first identified in Kent) becoming predominant, driven by a reproduction number 0.3 higher than for the prior wild-type. During January, positive samples were more likely to be Alpha in those aged 18 to 54 years old. Although individuals infected with the Alpha variant were no more likely to report one or more classic COVID-19 symptoms compared to those infected with wild-type, they were more likely to be antibody-positive 6 weeks after infection. Further, viral load was higher in those infected with the Alpha variant as measured by cycle threshold (Ct) values. The presence of infections with non-imported Beta variant (first identified in South Africa) during January, but not during February or March, suggests initial establishment in the community followed by fade-out. However, this occurred during a period of stringent social distancing. These results highlight how sequence data from representative community surveys such as REACT-1 can augment routine genomic surveillance during periods of lineage diversity
Appropriately smoothing prevalence data to inform estimates of growth rate and reproduction number
The time-varying reproduction number () can change rapidly over the course of a pandemic due to changing restrictions, behaviours, and levels of population immunity. Many methods exist that allow the estimation of from case data. However, these are not easily adapted to point prevalence data nor can they infer across periods of missing data. We developed a Bayesian P-spline model suitable for fitting to a wide range of epidemic time-series, including point-prevalence data. We demonstrate the utility of the model by fitting to periodic daily SARS-CoV-2 swab-positivity data in England from the first 7 rounds (May 2020–December 2020) of the REal-time Assessment of Community Transmission-1 (REACT-1) study. Estimates of over the period of two subsequent rounds (6–8 weeks) and single rounds (2–3 weeks) inferred using the Bayesian P-spline model were broadly consistent with estimates from a simple exponential model, with overlapping credible intervals. However, there were sometimes substantial differences in point estimates. The Bayesian P-spline model was further able to infer changes in over shorter periods tracking a temporary increase above one during late-May 2020, a gradual increase in over the summer of 2020 as restrictions were eased, and a reduction in during England’s second national lockdown followed by an increase as the Alpha variant surged. The model is robust against both under-fitting and over-fitting and is able to interpolate between periods of available data; it is a particularly versatile model when growth rate can change over small timescales, as in the current SARS-CoV-2 pandemic. This work highlights the importance of pairing robust methods with representative samples to track pandemics
Central role for MCP-1/CCL2 in injury-induced inflammation revealed by in vitro, in silico, and clinical studies
The translation of in vitro findings to clinical outcomes is often elusive. Trauma/hemorrhagic shock (T/HS) results in hepatic hypoxia that drives inflammation. We hypothesize that in silico methods would help bridge in vitro hepatocyte data and clinical T/HS, in which the liver is a primary site of inflammation. Primary mouse hepatocytes were cultured under hypoxia (1% O 2) or normoxia (21% O2) for 1-72 h, and both the cell supernatants and protein lysates were assayed for 18 inflammatory mediators by Luminex™ technology. Statistical analysis and data-driven modeling were employed to characterize the main components of the cellular response. Statistical analyses, hierarchical and k-means clustering, Principal Component Analysis, and Dynamic Network Analysis suggested MCP-1/CCL2 and IL-1α as central coordinators of hepatocyte-mediated inflammation in C57BL/6 mouse hepatocytes. Hepatocytes from MCP-1-null mice had altered dynamic inflammatory networks. Circulating MCP-1 levels segregated human T/HS survivors from non-survivors. Furthermore, T/HS survivors with elevated early levels of plasma MCP-1 post-injury had longer total lengths of stay, longer intensive care unit lengths of stay, and prolonged requirement for mechanical ventilation vs. those with low plasma MCP-1. This study identifies MCP-1 as a main driver of the response of hepatocytes in vitro and as a biomarker for clinical outcomes in T/HS, and suggests an experimental and computational framework for discovery of novel clinical biomarkers in inflammatory diseases. © 2013 Ziraldo et al
Terahertz All-Optical Modulation in a Silicon-Polymer Hybrid System
Although Gigahertz-scale free-carrier modulators have been previously
demonstrated in silicon, intensity modulators operating at Terahertz speeds
have not been reported because of silicon's weak ultrafast optical
nonlinearity. We have demonstrated intensity modulation of light with light in
a silicon-polymer integrated waveguide device, based on the all-optical Kerr
effect - the same ultrafast effect used in four-wave mixing. Direct
measurements of time-domain intensity modulation are made at speeds of 10 GHz.
We showed experimentally that the ultrafast mechanism of this modulation
functions at the optical frequency through spectral measurements, and that
intensity modulation at frequencies in excess of 1 THz can be obtained in this
device. By integrating optical polymers through evanescent coupling to
high-mode-confinement silicon waveguides, we greatly increase the effective
nonlinearity of the waveguide for cross-phase modulation. The combination of
high mode confinement, multiple integrated optical components, and high
nonlinearities produces all-optical ultrafast devices operating at
continuous-wave power levels compatible with telecommunication systems.
Although far from commercial radio frequency optical modulator standards in
terms of extinction, these devices are a first step in development of
large-scale integrated ultrafast optical logic in silicon, and are two orders
of magnitude faster than previously reported silicon devices.Comment: Under consideration at Nature Material
Insights from computational modeling in inflammation and acute rejection in limb transplantation
Acute skin rejection in vascularized composite allotransplantation (VCA) is the major obstacle for wider adoption in clinical practice. This study utilized computational modeling to identify biomarkers for diagnosis and targets for treatment of skin rejection. Protein levels of 14 inflammatory mediators in skin and muscle biopsies from syngeneic grafts [n = 10], allogeneic transplants without immunosuppression [n = 10] and allografts treated with tacrolimus [n = 10] were assessed by multiplexed analysis technology. Hierarchical Clustering Analysis, Principal Component Analysis, Random Forest Classification and Multinomial Logistic Regression models were used to segregate experimental groups. Based on Random Forest Classification, Multinomial Logistic Regression and Hierarchical Clustering Analysis models, IL-4, TNF-α and IL-12p70 were the best predictors of skin rejection and identified rejection well in advance of histopathological alterations. TNF-α and IL-12p70 were the best predictors of muscle rejection and also preceded histopathological alterations. Principal Component Analysis identified IL-1α, IL-18, IL-1β, and IL-4 as principal drivers of transplant rejection. Thus, inflammatory patterns associated with rejection are specific for the individual tissue and may be superior for early detection and targeted treatment of rejection. © 2014 Wolfram et al
Urban women's socioeconomic status, health service needs and utilization in the four weeks after postpartum hospital discharge: findings of a Canadian cross-sectional survey
<p>Abstract</p> <p>Background</p> <p>Postpartum women who experience socioeconomic disadvantage are at higher risk for poor health outcomes than more advantaged postpartum women, and may benefit from access to community based postpartum health services. This study examined socioeconomically disadvantaged (SED) postpartum women's health, and health service needs and utilization patterns in the first four weeks post hospital discharge, and compared them to more socioeconomically advantaged (SEA) postpartum women's health, health service needs and utilization patterns.</p> <p>Methods</p> <p>Data collected as part of a large Ontario cross-sectional mother-infant survey were analyzed. Women (N = 1000) who had uncomplicated vaginal births of single 'at-term' infants at four hospitals in two large southern Ontario, Canada cities were stratified into SED and SEA groups based on income, social support and a universally administered hospital postpartum risk screen. Participants completed a self-administered questionnaire before hospital discharge and a telephone interview four weeks after discharge. Main outcome measures were self-reported health status, symptoms of postpartum depression, postpartum service needs and health service use.</p> <p>Results</p> <p>When compared to the SEA women, the SED women were more likely to be discharged from hospital within the first 24 hours after giving birth [OR 1.49, 95% CI (1.01–2.18)], less likely to report very good or excellent health [OR 0.48, 95% CI (0.35–0.67)], and had higher rates of symptoms of postpartum depression [OR 2.7, 95% CI(1.64–4.4)]. No differences were found between groups in relation to self reported need for and ability to access services for physical and mental health needs, or in use of physicians, walk-in clinics and emergency departments. The SED group were more likely to accept public health nurse home visits [OR 2.24, 95% CI(1.47–3.40)].</p> <p>Conclusion</p> <p>Although SED women experienced poorer mental and overall health they reported similar health service needs and utilization patterns to more SEA women. The results can assist policy makers, health service planners and providers to develop and implement necessary and accessible services. Further research is needed to evaluate SED postpartum women's health service needs and barriers to service use.</p
- …