13,571 research outputs found

    Dietary patterns, insulin sensitivity and inflammation in older adults.

    Get PDF
    Background/objectivesSeveral studies have linked dietary patterns to insulin sensitivity and systemic inflammation, which affect risk of multiple chronic diseases. The purpose of this study was to investigate the dietary patterns of a cohort of older adults, and to examine relationships of dietary patterns with markers of insulin sensitivity and systemic inflammation.Subjects/methodsThe Health, Aging and Body Composition (Health ABC) Study is a prospective cohort study of 3075 older adults. In Health ABC, multiple indicators of glucose metabolism and systemic inflammation were assessed. Food intake was estimated with a modified Block food frequency questionnaire. In this study, dietary patterns of 1751 participants with complete data were derived by cluster analysis.ResultsSix clusters were identified, including a 'healthy foods' cluster, characterized by higher intake of low-fat dairy products, fruit, whole grains, poultry, fish and vegetables. In the main analysis, the 'healthy foods' cluster had significantly lower fasting insulin and homeostasis model assessment of insulin resistance values than the 'breakfast cereal' and 'high-fat dairy products' clusters, and lower fasting glucose than the 'high-fat dairy products' cluster (P≤0.05). No differences were found in 2-h glucose. With respect to inflammation, the 'healthy foods' cluster had lower interleukin-6 than the 'sweets and desserts' and 'high-fat dairy products' clusters, and no differences were seen in C-reactive protein or tumor necrosis factor-α.ConclusionsA dietary pattern high in low-fat dairy products, fruit, whole grains, poultry, fish and vegetables may be associated with greater insulin sensitivity and lower systemic inflammation in older adults

    Some triviality results for quasi-Einstein manifolds and Einstein warped products

    Full text link
    In this paper we prove a number of triviality results for Einstein warped products and quasi-Einstein manifolds using different techniques and under assumptions of various nature. In particular we obtain and exploit gradient estimates for solutions of weighted Poisson-type equations and adaptations to the weighted setting of some Liouville-type theorems.Comment: 15 pages, fixed minor mistakes in Section

    Functional ecology of aquatic phagotrophic protists - Concepts, limitations, and perspectives

    Get PDF
    Special issue Current trends in protistology – results from the VII ECOP - ISOP Joint Meeting 2015.-- 25 pages, 6 figures, 1 tableFunctional ecology is a subdiscipline that aims to enable a mechanistic understanding of patterns and processes from the organismic to the ecosystem level. This paper addresses some main aspects of the process-oriented current knowledge on phagotrophic, i.e. heterotrophic and mixotrophic, protists in aquatic food webs. This is not an exhaustive review; rather, we focus on conceptual issues, in particular on the numerical and functional response of these organisms. We discuss the evolution of concepts and define parameters to evaluate predator–prey dynamics ranging from Lotka–Volterra to the Independent Response Model. Since protists have extremely versatile feeding modes, we explore if there are systematic differences related to their taxonomic affiliation and life strategies. We differentiate between intrinsic factors (nutritional history, acclimatisation) and extrinsic factors (temperature, food, turbulence) affecting feeding, growth, and survival of protist populations. We briefly consider intraspecific variability of some key parameters and constraints inherent in laboratory microcosm experiments. We then upscale the significance of phagotrophic protists in food webs to the ocean level. Finally, we discuss limitations of the mechanistic understanding of protist functional ecology resulting from principal unpredictability of nonlinear dynamics. We conclude by defining open questions and identifying perspectives for future research on functional ecology of aquatic phagotrophic protistsHA was supported by grants from the German Research Foundation (DFG; AR 288/16) and from the Federal Ministry for Education and Research (BMBF: 03G0237B; 02WRM1364D). Project FERMI (CGL2014-59227-R) was awarded to AC from the Spanish Ministry of Economy and Competitiveness. RA was supported by the the European Union's Horizon 2020 research and innovation programme (Marie Sklodowska-Curie grant agreement No 658882). PJH was supported by the Danish Council for independent Reseach, project DDF-4181-00484. TW was financially supported by the Austrian Science Fund (FWF, projects P20118-B17 and P20360-B17). DJSM received no support for his efforts on this study, other than his salary provided by the University of LiverpoolPeer Reviewe

    Laccase: a green catalyst for the biosynthesis of poly-phenols

    Get PDF
    Laccases (benzene diol: oxidoreductases, EC 1.10.3.2) are able to catalyze the oxidation of various compounds containing phenolic and aniline structures using dissolved oxygen in water. Laccase structural features and catalytic mechanisms focused on the polymerization of aromatic compounds are reported. A description about the most recent research on the biosynthesis of chemicals and polymers is made. Selected applications of this technology are considered as well as the advantages, shortcomings and future needs related with the use of laccases.This study was supported by Chinese Government Scholarship under China Scholarship Council (No. 201606790036) and Chinese Foundation Key projects of governmental cooperation in international scientific and technological innovation (No. 2016 YFE0115700). The authors would also like to acknowledge Portuguese Foundation for Science and Technology (FCT) for the funding of UID/BIO/04469/2013 unit and COMPETE 2020 (POCI-01–0145-FEDER-006684) and BioTecNorte operation (NORTE-01–0145-FEDER-000004) funded by European Regional Development Fund under the scope of Norte2020 – Programa Operacional Regional do Norte.info:eu-repo/semantics/publishedVersio

    Computer Assisted Planning for Curved Laser Interstitial Thermal Therapy

    Get PDF
    IEEE Laser interstitial thermal therapy (LiTT) is a minimally invasive alternative to conventional open surgery for drug-resistant focal mesial temporal lobe epilepsy (MTLE). Recent studies suggest that higher seizure freedom rates are correlated with maximal ablation of the mesial hippocampal head, whilst sparing of the parahippocampal gyrus (PHG) may reduce neuropsychological sequelae. Current commercially available laser catheters are inserted following manually planned straight-line trajectories, which cannot conform to curved brain structures, such as the hippocampus, without causing collateral damage or requiring multiple insertions. The clinical feasibility and potential of curved LiTT trajectories through steerable needles has yet to be investigated. This is the focus of our work. We propose a GPU-accelerated computer-assisted planning (CAP) algorithm for steerable needle insertions that generates optimized curved 3D trajectories with maximal ablation of the amygdalohippocampal complex and minimal collateral damage to nearby structures, while accounting for a variable ablation diameter (515mm5-15mm). Simulated trajectories and ablations were performed on 5 patients with mesial temporal sclerosis (MTS), which were identified from a prospectively managed database. The algorithm generated obstacle-free paths with significantly greater target area ablation coverage and lower PHG ablation variance compared to straight line trajectories. The presented CAP algorithm returns increased ablation of the amygdalohippocampal complex, with lower patient risk scores compared to straight-line trajectories. This is the first clinical application of preoperative planning for steerable needle based LiTT. This study suggests that steerable needles have the potential to improve LiTT procedure efficacy whilst improving the safety and should thus be investigated further

    Fast Ensemble Smoothing

    Full text link
    Smoothing is essential to many oceanographic, meteorological and hydrological applications. The interval smoothing problem updates all desired states within a time interval using all available observations. The fixed-lag smoothing problem updates only a fixed number of states prior to the observation at current time. The fixed-lag smoothing problem is, in general, thought to be computationally faster than a fixed-interval smoother, and can be an appropriate approximation for long interval-smoothing problems. In this paper, we use an ensemble-based approach to fixed-interval and fixed-lag smoothing, and synthesize two algorithms. The first algorithm produces a linear time solution to the interval smoothing problem with a fixed factor, and the second one produces a fixed-lag solution that is independent of the lag length. Identical-twin experiments conducted with the Lorenz-95 model show that for lag lengths approximately equal to the error doubling time, or for long intervals the proposed methods can provide significant computational savings. These results suggest that ensemble methods yield both fixed-interval and fixed-lag smoothing solutions that cost little additional effort over filtering and model propagation, in the sense that in practical ensemble application the additional increment is a small fraction of either filtering or model propagation costs. We also show that fixed-interval smoothing can perform as fast as fixed-lag smoothing and may be advantageous when memory is not an issue

    Like trainer, like bot? Inheritance of bias in algorithmic content moderation

    Get PDF
    The internet has become a central medium through which `networked publics' express their opinions and engage in debate. Offensive comments and personal attacks can inhibit participation in these spaces. Automated content moderation aims to overcome this problem using machine learning classifiers trained on large corpora of texts manually annotated for offence. While such systems could help encourage more civil debate, they must navigate inherently normatively contestable boundaries, and are subject to the idiosyncratic norms of the human raters who provide the training data. An important objective for platforms implementing such measures might be to ensure that they are not unduly biased towards or against particular norms of offence. This paper provides some exploratory methods by which the normative biases of algorithmic content moderation systems can be measured, by way of a case study using an existing dataset of comments labelled for offence. We train classifiers on comments labelled by different demographic subsets (men and women) to understand how differences in conceptions of offence between these groups might affect the performance of the resulting models on various test sets. We conclude by discussing some of the ethical choices facing the implementers of algorithmic moderation systems, given various desired levels of diversity of viewpoints amongst discussion participants.Comment: 12 pages, 3 figures, 9th International Conference on Social Informatics (SocInfo 2017), Oxford, UK, 13--15 September 2017 (forthcoming in Springer Lecture Notes in Computer Science

    Patients’ use of information about medicine side effects in relation to experiences of suspected adverse drug reactions

    Get PDF
    Background Adverse drug reactions (ADRs) are common, and information about medicines is increasingly widely available to the public. However, relatively little work has explored how people use medicines information to help them assess symptoms that may be suspected ADRs. Objective Our objective was to determine how patients use patient information leaflets (PILs) or other medicines information sources and whether information use differs depending on experiences of suspected ADRs. Method This was a cross-sectional survey conducted in six National Health Service (NHS) hospitals in North West England involving medical in-patients taking at least two regular medicines prior to admission. The survey was administered via a questionnaire and covered use of the PIL and other medicines information sources, perceived knowledge about medicines risks/ADRs, experiences of suspected ADRs, plus demographic information. Results Of the 1,218 respondents to the survey, 18.8 % never read the PIL, whilst 6.5 % only do so if something unexpected happens. Educational level was related to perceived knowledge about medicines risks, but not to reading the PIL or seeking further information about medicines risks. Over half the respondents (56.0 %) never sought more information about possible side effects of medicines. A total of 57.2 % claimed they had experienced a suspected ADR. Of these 85.9 % were either very sure or fairly sure this was a reaction to a medicine. Over half of those experiencing a suspected ADR (53.8 %) had read the PIL, of whom 36.2 % did so before the suspected ADR occurred, the remainder afterwards. Reading the PIL helped 84.8 % of these respondents to decide they had experienced an ADR. Educational level, general knowledge of medicines risks and number of regular medicines used all increased the likelihood of experiencing an ADR. Conclusion More patients should be encouraged to read the PIL supplied with medicines. The results support the view that most patients feel knowledgeable about medicines risks and suspected ADRs and value information about side effects, but that reading about side effects in PILs or other medicines information sources does not lead to experiences of suspected ADRs

    The analysis of European lacquer : optimization of thermochemolysis temperature of natural resins

    Get PDF
    In order to optimize chromatographic analysis of European lacquer, thermochemolysis temperature was evaluated for the analysis of natural resins. Five main ingredients of lacquer were studied: sandarac, mastic, colophony, Manila copal and Congo copal. For each, five temperature programs were tested: four fixed temperatures (350, 480, 550, 650 degrees C) and one ultrafast thermal desorption (UFD), in which the temperature rises from 350 to 660 degrees C in 1 min. In total, the integrated signals of 27 molecules, partially characterizing the five resins, were monitored to compare the different methods. A compromise between detection of compounds released at low temperatures and compounds formed at high temperatures was searched. 650 degrees C is too high for both groups, 350 degrees C is best for the first, and 550 degrees C for the second. Fixed temperatures of 480 degrees C or UFD proved to be a consensus in order to detect most marker molecules. UFD was slightly better for the molecules released at low temperatures, while 480 degrees C showed best compounds formed at high temperatures
    corecore