154 research outputs found

    Glucocorticoid receptor gene polymorphisms do not affect growth in fetal and early postnatal life. The Generation R Study

    Get PDF
    Background: Glucocorticoids have an important role in early growth and development. Glucocorticoid receptor gene polymorphisms have been identified that contribute to the variability in glucocorticoid sensitivity. We examined whether these glucocorticoid receptor gene polymorphisms are associated with growth in fetal and early postnatal life.Methods: This study was embedded in a population-based prospective cohort study from fetal life onwards. The studied glucocorticoid receptor gene polymorphisms included BclI (rs41423247), TthIIII (rs10052957), GR-9β (rs6198), N363S (rs6195) and R23K (rs6789 and6190). Fetal growth was assessed by ultrasounds in second and third trimester of pregnancy. Anthropometric measurements in early childhood were performed at birth and at the ages of 6, 14 and 24 months postnatally. Analyses focused on weight, length and head circumference. Analyses were based on 2,414 healthy, Caucasian children.Results: Glucocorticoid receptor gene polymorphisms were not associated with fetal weight, birth weight and early postnatal weight. Also, no associations were found with length and head circumference. Neither were these polymorphisms associated with the risks of low birth weight or growth acceleration from birth to 24 months of age.Conclusions: We found in a large population-based cohort no evidence for an effect of known glucocorticoid receptor gene polymorphisms on fetal and early post

    A Comprehensive Workflow for General-Purpose Neural Modeling with Highly Configurable Neuromorphic Hardware Systems

    Full text link
    In this paper we present a methodological framework that meets novel requirements emerging from upcoming types of accelerated and highly configurable neuromorphic hardware systems. We describe in detail a device with 45 million programmable and dynamic synapses that is currently under development, and we sketch the conceptual challenges that arise from taking this platform into operation. More specifically, we aim at the establishment of this neuromorphic system as a flexible and neuroscientifically valuable modeling tool that can be used by non-hardware-experts. We consider various functional aspects to be crucial for this purpose, and we introduce a consistent workflow with detailed descriptions of all involved modules that implement the suggested steps: The integration of the hardware interface into the simulator-independent model description language PyNN; a fully automated translation between the PyNN domain and appropriate hardware configurations; an executable specification of the future neuromorphic system that can be seamlessly integrated into this biology-to-hardware mapping process as a test bench for all software layers and possible hardware design modifications; an evaluation scheme that deploys models from a dedicated benchmark library, compares the results generated by virtual or prototype hardware devices with reference software simulations and analyzes the differences. The integration of these components into one hardware-software workflow provides an ecosystem for ongoing preparative studies that support the hardware design process and represents the basis for the maturity of the model-to-hardware mapping software. The functionality and flexibility of the latter is proven with a variety of experimental results

    Optimal learning rules for familiarity detection

    Get PDF
    It has been suggested that the mammalian memory system has both familiarity and recollection components. Recently, a high-capacity network to store familiarity has been proposed. Here we derive analytically the optimal learning rule for such a familiarity memory using a signalto- noise ratio analysis. We find that in the limit of large networks the covariance rule, known to be the optimal local, linear learning rule for pattern association, is also the optimal learning rule for familiarity discrimination. The capacity is independent of the sparseness of the patterns, as long as the patterns have a fixed number of bits set. The corresponding information capacity is 0.057 bits per synapse, less than typically found for associative networks

    The effect of Tai Chi Chuan in reducing falls among elderly people: design of a randomized clinical trial in the Netherlands [ISRCTN98840266]

    Get PDF
    BACKGROUND: Falls are a significant public health problem. Thirty to fifty percent of the elderly of 65 years and older fall each year. Falls are the most common type of accident in this age group and can result in fractures and subsequent disabilities, increased fear of falling, social isolation, decreased mobility, and even an increased mortality. Several forms of exercise have been associated with a reduced risk of falling and with a wide range of physiological as well as psychosocial health benefits. Tai Chi Chuan seems to be the most promising form of exercise in the elderly, but the evidence is still controversial. In this article the design of a randomized clinical trial is presented. The trial evaluates the effect of Tai Chi Chuan on fall prevention and physical and psychological function in older adults. METHODS/DESIGN: 270 people of seventy years and older living at home will be identified in the files of the participating general practitioners. People will be asked to participate when meeting the following inclusion criteria: have experienced a fall in the preceding year or suffer from two of the following risk factors: disturbed balance, mobility problems, dizziness, or the use of benzodiazepines or diuretics. People will be randomly allocated to either the Tai Chi Chuan group (13 weeks, twice a week) or the no treatment control group. The primary outcome measure is the number of new falls, measured with a diary. The secondary outcome measures are balance, fear of falling, blood pressure, heart rate, lung function parameters, physical activity, functional status, quality of life, mental health, use of walking devices, medication, use of health care services, adjustments to the house, severity of fall incidents and subsequent injuries. Process parameters will be measured to evaluate the Tai Chi Chuan intervention. A cost-effectiveness analysis will be carried out alongside the evaluation of the clinical results. Follow-up measurements will be collected at 3, 6 and 12 months after randomization. DISCUSSION: As far as we know this is the first trial in Europe considering Tai Chi Chuan and fall prevention. This project will answer a pragmatic research question regarding the efficacy of Tai Chi Chuan regarding fall reduction

    Democratic population decisions result in robust policy-gradient learning: A parametric study with GPU simulations

    Get PDF
    High performance computing on the Graphics Processing Unit (GPU) is an emerging field driven by the promise of high computational power at a low cost. However, GPU programming is a non-trivial task and moreover architectural limitations raise the question of whether investing effort in this direction may be worthwhile. In this work, we use GPU programming to simulate a two-layer network of Integrate-and-Fire neurons with varying degrees of recurrent connectivity and investigate its ability to learn a simplified navigation task using a policy-gradient learning rule stemming from Reinforcement Learning. The purpose of this paper is twofold. First, we want to support the use of GPUs in the field of Computational Neuroscience. Second, using GPU computing power, we investigate the conditions under which the said architecture and learning rule demonstrate best performance. Our work indicates that networks featuring strong Mexican-Hat-shaped recurrent connections in the top layer, where decision making is governed by the formation of a stable activity bump in the neural population (a "non-democratic" mechanism), achieve mediocre learning results at best. In absence of recurrent connections, where all neurons "vote" independently ("democratic") for a decision via population vector readout, the task is generally learned better and more robustly. Our study would have been extremely difficult on a desktop computer without the use of GPU programming. We present the routines developed for this purpose and show that a speed improvement of 5x up to 42x is provided versus optimised Python code. The higher speed is achieved when we exploit the parallelism of the GPU in the search of learning parameters. This suggests that efficient GPU programming can significantly reduce the time needed for simulating networks of spiking neurons, particularly when multiple parameter configurations are investigated. © 2011 Richmond et al

    3.0 T cardiovascular magnetic resonance in patients treated with coronary stenting for myocardial infarction: evaluation of short term safety and image quality

    Get PDF
    Purpose To evaluate safety and image quality of cardiovascular magnetic resonance (CMR) at 3.0 T in patients with coronary stents after myocardial infarction (MI), in comparison to the clinical standard at 1.5 T. Methods Twenty-five patients (21 men; 55 ± 9 years) with first MI treated with primary stenting, underwent 18 scans at 3.0 T and 18 scans at 1.5 T. Twenty-four scans were performed 4 ± 2 days and 12 scans 125 ± 23 days after MI. Cine (steady-state free precession) and late gadolinium-enhanced (LGE, segmented inversion-recovery gradient echo) images were acquired. Patient safety and image artifacts were evaluated, and in 16 patients stent position was assessed during repeat catheterization. Additionally, image quality was scored from 1 (poor quality) to 4 (excellent quality). Results There were no clinical events within 30 days of CMR at 3.0 T or 1.5 T, and no stent migration occurred. At 3.0 T, image quality of cine studies was clinically useful in all, but not sufficient for quantitative analysis in 44% of the scans, due to stent (6/18 scans), flow (7/18 scans) and/or dark band artifacts (8/18 scans). Image quality of LGE images at 3.0 T was not sufficient for quantitative analysis in 53%, and not clinically useful in 12%. At 1.5 T, all cine and LGE images were quantitatively analyzable. Conclusion 3.0 T is safe in the acute and chronic phase after MI treated with primary stenting. Although cine imaging at 3.0 T is suitable for clinical use, quantitative analysis and LGE imaging is less reliable than at 1.5 T. Further optimization of pulse sequences at 3.0 T is essential

    Preparation of name and address data for record linkage using hidden Markov models

    Get PDF
    BACKGROUND: Record linkage refers to the process of joining records that relate to the same entity or event in one or more data collections. In the absence of a shared, unique key, record linkage involves the comparison of ensembles of partially-identifying, non-unique data items between pairs of records. Data items with variable formats, such as names and addresses, need to be transformed and normalised in order to validly carry out these comparisons. Traditionally, deterministic rule-based data processing systems have been used to carry out this pre-processing, which is commonly referred to as "standardisation". This paper describes an alternative approach to standardisation, using a combination of lexicon-based tokenisation and probabilistic hidden Markov models (HMMs). METHODS: HMMs were trained to standardise typical Australian name and address data drawn from a range of health data collections. The accuracy of the results was compared to that produced by rule-based systems. RESULTS: Training of HMMs was found to be quick and did not require any specialised skills. For addresses, HMMs produced equal or better standardisation accuracy than a widely-used rule-based system. However, acccuracy was worse when used with simpler name data. Possible reasons for this poorer performance are discussed. CONCLUSION: Lexicon-based tokenisation and HMMs provide a viable and effort-effective alternative to rule-based systems for pre-processing more complex variably formatted data such as addresses. Further work is required to improve the performance of this approach with simpler data such as names. Software which implements the methods described in this paper is freely available under an open source license for other researchers to use and improve

    An IGF-I promoter polymorphism modifies the relationships between birth weight and risk factors for cardiovascular disease and diabetes at age 36

    Get PDF
    OBJECTIVE: To investigate whether IGF-I promoter polymorphism was associated with birth weight and risk factors for cardiovascular disease (CVD) and type 2 diabetes (T2DM), and whether the birth weight – risk factor relationship was the same for each genotype. DESIGN AND PARTICIPANTS: 264 subjects (mean age 36 years) had data available on birth weight, IGF-I promoter polymorphism genotype, CVD and T2DM risk factors. Student's t-test and regression analyses were applied to analyse differences in birth weight and differences in the birth weight – risk factors relationship between the genotypes. RESULTS: Male variant carriers (VCs) of the IGF-I promoter polymorphism had a 0.2 kg lower birth weight than men with the wild type allele (p = 0.009). Of the risk factors for CVD and T2DM, solely LDL concentration was associated with the genotype for the polymorphism. Most birth weight – risk factor relationships were stronger in the VC subjects; among others the birth weight – systolic blood pressure relationship: 1 kg lower birth weight was related to an 8.0 mmHg higher systolic blood pressure CONCLUSION: The polymorphism in the promoter region of the IGF-I gene is related to birth weight in men only, and to LDL concentration only. Furthermore, the genotype for this polymorphism modified the relationships between birth weight and the risk factors, especially for systolic and diastolic blood pressure
    corecore