799 research outputs found
Representing Semantified Biological Assays in the Open Research Knowledge Graph
In the biotechnology and biomedical domains, recent text mining efforts
advocate for machine-interpretable, and preferably, semantified, documentation
formats of laboratory processes. This includes wet-lab protocols, (in)organic
materials synthesis reactions, genetic manipulations and procedures for faster
computer-mediated analysis and predictions. Herein, we present our work on the
representation of semantified bioassays in the Open Research Knowledge Graph
(ORKG). In particular, we describe a semantification system work-in-progress to
generate, automatically and quickly, the critical semantified bioassay data
mass needed to foster a consistent user audience to adopt the ORKG for
recording their bioassays and facilitate the organisation of research,
according to FAIR principles.Comment: In Proceedings of 'The 22nd International Conference on Asia-Pacific
Digital Libraries
How Does Information Processing Speed Relate to the Attentional Blink?
Background When observers are asked to identify two targets in rapid sequence, they often suffer profound performance deficits for the second target, even when the spatial location of the targets is known. This attentional blink (AB) is usually attributed to the time required to process a previous target, implying that a link should exist between individual differences in information processing speed and the AB. Methodology/Principal Findings The present work investigated this question by examining the relationship between a rapid automatized naming task typically used to assess information-processing speed and the magnitude of the AB. The results indicated that faster processing actually resulted in a greater AB, but only when targets were presented amongst high similarity distractors. When target-distractor similarity was minimal, processing speed was unrelated to the AB. Conclusions/Significance Our findings indicate that information-processing speed is unrelated to target processing efficiency per se, but rather to individual differences in observers' ability to suppress distractors. This is consistent with evidence that individuals who are able to avoid distraction are more efficient at deploying temporal attention, but argues against a direct link between general processing speed and efficient information selection
Monotonicity of Fitness Landscapes and Mutation Rate Control
A common view in evolutionary biology is that mutation rates are minimised.
However, studies in combinatorial optimisation and search have shown a clear
advantage of using variable mutation rates as a control parameter to optimise
the performance of evolutionary algorithms. Much biological theory in this area
is based on Ronald Fisher's work, who used Euclidean geometry to study the
relation between mutation size and expected fitness of the offspring in
infinite phenotypic spaces. Here we reconsider this theory based on the
alternative geometry of discrete and finite spaces of DNA sequences. First, we
consider the geometric case of fitness being isomorphic to distance from an
optimum, and show how problems of optimal mutation rate control can be solved
exactly or approximately depending on additional constraints of the problem.
Then we consider the general case of fitness communicating only partial
information about the distance. We define weak monotonicity of fitness
landscapes and prove that this property holds in all landscapes that are
continuous and open at the optimum. This theoretical result motivates our
hypothesis that optimal mutation rate functions in such landscapes will
increase when fitness decreases in some neighbourhood of an optimum, resembling
the control functions derived in the geometric case. We test this hypothesis
experimentally by analysing approximately optimal mutation rate control
functions in 115 complete landscapes of binding scores between DNA sequences
and transcription factors. Our findings support the hypothesis and find that
the increase of mutation rate is more rapid in landscapes that are less
monotonic (more rugged). We discuss the relevance of these findings to living
organisms
A family history of breast cancer will not predict female early onset breast cancer in a population-based setting
ABSTRACT: BACKGROUND: An increased risk of breast cancer for relatives of breast cancer patients has been demonstrated in many studies, and having a relative diagnosed with breast cancer at an early age is an indication for breast cancer screening. This indication has been derived from estimates based on data from cancer-prone families or from BRCA1/2 mutation families, and might be biased because BRCA1/2 mutations explain only a small proportion of the familial clustering of breast cancer. The aim of the current study was to determine the predictive value of a family history of cancer with regard to early onset of female breast cancer in a population based setting. METHODS: An unselected sample of 1,987 women with and without breast cancer was studied with regard to the age of diagnosis of breast cancer. RESULTS: The risk of early-onset breast cancer was increased when there were: (1) at least 2 cases of female breast cancer in first-degree relatives (yes/no; HR at age 30: 3.09; 95% CI: 128-7.44), (2) at least 2 cases of female breast cancer in first or second-degree relatives under the age of 50 (yes/no; HR at age 30: 3.36; 95% CI: 1.12-10.08), (3) at least 1 case of female breast cancer under the age of 40 in a first- or second-degree relative (yes/no; HR at age 30: 2.06; 95% CI: 0.83-5.12) and (4) any case of bilateral breast cancer (yes/no; HR at age 30: 3.47; 95%: 1.33-9.05). The positive predictive value of having 2 or more of these characteristics was 13% for breast cancer before the age of 70, 11% for breast cancer before the age of 50, and 1% for breast cancer before the age of 30. CONCLUSION: Applying family history related criteria in an unselected population could result in the screening of many women who will not develop breast cancer at an early age
Graphene Bilayer Field-Effect Phototransistor for Terahertz and Infrared Detection
A graphene bilayer phototransistor (GBL-PT) is proposed and analyzed. The
GBL-PT under consideration has the structure of a field-effect transistor with
a GBL as the channel and the back and top gates. The positive bias of the back
gate results in the formation of conducting source and drain sections in the
channel, while the negatively biased top gate provides the potential barrier
which is controlled by the charge of the photogenerated holes. The features of
the GBL-PT operation are associated with the variations of both the potential
distribution and the energy gap in different sections of the channel when the
gate voltages and the charge in the barrier section change. Using the developed
GBL-PT device model, the spectral characteristics, dark current, responsivity
and detectivity are calculated as functions of the applied voltages, energy of
incident photons, intensity of electron and hole scattering, and geometrical
parameters. It is shown that the GBL-PT spectral characteristics are voltage
tuned. The GBL-PT performance as photodetector in the terahertz and infrared
photodetectors can markedly exceed the performance of other photodetectors.Comment: 7 Pages, 7 figure
Evolution of Robustness to Noise and Mutation in Gene Expression Dynamics
Phenotype of biological systems needs to be robust against mutation in order
to sustain themselves between generations. On the other hand, phenotype of an
individual also needs to be robust against fluctuations of both internal and
external origins that are encountered during growth and development. Is there a
relationship between these two types of robustness, one during a single
generation and the other during evolution? Could stochasticity in gene
expression have any relevance to the evolution of these robustness? Robustness
can be defined by the sharpness of the distribution of phenotype; the variance
of phenotype distribution due to genetic variation gives a measure of `genetic
robustness' while that of isogenic individuals gives a measure of
`developmental robustness'. Through simulations of a simple stochastic gene
expression network that undergoes mutation and selection, we show that in order
for the network to acquire both types of robustness, the phenotypic variance
induced by mutations must be smaller than that observed in an isogenic
population. As the latter originates from noise in gene expression, this
signifies that the genetic robustness evolves only when the noise strength in
gene expression is larger than some threshold. In such a case, the two
variances decrease throughout the evolutionary time course, indicating increase
in robustness. The results reveal how noise that cells encounter during growth
and development shapes networks' robustness to stochasticity in gene
expression, which in turn shapes networks' robustness to mutation. The
condition for evolution of robustness as well as relationship between genetic
and developmental robustness is derived through the variance of phenotypic
fluctuations, which are measurable experimentally.Comment: 25 page
European clinical guidelines for Tourette syndrome and other tic disorders-version 2.0. Part IV: deep brain stimulation
In 2011 the European Society for the Study of Tourette Syndrome (ESSTS) published its first European clinical guidelines for the treatment of Tourette Syndrome (TS) with part IV on deep brain stimulation (DBS). Here, we present a revised version of these guidelines with updated recommendations based on the current literature covering the last decade as well as a survey among ESSTS experts. Currently, data from the International Tourette DBS Registry and Database, two meta-analyses, and eight randomized controlled trials (RCTs) are available. Interpretation of outcomes is limited by small sample sizes and short follow-up periods. Compared to open uncontrolled case studies, RCTs report less favorable outcomes with conflicting results. This could be related to several different aspects including methodological issues, but also substantial placebo effects. These guidelines, therefore, not only present currently available data from open and controlled studies, but also include expert knowledge. Although the overall database has increased in size since 2011, definite conclusions regarding the efficacy and tolerability of DBS in TS are still open to debate. Therefore, we continue to consider DBS for TS as an experimental treatment that should be used only in carefully selected, severely affected and otherwise treatment-resistant patients
Linearized stability analysis of gravastars in noncommutative geometry
In this work, we find exact gravastar solutions in the context of
noncommutative geometry, and explore their physical properties and
characteristics. The energy density of these geometries is a smeared and
particle-like gravitational source, where the mass is diffused throughout a
region of linear dimension due to the intrinsic uncertainty
encoded in the coordinate commutator. These solutions are then matched to an
exterior Schwarzschild spacetime. We further explore the dynamical stability of
the transition layer of these gravastars, for the specific case of
, where M is the black hole mass, to linearized
spherically symmetric radial perturbations about static equilibrium solutions.
It is found that large stability regions exist and, in particular, located
sufficiently close to where the event horizon is expected to form.Comment: 6 pages, 3 figure
Reliability and validity of three questionnaires measuring context-specific sedentary behaviour and associated correlates in adolescents, adults and older adults
BACKGROUND: Reliable and valid measures of total sedentary time, context-specific sedentary behaviour (SB) and its potential correlates are useful for the development of future interventions. The purpose was to examine test-retest reliability and criterion validity of three newly developed questionnaires on total sedentary time, context-specific SB and its potential correlates in adolescents, adults and older adults.
METHODS: Reliability and validity was tested in six different samples of Flemish (Belgium) residents. For the reliability study, 20 adolescents, 22 adults and 20 older adults filled out the age-specific SB questionnaire twice. Test-retest reliability was analysed using Kappa coefficients, Intraclass Correlation Coefficients and/or percentage agreement, separately for the three age groups. For the validity study, data were retrieved from 62 adolescents, 33 adults and 33 older adults, with activPAL as criterion measure. Spearman correlations and Bland-Altman plots (or non-parametric approach) were used to analyse criterion validity, separately for the three age groups and for weekday, weekend day and average day.
RESULTS: The test-retest reliability for self-reported total sedentary time indicated following values: ICC = 0.37-0.67 in adolescents; ICC = 0.73-0.77 in adults; ICC = 0.68-0.80 in older adults. Item-specific reliability results (e.g. context-specific SB and its potential correlates) showed good-to-excellent reliability in 67.94%, 68.90% and 66.38% of the items in adolescents, adults and older adults respectively. All items belonging to sedentary-related equipment and simultaneous SB showed good reliability. The sections of the questionnaire with lowest reliability were: context-specific SB (adolescents), potential correlates of computer use (adults) and potential correlates of motorized transport (older adults). Spearman correlations between self-reported total sedentary time and the activPAL were different for each age group: rho = 0.02-0.42 (adolescents), rho = 0.06-0.52 (adults), rho = 0.38-0.50 (older adults). Participants over-reported total sedentary time (except for weekend day in older adults) compared to the activPAL, for weekday, weekend day and average day respectively by +57.05%, +46.29%, +53.34% in adolescents; +40.40%, +19.15%, +32.89% in adults; +10.10%, -6.24%, +4.11% in older adults.
CONCLUSIONS: The questionnaires showed acceptable test-retest reliability and criterion validity. However, over-reporting of total SB was noticeable in adolescents and adults. Nevertheless, these questionnaires will be useful in getting context-specific information on SB
Coumarin anticoagulants and co-trimoxazole: avoid the combination rather than manage the interaction
OBJECTIVE: The objective of our study was to examine the management of the interaction between acenocoumarol or phenprocoumon and several antibiotics by anticoagulation clinics and to compare the consequences of this interaction on users of co-trimoxazole with those for users of other antibiotics. METHODS: A follow-up study was conducted at four anticoagulation clinics in The Netherlands. Data on measurements of the International Normalised Ratio (INR), application of a preventive dose reduction (PDR) of the coumarin anticoagulant, fever and time within or outside the therapeutic INR range were collected. RESULTS: The study cohort consisted of 326 subjects. A PDR was given more often to users of co-trimoxazole PDR than to users of other antibiotics. The PDR in co-trimoxazole users resulted in a significantly reduced risk of both moderate overanticoagulation (INR >4.5) and severe overanticoagulation (INR >6.0) compared with no PDR, with odds ratios (ORs) of 0.06 [95% confidence interval (CI): 0.01-0.51] and 0.09 (95% CI: 0.01-0.92), respectively. In co-trimoxazole users without PDR, the risk of overanticoagulation was significantly increased compared with users of other antibiotics. All co-trimoxazole users spent significantly more time under the therapeutic INR range during the first 6 weeks after the course than users of other antibiotics. CONCLUSION: PDR is effective in preventing overanticoagulation in co-trimoxazole users, but results in a significantly prolonged period of underanticoagulation after the course. Avoidance of concomitant use of co-trimoxazole with acenocoumarol or phenprocoumon seems to be a safer approach than management of the interaction between these drugs
- …