88 research outputs found
Recognizing Speech in a Novel Accent: The Motor Theory of Speech Perception Reframed
The motor theory of speech perception holds that we perceive the speech of
another in terms of a motor representation of that speech. However, when we
have learned to recognize a foreign accent, it seems plausible that recognition
of a word rarely involves reconstruction of the speech gestures of the speaker
rather than the listener. To better assess the motor theory and this
observation, we proceed in three stages. Part 1 places the motor theory of
speech perception in a larger framework based on our earlier models of the
adaptive formation of mirror neurons for grasping, and for viewing extensions
of that mirror system as part of a larger system for neuro-linguistic
processing, augmented by the present consideration of recognizing speech in a
novel accent. Part 2 then offers a novel computational model of how a listener
comes to understand the speech of someone speaking the listener's native
language with a foreign accent. The core tenet of the model is that the
listener uses hypotheses about the word the speaker is currently uttering to
update probabilities linking the sound produced by the speaker to phonemes in
the native language repertoire of the listener. This, on average, improves the
recognition of later words. This model is neutral regarding the nature of the
representations it uses (motor vs. auditory). It serve as a reference point for
the discussion in Part 3, which proposes a dual-stream neuro-linguistic
architecture to revisits claims for and against the motor theory of speech
perception and the relevance of mirror neurons, and extracts some implications
for the reframing of the motor theory
Preparation of name and address data for record linkage using hidden Markov models
BACKGROUND: Record linkage refers to the process of joining records that relate to the same entity or event in one or more data collections. In the absence of a shared, unique key, record linkage involves the comparison of ensembles of partially-identifying, non-unique data items between pairs of records. Data items with variable formats, such as names and addresses, need to be transformed and normalised in order to validly carry out these comparisons. Traditionally, deterministic rule-based data processing systems have been used to carry out this pre-processing, which is commonly referred to as "standardisation". This paper describes an alternative approach to standardisation, using a combination of lexicon-based tokenisation and probabilistic hidden Markov models (HMMs). METHODS: HMMs were trained to standardise typical Australian name and address data drawn from a range of health data collections. The accuracy of the results was compared to that produced by rule-based systems. RESULTS: Training of HMMs was found to be quick and did not require any specialised skills. For addresses, HMMs produced equal or better standardisation accuracy than a widely-used rule-based system. However, acccuracy was worse when used with simpler name data. Possible reasons for this poorer performance are discussed. CONCLUSION: Lexicon-based tokenisation and HMMs provide a viable and effort-effective alternative to rule-based systems for pre-processing more complex variably formatted data such as addresses. Further work is required to improve the performance of this approach with simpler data such as names. Software which implements the methods described in this paper is freely available under an open source license for other researchers to use and improve
The Genetic Association Between ADHD Symptoms and Reading Difficulties: The Role of Inattentiveness and IQ
Previous studies have documented the primarily genetic aetiology for the stronger phenotypic covariance between reading disability and ADHD inattention symptoms, compared to hyperactivity-impulsivity symptoms. In this study, we examined to what extent this covariation could be attributed to “generalist genes” shared with general cognitive ability or to “specialist” genes which may specifically underlie processes linking inattention symptoms and reading difficulties. We used multivariate structural equation modeling on IQ, parent and teacher ADHD ratings and parent ratings on reading difficulties from a general population sample of 1312 twins aged 7.9–10.9 years. The covariance between reading difficulties and ADHD inattention symptoms was largely driven by genetic (45%) and child-specific environment (21%) factors not shared with IQ and hyperactivity-impulsivity; only 11% of the covariance was due to genetic effects common with IQ. Aetiological influences shared among all phenotypes explained 47% of the variance in reading difficulties. The current study, using a general population sample, extends previous findings by showing, first, that the shared genetic variability between reading difficulties and ADHD inattention symptoms is largely independent from genes contributing to general cognitive ability and, second, that child-specific environment factors, independent from IQ, also contribute to the covariation between reading difficulties and inattention symptoms
Evaluating the drivers of and obstacles to the willingness to use cognitive enhancement drugs: the influence of drug characteristics, social environment, and personal characteristics
Sattler S, Mehlkop G, Graeff P, Sauer C. Evaluating the drivers of and obstacles to the willingness to use cognitive enhancement drugs: the influence of drug characteristics, social environment, and personal characteristics. Substance Abuse Treatment, Prevention, and Policy. 2014;9(1): 8.Background
The use of cognitive enhancement (CE) by means of pharmaceutical agents has been the subject of intense debate both among scientists and in the media. This study investigates several drivers of and obstacles to the willingness to use prescription drugs non-medically for augmenting brain capacity.
Methods
We conducted a web-based study among 2,877 students from randomly selected disciplines at German universities. Using a factorial survey, respondents expressed their willingness to take various hypothetical CE-drugs; the drugs were described by five experimentally varied characteristics and the social environment by three varied characteristics. Personal characteristics and demographic controls were also measured.
Results
We found that 65.3% of the respondents staunchly refused to use CE-drugs. The results of a multivariate negative binomial regression indicated that respondents’ willingness to use CE-drugs increased if the potential drugs promised a significant augmentation of mental capacity and a high probability of achieving this augmentation. Willingness decreased when there was a high probability of side effects and a high price. Prevalent CE-drug use among peers increased willingness, whereas a social environment that strongly disapproved of these drugs decreased it. Regarding the respondents’ characteristics, pronounced academic procrastination, high cognitive test anxiety, low intrinsic motivation, low internalization of social norms against CE-drug use, and past experiences with CE-drugs increased willingness. The potential severity of side effects, social recommendations about using CE-drugs, risk preferences, and competencies had no measured effects upon willingness.
Conclusions
These findings contribute to understanding factors that influence the willingness to use CE-drugs. They support the assumption of instrumental drug use and may contribute to the development of prevention, policy, and educational strategies
Assessing ADHD symptoms in children and adults:Evaluating the role of objective measures
Background:
Diagnostic guidelines recommend using a variety of methods to assess and diagnose ADHD. Applying subjective measures always incorporates risks such as informant biases or large differences between ratings obtained from diverse sources. Furthermore, it has been demonstrated that ratings and tests seem to assess somewhat different constructs. The use of objective measures might thus yield valuable information for diagnosing ADHD. This study aims at evaluating the role of objective measures when trying to distinguish between individuals with ADHD and controls. Our sample consisted of children (n = 60) and adults (n = 76) diagnosed with ADHD and matched controls who completed self- and observer ratings as well as objective tasks. Diagnosis was primarily based on clinical interviews. A popular pattern recognition approach, support vector machines, was used to predict the diagnosis.
Results:
We observed relatively high accuracy of 79% (adults) and 78% (children) applying solely objective measures. Predicting an ADHD diagnosis using both subjective and objective measures exceeded the accuracy of objective measures for both adults (89.5%) and children (86.7%), with the subjective variables proving to be the most relevant.
Conclusions:
We argue that objective measures are more robust against rater bias and errors inherent in subjective measures and may be more replicable. Considering the high accuracy of objective measures only, we found in our study, we think that they should be incorporated in diagnostic procedures for assessing ADHD
Advances in understanding and treating ADHD
Attention deficit hyperactivity disorder (ADHD) is a neurocognitive behavioral developmental disorder most commonly seen in childhood and adolescence, which often extends to the adult years. Relative to a decade ago, there has been extensive research into understanding the factors underlying ADHD, leading to far more treatment options available for both adolescents and adults with this disorder. Novel stimulant formulations have made it possible to tailor treatment to the duration of efficacy required by patients, and to help mitigate the potential for abuse, misuse and diversion. Several new non-stimulant options have also emerged in the past few years. Among these, cognitive behavioral interventions have proven popular in the treatment of adult ADHD, especially within the adult population who cannot or will not use medications, along with the many medication-treated patients who continue to show residual disability
What is the level of evidence for the use of currently available technologies in facilitating the self-management of difficulties associated with ADHD in children and young people? A systematic review
A number of technologies to help self-manage Attention Deficit Hyperactivity Disorder (ADHD) in children and young people (YP) have been developed. This review will assess the level of evidence for the use of such technologies. The review was undertaken in accordance with the general principles recommended in the Preferred Reporting Items for Systematic Reviews and Meta-Analysis. 7545 studies were screened. Fourteen studies of technology that aim to manage difficulties associated with ADHD in children and YP were included. Primary outcome measures were measures that assessed difficulties related to ADHD. Databases searched were MEDLINE, Web of Science (Core collection), CINAHL, the Cochrane Library, ProQuest ASSIA, PsycINFO and Scopus. The methodological quality of the studies was assessed. This review highlights the potential for the use of technology in paediatric ADHD self-management. However, it also demonstrates that current research lacks robustness; using small sample sizes, non-validated outcome measures and little psychoeducation component. Future research is required to investigate the value of technology in supporting children and YP with ADHD and a focus psychoeducation is needed
- …