123 research outputs found

    Modeling Accuracy and Variability of Motor Timing in Treated and Untreated Parkinson’s Disease and Healthy Controls

    Get PDF
    Parkinson’s disease (PD) is characterized by difficulty with the timing of movements. Data collected using the synchronization–continuation paradigm, an established motor timing paradigm, have produced varying results but with most studies finding impairment. Some of this inconsistency comes from variation in the medication state tested, in the inter-stimulus intervals (ISI) selected, and in changeable focus on either the synchronization (tapping in time with a tone) or continuation (maintaining the rhythm in the absence of the tone) phase. We sought to re-visit the paradigm by testing across four groups of participants: healthy controls, medication naïve de novo PD patients, and treated PD patients both “on” and “off” dopaminergic medication. Four finger tapping intervals (ISI) were used: 250, 500, 1000, and 2000 ms. Categorical predictors (group, ISI, and phase) were used to predict accuracy and variability using a linear mixed model. Accuracy was defined as the relative error of a tap, and variability as the deviation of the participant’s tap from group predicted relative error. Our primary finding is that the treated PD group (PD patients “on” and “off” dopaminergic therapy) showed a significantly different pattern of accuracy compared to the de novo group and the healthy controls at the 250-ms interval. At this interval, the treated PD patients performed “ahead” of the beat whilst the other groups performed “behind” the beat. We speculate that this “hastening” relates to the clinical phenomenon of motor festination. Across all groups, variability was smallest for both phases at the 500-ms interval, suggesting an innate preference for finger tapping within this range. Tapping variability for the two phases became increasingly divergent at the longer intervals, with worse performance in the continuation phase. The data suggest that patients with PD can be best discriminated from healthy controls on measures of motor timing accuracy, rather than variability

    How open science helps researchers succeed

    Get PDF
    Open access, open data, open source, and other open scholarship practices are growing in popularity and necessity. However, widespread adoption of these practices has not yet been achieved. One reason is that researchers are uncertain about how sharing their work will affect their careers. We review literature demonstrating that open research is associated with increases in citations, media attention, potential collaborators, job opportunities, and funding opportunities. These findings are evidence that open research practices bring significant benefits to researchers relative to more traditional closed practices

    Wildfire Risk as a Socioecological Pathology

    Get PDF
    Wildfire risk in temperate forests has become a nearly intractable problem that can be characterized as a socioecological “pathology”: that is, a set of complex and problematic interactions among social and ecological systems across multiple spatial and temporal scales. Assessments of wildfire risk could benefit from recognizing and accounting for these interactions in terms of socioecological systems, also known as coupled natural and human systems (CNHS). We characterize the primary social and ecological dimensions of the wildfire risk pathology, paying particular attention to the governance system around wildfire risk, and suggest strategies to mitigate the pathology through innovative planning approaches, analytical tools, and policies. We caution that even with a clear understanding of the problem and possible solutions, the system by which human actors govern fire-prone forests may evolve incrementally in imperfect ways and can be expected to resist change even as we learn better ways to manage CNHS

    Justify your alpha

    Get PDF
    Benjamin et al. proposed changing the conventional “statistical significance” threshold (i.e.,the alpha level) from p ≤ .05 to p ≤ .005 for all novel claims with relatively low prior odds. They provided two arguments for why lowering the significance threshold would “immediately improve the reproducibility of scientific research.” First, a p-value near .05provides weak evidence for the alternative hypothesis. Second, under certain assumptions, an alpha of .05 leads to high false positive report probabilities (FPRP2 ; the probability that a significant finding is a false positive

    Justify your alpha

    Get PDF
    In response to recommendations to redefine statistical significance to p ≤ .005, we propose that researchers should transparently report and justify all choices they make when designing a study, including the alpha level

    Jeffrey R. Spies

    No full text
    As access to high-performance computing has increased over the years, the scientific community has in turn sought to analyze increasingly complex data. However, discovering and understanding complex relationships in multidimensional data can be a daunting task. Traditionally, exploratory data analysis has been used to discover patterns and garner substantive understanding of data by emphasizing the use of graphical representation, but methods for visualizing data beyond two or three dimensions are rarely used because of the inherent limitations of Cartesian representation where variables are mapped to spatial dimensions. In order to confront the task of visualizing multidimensional data, it is necessary to draw from principles of human visual perception and cognition. It is well-known that humans are incredibly adept at categorizing the natural world by reducing large degrees of freedom down to "oak trees" or "maple trees", for instance. The overall shape of the trees is the result of high-order relationships and interactions among individual components, yet even in the face of such complexity, a young child can make these distinctions. Lindenmayer systems use simple procedural algorithms to render realistic flora that can be categorized just as easily as natural flora. Knowing this, the goal of the current project is to exploit the human ability to easily categorize nature by assigning variables in a multidimensional data set to features of binary trees. A tool, HDTreeV (High Dimensional Tree Visualization), has been created to provide the user a way to go through a process of categorizing and remapping, in order to ascertain multidimensional relationships in the data

    Dissertation: The Open Science Framework

    No full text
    corecore