1,641 research outputs found

    Combining frequency and time domain approaches to systems with multiple spike train input and output

    Get PDF
    A frequency domain approach and a time domain approach have been combined in an investigation of the behaviour of the primary and secondary endings of an isolated muscle spindle in response to the activity of two static fusimotor axons when the parent muscle is held at a fixed length and when it is subjected to random length changes. The frequency domain analysis has an associated error process which provides a measure of how well the input processes can be used to predict the output processes and is also used to specify how the interactions between the recorded processes contribute to this error. Without assuming stationarity of the input, the time domain approach uses a sequence of probability models of increasing complexity in which the number of input processes to the model is progressively increased. This feature of the time domain approach was used to identify a preferred direction of interaction between the processes underlying the generation of the activity of the primary and secondary endings. In the presence of fusimotor activity and dynamic length changes imposed on the muscle, it was shown that the activity of the primary and secondary endings carried different information about the effects of the inputs imposed on the muscle spindle. The results presented in this work emphasise that the analysis of the behaviour of complex systems benefits from a combination of frequency and time domain methods

    Rejoinder: The 2005 Neyman Lecture: Dynamic Indeterminism in Science

    Full text link
    Rejoinder to ``The 2005 Neyman Lecture: Dynamic Indeterminism in Science'' [arXiv:0808.0620]Comment: Published in at http://dx.doi.org/10.1214/08-STS246REJ the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    The effects of control versus no control, within a dyad, for the acquisition of a novel motor skill

    Get PDF
    The motor learning literature has shown that alternating dyad practice (i.e. switching between Actor and Observer roles after each practice trial) is both an effective and efficient method of practice. The present experiment examined the effects of providing dyad learners with control over when to switch roles with their partner and investigated the potential differences between when the Actor had control versus the Observer. Further, this experiment investigated the different switching strategies adopted by the dyad learners, when provided control. During acquisition, participants performed a speed cup-stacking task, and returned approximately twenty-four hours later for delayed retention and sequence transfer tests. The results showed participants who controlled their role-switching schedule learned the task relatively similarly to those who did not have control, as well as to those who practice individually. Additionally, providing the Actors with control over their schedule resulted in equivalent learning outcomes to the Observers who were provided control. Finally, the learners who were provided control adopted various switching strategies, highlighting the dynamic nature of dyad practice. Overall, these novel findings suggest that dyad learners can control their role-switching schedule, without undermining learning, and thus provide further support for dyad practice as an effective and efficient method of practice

    Examining an Irregularly Sampled Time Series for Whiteness

    Get PDF
    Suppose it is of interest whether the series itself is white noise. The empirical Fourier transform is proposed to address this question

    The 2005 Neyman Lecture: Dynamic Indeterminism in Science

    Full text link
    Jerzy Neyman's life history and some of his contributions to applied statistics are reviewed. In a 1960 article he wrote: ``Currently in the period of dynamic indeterminism in science, there is hardly a serious piece of research which, if treated realistically, does not involve operations on stochastic processes. The time has arrived for the theory of stochastic processes to become an item of usual equipment of every applied statistician.'' The emphasis in this article is on stochastic processes and on stochastic process data analysis. A number of data sets and corresponding substantive questions are addressed. The data sets concern sardine depletion, blowfly dynamics, weather modification, elk movement and seal journeying. Three of the examples are from Neyman's work and four from the author's joint work with collaborators.Comment: This paper commented in: [arXiv:0808.0631], [arXiv:0808.0638]. Rejoinder in [arXiv:0808.0639]. Published in at http://dx.doi.org/10.1214/07-STS246 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Asymptotic normality of finite Fourier transforms of stationary generalized processes

    Get PDF
    AbstractThis paper indicates a mixing condition under which a net of Fourier transforms, of a stationary generalized process over an abelian locally compact group, has a limiting normal distribution

    Comparative Aspects of the Analysis of Stationary Time Series, Point Processes and Hybrids

    Get PDF
    This paper brings out comparative aspects of the analysis of time series, point processes and hybrids such as sampled time series and marked point processes. Secondand third-order moments and spectra prove useful tools for addressing certain scientific problems involving such processes. Illustrative analyses are presented for data on tides, neurons and earthquakes

    Synthetic plots: some history and examples

    Get PDF
    Jerzy Neyman and Elizabeth Scott developed the idea of synthetic plots. These plots are a display of the data values of an experiment side by side with a display of simulated data values, with the simulation-based on a considered stochastic model. The Neyman and Scott work concerned the distribution of galaxies on the celestial sphere. A review of their wo is presented here followed by personal examples from hydrology, neuroscience, and animal motion.Jerzy Neyman and Elizabeth Scott developed the idea of synthetic plots. These plots are a display of the data values of an experiment side by side with a display of simulated data values, with the simulation-based on a considered stochastic model. The Neyman and Scott work concerned the distribution of galaxies on the celestial sphere. A review of their wo is presented here followed by personal examples from hydrology, neuroscience, and animal motion

    Three months journeying of a Hawaiian monk seal

    Get PDF
    Hawaiian monk seals (Monachus schauinslandi) are endemic to the Hawaiian Islands and are the most endangered species of marine mammal that lives entirely within the jurisdiction of the United States. The species numbers around 1300 and has been declining owing, among other things, to poor juvenile survival which is evidently related to poor foraging success. Consequently, data have been collected recently on the foraging habitats, movements, and behaviors of monk seals throughout the Northwestern and main Hawaiian Islands. Our work here is directed to exploring a data set located in a relatively shallow offshore submerged bank (Penguin Bank) in our search of a model for a seal's journey. The work ends by fitting a stochastic differential equation (SDE) that mimics some aspects of the behavior of seals by working with location data collected for one seal. The SDE is found by developing a time varying potential function with two points of attraction. The times of location are irregularly spaced and not close together geographically, leading to some difficulties of interpretation. Synthetic plots generated using the model are employed to assess its reasonableness spatially and temporally. One aspect is that the animal stays mainly southwest of Molokai. The work led to the estimation of the lengths and locations of the seal's foraging trips.Comment: Published in at http://dx.doi.org/10.1214/193940307000000473 the IMS Collections (http://www.imstat.org/publications/imscollections.htm) by the Institute of Mathematical Statistics (http://www.imstat.org

    A more precise chronology of earthquakes produced by the San Andreas Fault in southern California

    Get PDF
    Improved methods of radiocarbon analysis have enabled us to date more precisely the earthquake ruptures of the San Andreas fault that are recorded in the sediments at Pallett Creek. Previous dates of these events had 95% confidence errors of 50–100 calendar years. New error limits are less than 23 calendar years for all but two of the dated events. This greater precision is due to larger sample size, longer counting time, lower background noise levels, more precise conversion of radiocarbon ages to calendric dates, and better stratigraphic constraints and statistical techniques. The new date ranges, with one exception, fall within the broader ranges estimated previously, but our estimate of the average interval between the latest 10 episodes of faulting is now about 132 years. Variability about the mean interval is much greater than was suspected previously. Five of the nine intervals are shorter than a century; three of the remaining four intervals are about two to three centuries long. Despite the wide range of these intervals, a pattern in the occurrence of large earthquakes at Pallett Creek is apparent in the new data. The past 10 earthquakes occur in four clusters, each of which consists of two or three events. Earthquakes within the clusters are separated by periods of several decades, but the clusters are separated by dormant periods of two to three centuries. This pattern may reflect important mechanical aspects of the fault's behavior. If this pattern continues into the future, the current period of dormancy will probably be greater than two centuries. This would mean that the section of the fault represented by the Pallett Creek site is currently in the middle of one of its longer periods of repose between clusters, and sections of the fault farther to the southeast are much more likely to produce the next great earthquake in California. The greater precision of dates now available for large earthquakes recorded at the Pallett Creek site enables speculative correlation of events between paleoseismic sites along the southern half of the San Andreas fault. A history of great earthquakes with overlapping rupture zones along the Mojave section of the fault remains one of the more attractive possibilities
    corecore