2,847 research outputs found

    A Scalable Model of Cerebellar Adaptive Timing and Sequencing: The Recurrent Slide and Latch (RSL) Model

    Full text link
    From the dawn of modern neural network theory, the mammalian cerebellum has been a favored object of mathematical modeling studies. Early studies focused on the fan-out, convergence, thresholding, and learned weighting of perceptual-motor signals within the cerebellar cortex. This led in the proposals of Albus (1971; 1975) and Marr (1969) to the still viable idea that the granule cell stage in the cerebellar cortex performs a sparse expansive recoding of the time-varying input vector. This recoding reveals and emphasizes combinations (of input state variables) in a distributed representation that serves as a basis for the learned, state-dependent control actions engendered by cerebellar outputs to movement related centers. Although well-grounded as such, this perspective seriously underestimates the intelligence of the cerebellar cortex. Context and state information arises asynchronously due to the heterogeneity of sources that contribute signals to compose the cerebellar input vector. These sources include radically different sensory systems - vision, kinesthesia, touch, balance and audition - as well as many stages of the motor output channel. To make optimal use of available signals, the cerebellum must be able to sift the evolving state representation for the most reliable predictors of the need for control actions, and to use those predictors even if they appear only transiently and well in advance of the optimal time for initiating the control action. Such a cerebellar adaptive timing competence has recently been experimentally verified (Perrett, Ruiz, & Mauk, 1993). This paper proposes a modification to prior, population, models for cerebellar adaptive timing and sequencing. Since it replaces a population with a single clement, the proposed Recurrent Slide and Latch (RSL) model is in one sense maximally efficient, and therefore optimal from the perspective of scalability.Defense Advanced Research Projects Agency and the Office of Naval Research (N00014-92-J-1309, N00014-93-1-1364, N00014-95-1-0409)

    The Embedding Capacity of Information Flows Under Renewal Traffic

    Full text link
    Given two independent point processes and a certain rule for matching points between them, what is the fraction of matched points over infinitely long streams? In many application contexts, e.g., secure networking, a meaningful matching rule is that of a maximum causal delay, and the problem is related to embedding a flow of packets in cover traffic such that no traffic analysis can detect it. We study the best undetectable embedding policy and the corresponding maximum flow rate ---that we call the embedding capacity--- under the assumption that the cover traffic can be modeled as arbitrary renewal processes. We find that computing the embedding capacity requires the inversion of very structured linear systems that, for a broad range of renewal models encountered in practice, admits a fully analytical expression in terms of the renewal function of the processes. Our main theoretical contribution is a simple closed form of such relationship. This result enables us to explore properties of the embedding capacity, obtaining closed-form solutions for selected distribution families and a suite of sufficient conditions on the capacity ordering. We evaluate our solution on real network traces, which shows a noticeable match for tight delay constraints. A gap between the predicted and the actual embedding capacities appears for looser constraints, and further investigation reveals that it is caused by inaccuracy of the renewal traffic model rather than of the solution itself.Comment: Sumbitted to IEEE Trans. on Information Theory on March 10, 201

    A novel approach for local treatment of breast cancer

    Get PDF
    Early local recurrence of breast cancer most commonly (over 90%) occurs at the site of the primary tumour. This is true whether or not radiotherapy is given and irrespective of the margin status. Whole-organ analysis of mastectomy specimens on the other hand, reveals that 63% of breasts harbour occult cancer foci and 80% of these are situated remote from the index quadrant. Therefore, these occult cancer foci may be clinically irrelevant and it may not be necessary to treat the whole breast with radiotherapy. This 6-wks long course of post-operative radiotherapy after breast conserving therapy is not only inconvenient and costly, but may cause many women from geographically remote areas to choose mastectomy. Targeted Intraoperative radiotherapy (TARGIT) to the peri-tumoural area alone might provide adequate local control. ‘Intrabeam’ (PeC) is a portable electron-beam driven device that can deliver therapeutic radiation (soft x-rays) in 20-30 minutes within a standard operating theatre environment. The pliable breast tissue - the target - is wrapped around a spherical applicator - the source - providing truly conformal radiotherapy. The prescribed dose is 5 & 20Gy at 1cm and 0.2cm respectively, from the tumour bed. The biologically effective dose is 7-53Gy for α/β=10 and 20-120Gy for α/β=1.5. In our pilot study of 26 patients (age 30-80 years, T=0.42-4.0cm), we replaced the routine post-operative tumour bed boost with targeted intra-operative radiotherapy. There have been no major complications and no patient has developed local recurrence, although the median follow-up time is short at 34 months. The cosmetic outcome is satisfying to both the patient and the clinician. Having established the feasibility, acceptability and safety in the pilot study, we started in March 2000, a randomised trial that compares TARGIT with conventional postoperative radiotherapy for infiltrating duct carcinomas, with local recurrence and cosmesis as the main outcome measures. Patient accrual in this trial has been excellent and it has attracted several international collaborative groups. If proven effective, TARGIT could eliminate the need for postoperative radiotherapy potentially saving time, money and breasts

    Adaptive Algorithms For Classification On High-Frequency Data Streams: Application To Finance

    Get PDF
    Mención Internacional en el título de doctorIn recent years, the problem of concept drift has gained importance in the financial domain. The succession of manias, panics and crashes have stressed the nonstationary nature and the likelihood of drastic structural changes in financial markets. The most recent literature suggests the use of conventional machine learning and statistical approaches for this. However, these techniques are unable or slow to adapt to non-stationarities and may require re-training over time, which is computationally expensive and brings financial risks. This thesis proposes a set of adaptive algorithms to deal with high-frequency data streams and applies these to the financial domain. We present approaches to handle different types of concept drifts and perform predictions using up-to-date models. These mechanisms are designed to provide fast reaction times and are thus applicable to high-frequency data. The core experiments of this thesis are based on the prediction of the price movement direction at different intraday resolutions in the SPDR S&P 500 exchange-traded fund. The proposed algorithms are benchmarked against other popular methods from the data stream mining literature and achieve competitive results. We believe that this thesis opens good research prospects for financial forecasting during market instability and structural breaks. Results have shown that our proposed methods can improve prediction accuracy in many of these scenarios. Indeed, the results obtained are compatible with ideas against the efficient market hypothesis. However, we cannot claim that we can beat consistently buy and hold; therefore, we cannot reject it.Programa de Doctorado en Ciencia y Tecnología Informática por la Universidad Carlos III de MadridPresidente: Gustavo Recio Isasi.- Secretario: Pedro Isasi Viñuela.- Vocal: Sandra García Rodrígue

    The Large Observatory for X-ray Timing (LOFT)

    Get PDF
    High-time-resolution X-ray observations of compact objects provide direct access to strong-field gravity, to the equation of state of ultradense matter and to black hole masses and spins. A 10 m[superscript 2]-class instrument in combination with good spectral resolution is required to exploit the relevant diagnostics and answer two of the fundamental questions of the European Space Agency (ESA) Cosmic Vision Theme “Matter under extreme conditions”, namely: does matter orbiting close to the event horizon follow the predictions of general relativity? What is the equation of state of matter in neutron stars? The Large Observatory For X-ray Timing (LOFT), selected by ESA as one of the four Cosmic Vision M3 candidate missions to undergo an assessment phase, will revolutionise the study of collapsed objects in our galaxy and of the brightest supermassive black holes in active galactic nuclei. Thanks to an innovative design and the development of large-area monolithic silicon drift detectors, the Large Area Detector (LAD) on board LOFT will achieve an effective area of ~12 m2 (more than an order of magnitude larger than any spaceborne predecessor) in the 2–30 keV range (up to 50 keV in expanded mode), yet still fits a conventional platform and small/medium-class launcher. With this large area and a spectral resolution of <260 eV, LOFT will yield unprecedented information on strongly curved spacetimes and matter under extreme conditions of pressure and magnetic field strength

    Work, aging, mental fatigue, and eye movement dynamics

    Get PDF
    corecore