87,991 research outputs found

    Predictable Feature Analysis

    Full text link
    Every organism in an environment, whether biological, robotic or virtual, must be able to predict certain aspects of its environment in order to survive or perform whatever task is intended. It needs a model that is capable of estimating the consequences of possible actions, so that planning, control, and decision-making become feasible. For scientific purposes, such models are usually created in a problem specific manner using differential equations and other techniques from control- and system-theory. In contrast to that, we aim for an unsupervised approach that builds up the desired model in a self-organized fashion. Inspired by Slow Feature Analysis (SFA), our approach is to extract sub-signals from the input, that behave as predictable as possible. These "predictable features" are highly relevant for modeling, because predictability is a desired property of the needed consequence-estimating model by definition. In our approach, we measure predictability with respect to a certain prediction model. We focus here on the solution of the arising optimization problem and present a tractable algorithm based on algebraic methods which we call Predictable Feature Analysis (PFA). We prove that the algorithm finds the globally optimal signal, if this signal can be predicted with low error. To deal with cases where the optimal signal has a significant prediction error, we provide a robust, heuristically motivated variant of the algorithm and verify it empirically. Additionally, we give formal criteria a prediction-model must meet to be suitable for measuring predictability in the PFA setting and also provide a suitable default-model along with a formal proof that it meets these criteria

    Hierarchical Information and the Rate of Information Diffusion

    Get PDF
    The rate of information diffusion and consequently price discovery, is conditional upon not only the design of the market microstructure, but also the informational structure. This paper presents a market microstructure model showing that an increasing number of information hierarchies among informed competitive traders leads to a slower information diffusion rate and informational inefficiency. The model illustrates that informed traders may prefer trading with each other rather than with noise traders in the presence of the information hierarchies. Furthermore, we show that momentum can be generated from the predictable patterns of noise traders, which are assumed to be a function of past pricesInformation hierarchies, Information diffusion rate, Momentum

    A Generic Synthesis Algorithm for Well-Defined Parametric Design

    Get PDF
    This paper aims to improve the way synthesis tools can be built by formalizing: 1) the design artefact, 2) related knowledge and 3) an algorithm to generate solutions. This paper focuses on well-defined parametric engineering design, ranging from machine elements to industrial products. A design artefact is formalized in terms of parameters and topology elements. The knowledge is classified in three types: resolving rules to determine parameter values, constraining rules to restrict parameter values and expansion rules to add elements to the topology. A synthesis algorithm, based on an opportunistic design strategy, is described and tested for three design cases

    Data warehouse - opportunity for local authorities

    Get PDF
    Local government exists in a very complex organizational environment, which has been subjected to an ever increasing pace of changes. This situation has generated a huge impact on the local government's reaction towards two major elements in is administration - decision making and technology approaching. The data warehouse technology offers a solution to this problem. But there are various obstacles that inhibit the government from adopting this technology

    The Use of a Cap-mounted Tri-axial Accelerometer for Measurement of Distance, Lap Times and Stroke Rates in Swim Training

    Get PDF
    This paper will report some of the findings from a trial which recorded accelerometer data from six elite level swimmers (three female and three male, varying primary event stroke and distance) over the course of a regular 15 week training block. Measurements from a head-mounted accelerometer are used to determine when the athlete is swimming, marking of turning points (and therefore distance and lap-time measurements), and is processed by frequency analysis to determine stroke-rate. Comparison with video where available, and with training plans and literature where not, have proven this method to be accurate and reliable for determining these performance metrics. The primary objective of this project was to develop a low-cost, simple and highly usable system for use in swim coaching, feedback from elite coaches has indicated that development of this could be an extremely useful addition to their training regime
    corecore