9,709 research outputs found

    Computer-aided processing of LANDSAT MSS data for classification of forestlands

    Get PDF
    There are no author-identified significant results in this report

    Price dynamics, informational efficiency and wealth distribution in continuous double auction markets

    Get PDF
    This paper studies the properties of the continuous double auction trading mechanishm using an artificial market populated by heterogeneous computational agents. In particular, we investigate how changes in the population of traders and in market microstructure characteristics affect price dynamics, information dissemination and distribution of wealth across agents. In our computer simulated market only a small fraction of the population observe the risky asset's fundamental value with noise, while the rest of agents try to forecast the asset's price from past transaction data. In contrast to other artificial markets, we assume that the risky asset pays no dividend, so agents cannot learn from past transaction prices and subsequent dividend payments. We find that private information can effectively disseminate in the market unless market regulation prevents informed investors from short selling or borrowing the asset, and these investors do not constitute a critical mass. In such case, not only are markets less efficient informationally, but may even experience crashes and bubbles. Finally, increased informational efficiency has a negative impact on informed agents' trading profits and a positive impact on artificial intelligent agents' profits

    What May Visualization Processes Optimize?

    Full text link
    In this paper, we present an abstract model of visualization and inference processes and describe an information-theoretic measure for optimizing such processes. In order to obtain such an abstraction, we first examined six classes of workflows in data analysis and visualization, and identified four levels of typical visualization components, namely disseminative, observational, analytical and model-developmental visualization. We noticed a common phenomenon at different levels of visualization, that is, the transformation of data spaces (referred to as alphabets) usually corresponds to the reduction of maximal entropy along a workflow. Based on this observation, we establish an information-theoretic measure of cost-benefit ratio that may be used as a cost function for optimizing a data visualization process. To demonstrate the validity of this measure, we examined a number of successful visualization processes in the literature, and showed that the information-theoretic measure can mathematically explain the advantages of such processes over possible alternatives.Comment: 10 page

    A Systems Approach for Selection Between Manual and Automated Work Zones Within Assembly Lines

    Get PDF
    Manufacturing firms are continuously looking forward to improve and optimize their processes to meet the requirements of mass production and product customization. In order to meet these demands, the operations on the assembly line need to be allocated with the right level of automation, such that neither the human nor the machine is underutilized With such an emphasis being put on assembly operations within manufacturing enterprises, there is a need for a systematic procedure that helps in identifying appropriate levels of automation (LoA) within different resolutions, such as at the workstation, and the band scales. Based on a literature review, it was seen that the research done within the area of LoA is not abundant, and the few methodologies that discuss about this aspect have their own benefits and limitations. The main aim of this thesis research is to develop a systematic methodology/approach that can help determine the appropriate at a systems level, by looking at various factors such as production volume, production flow, the no. of variants and other factors. To arrive at this, a set of requirements are defined that can be used to judge the most suitable method from the existing literature. The most suitable method would be a method that satisfies all the requirements and helps in determining the appropriate LoA at workstation and band scales. Two methods: 1) B&D method and 2) Dynamo method partially satisfy most of the requirements and are combined together in order to form a new integrated method that can help in determining the appropriate levels of automation to be applied at workstation and band scales. Both the methods are validated based on 4 individual case studies performed at 2 different manufacturing firms. Based on the results obtained both the methods are useful at the workstation level but fail to determine the appropriate LoA at the band level. The integrated method is then applied to the operations at one of the manufacturing firms, to suggest possible improvements within the levels of automation currently being implemented at the firm

    PRICE DYNAMICS, INFORMATIONAL EFFICIENCY AND WEALTH DISTRIBUTION IN CONTINUOUS DOUBLE AUCTION MARKETS

    Get PDF
    This paper studies the properties of the continuous double auction trading mechanishm using an artificial market populated by heterogeneous computational agents. In particular, we investigate how changes in the population of traders and in market microstructure characteristics affect price dynamics, information dissemination and distribution of wealth across agents. In our computer simulated market only a small fraction of the population observe the risky asset’s fundamental value with noise, while the rest of agents try to forecast the asset’s price from past transaction data. In contrast to other artificial markets, we assume that the risky asset pays no dividend, so agents cannot learn from past transaction prices and subsequent dividend payments. We find that private information can effectively disseminate in the market unless market regulation prevents informed investors from short selling or borrowing the asset, and these investors do not constitute a critical mass. In such case, not only are markets less efficient informationally, but may even experience crashes and bubbles. Finally, increased informational efficiency has a negative impact on informed agents’ trading profits and a positive impact on artificial intelligent agents’ profits.

    Image Segmentation Using Active Contours Driven by the Bhattacharyya Gradient Flow

    Get PDF
    ©2007 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or distribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.DOI: 10.1109/TIP.2007.908073This paper addresses the problem of image segmentation by means of active contours, whose evolution is driven by the gradient flow derived froman energy functional that is based on the Bhattacharyya distance. In particular, given the values of a photometric variable (or of a set thereof), which is to be used for classifying the image pixels, the active contours are designed to converge to the shape that results in maximal discrepancy between the empirical distributions of the photometric variable inside and outside of the contours. The above discrepancy is measured by means of the Bhattacharyya distance that proves to be an extremely useful tool for solving the problem at hand. The proposed methodology can be viewed as a generalization of the segmentation methods, in which active contours maximize the difference between a finite number of empirical moments of the "inside" and "outside" distributions. Furthermore, it is shown that the proposed methodology is very versatile and flexible in the sense that it allows one to easily accommodate a diversity of the image features based on which the segmentation should be performed. As an additional contribution, a method for automatically adjusting the smoothness properties of the empirical distributions is proposed. Such a procedure is crucial in situations when the number of data samples (supporting a certain segmentation class) varies considerably in the course of the evolution of the active contour. In this case, the smoothness properties of the empirical distributions have to be properly adjusted to avoid either over- or underestimation artifacts. Finally, a number of relevant segmentation results are demonstrated and some further research directions are discussed

    Price dynamics, informational efficiency and wealth distribution in continuous double-auction markets.

    Get PDF
    This paper studies the properties of the continuous double-auction trading mechanism using an artificial market populated by heterogeneous computational agents. In particular, we investigate how changes in the population of traders and in market microstructure characteristics affect price dynamics, information dissemination, and distribution of wealth across agents. In our computer-simulated market only a small fraction of the population observe the risky asset's fundamental value with noise, while the rest of the agents try to forecast the asset's price from past transaction data. In contrast to other artificial markets, we assume that the risky asset pays no dividend, thus agents cannot learn from past transaction prices and subsequent dividend payments. We find that private information can effectively disseminate in the market unless market regulation prevents informed investors from short selling or borrowing the asset, and these investors do not constitute a critical mass. In such case, not only are markets less efficient informationally, but may even experience crashes and bubbles. Finally, increased informational efficiency has a negative impact on informed agents' trading profits and a positive impact on artificial intelligent agents' profits.Artificial financial markets; Information dissemination; Artificial neural networks; Heterogeneous agents;

    The Employment Impact Of Globalisation In Developing Countries

    Get PDF
    The relationship between globalization and employment is of growing significance to policy makers in developing countries, but is surprisingly difficult to analyse theoretically and empirically. 'Globalization' means different things to different analysts and it is so multi-faceted that its effects are difficult to isolate and evaluate. Received trade theory does not provide a clear guide to its employment effects and in its most commonly used version it assumes away many factors that affect employment during globalization. Much finally depends on the ability of each country to cope with the liberalised trade, investment and technology flows that globalization implies. As this ability varies widely across the developing world - and is continuing to diverge between countries - it appears that no generalisation about the globalization-employment relationship is possible.

    Complexity, BioComplexity, the Connectionist Conjecture and Ontology of Complexity\ud

    Get PDF
    This paper develops and integrates major ideas and concepts on complexity and biocomplexity - the connectionist conjecture, universal ontology of complexity, irreducible complexity of totality & inherent randomness, perpetual evolution of information, emergence of criticality and equivalence of symmetry & complexity. This paper introduces the Connectionist Conjecture which states that the one and only representation of Totality is the connectionist one i.e. in terms of nodes and edges. This paper also introduces an idea of Universal Ontology of Complexity and develops concepts in that direction. The paper also develops ideas and concepts on the perpetual evolution of information, irreducibility and computability of totality, all in the context of the Connectionist Conjecture. The paper indicates that the control and communication are the prime functionals that are responsible for the symmetry and complexity of complex phenomenon. The paper takes the stand that the phenomenon of life (including its evolution) is probably the nearest to what we can describe with the term “complexity”. The paper also assumes that signaling and communication within the living world and of the living world with the environment creates the connectionist structure of the biocomplexity. With life and its evolution as the substrate, the paper develops ideas towards the ontology of complexity. The paper introduces new complexity theoretic interpretations of fundamental biomolecular parameters. The paper also develops ideas on the methodology to determine the complexity of “true” complex phenomena.\u

    Governance arrangements for state owned enterprises

    Get PDF
    The aim of this paper is to shed new light on key challenges in governance arrangements for state owned enterprises in infrastructure sectors. The paper provides guidelines on how to classify the fuzzy and sometimes conflicting development goals of infrastructure and the governance arrangements needed to reach such goals. Three policy recommendations emerge. First, some of the structures implied by internationally adopted principles of corporate governance for state owned enterprises favoring a centralized ownership function versus a decentralized or dual structure have not yet been sufficiently"tested"in practice and may not suit all developing countries. Second, general corporate governance guidelines (and policy recommendations) need to be carefully adapted to infrastructure sectors, particularly in the natural monopoly segments. Because the market structure and regulatory arrangements in which state owned enterprises operate matters, governments may want to distinguish the state owned enterprises operating in potentially competitive sectors from the ones under a natural monopoly structure. Competition provides not only formidable benefits, but also unique opportunities for benchmarking, increasing transparency and accountability. Third, governments may want to avoid partial fixes, by tackling both the internal and external governance factors. Focusing only on one of the governance dimensions is unlikely to improve SOE performance in a sustainable way.National Governance,Banks&Banking Reform,Public Sector Economics&Finance,Debt Markets,Public Sector Expenditure Analysis&Management
    corecore