25,723 research outputs found

    Developing an ontological sandbox : investigating multi-level modelling’s possible Metaphysical Structures

    Get PDF
    One of the central concerns of the multi-level modelling (MLM) community is the hierarchy of classifications that appear in conceptual models; what these are, how they are linked and how they should be organised into levels and modelled. Though there has been significant work done in this area, we believe that it could be enhanced by introducing a systematic way to investigate the ontological nature and requirements that underlie the frameworks and tools proposed by the community to support MLM (such as Orthogonal Classification Architecture and Melanee). In this paper, we introduce a key component for the investigation and understanding of the ontological requirements, an ontological sandbox. This is a conceptual framework for investigating and comparing multiple variations of possible ontologies – without having to commit to any of them – isolated from a full commitment to any foundational ontology. We discuss the sandbox framework as well as walking through an example of how it can be used to investigate a simple ontology. The example, despite its simplicity, illustrates how the constructional approach can help to expose and explain the metaphysical structures used in ontologies, and so reveal the underlying nature of MLM levelling

    Representing Style by Feature Space Archetypes: Description and Emulation of Spatial Styles in an Architectural Context

    Get PDF

    Classifying Supergravity Solutions

    Full text link
    We review the substantial progress that has been made in classifying supersymmetric solutions of supergravity theories using G-structures. We also review the construction of supersymmetric black rings that were discovered using the classification of D=5 supergravity solutions.Comment: 16 pages. To appear in the Proceedings of the 37th International Symposium Ahrenshoop, ``Recent Developments in String/M-Theory and Field Theory", August 23-27 2004, Berlin-Schmoeckwitz. References adde

    Manipulationism, Ceteris Paribus Laws, and the Bugbear of Background Knowledge

    Get PDF
    According to manipulationist accounts of causal explanation, to explain an event is to show how it could be changed by intervening on its cause. The relevant change must be a ‘serious possibility’ claims Woodward 2003, distinct from mere logical or physical possibility—approximating something I call ‘scientific possibility’. This idea creates significant difficulties: background knowledge is necessary for judgments of possibili-ty. Yet the primary vehicles of explanation in manipulationism are ‘invariant’ generali-sations, and these are not well adapted to encoding such knowledge, especially in the social sciences, as some of it is non-causal. Ceteris paribus (CP) laws or generalisa-tions labour under no such difficulty. A survey of research methods such as case and comparative studies, randomised control trials, ethnography, and structural equation modeling, suggests that it would be more difficult and in some instances impossible to try to represent the output of each method in invariant generalisations; and that this is because in each method causal and non-causal background knowledge mesh in a way that cannot easily be accounted for in manipulationist terms. Ceteris paribus-generalisations being superior in this regard, a theory of explanation based on the latter is a better fit for social science

    Mapping and analysis of changes in the riparian landscape structure of the Lockyer Valley Catchment, Queensland, Australia

    Get PDF
    [Abstract]: A case study of the Lockyer Valley catchment in Queensland, Australia, was conducted to develop appropriate mapping and assessment techniques to quantify the nature and magnitude of riparian landscape structural changes within a catchment. The study employed digital image processing techniques to produce land cover maps from the 1973 and 1997 Landsat imagery. Fixed and variable width buffering of streams were implemented using a geographic information system (GIS) to estimate the riparian zone and to subsequently calculate the landscape patterns using the Patch Analyst (Grid) program (a FRAGSTATS interface). The nature of vegetation clearing was characterised based on land tenure, slope and stream order. Using the Pearson chi-square test and Cramer’s V statistic, the relationships between the vegetation clearing and land tenure were further assessed. The results show the significant decrease in woody vegetation areas mainly due to conversion to pasture. Riparian vegetation corridors have become more fragmented, isolated and of much smaller patches. Land tenure was found to be significantly associated with the vegetation clearing, although the strength of association was weak. The large proportion of deforested riparian zones within steep slopes or first-order streams raises serious questions about the catchment health and the longer term potential for land degradation by upland clearing. This study highlights the use of satellite imagery and geographic information systems in mapping and analysis of landscape structural change, as well as the identification of key issues related to sensor spatial resolution, stream buffering widths, and the quantification of land transformation processes

    Non-adaptive Measurement-based Quantum Computation and Multi-party Bell Inequalities

    Full text link
    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as Measurement-based Quantum Computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities still remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focussing on deterministic computation of Boolean functions, in which natural generalisations of the Greenberger-Horne-Zeilinger (GHZ) paradox emerge; we then explore probabilistic computation, via which multipartite Bell Inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.Comment: 13 pages, 4 figures, final version accepted for publicatio

    Equity trend prediction with neural networks

    Get PDF
    This paper presents results of neural network based trend prediction for equity markets. Raw equity exchange data is pre-processed before being fed into a series of neural networks. The use of Self Organising Maps (SOM) is investigated as a data classification method to limit neural network inputs and training data requirements. The resulting primary simulation is a neural network that can prediction whether the next trading period will be, on average, higher or lower than the current. Combinations of pre-processing and feature extracting SOM’s are investigated to determine the more optimal system configuration
    • …
    corecore