30,000 research outputs found
Higher-order Representation and Reasoning for Automated Ontology Evolution
Abstract: The GALILEO system aims at realising automated ontology evolution. This is necessary to enable intelligent agents to manipulate their own knowledge autonomously and thus reason and communicate effectively in open, dynamic digital environments characterised by the heterogeneity of data and of representation languages. Our approach is based on patterns of diagnosis of faults detected across multiple ontologies. Such patterns allow to identify the type of repair required when conflicting ontologies yield erroneous inferences. We assume that each ontology is locally consistent, i.e. inconsistency arises only across ontologies when they are merged together. Local consistency avoids the derivation of uninteresting theorems, so the formula for diagnosis can essentially be seen as an open theorem over the ontologies. The system’s application domain is physics; we have adopted a modular formalisation of physics, structured by means of locales in Isabelle, to perform modular higher-order reasoning, and visualised by means of development graphs.
Data types
A Mathematical
interpretation is given to the notion of a data type.
The main novelty is in the generality of the mathematical treatment
which allows procedural data types and circularly defined data types.
What is meant by data type is pretty close to what any computer
scientist would understand by this term or by data structure, type,
mode, cluster, class. The mathematical treatment is the conjunction
of the ideas of D. Scott on the solution of domain equations (Scott
(71), (72) and (76)) and the initiality property noticed by the
ADJ group (ADJ (75), ADJ (77)). The present work adds operations
to the data types proposed by Scott and generalizes the data types
of ADJ to procedural types and arbitrary circular type definitions.
The advantages of a mathematical interpretation of data types are
those of mathematical semantics in general : throwing light on some
ill-understood constructs in high-level programming languages, easing
the task of writing correct programs and making possible proofs of
correctness for programs or implementations"
Disentangling Treatment Effects of Polish Active Labor Market Policies: Evidence from Matched Samples
This paper estimates causal effects of two Polish active labor market policies - Training and Intervention Works - on employment probabilities. Using data from the 18th wave of the Polish Labor Force Survey we discuss three stages of an appropriately designed matching procedure and demonstrate how the method succeeds in balancing relevant covariates. The validity of this approach is illustrated using the estimated propensity score as a summary measure of balance. We implement a conditional difference-in-differences estimator of treatment effects based on individual trinomial sequences of pre-treatment labor market status. Our findings suggest that Training raises employment probability, while Intervention Works seems to lead to a negative treatment effect for men. Furthermore, we find that appropriate subdivision of the matched sample for conditional treatment effect estimation can add considerable insight to the interpretation of results.http://deepblue.lib.umich.edu/bitstream/2027.42/39831/3/wp447.pd
Edge Enhancement Investigations by Means of Experiments and Simulations
Standard neutron imaging procedures are based on the “shadow” of the transmitted radiation, attenuated by the sample material. Under certain conditions significant deviations from pure transmission can be found in the form of enhancement or depression at the edges of the samples. These effects can limit the quantification process in the related regions. Otherwise, an enhancement and improvement of visibility can be achieved e.g. in defect analysis. In systematic studies we investigated the dependency of these effects on the specific material (mainly for common metals), such as the sample-to-detector distance, the beam collimation, the material thickness and the neutron energy. The beam lines ICON and BOA at PSI and ANTARES at TU München were used for these experiments due to their capability for neutron imaging with highest possible spatial resolution (6.5 to 13.5 micro-meter pixel size, respectively) and their cold beam spectrum. Next to the experimental data we used a McStas tool for the description of refraction and reflection features at edges for comparison. Even if minor contributions by coherent in-line propagation phase contrast are underlined, the major effect can be described by refraction of the neutrons at the sample-void interface. Ways to suppress and to magnify the edge effects can be derived from these findings.Fil: Lehmann, E.. Paul Scherrer Institut; SuizaFil: Schulz, M.. Technische Universitat Munchen; AlemaniaFil: Wang, Y.. China Insititute of Atomic Energy; ChinaFil: Tartaglione, Aureliano. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentin
Diversification and the Optimal Construction of Basis Portfolios
Nontrivial diversification possibilities arise when a factor model describes security returns. In this paper, we provide a comprehensive examination of the merits of various strategies for constructing basis portfolios that are, in principle, highly correlated with the common factors underlying security returns. Three main conclusions emerge from our study. First, increasing the number of securities included in the analysis dramatically improves basis portfolio performance. Our results indicate that factor models involving 750 securities provide markedly superior performance to those involving 30 or 250 securities. Second, comparatively efficient estimation procedures such as maximum likelihood and restricted maximum likelihood factor analysis (which imposes the APT mean restriction) significantly outperform the less efficient instrumental variables and principal components procedures that have been proposed in the literature. Third, a variant of the usual Fama-MacBeth portfolio formation procedure, which we call the minimum idiosyncratic risk portfolio formation procedure, outperformed the Fama-MacBeth procedure and proved equal to or better than more expensive quadratic programming procedures.
Mutual Fund Performance Evaluation: A Comparison of Benchmarks and Benchmark Comparisons
Our primary goal in this paper is to ascertain whether the absolute and relative rankings of managed funds are sensitive to the benchmark chosen to measure normal performance. We employ the standard CAPM benchmarks and a variety of APT benchmarks to investigate this question. We found that there is little similarity between the absolute and relative mutual fund rankings obtained from alternative benchmarks which suggests the importance of knowing the appropriate model for risk and expected return in this context. In addition, the rankings are quite sensitive to the method used to construct the APT benchmark. One would reach very different conclusions about the funds' performance using smaller numbers of securities in the analysis or the less efficient methods for estimating the necessary factor models than one would arrive at using the maximum likelihood procedures with 750 securities. We did, however, find the rankings of the funds are not very sensitive to the exact number of common sources of systematic risk that are assumed to impinge on security returns. Finally, we found statistically significant measured abnormal performance using all the benchmarks. The economic explanation of this phenomenon appears to be an open question.
The Empirical Foundations of the Arbitrage Pricing Theory I: The Empirical Tests
This paper provides a detailed and extensive examination of the validity of the APT based on maximum likelihood factor analysis of large cross-sections of securities. Our empirical implementation of the theory proved in capable of explaining expected returns on portfolios composed of securities with different market capitalizations although it provided an adequate account of the expected returns of portfolios formed on the basis of dividend yield and own variance where risk adjustment with the CAPM employing the usual market proxies failed. In addition, it appears that the zero beta version of the APT is sharply rejected in favor of the riskless rate model and that there is little basis for discriminating among five and ten factor versions of the theory.
- …