1,401 research outputs found
Default reasoning using maximum entropy and variable strength defaults
PhDThe thesis presents a computational model for reasoning with partial information
which uses default rules or information about what normally happens. The idea is
to provide a means of filling the gaps in an incomplete world view with the most
plausible assumptions while allowing for the retraction of conclusions should they
subsequently turn out to be incorrect. The model can be used both to reason from
a given knowledge base of default rules, and to aid in the construction of such
knowledge bases by allowing their designer to compare the consequences of his
design with his own default assumptions. The conclusions supported by the proposed
model are justified by the use of a probabilistic semantics for default rules
in conjunction with the application of a rational means of inference from incomplete
knowledge the principle of maximum entropy (ME). The thesis develops
both the theory and algorithms for the ME approach and argues that it should be
considered as a general theory of default reasoning.
The argument supporting the thesis has two main threads. Firstly, the ME approach
is tested on the benchmark examples required of nonmonotonic behaviour,
and it is found to handle them appropriately. Moreover, these patterns of commonsense
reasoning emerge as consequences of the chosen semantics rather than
being design features. It is argued that this makes the ME approach more objective,
and its conclusions more justifiable, than other default systems. Secondly, the
ME approach is compared with two existing systems: the lexicographic approach
(LEX) and system Z+. It is shown that the former can be equated with ME under
suitable conditions making it strictly less expressive, while the latter is too crude to
perform the subtle resolution of default conflict which the ME approach allows. Finally,
a program called DRS is described which implements all systems discussed
in the thesis and provides a tool for testing their behaviours.Engineering and Physical Science Research Council (EPSRC
Quasar Tomography: Unification of Echo Mapping and Photoionisation Models
Reverberation mapping uses time-delayed variations in photoionised emission
lines to map the geometry and kinematics of emission-line gas in active
galactic nuclei. In previous work, the light travel time delay
tau=R(1+cos(theta))/c and Doppler shift v give a 2-d map Psi(tau,v) for each
emission line. Here we combine the velocity-delay information with
photoionisation physics in a maximum entropy fit to the full reverberating
spectrum F_lam(lam,t) to recover a 5-d map of the differential covering
fraction f(R,theta,n,N,v), with n and N the density and column density of the
gas clouds. We test the method for a variety of geometries (shells, rings,
disks, clouds, jets) by recovering a 3-d map f(R,theta,n) from reverberations
in 7 uv emission lines. The best test recovers a hollow shell geometry,
defining R to 0.15 dex, n to 0.3 dex, and ionisation parameter U ~ 1/(n R^2) to
0.1 dex. The results are sensitive to the adopted distance and luminosity,
suggesting that these parameters may be measurable as well.Comment: Accepted 4 Sep 2002 for publication in MNRA
Perceptually-Driven Video Coding with the Daala Video Codec
The Daala project is a royalty-free video codec that attempts to compete with
the best patent-encumbered codecs. Part of our strategy is to replace core
tools of traditional video codecs with alternative approaches, many of them
designed to take perceptual aspects into account, rather than optimizing for
simple metrics like PSNR. This paper documents some of our experiences with
these tools, which ones worked and which did not. We evaluate which tools are
easy to integrate into a more traditional codec design, and show results in the
context of the codec being developed by the Alliance for Open Media.Comment: 19 pages, Proceedings of SPIE Workshop on Applications of Digital
Image Processing (ADIP), 201
Estimation of Default Probabilities with Support Vector Machines
Predicting default probabilities is important for firms and banks to operate successfully and to estimate their specific risks. There are many reasons to use nonlinear techniques for predicting bankruptcy from financial ratios. Here we propose the so called Support Vector Machine (SVM) to estimate default probabilities of German firms. Our analysis is based on the Creditreform database. The results reveal that the most important eight predictors related to bankruptcy for these German firms belong to the ratios of activity, profitability, liquidity, leverage and the percentage of incremental inventories. Based on the performance measures, the SVM tool can predict a firms default risk and identify the insolvent firm more accurately than the benchmark logit model. The sensitivity investigation and a corresponding visualization tool reveal that the classifying ability of SVM appears to be superior over a wide range of the SVM parameters. Based on the nonparametric Nadaraya-Watson estimator, the expected returns predicted by the SVM for regression have a significant positive linear relationship with the risk scores obtained for classification. This evidence is stronger than empirical results for the CAPM based on a linear regression and confirms that higher risks need to be compensated by higher potential returns.Support Vector Machine, Bankruptcy, Default Probabilities Prediction, Expected Profitability, CAPM.
- âŠ