13,954 research outputs found
Ontology and Formal Semantics - Integration Overdue
In this note we suggest that difficulties encountered in natural language semantics are, for the most part, due to the use of mere symbol manipulation systems that are devoid of any content. In such systems, where there is hardly any link with our common-sense view of the world, and it is quite difficult to envision how one can formally account for the considerable amount of content that is often implicit, but almost never explicitly stated in our everyday discourse. \ud
The solution, in our opinion, is a compositional semantics grounded in an ontology that reflects our commonsense view of the world and the way we talk about it in ordinary language. In the compositional logic we envision there are ontological (or first-intension) concepts, and logical (or second-intension) concepts, and where the ontological concepts include not only Davidsonian events, but other abstract objects as well (e.g., states, processes, properties, activities, attributes, etc.) \ud
It will be demonstrated here that in such a framework, a number of challenges in the semantics of natural language (e.g., metonymy, intensionality, metaphor, etc.) can be properly and uniformly addressed.\u
A Note on Ontology and Ordinary Language
We argue for a compositional semantics grounded in a strongly typed ontology that reflects our commonsense view of the world and the way we talk about it. Assuming such a structure we show that the semantics of various natural language phenomena may become nearly trivial
Language, logic and ontology: uncovering the structure of commonsense knowledge
The purpose of this paper is twofold: (i) we argue that the structure of commonsense knowledge must be discovered, rather than invented; and (ii) we argue that natural
language, which is the best known theory of our (shared) commonsense knowledge, should itself be used as a guide to discovering the structure of commonsense knowledge. In addition to suggesting a systematic method to the discovery of the structure of commonsense knowledge, the method we propose seems to also provide an explanation for a number of phenomena in natural language, such as metaphor, intensionality, and the semantics of nominal compounds. Admittedly, our ultimate goal is quite ambitious, and it is no less than the systematic ‘discovery’ of a well-typed
ontology of commonsense knowledge, and the subsequent formulation of the longawaited goal of a meaning algebra
The effect of data preprocessing on the performance of artificial neural networks techniques for classification problems
The artificial neural network (ANN) has recently been applied in many areas, such as
medical, biology, financial, economy, engineering and so on. It is known as an excellent
classifier of nonlinear input and output numerical data. Improving training efficiency of
ANN based algorithm is an active area of research and numerous papers have been
reviewed in the literature. The performance of Multi-layer Perceptron (MLP) trained
with back-propagation artificial neural network (BP-ANN) method is highly influenced
by the size of the data-sets and the data-preprocessing techniques used. This work
analyzes the advantages of using pre-processing datasets using different techniques in
order to improve the ANN convergence. Specifically Min-Max, Z-Score and Decimal
Scaling Normalization preprocessing techniques were evaluated. The simulation results
showed that the computational efficiency of ANN training process is highly enhanced
when coupled with different preprocessing techniques
- …