11,282 research outputs found

    VLSI ARCHITECTURES FOR VIDEO PROCESSING AND RISC-V

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    Ontology and Formal Semantics - Integration Overdue

    Get PDF
    In this note we suggest that difficulties encountered in natural language semantics are, for the most part, due to the use of mere symbol manipulation systems that are devoid of any content. In such systems, where there is hardly any link with our common-sense view of the world, and it is quite difficult to envision how one can formally account for the considerable amount of content that is often implicit, but almost never explicitly stated in our everyday discourse. \ud The solution, in our opinion, is a compositional semantics grounded in an ontology that reflects our commonsense view of the world and the way we talk about it in ordinary language. In the compositional logic we envision there are ontological (or first-intension) concepts, and logical (or second-intension) concepts, and where the ontological concepts include not only Davidsonian events, but other abstract objects as well (e.g., states, processes, properties, activities, attributes, etc.) \ud It will be demonstrated here that in such a framework, a number of challenges in the semantics of natural language (e.g., metonymy, intensionality, metaphor, etc.) can be properly and uniformly addressed.\u

    A Note on Ontology and Ordinary Language

    Get PDF
    We argue for a compositional semantics grounded in a strongly typed ontology that reflects our commonsense view of the world and the way we talk about it. Assuming such a structure we show that the semantics of various natural language phenomena may become nearly trivial

    Logical Semantics and Commonsense Knowledge: Where Did we Go Wrong, and How to Go Forward, Again

    Get PDF
    We argue that logical semantics might have faltered due to its failure in distinguishing between two fundamentally very different types of concepts: ontological concepts, that should be types in a strongly-typed ontology, and logical concepts, that are predicates corresponding to properties of and relations between objects of various ontological types. We will then show that accounting for these differences amounts to the integration of lexical and compositional semantics in one coherent framework, and to an embedding in our logical semantics of a strongly-typed ontology that reflects our commonsense view of the world and the way we talk about it in ordinary language. We will show that in such a framework a number of challenges in natural language semantics can be adequately and systematically treated

    Interior Collective Optimum in a Volontary Contribution to a Public-Goods Game : An Experimental Approach

    Get PDF
    We run a public good experiment with four different treatments. The payoff function is chosen such that the Nash equilibrium (NE) and the collective optimum (CO) are both in the interior of the strategy space. We try to test the effect of varying the level of the collective optimum, which changes the "social dilemma", involved in the decision as to how much to contribute to the public good . Our results show that contributions increase with the level of the interior CO. There is overcontribution in comparison to the NE and under contribution in comparison to the CO. But contributions are as far from the CO as the level of this former gets high. An overcontribution index that takes into account the effective contribution relative to both, the NE and the CO, shows that subjects adopt a constant behavior while passing from one treatment to another: they contribute a constant share of the CO.Public Goods, Experiments, Interior Solutions, Social Dilemma

    The effect of data preprocessing on the performance of artificial neural networks techniques for classification problems

    Get PDF
    The artificial neural network (ANN) has recently been applied in many areas, such as medical, biology, financial, economy, engineering and so on. It is known as an excellent classifier of nonlinear input and output numerical data. Improving training efficiency of ANN based algorithm is an active area of research and numerous papers have been reviewed in the literature. The performance of Multi-layer Perceptron (MLP) trained with back-propagation artificial neural network (BP-ANN) method is highly influenced by the size of the data-sets and the data-preprocessing techniques used. This work analyzes the advantages of using pre-processing datasets using different techniques in order to improve the ANN convergence. Specifically Min-Max, Z-Score and Decimal Scaling Normalization preprocessing techniques were evaluated. The simulation results showed that the computational efficiency of ANN training process is highly enhanced when coupled with different preprocessing techniques
    • …
    corecore