49 research outputs found

    Towards a verified transformation from AADL to the formal component-based language FIACRE

    Get PDF
    International audienceDuring the last decade, aadl  is an emerging architecture description languages addressing the modeling of embedded systems. Several research projects have shown that aadl  concepts are well suited to the design of embedded systems. Moreover, aadl  has a precise execution model which has proved to be one key feature for effective early analysis. In this paper, we are concerned with the foundational aspects of the verification support for aadl. More precisely, we propose a verification toolchain for aadl  models through its transformation to the Fiacre language which is the pivot verification language of the TOPCASED project: high level models can be transformed to Fiacre  models and then model-checked. Then, we investigate how to prove the correctness of the transformation from AADL into Fiacre and present related elementary ingredients: the semantics of aadl  and Fiacre  subsets expressed in a common framework, namely timed transition systems. We also briefly discuss experimental validation of the work

    Isomorphy and Syntax-Prosody Relations in English

    Get PDF
    abstract: This dissertation investigates the precise degree to which prosody and syntax are related. One possibility is that the syntax-prosody mapping is one-to-one (“isomorphic”) at an underlying level (Chomsky & Halle 1968, Selkirk 1996, 2011, Ito & Mester 2009). This predicts that prosodic units should preferably match up with syntactic units. It is also possible that the mapping between these systems is entirely non-isomorphic, with prosody being influenced by factors from language perception and production (Wheeldon & Lahiri 1997, Lahiri & Plank 2010). In this work, I argue that both perspectives are needed in order to address the full range of phonological phenomena that have been identified in English and related languages, including word-initial lenition/flapping, word-initial segment-deletion, and vowel reduction in function words, as well as patterns of pitch accent assignment, final-pronoun constructions, and the distribution of null complementizer allomorphs. In the process, I develop models for both isomorphic and non-isomorphic phrasing. The former is cast within a Minimalist syntactic framework of Merge/Label and Bare Phrase Structure (Chomsky 2013, 2015), while the latter is characterized by a stress-based algorithm for the formation of phonological domains, following Lahiri & Plank (2010).Dissertation/ThesisDoctoral Dissertation English 201

    Linguistic (ir-)realities. A heuristic critique of the meta-theoretical foundations of generativism.

    Get PDF
    The thesis aims to provide a heuristic critique of the meta-theoretical foundations of Chomsky's project for an explanatory linguistics. The critique is 'heuristic' in that it attempts to take the considerations adduced to indicate how those conceptual foundations are to be re-designed on lines parallel to constructivism in the philosophy of mathematics. The net result is the provision of an outline of a meta-theoretic rationale for a process orientated linguistic theory (e.g. Kempson et al.'s LDSNL framework). The thesis investigates, and is organized around, three central strands of the Chomskyan paradigm: 1) The mathematization of linguistics: the use of formal/mathematical systems as theory constitutive metaphors. 2) A scientific realist (as opposed to instrumentalist) construal of linguistic theories. 3) A conceptualist/psychologist ontology for linguistic objects with a concomitant explanation for the nature of the linguistic in terms of properties of the modularized human "mind/brain" articulated through a system of mental representations. The central conclusions drawn are: 1) There is a failure to achieve adequate warrant for a scientific realist construal of Chomskyan linguistic theories. 2) The object(s) of study that is (are) posited in the Chomskyan paradigm require a Platonist or autonomist ontological status. A corollary of this is the inability to achieve an adequate explanation for the nature of linguistic phenomena. These conclusions, together with the observation of certain conceptual tensions and antimonies in generativist thinking (e.g. the relation between types and tokens), are taken to be sufficient to prompt a re-examination of the (metaphysical realist) assumptions that underlie that thinking. The solution that is canvassed, and which promises to resolve these tensions, is by way of a linguistic version of mathematical constructivism in which the emphasis lies in linguistic phenomena being construed as primarily cognitive events in which the constructive procedures are crucially constitutive of then linguistically individuating properties

    3-D Content-Based Retrieval and Classification with Applications to Museum Data

    Get PDF
    There is an increasing number of multimedia collections arising in areas once only the domain of text and 2-D images. Richer types of multimedia such as audio, video and 3-D objects are becoming more and more common place. However, current retrieval techniques in these areas are not as sophisticated as textual and 2-D image techniques and in many cases rely upon textual searching through associated keywords. This thesis is concerned with the retrieval of 3-D objects and with the application of these techniques to the problem of 3-D object annotation. The majority of the work in this thesis has been driven by the European project, SCULPTEUR. This thesis provides an in-depth analysis of a range of 3-D shape descriptors for their suitability for general purpose and specific retrieval tasks using a publicly available data set, the Princeton Shape Benchmark, and using real world museum objects evaluated using a variety of performance metrics. This thesis also investigates the use of 3-D shape descriptors as inputs to popular classification algorithms and a novel classifier agent for use with the SCULPTEUR system is designed and developed and its performance analysed. Several techniques are investigated to improve individual classifier performance. One set of techniques combines several classifiers whereas the other set of techniques aim to find the optimal training parameters for a classifier. The final chapter of this thesis explores a possible application of these techniques to the problem of 3-D object annotation

    CyberResearch on the Ancient Near East and Eastern Mediterranean

    Get PDF
    CyberResearch on the Ancient Near East and Neighboring Regions provides case studies on archaeology, objects, cuneiform texts, and online publishing, digital archiving, and preservation. Eleven chapters present a rich array of material, spanning the fifth through the first millennium BCE, from Anatolia, the Levant, Mesopotamia, and Iran. Customized cyber- and general glossaries support readers who lack either a technical background or familiarity with the ancient cultures. Edited by Vanessa Bigot Juloux, Amy Rebecca Gansell, and Alessandro Di Ludovico, this volume is dedicated to broadening the understanding and accessibility of digital humanities tools, methodologies, and results to Ancient Near Eastern Studies. Ultimately, this book provides a model for introducing cyber-studies to the mainstream of humanities research

    Semantic Domains in Akkadian Text

    Get PDF
    The article examines the possibilities offered by language technology for analyzing semantic fields in Akkadian. The corpus of data for our research group is the existing electronic corpora, Open richly annotated cuneiform corpus (ORACC). In addition to more traditional Assyriological methods, the article explores two language technological methods: Pointwise mutual information (PMI) and Word2vec.Peer reviewe

    STRICT: a language and tool set for the design of very large scale integrated circuits

    Get PDF
    PhD ThesisAn essential requirement for the design of large VLSI circuits is a design methodology which would allow the designer to overcome the complexity and correctness issues associated with the building of such circuits. We propose that many of the problems of the design of large circuits can be solved by using a formal design notation based upon the functional programming paradigm, that embodies design concepts that have been used extensively as the framework for software construction. The design notation should permit parallel, sequential, and recursive decompositions of a design into smaller components, and it should allow large circuits to be constructed from simpler circuits that can be embedded in a design in a modular fashion. Consistency checking should be provided as early as possible in a design. Such a methodology would structure the design of a circuit in much the same way that procedures, classes, and control structures may be used to structure large software systems. However, such a design notation must be supported by tools which automatically check the consistency of the design, if the methodology is to be practical. In principle, the methodology should impose constraints upon circuit design to reduce errors and provide' correctness by construction' . It should be possible to generate efficient and correct circuits, by providing a route to a large variety of design tools commonly found in design systems: simulators, automatic placement and routing tools, module generators, schematic capture tools, and formal verification and synthesis tools

    Proceedings of the 9th Dutch-Belgian Information Retrieval Workshop

    Get PDF
    corecore