32 research outputs found

    OM-2017: Proceedings of the Twelfth International Workshop on Ontology Matching

    Get PDF
    shvaiko2017aInternational audienceOntology matching is a key interoperability enabler for the semantic web, as well as auseful tactic in some classical data integration tasks dealing with the semantic heterogeneityproblem. It takes ontologies as input and determines as output an alignment,that is, a set of correspondences between the semantically related entities of those ontologies.These correspondences can be used for various tasks, such as ontology merging,data translation, query answering or navigation on the web of data. Thus, matchingontologies enables the knowledge and data expressed with the matched ontologies tointeroperate

    TOWARDS BUILDING AN INTELLIGENT INTEGRATED MULTI-MODE TIME DIARY SURVEY FRAMEWORK

    Get PDF
    Enabling true responses is an important characteristic in surveys; where the responses are free from bias and satisficing. In this thesis, we examine the current state of surveys, briefly touching upon questionnaire surveys, and then on time diary surveys (TDS). TDS are open-ended conversational surveys of a free-form nature with both, the interviewer and the respondent, playing a part in its progress and successful completion. With limited research available on how intelligent and assistive components can affect TDS respondents, we explore ways in which intelligent systems such as Computer Adaptive Testing, Intelligent Tutoring Systems, Recommender Systems, and Decision Support Systems can be leveraged for use in TDS. The motivation for this work is from realizing the opportunity that an enhanced web based instrument can offer the survey domain to unite the various facets of web based surveys to create an intelligent integrated multi-mode TDS framework. We envision the framework to provide all the advantages of web based surveys and interviewer assisted surveys. The two primary challenges are in determining what data is to be used by the system and how to interact with the user – specifically integrating the (1) Interviewer-assisted mode, and (2) Self-administered mode. Our proposed solution – the intelligent integrated multi-mode framework – is essentially the solution to a set of modeling problems and we propose two sets of overreaching mechanisms: (1) Knowledge Engineering Mechanisms (KEM), and (2) Interaction Mechanisms (IxM), where KEM serves the purpose of understanding what data can be created, used and stored while IxM deals with interacting with the user. We build and study a prototype instrument in the interviewer-assisted mode based on the framework. We are able to determine that the instrument improves the interview process as intended and increases the data quality of the response data and is able to assist the interviewer. We also observe that the framework’s mechanisms contribute towards reducing interviewers’ cognitive load, data entry times and interview time by predicting the next activity. Advisor: Leenkiat So

    Technology 2003: The Fourth National Technology Transfer Conference and Exposition, volume 2

    Get PDF
    Proceedings from symposia of the Technology 2003 Conference and Exposition, Dec. 7-9, 1993, Anaheim, CA, are presented. Volume 2 features papers on artificial intelligence, CAD&E, computer hardware, computer software, information management, photonics, robotics, test and measurement, video and imaging, and virtual reality/simulation

    A Systematic Literature Review of Drone Utility in Railway Condition Monitoring

    Get PDF
    Raj Bridgelall is the program director for the Upper Great Plains Transportation Institute (UGPTI) Center for Surface Mobility Applications & Real-time Simulation environments (SMARTSeSM).Drones have recently become a new tool in railway inspection and monitoring (RIM) worldwide, but there is still a lack of information about the specific benefits and costs. This study conducts a systematic literature review (SLR) of the applications, opportunities, and challenges of using drones for RIM. The SLR technique yielded 47 articles filtered from 7,900 publications from 2014 to 2022. The SLR found that key motivations for using drones in RIM are to reduce costs, improve safety, save time, improve mobility, increase flexibility, and enhance reliability. Nearly all the applications fit into the categories of defect identification, situation assessment, rail network mapping, infrastructure asset monitoring, track condition monitoring, and obstruction detection. The authors assessed the open technical, safety, and regulatory challenges. The authors also contributed a cost analysis framework, identified factors that affect drone performance in RIM, and offered implications for new theories, management, and impacts to society.The authors conducted this work with support from North Dakota State University and the Mountain-Plains Consortium, a University Transportation Center funded by the U.S. Department of Transportation.https://www.ugpti.org/about/staff/viewbio.php?id=7

    3D Object Recognition Based On Constrained 2D Views

    Get PDF
    The aim of the present work was to build a novel 3D object recognition system capable of classifying man-made and natural objects based on single 2D views. The approach to this problem has been one motivated by recent theories on biological vision and multiresolution analysis. The project's objectives were the implementation of a system that is able to deal with simple 3D scenes and constitutes an engineering solution to the problem of 3D object recognition, allowing the proposed recognition system to operate in a practically acceptable time frame. The developed system takes further the work on automatic classification of marine phytoplank- (ons, carried out at the Centre for Intelligent Systems, University of Plymouth. The thesis discusses the main theoretical issues that prompted the fundamental system design options. The principles and the implementation of the coarse data channels used in the system are described. A new multiresolution representation of 2D views is presented, which provides the classifier module of the system with coarse-coded descriptions of the scale-space distribution of potentially interesting features. A multiresolution analysis-based mechanism is proposed, which directs the system's attention towards potentially salient features. Unsupervised similarity-based feature grouping is introduced, which is used in coarse data channels to yield feature signatures that are not spatially coherent and provide the classifier module with salient descriptions of object views. A simple texture descriptor is described, which is based on properties of a special wavelet transform. The system has been tested on computer-generated and natural image data sets, in conditions where the inter-object similarity was monitored and quantitatively assessed by human subjects, or the analysed objects were very similar and their discrimination constituted a difficult task even for human experts. The validity of the above described approaches has been proven. The studies conducted with various statistical and artificial neural network-based classifiers have shown that the system is able to perform well in all of the above mentioned situations. These investigations also made possible to take further and generalise a number of important conclusions drawn during previous work carried out in the field of 2D shape (plankton) recognition, regarding the behaviour of multiple coarse data channels-based pattern recognition systems and various classifier architectures. The system possesses the ability of dealing with difficult field-collected images of objects and the techniques employed by its component modules make possible its extension to the domain of complex multiple-object 3D scene recognition. The system is expected to find immediate applicability in the field of marine biota classification

    Cogitator : a parallel, fuzzy, database-driven expert system

    Get PDF
    The quest to build anthropomorphic machines has led researchers to focus on knowledge and the manipulation thereof. Recently, the expert system was proposed as a solution, working well in small, well understood domains. However these initial attempts highlighted the tedious process associated with building systems to display intelligence, the most notable being the Knowledge Acquisition Bottleneck. Attempts to circumvent this problem have led researchers to propose the use of machine learning databases as a source of knowledge. Attempts to utilise databases as sources of knowledge has led to the development Database-Driven Expert Systems. Furthermore, it has been ascertained that a requisite for intelligent systems is powerful computation. In response to these problems and proposals, a new type of database-driven expert system, Cogitator is proposed. It is shown to circumvent the Knowledge Acquisition Bottleneck and posess many other advantages over both traditional expert systems and connectionist systems, whilst having non-serious disadvantages.KMBT_22

    Calculi for higher order communicating systems

    Get PDF
    This thesis develops two Calculi for Higher Order Communicating Systems. Both calculi consider sending and receiving processes to be as fundamental as nondeterminism and parallel composition. The first calculus called CHOCS is an extension of Milner's CCS in the sense that all the constructions of CCS are included or may be derived from more fundamental constructs. Most of the mathematical framework of CCS carries over almost unchanged. The operational semantics of CHOCS is given as a labelled transition system and it is a direct extension of the semantics of CCS with value passing. A set of algebraic laws satisfied by the calculus is presented. These are similar to the CCS laws only introducing obvious extra laws for sending and receiving processes. The power of process passing is underlined by a result showing that the recursion operator is unnecessary in the sense that recursion can be simulated by means of process passing and communication. The CHOCS language is also studied by means of a denotational semantics. A major result is the full abstractness of this semantics with respect to the operational semantics. The denotational semantics is used to provide an easy proof of the simulation of recursion. Introducing processes as first class objects yields a powerful metalanguage. It is shown that it is possible to simulate various reduction strategies of the untyped λ-Calculus in CHOCS. As pointed out by Milner, CCS has its limitations when one wants to describe unboundedly expanding systems, e.g. an unbounded number of procedure invocations in an imperative concurrent programming language P with recursive procedures. CHOCS may neatly describe both call-by-value and call-by-reference parameter mechanisms for P. We also consider call-by-name and lazy parameter mechanisms for P. The second calculus is called Plain CHOCS. Essential to the new calculus is the treatment of restriction as a static binding operator on port names. This calculus is given an operational semantics using labelled transition systems which combines ideas from the applicative transition systems described by Abramsky and the transition systems used for CHOCS. This calculus enjoys algebraic properties which are similar to those of CHOCS only needing obvious extra laws for the static nature of the restriction operator. Processes as first class objects enable description of networks with changing interconnection structure and there is a close connection between the Plain CHOCS calculus and the π-Calculus described by Milner, Parrow and Walker: the two calculi can simulate one another. Recently object oriented programming has grown into a major discipline in computational practice as well as in computer science. From a theoretical point of view object oriented programming presents a challenge to any metalanguage since most object oriented languages have no formal semantics. We show how Plain CHOCS may be used to give a semantics to a prototype object oriented language called 0.Open Acess

    The Future of Information Sciences : INFuture2007 : Digital Information and Heritage

    Get PDF

    A comprehensive Chinese thesaurus system.

    Get PDF
    by Chen Hong Yi.Thesis (M.Phil.)--Chinese University of Hong Kong, 1995.Includes bibliographical references (leaves 62-65).Abstract --- p.iiAcknowledgement --- p.ivList of Tables --- p.viiiList of Figures --- p.ixChapter 1 --- Introduction --- p.1Chapter 2 --- Background Information And Thesis Scope --- p.6Chapter 2.1 --- Basic Concepts and Terminologies --- p.6Chapter 2.1.1 --- Semantic Classification Of A Word --- p.6Chapter 2.1.2 --- Relationship Link And Relationship Type --- p.7Chapter 2.1.3 --- "Semantic Closeness, Link Weight And Semantic Distance" --- p.8Chapter 2.1.4 --- Thesaurus Model And Semantic Net --- p.9Chapter 2.1.5 --- Thesaurus Building And Maintaining Tool --- p.9Chapter 2.2 --- Chinese Information Processing --- p.9Chapter 2.2.1 --- The Segmentation of Chinese Words --- p.10Chapter 2.2.2 --- The Ambiguity of Chinese Words --- p.10Chapter 2.2.3 --- Multiple Chinese Character Code Set Standards --- p.11Chapter 2.3 --- Related Work --- p.11Chapter 2.4 --- Thesis Scope --- p.13Chapter 3 --- System Design Principles --- p.15Chapter 3.1 --- Application Context Of TheSys --- p.15Chapter 3.2 --- Overall System Architecture --- p.16Chapter 3.3 --- Entry-Term Construct And Thesaurus Frame --- p.19Chapter 3.3.1 --- "Words, Entry Terms And Entry Term Construct" --- p.21Chapter 3.3.2 --- "Semanteme, Relationship And Thesaurus Frame" --- p.23Chapter 3.3.3 --- Dealing With Term Ambiguity --- p.28Chapter 3.4 --- Weighting Scheme --- p.33Chapter 3.4.1 --- Assumption --- p.33Chapter 3.4.2 --- Quantify The Relevancy Between Two Directly Linked Concepts --- p.34Chapter 3.4.3 --- Quantify The Relevancy Between Two Indirectly Linked Concepts --- p.35Chapter 3.5 --- Term Ranking --- p.38Chapter 3.6 --- Thesaurus Module and Maintenance Module --- p.39Chapter 3.6.1 --- The Procedure Of Building A Thesaurus --- p.40Chapter 3.6.2 --- Thesaurus Nomination --- p.41Chapter 3.6.3 --- Semantic Classification Tree Construction --- p.41Chapter 3.6.4 --- Relation Type Definition --- p.42Chapter 3.6.5 --- Entry Term Construct Construction --- p.42Chapter 3.6.6 --- Thesaurus Frame Construction --- p.43Chapter 3.6.7 --- Thesaurus Query --- p.44Chapter 4 --- System Implementation --- p.45Chapter 4.1 --- Data Structure --- p.45Chapter 4.1.1 --- Entry Term Construct --- p.45Chapter 4.1.2 --- Thesaurus Frame --- p.49Chapter 4.2 --- API --- p.50Chapter 4.3 --- User Interface --- p.54Chapter 4.3.1 --- Widget And Its Callback --- p.54Chapter 4.3.2 --- Bilingual User Interface --- p.55Chapter 4.3.3 --- Chinese Character Input Method --- p.57Chapter 5 --- Conclusion And Future Work --- p.60Chapter A --- System Installation --- p.66Chapter A.1 --- Files In TheSys --- p.67Chapter A.2 --- Employ TheSys As Application Package --- p.70Chapter A.3 --- Set Up TheSys With UI --- p.71Chapter A.4 --- Verify The Word Using External Dictionary --- p.74Chapter B --- API Description --- p.77Chapter B.1 --- thesys.h File --- p.77Chapter B.2 --- API Reference --- p.82Chapter C --- User Interface Reference --- p.10

    An Architecture for the Compilation of Persistent Polymorphic Reflective Higher-Order Languages

    Get PDF
    Persistent Application Systems are potentially very large and long-lived application systems which use information technology: computers, communications, networks, software and databases. They are vital to the organisations that depend on them and have to be adaptable to organisational and technological changes and evolvable without serious interruption of service. Persistent Programming Languages are a promising technology that facilitate the task of incrementally building and maintaining persistent application systems. This thesis identifies a number of technical challenges in making persistent programming languages scalable, with adequate performance and sufficient longevity and in amortising costs by providing general services. A new architecture to support the compilation of long-lived, large-scale applications is proposed. This architecture comprises an intermediate language to be used by front-ends, high-level and machine independent optimisers, low-level optimisers and code generators of target machine code. The intermediate target language, TPL, has been designed to allow compiler writers to utilise common technology for several different orthogonally persistent higher-order reflective languages. The goal is to reuse optimisation and code-generation or interpretation technology with a variety of front-ends. A subsidiary goal is to provide an experimental framework for those investigating optimisation and code generation. TPL has a simple, clean type system and will support orthogonally persistent, reflective, higher-order, polymorphic languages. TPL allows code generation and the abstraction over details of the underlying software and hardware layers. An experiment to build a prototype of the proposed architecture was designed, developed and evaluated. The experimental work includes a language processor and examples of its use are presented in this dissertation. The design space was covered by describing the implications of the goals of supporting the class of languages anticipated while ensuring long-term persistence of data and programs, and sufficient efficiency. For each of the goals, the design decisions were evaluated in face of the results
    corecore