493 research outputs found

    Advanced Knowledge Technologies at the Midterm: Tools and Methods for the Semantic Web

    Get PDF
    The University of Edinburgh and research sponsors are authorised to reproduce and distribute reprints and on-line copies for their purposes notwithstanding any copyright annotation hereon. The views and conclusions contained herein are the author’s and shouldn’t be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of other parties.In a celebrated essay on the new electronic media, Marshall McLuhan wrote in 1962:Our private senses are not closed systems but are endlessly translated into each other in that experience which we call consciousness. Our extended senses, tools, technologies, through the ages, have been closed systems incapable of interplay or collective awareness. Now, in the electric age, the very instantaneous nature of co-existence among our technological instruments has created a crisis quite new in human history. Our extended faculties and senses now constitute a single field of experience which demands that they become collectively conscious. Our technologies, like our private senses, now demand an interplay and ratio that makes rational co-existence possible. As long as our technologies were as slow as the wheel or the alphabet or money, the fact that they were separate, closed systems was socially and psychically supportable. This is not true now when sight and sound and movement are simultaneous and global in extent. (McLuhan 1962, p.5, emphasis in original)Over forty years later, the seamless interplay that McLuhan demanded between our technologies is still barely visible. McLuhan’s predictions of the spread, and increased importance, of electronic media have of course been borne out, and the worlds of business, science and knowledge storage and transfer have been revolutionised. Yet the integration of electronic systems as open systems remains in its infancy.Advanced Knowledge Technologies (AKT) aims to address this problem, to create a view of knowledge and its management across its lifecycle, to research and create the services and technologies that such unification will require. Half way through its sixyear span, the results are beginning to come through, and this paper will explore some of the services, technologies and methodologies that have been developed. We hope to give a sense in this paper of the potential for the next three years, to discuss the insights and lessons learnt in the first phase of the project, to articulate the challenges and issues that remain.The WWW provided the original context that made the AKT approach to knowledge management (KM) possible. AKT was initially proposed in 1999, it brought together an interdisciplinary consortium with the technological breadth and complementarity to create the conditions for a unified approach to knowledge across its lifecycle. The combination of this expertise, and the time and space afforded the consortium by the IRC structure, suggested the opportunity for a concerted effort to develop an approach to advanced knowledge technologies, based on the WWW as a basic infrastructure.The technological context of AKT altered for the better in the short period between the development of the proposal and the beginning of the project itself with the development of the semantic web (SW), which foresaw much more intelligent manipulation and querying of knowledge. The opportunities that the SW provided for e.g., more intelligent retrieval, put AKT in the centre of information technology innovation and knowledge management services; the AKT skill set would clearly be central for the exploitation of those opportunities.The SW, as an extension of the WWW, provides an interesting set of constraints to the knowledge management services AKT tries to provide. As a medium for the semantically-informed coordination of information, it has suggested a number of ways in which the objectives of AKT can be achieved, most obviously through the provision of knowledge management services delivered over the web as opposed to the creation and provision of technologies to manage knowledge.AKT is working on the assumption that many web services will be developed and provided for users. The KM problem in the near future will be one of deciding which services are needed and of coordinating them. Many of these services will be largely or entirely legacies of the WWW, and so the capabilities of the services will vary. As well as providing useful KM services in their own right, AKT will be aiming to exploit this opportunity, by reasoning over services, brokering between them, and providing essential meta-services for SW knowledge service management.Ontologies will be a crucial tool for the SW. The AKT consortium brings a lot of expertise on ontologies together, and ontologies were always going to be a key part of the strategy. All kinds of knowledge sharing and transfer activities will be mediated by ontologies, and ontology management will be an important enabling task. Different applications will need to cope with inconsistent ontologies, or with the problems that will follow the automatic creation of ontologies (e.g. merging of pre-existing ontologies to create a third). Ontology mapping, and the elimination of conflicts of reference, will be important tasks. All of these issues are discussed along with our proposed technologies.Similarly, specifications of tasks will be used for the deployment of knowledge services over the SW, but in general it cannot be expected that in the medium term there will be standards for task (or service) specifications. The brokering metaservices that are envisaged will have to deal with this heterogeneity.The emerging picture of the SW is one of great opportunity but it will not be a wellordered, certain or consistent environment. It will comprise many repositories of legacy data, outdated and inconsistent stores, and requirements for common understandings across divergent formalisms. There is clearly a role for standards to play to bring much of this context together; AKT is playing a significant role in these efforts. But standards take time to emerge, they take political power to enforce, and they have been known to stifle innovation (in the short term). AKT is keen to understand the balance between principled inference and statistical processing of web content. Logical inference on the Web is tough. Complex queries using traditional AI inference methods bring most distributed computer systems to their knees. Do we set up semantically well-behaved areas of the Web? Is any part of the Web in which semantic hygiene prevails interesting enough to reason in? These and many other questions need to be addressed if we are to provide effective knowledge technologies for our content on the web

    High performance constraint satisfaction problem solving: State-recomputation versus state-copying.

    Get PDF
    Constraint Satisfaction Problems (CSPs) in Artificial Intelligence have been an important focus of research and have been a useful model for various applications such as scheduling, image processing and machine vision. CSPs are mathematical problems that try to search values for variables according to constraints. There are many approaches for searching solutions of non-binary CSPs. Traditionally, most CSP methods rely on a single processor. With the increasing popularization of multiple processors, parallel search methods are becoming alternatives to speed up the search process. Parallel search is a subfield of artificial intelligence in which the constraint satisfaction problem is centralized whereas the search processes are distributed among the different processors. In this thesis we present a forward checking algorithm solving non-binary CSPs by distributing different branches to different processors via message passing interface and execute it on a high performance distributed system called SHARCNET. However, the problem is how to efficiently communicate the state of the search among processors. Two communication models, namely, state-recomputation and state-copying via message passing, are implemented and evaluated. This thesis investigates the behaviour of communication from one process to another. The experimental results demonstrate that the state-recomputation model with tighter constraints obtains a better performance than the state-copying model, but when constraints become looser, the state-copying model is a better choice.Dept. of Computer Science. Paper copy at Leddy Library: Theses & Major Papers - Basement, West Bldg. / Call Number: Thesis2004 .Y364. Source: Masters Abstracts International, Volume: 44-01, page: 0417. Thesis (M.Sc.)--University of Windsor (Canada), 2005

    Informed selection and use of training examples for knowledge refinement.

    Get PDF
    Knowledge refinement tools seek to correct faulty rule-based systems by identifying and repairing faults indicated by training examples that provide evidence of faults. This thesis proposes mechanisms that improve the effectiveness and efficiency of refinement tools by the best use and selection of training examples. The refinement task is sufficiently complex that the space of possible refinements demands a heuristic search. Refinement tools typically use hill-climbing search to identify suitable repairs but run the risk of getting caught in local optima. A novel contribution of this thesis is solving the local optima problem by converting the hill-climbing search into a best-first search that can backtrack to previous refinement states. The thesis explores how different backtracking heuristics and training example ordering heuristics affect refinement effectiveness and efficiency. Refinement tools rely on a representative set of training examples to identify faults and influence repair choices. In real environments it is often difficult to obtain a large set of training examples, since each problem-solving task must be labelled with the expert's solution. Another novel aspect introduced in this thesis is informed selection of examples for knowledge refinement, where suitable examples are selected from a set of unlabelled examples, so that only the subset requires to be labelled. Conversely, if a large set of labelled examples is available, it still makes sense to have mechanisms that can select a representative set of examples beneficial for the refinement task, thereby avoiding unnecessary example processing costs. Finally, an experimental evaluation of example utilisation and selection strategies on two artificial domains and one real application are presented. Informed backtracking is able to effectively deal with local optima by moving search to more promising areas, while informed ordering of training examples reduces search effort by ensuring that more pressing faults are dealt with early on in the search. Additionally, example selection methods achieve similar refinement accuracy with significantly fewer examples

    Generalizing backdoors

    Get PDF
    Abstract. A powerful intuition in the design of search methods is that one wants to proactively select variables that simplify the problem instance as much as possible when these variables are assigned values. The notion of “Backdoor ” variables follows this intuition. In this work we generalize Backdoors in such a way to allow more general classes of sub-solvers, both complete and heuristic. In order to do so, Pseudo-Backdoors and Heuristic-Backdoors are formally introduced and then applied firstly to a simple Multiple Knapsack Problem and secondly to a complex combinatorial optimization problem in the area of stochastic inventory control. Our preliminary computational experience shows the effectiveness of these approaches that are able to produce very low run times and — in the case of Heuristic-Backdoors — high quality solutions by employing very simple heuristic rules such as greedy local search strategies.

    Tree Projections and Constraint Optimization Problems: Fixed-Parameter Tractability and Parallel Algorithms

    Full text link
    Tree projections provide a unifying framework to deal with most structural decomposition methods of constraint satisfaction problems (CSPs). Within this framework, a CSP instance is decomposed into a number of sub-problems, called views, whose solutions are either already available or can be computed efficiently. The goal is to arrange portions of these views in a tree-like structure, called tree projection, which determines an efficiently solvable CSP instance equivalent to the original one. Deciding whether a tree projection exists is NP-hard. Solution methods have therefore been proposed in the literature that do not require a tree projection to be given, and that either correctly decide whether the given CSP instance is satisfiable, or return that a tree projection actually does not exist. These approaches had not been generalized so far on CSP extensions for optimization problems, where the goal is to compute a solution of maximum value/minimum cost. The paper fills the gap, by exhibiting a fixed-parameter polynomial-time algorithm that either disproves the existence of tree projections or computes an optimal solution, with the parameter being the size of the expression of the objective function to be optimized over all possible solutions (and not the size of the whole constraint formula, used in related works). Tractability results are also established for the problem of returning the best K solutions. Finally, parallel algorithms for such optimization problems are proposed and analyzed. Given that the classes of acyclic hypergraphs, hypergraphs of bounded treewidth, and hypergraphs of bounded generalized hypertree width are all covered as special cases of the tree projection framework, the results in this paper directly apply to these classes. These classes are extensively considered in the CSP setting, as well as in conjunctive database query evaluation and optimization

    Combining mathematical programming and SysML for component sizing as applied to hydraulic systems

    Get PDF
    In this research, the focus is on improving a designer's capability to determine near-optimal sizes of components for a given system architecture. Component sizing is a hard problem to solve because of the presence of competing objectives, requirements from multiple disciplines, and the need for finding a solution quickly for the architecture being considered. In current approaches, designers rely on heuristics and iterate over the multiple objectives and requirements until a satisfactory solution is found. To improve on this state of practice, this research introduces advances in the following two areas: a.) Formulating a component sizing problem in a manner that is convenient to designers and b.) Solving the component sizing problem in an efficient manner so that all of the imposed requirements are satisfied simultaneously and the solution obtained is mathematically optimal. In particular, an acausal, algebraic, equation-based, declarative modeling approach is taken to solve component sizing problems efficiently. This is because global optimization algorithms exist for algebraic models and the computation time is considerably less as compared to the optimization of dynamic simulations. In this thesis, the mathematical programming language known as GAMS (General Algebraic Modeling System) and its associated global optimization solvers are used to solve component sizing problems efficiently. Mathematical programming languages such as GAMS are not convenient for formulating component sizing problems and therefore the Systems Modeling Language developed by the Object Management Group (OMG SysML ) is used to formally capture and organize models related to component sizing into libraries that can be reused to compose new models quickly by connecting them together. Model-transformations are then used to generate low-level mathematical programming models in GAMS that can be solved using commercial off-the-shelf solvers such as BARON (Branch and Reduce Optimization Navigator) to determine the component sizes that satisfy the requirements and objectives imposed on the system. This framework is illustrated by applying it to an example application for sizing a hydraulic log splitter.M.S.Committee Co-Chair: Paredis, Chris ; Committee Co-Chair: Schaefer, Dirk; Committee Member: Goel, Asho

    Practical Tractability of CSPS by Higher Level Consistency and Tree Decomposition

    Get PDF
    Constraint Satisfaction is a flexible paradigm for modeling many decision problems in Engineering, Computer Science, and Management. Constraint Satisfaction Problems (CSPs) are in general NP-complete and are usually solved with search. Research has identified various islands of tractability, which enable solving certain CSPs with backtrack-free search. For example, one sufficient condition for tractability relates the consistency level of a CSP to treewidth of the CSP\u27s constraint network. However, enforcing higher levels of consistency on a CSP may require the addition of constraints, thus altering the topology of the constraint network and increasing its treewidth. This thesis addresses the following question: How close can we approach in practice the tractability guaranteed by the relationship between the level of consistency in a CSP and the treewidth of its constraint network? To achieve practical tractability, this thesis proposes: (1) New local consistency properties and algorithms for enforcing them without adding constraints or altering the network\u27s topology; (2) Methods to enforce these consistency properties on the clusters of a tree decomposition of the CSP; and (3) Schemes to bolster the propagation between the clusters of the tree decomposition. Our empirical evaluation shows that our techniques allow us to achieve practical tractability for a wide range of problems, and that they are both applicable (i.e., require acceptable time and space) and useful (i.e., outperform other consistency properties). We theoretically characterize the proposed consistency properties and empirically evaluate our techniques on benchmark problems. Our techniques for higher level consistency exhibit their best performances on difficult benchmark problems. They solve a larger number of difficult problem instances than algorithms enforcing weaker consistency properties, and moreover they solve them in an almost backtrack-free manner. Adviser: Berthe Y. Choueir

    Finding regions of local repair in hierarchical constraint satisfaction

    Get PDF
    Algorithms for solving constraint satisfaction problems (CSP) have been successfully applied to several fields including scheduling, design, and planning. Latest extensions of the standard CSP to constraint optimization problems (COP) additionally provided new opportunities for solving several problems of combinatorial optimization more efficiently. Basically, two classes of algorithms have been used for searching constraint satisfaction problems (CSP): local search methods and systematic tree search extended by the classical constraint-processing techniques like e.g. forward checking and backmarking. Both classes exhibit characteristic advantages and drawbacks. This report presents a novel approach for solving constraint optimization problems that combines the advantages of local search and tree search algorithms which have been extended by constraint-processing techniques. This method proved applicability in a commercial nurse scheduling system as well as on randomly generated problems
    • 

    corecore