90 research outputs found

    Logic programming in the context of multiparadigm programming: the Oz experience

    Full text link
    Oz is a multiparadigm language that supports logic programming as one of its major paradigms. A multiparadigm language is designed to support different programming paradigms (logic, functional, constraint, object-oriented, sequential, concurrent, etc.) with equal ease. This article has two goals: to give a tutorial of logic programming in Oz and to show how logic programming fits naturally into the wider context of multiparadigm programming. Our experience shows that there are two classes of problems, which we call algorithmic and search problems, for which logic programming can help formulate practical solutions. Algorithmic problems have known efficient algorithms. Search problems do not have known efficient algorithms but can be solved with search. The Oz support for logic programming targets these two problem classes specifically, using the concepts needed for each. This is in contrast to the Prolog approach, which targets both classes with one set of concepts, which results in less than optimal support for each class. To explain the essential difference between algorithmic and search programs, we define the Oz execution model. This model subsumes both concurrent logic programming (committed-choice-style) and search-based logic programming (Prolog-style). Instead of Horn clause syntax, Oz has a simple, fully compositional, higher-order syntax that accommodates the abilities of the language. We conclude with lessons learned from this work, a brief history of Oz, and many entry points into the Oz literature.Comment: 48 pages, to appear in the journal "Theory and Practice of Logic Programming

    Review of Detection and Monitoring Systems for Buried High Pressure Pipelines:Final Report

    Get PDF
    The Netherlands has approximately two million kilometers of underground cables and pipelines. One specific type of buried infrastructure is the distribution network of hazardous material such as gas, oil, and chemicals (‘transportleiding gevaarlijke stoffen’). This network comprises 22.000 kilometers of high-pressure transportation pipelines. Because they are located under the ground, these pipelines are subject to excavation damages. Incidents in them Belgian Gellingen (2004) and German Ludwigshafen (2014) show that consequences of pipeline damages are significant. They can cause fatalities to excavation workers and impact the environment too. In addition, only direct costs for recovery of damages are estimated by the pipeline owner association (VELIN) to range already from several hundreds of thousands to even a few millions of euros. This figure does not yet include the indirect costs. Serious incidents will eventually undermine the public’s acceptance for hazardous pipelines, so it goes without saying that pipeline excavation incidents should, therefore, be avoided. Nowadays, third parties seem to be causing most of the damage to underground pipelines (Capstick, 2007; CONCAWE, 2013; EGIG, 2015; J. M. Muggleton & Rustighi, 2013). Reasons for this, often mentioned by industry, are that utility location information (KLIC-melding) is not always available and, when available, it is not always accurate or too difficult to interpret by excavator operators. It is crucial to detect underground infrastructure in a timely fashion to avoid damages. For this purpose, initiatives are needed to help excavator operators to detect pipelines and monitor groundworks taking place close to pipelines. Such initiatives could focus on the identification and the development of technologies for pipeline strike avoidance. The first step in this direction was this study – which in turn is related to the Safety Deals that are prepared by the association of pipeline owners in the Netherlands (VELIN) and the Dutch Ministry of Infrastructure and the Environment. VELIN and I&M requested the University of Twente to systematically review existing technologies for excavation damage avoidance. Such an overview is not available to the Dutch industry to date. The project team therefore identified and described existing systems for global monitoring and detection of utilities. These systems eventually help detect clashes between excavator equipment and high-pressure transportation pipelines

    Knowledge-based Engineering in Product Development Processes - Process, IT and Knowledge Management perspectives

    Get PDF
    Product development as a field of practice and research has significantly changed due to the general trends of globalization changing the enterprise landscapes in which products are realized. The access to partners and suppliers with high technological specialization has also led to an increased specialization of original equipment manufacturers (OEMs). Furthermore, the products are becoming increasingly complex with a high functional and technological content and many variants. Combined with shorter lifecycles which require reuse of technologies and solutions, this has resulted in an overall increased knowledge intensity which necessitates a more explicit approach towards knowledge and knowledge management in product development. In parallel, methods and IT tools for managing knowledge have been developed and are more accessible and usable today. One such approach is knowledge-based engineering (KBE), a term that was coined in the mid-1980s as a label for applications which automate the design of rule-driven geometries. In this thesis the term KBE embraces the capture and application of engineering knowledge to automate engineering tasks, regardless of domain of application, and the thesis aims at contributing to a wider utilization of KBE in product development (PD). The thesis focuses on two perspectives of KBE; as a process improvement IT method and as a knowledge management (KM) method. In the first perspective, the lack of explicit regard for the constraints of the product lifecycle management (PLM) architecture, which governs the interaction of processes and IT in PD, has been identified to negatively affect the utilization of KBE in PD processes. In the second perspective, KM theories and models can complement existing methods for identifying potential for KBE applications.Regarding the first perspective, it is concluded that explicit regard for the PLM architecture decreases the need to develop and maintain software code related to hard coded redundant data and functions in the KBE application. The concept of service oriented architecture (SOA) has been found to enable an the explicit regard for the PLM architecture.. Regarding the second perspective, it is concluded that potential for KBE applications is indicated by: 1.) application of certain types of knowledge in PD processes 2.) high maturity and formalization of the applied knowledge 3.) a codification strategy for KM and 4.) an agreement and transparency regarding how the knowledge is applied, captured and transferred. It is also concluded that the formulation of explicit KM strategies in PD should be guided by knowledge application and its relation to strategic objectives focusing on types of knowledge, their role in the PD process and the methods and tools for their application. These, in turn, affect the methods and tools deployed for knowledge capture in order for it to integrate with the processes of knowledge origin. Finally, roles and processes for knowledge transfer have to be transparent to assure the motivation of individuals to engage in the KM strategy

    Knowledge-based Engineering in Product Development Processes - Process, IT and Knowledge Management perspectives

    Get PDF
    Product development as a field of practice and research has significantly changed due to the general trends of globalization changing the enterprise landscapes in which products are realized. The access to partners and suppliers with high technological specialization has also led to an increased specialization of original equipment manufacturers (OEMs). Furthermore, the products are becoming increasingly complex with a high functional and technological content and many variants. Combined with shorter lifecycles which require reuse of technologies and solutions, this has resulted in an overall increased knowledge intensity which necessitates a more explicit approach towards knowledge and knowledge management in product development. In parallel, methods and IT tools for managing knowledge have been developed and are more accessible and usable today. One such approach is knowledge-based engineering (KBE), a term that was coined in the mid-1980s as a label for applications which automate the design of rule-driven geometries. In this thesis the term KBE embraces the capture and application of engineering knowledge to automate engineering tasks, regardless of domain of application, and the thesis aims at contributing to a wider utilization of KBE in product development (PD). The thesis focuses on two perspectives of KBE; as a process improvement IT method and as a knowledge management (KM) method. In the first perspective, the lack of explicit regard for the constraints of the product lifecycle management (PLM) architecture, which governs the interaction of processes and IT in PD, has been identified to negatively affect the utilization of KBE in PD processes. In the second perspective, KM theories and models can complement existing methods for identifying potential for KBE applications.Regarding the first perspective, it is concluded that explicit regard for the PLM architecture decreases the need to develop and maintain software code related to hard coded redundant data and functions in the KBE application. The concept of service oriented architecture (SOA) has been found to enable an the explicit regard for the PLM architecture.. Regarding the second perspective, it is concluded that potential for KBE applications is indicated by: 1.) application of certain types of knowledge in PD processes 2.) high maturity and formalization of the applied knowledge 3.) a codification strategy for KM and 4.) an agreement and transparency regarding how the knowledge is applied, captured and transferred. It is also concluded that the formulation of explicit KM strategies in PD should be guided by knowledge application and its relation to strategic objectives focusing on types of knowledge, their role in the PD process and the methods and tools for their application. These, in turn, affect the methods and tools deployed for knowledge capture in order for it to integrate with the processes of knowledge origin. Finally, roles and processes for knowledge transfer have to be transparent to assure the motivation of individuals to engage in the KM strategy

    Milestones in Software Engineering and Knowledge Engineering History: A Comparative Review

    Get PDF
    We present a review of the historical evolution of software engineering, intertwining it with the history of knowledge engineering because “those who cannot remember the past are condemned to repeat it.” This retrospective represents a further step forward to understanding the current state of both types of engineerings; history has also positive experiences; some of them we would like to remember and to repeat. Two types of engineerings had parallel and divergent evolutions but following a similar pattern. We also define a set of milestones that represent a convergence or divergence of the software development methodologies. These milestones do not appear at the same time in software engineering and knowledge engineering, so lessons learned in one discipline can help in the evolution of the other one

    Four essays in dynamic macroeconomics

    Get PDF
    The dissertation contains essays concerning the linkages between macroeconomy and financial market or the conduct of monetary policy via DSGE modelling. The dissertation contributes to the questions of fitting macroeconomic models to the data, and so contributes to our understanding of the driving forces of fluctuations in macroeconomic and financial variables. Chapter one offers an introduction to my thesis and outlines in detail the main results and methodologies. In Chapter two I introduce a statistical measure for model evaluation and selection based on the full information of sample second moments in data. A model is said to outperform its counterpart if it produces closer similarity in simulated data variance-covariance matrix when compared with the actual data. The "distance method" is generally feasible and simple to conduct. A flexible price two-sector open economy model is studied to match the observed puzzles of international finance data. The statistical distance approach favours a model with dominant role played by the expectational errors in foreign exchange market which breaks the international interest rate parity. Chapter three applies the distance approach to a New Keynesian model augmented with habit formation and backward-looking component of pricing behaviour. A macro-finance model of yield curve is developed to showcase the dynamics of implied forward yields. This exercise, with the distance approach, reiterate the inability of macro model in explaining yield curve dynamics. The method also reveals remarkable interconnection between real quantity and bond yield slope. In Chapter four I study a general equilibrium business cycle model with sticky prices and labour market rigidities. With costly matching on labour market, output responds in a hump-shaped and persistent manner to monetary shocks and the resulting Phillips curve seems to radically change the scope for monetary policy because (i) there are speed limit effects for policy and (ii) there is a cost channel for monetary policy. Labour reforms such as in mid-1980s UK can trigger more effective monetary policy. Research on monetary policy shall pay greater attention to output when labour market adjustments are persistent. Chapter five analyzes the link between money and financial spread, which is oft missed in specification of monetary policy making analysis. When liquidity provision by banks dominates the demand for money from the real economy, money may contain information of future output and inflation due to its impact on financial spreads. I use a sign-restriction Bayesian VAR estimation to separate the liquidity provision impact from money market equilibrium. The decomposition exercise shows supply shocks dominate the money-price nexus in the short to medium term. It also uncovers distinctive policy stance of two central banks. Finally Chapter six concludes, providing a brief summary of the research work as well as a discussion of potential limitations and possible directions for future research
    corecore