224 research outputs found

    Validation of highly reliable, real-time knowledge-based systems

    Get PDF
    Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications

    Knowledge Based Systems: A Critical Survey of Major Concepts, Issues, and Techniques

    Get PDF
    This Working Paper Series entry presents a detailed survey of knowledge based systems. After being in a relatively dormant state for many years, only recently is Artificial Intelligence (AI) - that branch of computer science that attempts to have machines emulate intelligent behavior - accomplishing practical results. Most of these results can be attributed to the design and use of Knowledge-Based Systems, KBSs (or ecpert systems) - problem solving computer programs that can reach a level of performance comparable to that of a human expert in some specialized problem domain. These systems can act as a consultant for various requirements like medical diagnosis, military threat analysis, project risk assessment, etc. These systems possess knowledge to enable them to make intelligent desisions. They are, however, not meant to replace the human specialists in any particular domain. A critical survey of recent work in interactive KBSs is reported. A case study (MYCIN) of a KBS, a list of existing KBSs, and an introduction to the Japanese Fifth Generation Computer Project are provided as appendices. Finally, an extensive set of KBS-related references is provided at the end of the report

    Conceptual information processing: A robust approach to KBS-DBMS integration

    Get PDF
    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing

    Integrated knowledge utilization and evolution for the conservation of corporate know-how

    Get PDF
    Insufficient consideration of knowledge evolution is a frequent cause for the failure of knowledge-based systems (KBSs) in industrial practice. Corporate know-how about the design and manufacturing of a particular product is subject to rather rapid changes, and it is hard to specify in advance exactly what information will be requested by various users. Keeping a KBS for the conservation of corporate know-how up-to-date or even enhancing its utility, thus requires the continuous monitoring of its performance, noting deficiencies, and suggestions for improvements. In the current paper, we discuss different ways in which information collected during knowledge utilization can be exploited for system evolution. We present structure-based rule and concept editors which allow for an immediate integration and formalization of new information, even by rather inexperienced users. A prototypical knowledge conservation system for crankshaft design which was developed in cooperation between the DFKI and a German company is used to illustrate and evaluate our approach

    A Software Architecture for Knowledge-Based Systems

    Get PDF
    . The paper introduces a software architecture for the specification and verification of knowledge-based systems combining conceptual and formal techniques. Our focus is component-based specification enabling their reuse. We identify four elements of the specification of a knowledge-based system: a task definition, a problem-solving method, a domain model, and an adapter. We present algebraic specifications and a variant of dynamic logic as formal means to specify and verify these different elements. As a consequence of our architecture we can decompose the overall specification and verification task of the knowledge-based systems into subtasks. We identify different subcomponents for specification and different proof obligations for verification. The use of the architecture in specification and verification improves understandability and reduces the effort for both activities. In addition, its decomposition and modularisation enables reuse of components and proofs. Ther..

    A framework for managing global risk factors affecting construction cost performance

    Get PDF
    Poor cost performance of construction projects has been a major concern for both contractors and clients. The effective management of risk is thus critical to the success of any construction project and the importance of risk management has grown as projects have become more complex and competition has increased. Contractors have traditionally used financial mark-ups to cover the risk associated with construction projects but as competition increases and margins have become tighter they can no longer rely on this strategy and must improve their ability to manage risk. Furthermore, the construction industry has witnessed significant changes particularly in procurement methods with clients allocating greater risks to contractors. Evidence shows that there is a gap between existing risk management techniques and tools, mainly built on normative statistical decision theory, and their practical application by construction contractors. The main reason behind the lack of use is that risk decision making within construction organisations is heavily based upon experience, intuition and judgement and not on mathematical models. This thesis presents a model for managing global risk factors affecting construction cost performance of construction projects. The model has been developed using behavioural decision approach, fuzzy logic technology, and Artificial Intelligence technology. The methodology adopted to conduct the research involved a thorough literature survey on risk management, informal and formal discussions with construction practitioners to assess the extent of the problem, a questionnaire survey to evaluate the importance of global risk factors and, finally, repertory grid interviews aimed at eliciting relevant knowledge. There are several approaches to categorising risks permeating construction projects. This research groups risks into three main categories, namely organisation-specific, global and Acts of God. It focuses on global risk factors because they are ill-defined, less understood by contractors and difficult to model, assess and manage although they have huge impact on cost performance. Generally, contractors, especially in developing countries, have insufficient experience and knowledge to manage them effectively. The research identified the following groups of global risk factors as having significant impact on cost performance: estimator related, project related, fraudulent practices related, competition related, construction related, economy related and political related factors. The model was tested for validity through a panel of validators (experts) and crosssectional cases studies, and the general conclusion was that it could provide valuable assistance in the management of global risk factors since it is effective, efficient, flexible and user-friendly. The findings stress the need to depart from traditional approaches and to explore new directions in order to equip contractors with effective risk management tools

    Measuring the Physical Conditions in High-redshift Star-forming Galaxies: Insights from KBSS-MOSFIRE

    Get PDF
    We use photoionization models that are designed to reconcile the joint rest-UV-optical spectra of high-z star-forming galaxies to self-consistently infer the gas chemistry and nebular ionization and excitation conditions for ~150 galaxies from the Keck Baryonic Structure Survey (KBSS), using only observations of their rest-optical nebular spectra. We find that the majority of z ~ 2–3 KBSS galaxies are moderately O-rich, with an interquartile range in 12 + log(O/H) = 8.29–8.56, and have significantly sub-solar Fe enrichment, with an interquartile range of [Fe/H] = [−0.79, −0.53], which contributes additional evidence in favor of super-solar O/Fe in high-z galaxies. The model-inferred ionization parameters and N/O are strongly correlated with common strong-line indices (such as O32 and N2O2), with the latter exhibiting similar behavior to local extragalactic H ii regions. In contrast, diagnostics commonly used for measuring gas-phase O/H (such as N2 and O3N2) show relatively large scatter with the overall amount of oxygen present in the gas and behave differently than observed at z ~ 0. We provide a new calibration for using R23 to measure O/H in typical high-z galaxies, although it is most useful for relatively O-rich galaxies; combining O32 and R23 does not yield a more effective calibration. Finally, we consider the implications for the intrinsic correlations between physical conditions across the galaxy sample and find that N/O varies with O/H in high-z galaxies in a manner that is almost identical to local H ii regions. However, we do not find a strong anti-correlation between ionization parameter and metallicity (O/H or Fe/H) in high-z galaxies, which is one of the principal bases for using strong-line ratios to infer oxygen abundance

    An architecture for knowledge-based systems [online]

    Get PDF
    • …
    corecore