43 research outputs found

    A Language for Rule-based Systems

    Get PDF
    Expert systems are proliferating in many situations in which it is important to capture expertise in a computer system. This type of system is useful in situations in which human expertise is expensive or difficult to obtain or in which the operating environment is too dangerous for a person. Expert systems are used to address the following categories of problems: interpretation, prediction, diagnosis, design, planning, monitoring, debugging, repair, instruction, and control. [Hayes-Roth] Expert system have now moved out of the laboratory and are being used in production environments. Herein lies the problem addressed by this research. Expert systems have traditionally been used in a research environment in which the software engineering of the product is not particularly important. Production environments are much more demanding. The quality necessary for continual use and abuse is not generally built into research quality expert systems. The problem is further exacerbated when an expert system is to be embedded in an autonomous system for which human interaction is difficult. (For example, an expert system could be used to drive a robot in a hazardous environment. If the expert system fails, it may not be easy for a human to reach the robot for repair.) Quality in these situations is vital

    The Integration of Software Development Tools

    Get PDF
    The effectiveness of software development tools can be dr creased by their integration (i.e. their cooperation). This paper discusses the problems to be overcome in integration of tools, and a categorization of the degree of tool integration. The continuum from loose to tight integration is parameterized. An informal method is described to apply these parameters to tools in order to determine some nxeasure of their ability to be integrated

    Techniques for the Integration of Existing Tools

    Get PDF
    The purpose of this paper is to explain and demonstrate the advantages of tool integration and the reuse of tools. Several techniques for the integration of existing tools are presented and discussed. These techniques include the use of a monitor, simlated incremental operation, syntax padding, view extraction, and output distribution. The advantages of the use of these methods of tool intregation are illustrated by their use in integrating an existing compiler and on-line debugger. The commands that are produced by this synergisln have increased both the user friendliness of the tools and the power of the resultant command

    Objects and Types: A Tutorial

    Get PDF
    This paper is a tutorial explaining the concepts that surround abstract data types and object-oriented programming, and the relationships between these groups of concepts. These concepts include types (languagedefied, user-defied, abstract), instantiations, differences between operations and functions, overloading, objects, state, inheritance and, messages. Some of the these trems, e.g. "type", have been well defied. Many others are used in seveml contexts with multiple meanings. This paper is an attempt to identify consistent and meaningful definitions which are the most widely accepted

    Using Dissimilarity Metrics to Identify Interesting Designs

    Get PDF
    A computer program helps to blend the power of automated-search software, which is able to generate large numbers of design solutions, with the insight of expert designers, who are able to identify preferred designs but do not have time to examine all the solutions. From among the many automated solutions to a given design problem, the program selects a smaller number of solutions that are worthy of scrutiny by the experts in the sense that they are sufficiently dissimilar from each other. The program makes the selection in an interactive process that involves a sequence of data-mining steps interspersed with visual displays of results of these steps to the experts. At crucial points between steps, the experts provide directives to guide the process. The program uses heuristic search techniques to identify nearly optimal design solutions and uses dissimilarity metrics defined by the experts to characterize the degree to which solutions are interestingly different. The search, data-mining, and visualization features of the program were derived from previously developed risk-management software used to support a risk-centric design methodolog

    Visual Depiction of Decision Statements: What is Best for Programmers and Non-programmers

    Get PDF
    This paper reports the results of two experiments investigating differences in comprehensibility of textual and graphical notations for representing decision statements. The first experiment was a replication of a prior experiment that found textual notations to be better than particular graphical notations. After replicating this study, two other hypotheses were investigated in a second experiment. Our first claim is that graphics may be better for technical, non-programmers than they are for programmers because of the great amount of experience that programmers have with textual notations in programming languages. The second is that modifications to graphical forms may improve their usefulness. The results support both of these hypotheses. Keywords: visual programming, decision structures, program comprehension, expert-novice difference

    Developing Software Requirements for a Knowledge Management System that Coordinates Training Programs with Business Processes and Policies in Large Organizations

    Get PDF
    For large organizations, updating instructional programs presents a challenge to keep abreast of constantly changing business processes and policies. Each time a process or policy changes, significant resources are required to locate and modify the training materials that convey the new content. Moreover, without the ability to track learning objects to processes and policies, training managers cannot conduct an effective training gap analysis in these areas. As a result, the corporate training picture is unclear and instructional needs cannot be accurately determined. The research addressed these problems by recognizing the need for linkages between an organization\u27s business processes, its policies, and the learning objects that package the corresponding training content and deliver it to the workforce. The overall investigation was completed in three parts. In the first study, a thorough examination of the literature was conducted to determine the extent of the research problem and to provide a theoretical foundation for a solution. In the second study an expert panel was used to elicit user needs for a knowledge management system that addresses training management shortcomings in a large law enforcement agency. Another expert panel from that agency validated and prioritized the user needs during the third study. Through a combination of research-based elicitation and validation techniques, an accurate list of natural language software requirements emerged to represent the collective needs of the law enforcement training experts. The software requirements may now serve to analyze the capabilities of existing information technology systems or to form the basis for a request for proposal (RFP) to build the envisioned knowledge management system

    Towards a development of a Social Engineering eXposure Index (SEXI) using publicly available personal information

    Get PDF
    Millions of people willingly expose their lives via Internet technologies every day, and even those who stay off the Internet find themselves exposed through data breaches. Trillions of private information records flow through the Internet. Marketers gather personal preferences to coerce shopping behavior, while providers gather personal information to provide enhanced services. Few users have considered where their information is going or who has access to it. Even fewer are aware of how decisions made in their own lives expose significant pieces of information, which can be used to harm the very organizations they are affiliated with by cyber attackers. While this threat can affect everyone, upper management provides a significantly higher risk due to their level of access to critical data and finances targeted by cybercrime. Thus, the goal of this work-in-progress research is to develop and validate a means to measure exposure to social engineering of 100 executives from Fortune 500 companies. This work-in-progress study will include a mixed methods approach combining an expert panel using the Delphi method, developmental research, and a quantitative data collection. The expert panel will provide a weighted evaluation instrument, subsequently used to develop an algorithm that will form the basis for a Social Engineering eXposure Index (SEXI) using publicly available personal information found on the Internet on these executives, which will help quantify the exposure of each executive. The collected data will be quantitatively evaluated, analyzed, and presented

    Enhanced firing of locus coeruleus neurons and SK channel dysfunction are conserved in distinct models of prodromal Parkinson's disease

    Get PDF
    Parkinson’s disease (PD) is clinically defined by the presence of the cardinal motor symptoms, which are associated with a loss of dopaminergic nigrostriatal neurons in the substantia nigra pars compacta (SNpc). While SNpc neurons serve as the prototypical cell-type to study cellular vulnerability in PD, there is an unmet need to extent our efforts to other neurons at risk. The noradrenergic locus coeruleus (LC) represents one of the first brain structures affected in Parkinson’s disease (PD) and plays not only a crucial role for the evolving non-motor symptomatology, but it is also believed to contribute to disease progression by efferent noradrenergic deficiency. Therefore, we sought to characterize the electrophysiological properties of LC neurons in two distinct PD models: (1) in an in vivo mouse model of focal α-synuclein overexpression; and (2) in an in vitro rotenone-induced PD model. Despite the fundamental differences of these two PD models, α-synuclein overexpression as well as rotenone exposure led to an accelerated autonomous pacemaker frequency of LC neurons, accompanied by severe alterations of the afterhyperpolarization amplitude. On the mechanistic side, we suggest that Ca(2+)-activated K(+) (SK) channels are mediators of the increased LC neuronal excitability, as pharmacological activation of these channels is sufficient to prevent increased LC pacemaking and subsequent neuronal loss in the LC following in vitro rotenone exposure. These findings suggest a role of SK channels in PD by linking α-synuclein- and rotenone-induced changes in LC firing rate to SK channel dysfunction
    corecore