53 research outputs found

    Second CLIPS Conference Proceedings, volume 1

    Get PDF
    Topics covered at the 2nd CLIPS Conference held at the Johnson Space Center, September 23-25, 1991 are given. Topics include rule groupings, fault detection using expert systems, decision making using expert systems, knowledge representation, computer aided design and debugging expert systems

    Business rules based legacy system evolution towards service-oriented architecture.

    Get PDF
    Enterprises can be empowered to live up to the potential of becoming dynamic, agile and real-time. Service orientation is emerging from the amalgamation of a number of key business, technology and cultural developments. Three essential trends in particular are coming together to create a new revolutionary breed of enterprise, the service-oriented enterprise (SOE): (1) the continuous performance management of the enterprise; (2) the emergence of business process management; and (3) advances in the standards-based service-oriented infrastructures. This thesis focuses on this emerging three-layered architecture that builds on a service-oriented architecture framework, with a process layer that brings technology and business together, and a corporate performance layer that continually monitors and improves the performance indicators of global enterprises provides a novel framework for the business context in which to apply the important technical idea of service orientation and moves it from being an interesting tool for engineers to a vehicle for business managers to fundamentally improve their businesses

    Data bases and data base systems related to NASA's aerospace program. A bibliography with indexes

    Get PDF
    This bibliography lists 1778 reports, articles, and other documents introduced into the NASA scientific and technical information system, 1975 through 1980

    An Empirical Investigation of the Causes and Consequences of Card-Not-Present Fraud, Its Impact and Solution

    Get PDF
    The boom of electronic commerce technologies in recent years has drastically increased the demand for an effective electronic method to pay or be paid. The currently predominant method is online card payment, in which the cardholder is not present at the point of sale. However this method is accompanied by huge vulnerabilities and also serves as a low-risk avenue for fraudsters to steal card details with the intent to defraud online merchants. These merchants are those who mostly bear the overall risk and consequences because they cannot provide a document signed by the legitimate cardholder. Several attempts and proposals have been introduced to solve this problem. However, many have failed to be adopted, while those that have been adopted have not been able to adequately solve the problem. The card payment industry is fully aware of the problem and its consequences, but it has abdicated responsibility for fraud in this type of transaction, and declines to guarantee the “card-not-present” fraud solutions that have been proposed during the past ten years. Instead, the industry has chosen only to accept responsibility for fraud arising from the “card present” environment, which is of low risk because it uses chip-and-pin technology. As a result, many merchants have withdrawn from online business for fear of losses, while consumers are sometimes turning back to alternative payment and traditional “bricks-and-mortar”-style shopping, for fear of identity theft. In light of these problems and challenges, this research adopted a practice approach to investigate the causes and consequences of card-not-present fraud, the associated infiltration techniques, and the impact on the development of e-commerce, in order to unveil and establish an understanding of the background and characteristics of card-not-present fraud, its causes, its penetration techniques and aftermath. This research examined the result of the investigation, and proposes a feasible solution known as 3W-ADA Sentry System, built out of the framework of analytic geometry to counter threats of card-not-present fraud and related identity theft by introducing a non-electronic and low-cost dynamic tokenization process for card-not-present authentication. This proposal could eventually help to restore the security of online card payment authentication, restore the trust of participants, and improve the development of electronic commerce. However, due to limitations inherent in this research, it provides instead recommendations concerning the additional work that would be required to turn the 3W-ADA Sentry into an Association or Scheme to promote its global adoption and its compatibility with the infrastructures and systems of relevant organisations

    Design For Change: Ontology-Driven Knowledgebase Applications For Dynamic Biological Domains

    Get PDF
    Post-genomic biology is producing a plethora of rapidly changing complex data. Yet extracting useful information from this data is limited by current knowledge management methodologies. Biological knowledge management is complicated by ambiguous nomenclature, cultural differences between biologists and computer scientists, and conventional database technology that was not designed to support rapidly changing complex domains. A recent trend in ontology-driven database design has emerged to address this challenge. While ontologies provide effective knowledge models, attempts to transform ontologies into knowledgebases have revealed an impedance mismatch or ontology transformation gap. A unique methodology called Ultra-Structure Theory (UT) may provide an ontology transform solution that supports large scale, dynamic biological domains by expressing the complexity in data rather than programming code. This thesis aims to survey ontology and database theory and methodologies, and describe how UT integrates and extends them to provide a flexible, semantically expressive knowledgebase solution using standard relational database technology

    Simplifying the use of event-based systems with context mediation and declarative descriptions

    Get PDF
    Current trends like the proliferation of sensors or the Internet of Things lead to Cyber-physical Systems (CPSs). In these systems many different components communicate by exchanging events. While events provide a convenient abstraction for handling the high load these systems generate, CPSs are very complex and require expert computer scientists to handle correctly. We realized that one of the primary reasons for this inherent complexity is that events do not carry context. We analyzed the context of events and realized that there are two dimensions: context about the data of an event and context about the event itself. Context about the data includes assumptions like systems of measurement units or the structure of the encoded information that are required to correctly understand the event. Context about the event itself is data that provides additional information to the information carried by the event. For example an event might carry positional data, the additional information could then be the room identifier belonging to this position. Context about the data helps bridge the heterogeneity that CPSs possess. Event producers and consumers may have different assumptions about the data and thus interpret events in different ways. To overcome this gap, we developed the ACTrESS middleware. ACTrESS provides a model to encode interpretation assumptions in an interpretation context. Clients can thus make their assumptions explicit and send them to the middleware, which is then able to mediate between different contexts by transforming events. Through analysis of the provided contexts, ACTrESS can generate transformers, which are dynamically loaded into the system. It does not need to rely on costly operations like reflection. To prove this, we conducted a performance study which shows that in a content-based publish/subscribe system, the overhead introduced by ACTrESS’ transformations is too small to be measurable. Because events do not carry contextual information, expert computer scientists are required to describe situations that are made up of multiple events. The fact that CPSs promise to transform our everyday life (e.g., smart homes) makes this problem even more severe in that most of the target users cannot use CPSs. In this thesis, we developed a declarative language to easily describe situations and a desired reaction. Furthermore, we provide a mechanism to translate this high-level description to executable code. The key idea is that events are contextualized, i.e. our middleware enriches the event with the missing contextual information based on the situation description. The enriched events are then correlated and combined automatically, to ultimately be able to decide if the described situation is fulfilled or not. By generating small computational units, we achieve good parallelization and are able to elegantly scale up and down, which makes our approach particularly suitable for modern cloud architectures. We conducted a usability analysis and performance study. The usability analysis shows that our approach significantly simplifies the definition of reactive behavior in CPS. The performance study shows that the achieved automatic distribution and parallelization incur a small performance cost compared to highly optimized systems like Esper

    Towards a Research Agenda on Computer-Based Assessment - Challenges and Needs for European Educational Measurement

    Get PDF
    In 2006 the European Parliament and the Council of Europe have passed recommendations on key competences for lifelong learning and the use of a common reference tool to observe and promote progress in terms of the achievement of goals formulated in ÂżLisbon strategyÂż in March 2000 (revised in 2006, see http://ec.europa.eu/growthandjobs/) and its follow-up declarations. For those areas which are not already covered by existing measurements (foreign languages and learning-to-learn skills), indicators for the identification of such skills are now needed, as well as effective instruments for carrying out large-scale assessments in Europe. In this context it is hoped that electronic testing could improve the effectiveness of the needed assessments, i.e. to improve identification of skills, by reducing costs of the whole operation (financial efforts, human resources etc.). The European Commission is asked to assist Member States to define the organisational and resource implications for them of the construction and administration of tests, including looking into the possibility of adopting e-testing as the means to administer the tests. In addition to traditional testing approaches carried out in a paper-pencil mode, there are a variety of aspects needed to be taken into account when computer-based testing is deployed, such as software quality, secure delivery, if Internet-based: reliable network capacities, support, maintenance, software costs for development and test delivery, including licences. Future European surveys are going to introduce new ways of assessing student achievements. Tests can be calibrated to the specific competence level of each student and become more stimulating, going much further than it can be achieved with traditional multiple choice questions. Simulations provide better means of contextualising skills to real life situations and providing a more complete picture of the actual competence to be assessed. However, a variety of challenges require more research into the barriers posed by the use of technologies, e.g. in terms of computer, performance and security. The ÂżQuality of Scientific InformationÂż Action (QSI) and the Centre for Research on Lifelong Learning (CRELL) are carrying out a research project on quality criteria of Open Source skills assessment tools. 2 workshops were carried out in previous years bringing together European key experts from assessment research and practice in order to identify and discuss quality criteria relevant for carrying out large-scale assessments at a European level. This report reflects the contributions made on experiences and key challenges for European skills assessment.JRC.G.9-Econometrics and statistical support to antifrau

    Realizing Business Benefits from Company IT Standardization: Case Study Research into the Organizational Value of IT Standards, Towards a Company IT Standardization Management Framework.

    Get PDF
    From a practical point of view, this research provides insight into how company IT standards affect business process performance. Furthermore it gives recommendations on how to govern and manage such standards successfully with regard to their selection, implementation and usage. After evaluating this research Business may wish to reconsider the way it currently views the value of company IT standards and the manner with which it deals with them.
    • …
    corecore