3,081 research outputs found

    Personalization by Partial Evaluation.

    Get PDF
    The central contribution of this paper is to model personalization by the programmatic notion of partial evaluation.Partial evaluation is a technique used to automatically specialize programs, given incomplete information about their input.The methodology presented here models a collection of information resources as a program (which abstracts the underlying schema of organization and flow of information),partially evaluates the program with respect to user input,and recreates a personalized site from the specialized program.This enables a customizable methodology called PIPE that supports the automatic specialization of resources,without enumerating the interaction sequences beforehand .Issues relating to the scalability of PIPE,information integration,sessioniz-ling scenarios,and case studies are presented

    The applicability of Process Mining to determine and align process model descriptions

    Get PDF
    Within the HU University of Applied Sciences (HU) the department HU Services (HUS) has not got enough insight in their IT Service Management processes to align them to the new Information System that is implemented to support the service management function. The problem that rises from this is that it is not clear for the HU how the actual Incident Management process as facilitated by the application is actually executed. Subsequently it is not clear what adjustments have to be made to the process descriptions to have it resemble the process in the IT Service Management tool. To determine the actual process the HU wants to use Process Mining. Therefore the research question for this study is: ‘How is Process Mining applicable to determine the actual Incident Management process and align this to the existing process model descriptions?’ For this research a case study is performed using Process Mining to check if the actual process resembles like the predefined process. The findings show that it is not possible to mine the process within the scope of the predefined process. The event data are too limited in granularity. From this we conclude that adjustment of the granularity of the given process model to the granularity of the used event data or vice versa is important

    A Systematic Review of Automated Query Reformulations in Source Code Search

    Full text link
    Fixing software bugs and adding new features are two of the major maintenance tasks. Software bugs and features are reported as change requests. Developers consult these requests and often choose a few keywords from them as an ad hoc query. Then they execute the query with a search engine to find the exact locations within software code that need to be changed. Unfortunately, even experienced developers often fail to choose appropriate queries, which leads to costly trials and errors during a code search. Over the years, many studies attempt to reformulate the ad hoc queries from developers to support them. In this systematic literature review, we carefully select 70 primary studies on query reformulations from 2,970 candidate studies, perform an in-depth qualitative analysis (e.g., Grounded Theory), and then answer seven research questions with major findings. First, to date, eight major methodologies (e.g., term weighting, term co-occurrence analysis, thesaurus lookup) have been adopted to reformulate queries. Second, the existing studies suffer from several major limitations (e.g., lack of generalizability, vocabulary mismatch problem, subjective bias) that might prevent their wide adoption. Finally, we discuss the best practices and future opportunities to advance the state of research in search query reformulations.Comment: 81 pages, accepted at TOSE

    Generating eScience Workflows from Statistical Analysis of Prior Data

    Get PDF
    A number of workflow design tools have been developed specifically to enable easy graphical specification of workflows that ensure systematic scientific data capture and analysis and precise provenance information. We believe that an important component that is missing from these existing workflow specification and enactment systems is integration with tools that enable prior detailed analysis of the existing data - and in particular statistical analysis. By thoroughly analyzing the existing relevant datasets first, it is possible to determine precisely where the existing data is sparse or insufficient and what further experimentation is required. Introducing statistical analysis to experimental design will reduce duplication and costs associated with fruitless experimentation and maximize opportunities for scientific breakthroughs. In this paper we describe a workflow specification system that we have developed for a particular eScience application (fuel cell optimization). Experimental workflow instances are generated as a result of detailed statistical analysis and interactive exploration of the existing datasets. This is carried out through a graphical data exploration interface that integrates the widely-used open source statistical analysis software package, R, as a web service

    Debian Clusters for Education and Research: The Missing Manual

    Get PDF

    Automated Knowledge Extraction from Archival Documents, 2019

    Get PDF
    Traditional archival media such as paper, film, photographs, etc. contain a vast storage of knowledge. Much of this knowledge is applicable to current business and scientific problems, and offers solutions, consequently, there is value in extracting this information. While it is possible to manually extract the content, this technique is not feasible for large knowledge repositories due to cost and time. In this thesis, we develop a system that can extract such knowledge automatically from large repositories. A Graphical User Interface that permits users to indicate the location of the knowledge components (indexes) is developed, and software features that permit automatic extraction of indexes from similar documents is presented. The indexes and the documents are stored in a persistentdata store. The system is tested on a University Registrar's legacy paper-based transcript repository. The study shows that the system provides a good solution for large-scale extraction of knowledge from archived paper and other media

    Leveraging Large Language Models (LLMs) for Process Mining (Technical Report)

    Full text link
    This technical report describes the intersection of process mining and large language models (LLMs), specifically focusing on the abstraction of traditional and object-centric process mining artifacts into textual format. We introduce and explore various prompting strategies: direct answering, where the large language model directly addresses user queries; multi-prompt answering, which allows the model to incrementally build on the knowledge obtained through a series of prompts; and the generation of database queries, facilitating the validation of hypotheses against the original event log. Our assessment considers two large language models, GPT-4 and Google's Bard, under various contextual scenarios across all prompting strategies. Results indicate that these models exhibit a robust understanding of key process mining abstractions, with notable proficiency in interpreting both declarative and procedural process models. In addition, we find that both models demonstrate strong performance in the object-centric setting, which could significantly propel the advancement of the object-centric process mining discipline. Additionally, these models display a noteworthy capacity to evaluate various concepts of fairness in process mining. This opens the door to more rapid and efficient assessments of the fairness of process mining event logs, which has significant implications for the field. The integration of these large language models into process mining applications may open new avenues for exploration, innovation, and insight generation in the field

    Integrated stability mapping system for mines

    Get PDF
    The Integrated Stability Mapping System (ISMS) was developed as an engineering tool to quantify the geologic and geo-mechanical information of mines, and to integrate the critical stability influence factors into an overall stability index for use in mine planning and support design. It is generally understood that the inherent underground roof stability is determined by the interaction of both the given geologic characteristics and the local stress influences. Form this perspective, in this dissertation, the need for an integrated stability mapping system is established through investigating the traditional and current hazard mapping practices. In order to fulfill this need, computer aided hazard mapping techniques and popular numerical methods for geo-mechanical analysis are reviewed. Then, an integrated stability mapping system incorporating geology hazard mapping, geologic structural feature impacts, and advanced numerical stress analysis techniques into one solution has been developed.;The stability system is implemented inside the de-facto standard drawing environment, AutoCAD, and in compatible with widely used geology modeling software SurvCADD. This feature allows one to access numerous existing geologic data and mining information from present mine maps easily and directly. The LaModel stress calculation, a boundary element method, integrated within the mapping system can produce realistic and accurate stress and displacement analysis with its distinguished features such as the laminated overburden model, the true topography consideration and actual irregular pillar matching.;After the stability mapping system was developed, two case studies were performed to check for coding errors, calculation accuracy, and for demonstrating the functionalities and usefulness of the system. In the case studies, the composite stability index was compared with field observations. A good correlation has been found although only a few influence factors have been considered.;In the conclusion of this dissertation, it is suggested that the stability mapping system provides mining engineers with the ability to perform comprehensive, rapid and accurate multiple-factor stability mapping analysis. Then the resultant stability map can be a valuable guide to safer support designing and better mine planning, and ultimately increase the safety of mine design and reduce the injuries and fatalities associated with ground fall in underground mines
    • …
    corecore