3,433 research outputs found

    LCG MCDB -- a Knowledgebase of Monte Carlo Simulated Events

    Get PDF
    In this paper we report on LCG Monte Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC collaborations by experts. In many cases, the modern Monte Carlo simulation of physical processes requires expert knowledge in Monte Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly dedicated to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project

    BSML: A Binding Schema Markup Language for Data Interchange in Problem Solving Environments (PSEs)

    Full text link
    We describe a binding schema markup language (BSML) for describing data interchange between scientific codes. Such a facility is an important constituent of scientific problem solving environments (PSEs). BSML is designed to integrate with a PSE or application composition system that views model specification and execution as a problem of managing semistructured data. The data interchange problem is addressed by three techniques for processing semistructured data: validation, binding, and conversion. We present BSML and describe its application to a PSE for wireless communications system design

    Bridging the Semantic Gap with SQL Query Logs in Natural Language Interfaces to Databases

    Full text link
    A critical challenge in constructing a natural language interface to database (NLIDB) is bridging the semantic gap between a natural language query (NLQ) and the underlying data. Two specific ways this challenge exhibits itself is through keyword mapping and join path inference. Keyword mapping is the task of mapping individual keywords in the original NLQ to database elements (such as relations, attributes or values). It is challenging due to the ambiguity in mapping the user's mental model and diction to the schema definition and contents of the underlying database. Join path inference is the process of selecting the relations and join conditions in the FROM clause of the final SQL query, and is difficult because NLIDB users lack the knowledge of the database schema or SQL and therefore cannot explicitly specify the intermediate tables and joins needed to construct a final SQL query. In this paper, we propose leveraging information from the SQL query log of a database to enhance the performance of existing NLIDBs with respect to these challenges. We present a system Templar that can be used to augment existing NLIDBs. Our extensive experimental evaluation demonstrates the effectiveness of our approach, leading up to 138% improvement in top-1 accuracy in existing NLIDBs by leveraging SQL query log information.Comment: Accepted to IEEE International Conference on Data Engineering (ICDE) 201

    Generating SQL queries from visual specifications

    Get PDF
    Abstract: Structured Query Language (SQL) is the most widely used declarative language for accessing relational databases, and an essential topic in introductory database courses in higher learning institutions. Despite the intuitiveness of SQL, formulating and comprehending written queries can be confusing, especially for undergraduate students. One major reason for this is that the simple syntax of SQL is often misleading and hard to comprehend. A number of tools have been developed to aid the comprehension of queries and improve the mental models of students concerning the underlying logic of SQL. Some of these tools employed visualisation and animation in their approach to aid the comprehension of SQL. This paper presents an interactive comprehension aid based on visualisation, specifically designed to support the SQL SELECT statement, an area identified in the literature as problematic for students. The visualisation tool uses visual specifications depicting SQL operations to build queries. This is expected to reduce the cognitive load of a student who is learning SQL. We have shown with an online survey that adopting visual specifications in teaching systems assist students in attaining a richer learning experience in introductory database courses

    XWeB: the XML Warehouse Benchmark

    Full text link
    With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems

    Home-grown CASE tools with XML and XSLT

    Get PDF
    This paper demonstrates an approach to software generation where xml representations of models are transformed to implementations by XSLT style sheets. Although XSLT was not primarily intended for this use, it serves quite well. There are only few problems in this approach, and we identify these based on our examples

    A DSL for EER Data Model Specification

    Get PDF
    In this paper we present a domain specific language (DSL) for Extended Entity-Relationship (EER) data model approach, named EERDSL. EERDSL is a part of our Multi-Paradigm Information System Modeling Tool (MIST) that provides EER database schema specification at the conceptual level and its transformation into a relational data model, or a class model. EERDSL modeling concepts are specified by Ecore, one of the commonly used approaches to create meta-models. In the paper we present both textual and graphical notations of EERDSL. Since only few modeling constraints may be described at the level of abstract syntax, we use Object Constraint Language (OCL) to specify complex validation rules for EER models

    Reverse Proxy Framework using Sanitization Technique for Intrusion Prevention in Database

    Full text link
    With the increasing importance of the internet in our day to day life, data security in web application has become very crucial. Ever increasing on line and real time transaction services have led to manifold rise in the problems associated with the database security. Attacker uses illegal and unauthorized approaches to hijack the confidential information like username, password and other vital details. Hence the real time transaction requires security against web based attacks. SQL injection and cross site scripting attack are the most common application layer attack. The SQL injection attacker pass SQL statement through a web applications input fields, URL or hidden parameters and get access to the database or update it. The attacker take a benefit from user provided data in such a way that the users input is handled as a SQL code. Using this vulnerability an attacker can execute SQL commands directly on the database. SQL injection attacks are most serious threats which take users input and integrate it into SQL query. Reverse Proxy is a technique which is used to sanitize the users inputs that may transform into a database attack. In this technique a data redirector program redirects the users input to the proxy server before it is sent to the application server. At the proxy server, data cleaning algorithm is triggered using a sanitizing application. In this framework we include detection and sanitization of the tainted information being sent to the database and innovate a new prototype.Comment: 9 pages, 6 figures, 3 tables; CIIT 2013 International Conference, Mumba

    On the Automated Verification of Web Applications with Embedded SQL

    Get PDF
    A large number of web applications is based on a relational database together with a program, typically a script, that enables the user to interact with the database through embedded SQL queries and commands. In this paper, we introduce a method for formal automated verification of such systems which connects database theory to mainstream program analysis. We identify a fragment of SQL which captures the behavior of the queries in our case studies, is algorithmically decidable, and facilitates the construction of weakest preconditions. Thus, we can integrate the analysis of SQL queries into a program analysis tool chain. To this end, we implement a new decision procedure for the SQL fragment that we introduce. We demonstrate practical applicability of our results with three case studies, a web administrator, a simple firewall, and a conference management system
    • …
    corecore