12,060 research outputs found

    Local Type Checking for Linked Data Consumers

    Get PDF
    The Web of Linked Data is the cumulation of over a decade of work by the Web standards community in their effort to make data more Web-like. We provide an introduction to the Web of Linked Data from the perspective of a Web developer that would like to build an application using Linked Data. We identify a weakness in the development stack as being a lack of domain specific scripting languages for designing background processes that consume Linked Data. To address this weakness, we design a scripting language with a simple but appropriate type system. In our proposed architecture some data is consumed from sources outside of the control of the system and some data is held locally. Stronger type assumptions can be made about the local data than external data, hence our type system mixes static and dynamic typing. Throughout, we relate our work to the W3C recommendations that drive Linked Data, so our syntax is accessible to Web developers.Comment: In Proceedings WWV 2013, arXiv:1308.026

    Blocking SQL Injection in Database Stored Procedures

    Get PDF
    This thesis contains a summary of all the work that has been done by us for the B-Tech project in the academic session of 2009-2010. The area chosen for the project was SQL Injection attacks and methods to prevent them, and this thesis goes on to describe four proposed models to block SQL Injection, all of them obtained from published research papers. It then gives the details of the implementation of the model “SQL Injection prevention in database stored procedures” as proposed by K. Muthuprasanna et al, which describes a technique to prevent injections attacks occurring due to dynamic SQL statements in database stored procedures, which are often used in e-commerce applications. The thesis also contains the algorithms used, data flow diagrams for the system, user interface samples and the performance reports. The particulars of some of the modifications made to the proposed model during implementation have also been documented, and there has also been included a section which discusses the possible updations that could be made to the tool, and future work

    A Discriminative Survey on SQL Injection Methods to Detect Vulnerabilities in Web applications

    Get PDF
    SQL Injection Attacks are extremely sober intrusion assaults on web based application since such types of assaults could reveals the secrets and safety of information. In actuality, illegal personnel intrude to the web based database and then after consequently, access to the information. To avoid such type of assault different methods are recommended by various researchers but they are not adequate since most of implemented methods will not prevent all type of assaults. In this paper we did survey on the various sorts of SQL Injection attacks and on the various present SQL Injection Attacks avoidance methods available. We analyzed that the existing SQL Injection Attacks avoidance methods will require the client side information, one by one and then authenticate which will create typical the developer’s job to write different validation codes for every web page which is receiving in the server side. Keywords: SQL Injection, Attacks, Vulnerability, WWW, XS

    A Symbolic Execution Algorithm for Constraint-Based Testing of Database Programs

    Full text link
    In so-called constraint-based testing, symbolic execution is a common technique used as a part of the process to generate test data for imperative programs. Databases are ubiquitous in software and testing of programs manipulating databases is thus essential to enhance the reliability of software. This work proposes and evaluates experimentally a symbolic ex- ecution algorithm for constraint-based testing of database programs. First, we describe SimpleDB, a formal language which offers a minimal and well-defined syntax and seman- tics, to model common interaction scenarios between pro- grams and databases. Secondly, we detail the proposed al- gorithm for symbolic execution of SimpleDB models. This algorithm considers a SimpleDB program as a sequence of operations over a set of relational variables, modeling both the database tables and the program variables. By inte- grating this relational model of the program with classical static symbolic execution, the algorithm can generate a set of path constraints for any finite path to test in the control- flow graph of the program. Solutions of these constraints are test inputs for the program, including an initial content for the database. When the program is executed with respect to these inputs, it is guaranteed to follow the path with re- spect to which the constraints were generated. Finally, the algorithm is evaluated experimentally using representative SimpleDB models.Comment: 12 pages - preliminary wor

    Evolving database systems : a persistent view

    Get PDF
    Submitted to POS7 This work was supported in St Andrews by EPSRC Grant GR/J67611 "Delivering the Benefits of Persistence"Orthogonal persistence ensures that information will exist for as long as it is useful, for which it must have the ability to evolve with the growing needs of the application systems that use it. This may involve evolution of the data, meta-data, programs and applications, as well as the users' perception of what the information models. The need for evolution has been well recognised in the traditional (data processing) database community and the cost of failing to evolve can be gauged by the resources being invested in interfacing with legacy systems. Zdonik has identified new classes of application, such as scientific, financial and hypermedia, that require new approaches to evolution. These applications are characterised by their need to store large amounts of data whose structure must evolve as it is discovered by the applications that use it. This requires that the data be mapped dynamically to an evolving schema. Here, we discuss the problems of evolution in these new classes of application within an orthogonally persistent environment and outline some approaches to these problems.Postprin

    Automatically Leveraging MapReduce Frameworks for Data-Intensive Applications

    Full text link
    MapReduce is a popular programming paradigm for developing large-scale, data-intensive computation. Many frameworks that implement this paradigm have recently been developed. To leverage these frameworks, however, developers must become familiar with their APIs and rewrite existing code. Casper is a new tool that automatically translates sequential Java programs into the MapReduce paradigm. Casper identifies potential code fragments to rewrite and translates them in two steps: (1) Casper uses program synthesis to search for a program summary (i.e., a functional specification) of each code fragment. The summary is expressed using a high-level intermediate language resembling the MapReduce paradigm and verified to be semantically equivalent to the original using a theorem prover. (2) Casper generates executable code from the summary, using either the Hadoop, Spark, or Flink API. We evaluated Casper by automatically converting real-world, sequential Java benchmarks to MapReduce. The resulting benchmarks perform up to 48.2x faster compared to the original.Comment: 12 pages, additional 4 pages of references and appendi
    corecore