94,522 research outputs found

    Constraint-based generation of database states for testing database applications

    Get PDF
    Testing is essential for quality assurance of database applications. To test the quality of database applications, it usually requires test inputs consisting of both program input values and corresponding database states. However, producing these tests could be very tedious and labor-intensive in a non-automated way. It is thus imperative to conduct automatic test generation helping reduce human efforts. The research focuses on automatic test generation of both program input values and corresponding database states for testing database applications. We develop our approaches based on the Dynamic Symbolic Execution (DSE) technique to achieve various testing requirements. We formalize a problem for program-input-generation given an existing database state to achieve high program code coverage and propose an approach that conducts program-input-generation through auxiliary query construction based on the intermediate information accumulated during DSE's exploration. We develop a technique to generate database states to achieve advanced code coverage criteria such as Boundary Value Coverage and Logical Coverage. We develop an approach that constructs synthesized database interactions to guide the DSE's exploration to collect constraints for both program inputs and associated database states. In this way, we bridge various constraints within a database application: query-construction constraints, query constraints, database schema constraints, and query-result-manipulation constraints. We develop an approach that generates tests for mutation testing on database applications. We use a state-of-the-art white-box testing tool called Pex for .NET from Microsoft Research as the DSE engine. Empirical evaluation results show that our approaches are able to generate effective program input values and sufficient database states to achieve various testing requirements

    Lom: discovering logic flaws within MongoDB-based web applications

    Get PDF
    Logic flaws within web applications will allow malicious operations to be triggered towards back-end database. Existing approaches to identifying logic flaws of database accesses are strongly tied to structured query language (SQL) statement construction and cannot be applied to the new generation of web applications that use not only structured query language (NoSQL) databases as the storage tier. In this paper, we present Lom, a black-box approach for discovering many categories of logic flaws within MongoDBbased web applications. Our approach introduces a MongoDB operation model to support new features of MongoDB and models the application logic as a mealy finite state machine. During the testing phase, test inputs which emulate state violation attacks are constructed for identifying logic flaws at each application state. We apply Lom to several MongoDB-based web applications and demonstrate its effectiveness

    Ontology-based data semantic management and application in IoT- and cloud-enabled smart homes

    Get PDF
    The application of emerging technologies of Internet of Things (IoT) and cloud computing have increasing the popularity of smart homes, along with which, large volumes of heterogeneous data have been generating by home entities. The representation, management and application of the continuously increasing amounts of heterogeneous data in the smart home data space have been critical challenges to the further development of smart home industry. To this end, a scheme for ontology-based data semantic management and application is proposed in this paper. Based on a smart home system model abstracted from the perspective of implementing users’ household operations, a general domain ontology model is designed by defining the correlative concepts, and a logical data semantic fusion model is designed accordingly. Subsequently, to achieve high-efficiency ontology data query and update in the implementation of the data semantic fusion model, a relational-database-based ontology data decomposition storage method is developed by thoroughly investigating existing storage modes, and the performance is demonstrated using a group of elaborated ontology data query and update operations. Comprehensively utilizing the stated achievements, ontology-based semantic reasoning with a specially designed semantic matching rule is studied as well in this work in an attempt to provide accurate and personalized home services, and the efficiency is demonstrated through experiments conducted on the developed testing system for user behavior reasoning

    Relational Constraint Driven Test Case Synthesis for Web Applications

    Full text link
    This paper proposes a relational constraint driven technique that synthesizes test cases automatically for web applications. Using a static analysis, servlets can be modeled as relational transducers, which manipulate backend databases. We present a synthesis algorithm that generates a sequence of HTTP requests for simulating a user session. The algorithm relies on backward symbolic image computation for reaching a certain database state, given a code coverage objective. With a slight adaptation, the technique can be used for discovering workflow attacks on web applications.Comment: In Proceedings TAV-WEB 2010, arXiv:1009.330

    A Symbolic Execution Algorithm for Constraint-Based Testing of Database Programs

    Full text link
    In so-called constraint-based testing, symbolic execution is a common technique used as a part of the process to generate test data for imperative programs. Databases are ubiquitous in software and testing of programs manipulating databases is thus essential to enhance the reliability of software. This work proposes and evaluates experimentally a symbolic ex- ecution algorithm for constraint-based testing of database programs. First, we describe SimpleDB, a formal language which offers a minimal and well-defined syntax and seman- tics, to model common interaction scenarios between pro- grams and databases. Secondly, we detail the proposed al- gorithm for symbolic execution of SimpleDB models. This algorithm considers a SimpleDB program as a sequence of operations over a set of relational variables, modeling both the database tables and the program variables. By inte- grating this relational model of the program with classical static symbolic execution, the algorithm can generate a set of path constraints for any finite path to test in the control- flow graph of the program. Solutions of these constraints are test inputs for the program, including an initial content for the database. When the program is executed with respect to these inputs, it is guaranteed to follow the path with re- spect to which the constraints were generated. Finally, the algorithm is evaluated experimentally using representative SimpleDB models.Comment: 12 pages - preliminary wor

    Semantic processing of EHR data for clinical research

    Get PDF
    There is a growing need to semantically process and integrate clinical data from different sources for clinical research. This paper presents an approach to integrate EHRs from heterogeneous resources and generate integrated data in different data formats or semantics to support various clinical research applications. The proposed approach builds semantic data virtualization layers on top of data sources, which generate data in the requested semantics or formats on demand. This approach avoids upfront dumping to and synchronizing of the data with various representations. Data from different EHR systems are first mapped to RDF data with source semantics, and then converted to representations with harmonized domain semantics where domain ontologies and terminologies are used to improve reusability. It is also possible to further convert data to application semantics and store the converted results in clinical research databases, e.g. i2b2, OMOP, to support different clinical research settings. Semantic conversions between different representations are explicitly expressed using N3 rules and executed by an N3 Reasoner (EYE), which can also generate proofs of the conversion processes. The solution presented in this paper has been applied to real-world applications that process large scale EHR data.Comment: Accepted for publication in Journal of Biomedical Informatics, 2015, preprint versio
    • …
    corecore