3,367 research outputs found

    A Brief History of Web Crawlers

    Full text link
    Web crawlers visit internet applications, collect data, and learn about new web pages from visited pages. Web crawlers have a long and interesting history. Early web crawlers collected statistics about the web. In addition to collecting statistics about the web and indexing the applications for search engines, modern crawlers can be used to perform accessibility and vulnerability checks on the application. Quick expansion of the web, and the complexity added to web applications have made the process of crawling a very challenging one. Throughout the history of web crawling many researchers and industrial groups addressed different issues and challenges that web crawlers face. Different solutions have been proposed to reduce the time and cost of crawling. Performing an exhaustive crawl is a challenging question. Additionally capturing the model of a modern web application and extracting data from it automatically is another open question. What follows is a brief history of different technique and algorithms used from the early days of crawling up to the recent days. We introduce criteria to evaluate the relative performance of web crawlers. Based on these criteria we plot the evolution of web crawlers and compare their performanc

    Proposing a secure component-based-application logic and system’s integration testing approach

    Get PDF
    Software engineering moved from traditional methods of software enterprise applications to com-ponent based development for distributed system’s applications. This new era has grown up forlast few years, with component-based methods, for design and rapid development of systems, butfact is that , deployment of all secure software features of technology into practical e-commercedistributed systems are higher rated target for intruders. Although most of research has been con-ducted on web application services that use a large share of the present software, but on the otherside Component Based Software in the middle tier ,which rapidly develops application logic, alsoopen security breaching opportunities .This research paper focus on a burning issue for researchersand scientists ,a weakest link in component based distributed system, logical attacks, that cannotbe detected with any intrusion detection system within the middle tier e-commerce distributed ap-plications. We proposed An Approach of Secure Designing application logic for distributed system,while dealing with logically vulnerability issue

    Verifying Web Applications: From Business Level Specifications to Automated Model-Based Testing

    Full text link
    One of reasons preventing a wider uptake of model-based testing in the industry is the difficulty which is encountered by developers when trying to think in terms of properties rather than linear specifications. A disparity has traditionally been perceived between the language spoken by customers who specify the system and the language required to construct models of that system. The dynamic nature of the specifications for commercial systems further aggravates this problem in that models would need to be rechecked after every specification change. In this paper, we propose an approach for converting specifications written in the commonly-used quasi-natural language Gherkin into models for use with a model-based testing tool. We have instantiated this approach using QuickCheck and demonstrate its applicability via a case study on the eHealth system, the national health portal for Maltese residents.Comment: In Proceedings MBT 2014, arXiv:1403.704

    Statically Checking Web API Requests in JavaScript

    Full text link
    Many JavaScript applications perform HTTP requests to web APIs, relying on the request URL, HTTP method, and request data to be constructed correctly by string operations. Traditional compile-time error checking, such as calling a non-existent method in Java, are not available for checking whether such requests comply with the requirements of a web API. In this paper, we propose an approach to statically check web API requests in JavaScript. Our approach first extracts a request's URL string, HTTP method, and the corresponding request data using an inter-procedural string analysis, and then checks whether the request conforms to given web API specifications. We evaluated our approach by checking whether web API requests in JavaScript files mined from GitHub are consistent or inconsistent with publicly available API specifications. From the 6575 requests in scope, our approach determined whether the request's URL and HTTP method was consistent or inconsistent with web API specifications with a precision of 96.0%. Our approach also correctly determined whether extracted request data was consistent or inconsistent with the data requirements with a precision of 87.9% for payload data and 99.9% for query data. In a systematic analysis of the inconsistent cases, we found that many of them were due to errors in the client code. The here proposed checker can be integrated with code editors or with continuous integration tools to warn programmers about code containing potentially erroneous requests.Comment: International Conference on Software Engineering, 201

    Testing Mobile Web Applications for W3C Best Practice Compliance

    Get PDF
    Adherence to best practices and standards when developing mobile web applications is important to achieving a quality outcome. As smartphones and tablet PCs continue to proliferate in the consumer electronics market, businesses and individuals are increasingly turning from the native application paradigm to HTML 5-based web applications as a means of software development and distribution. With an everincreasing reliance by users on the correct functioning of such applications, the requirement for stringent and comprehensive quality assurance measures is also brought sharply into focus. This research investigates the increasing trend towards mobile web application development in the mobile software domain, and assesses the requirement for an automated approach to best practice validation testing for mobile web applications. Contemporary approaches to automated web application testing are examined, with particular emphasis on issues relating to mobile web application tests. The individual guidelines proposed by the W3C Mobile Web Application Best Practices are analysed and where applicable automated conformance tests are implemented in a customised testing tool. A range of mobile web applications are tested using this tool in order to examine the extent to which implementation of the tested-for guidelines is detected. Automated tests were successfully implemented in respect of nearly 60% of the best practices

    Automock Automated Mock Backend Generation for JavaScript based Applications

    Get PDF
    Modern web development is an intensely collaborative process. Frontend Developers, Backend Developers and Quality Assurance Engineers are integral cogs of a development machine. Frontend developers constantly juggle developing new features, fixing bugs and writing good unit test cases. Achieving this is sometimes difficult as frontend developers are not able to utilize their time completely. They have to wait for the backend to be ready and wait for pages to load during iterations. This paper proposes an approach that enables frontend developers to quickly generate a mock backend that behaves exactly like their actual backend. This generated mock backend minimizes the dependency between frontend developers and backend developers, since both the teams can now utilize the entire sprint duration efficiently. The approach also aids the frontend developer to perform quicker iterations and modifications to his or her code

    Integrating DGSs and GATPs in an Adaptative and Collaborative Blended-Learning Web-Environment

    Full text link
    The area of geometry with its very strong and appealing visual contents and its also strong and appealing connection between the visual content and its formal specification, is an area where computational tools can enhance, in a significant way, the learning environments. The dynamic geometry software systems (DGSs) can be used to explore the visual contents of geometry. This already mature tools allows an easy construction of geometric figures build from free objects and elementary constructions. The geometric automated theorem provers (GATPs) allows formal deductive reasoning about geometric constructions, extending the reasoning via concrete instances in a given model to formal deductive reasoning in a geometric theory. An adaptative and collaborative blended-learning environment where the DGS and GATP features could be fully explored would be, in our opinion a very rich and challenging learning environment for teachers and students. In this text we will describe the Web Geometry Laboratory a Web environment incorporating a DGS and a repository of geometric problems, that can be used in a synchronous and asynchronous fashion and with some adaptative and collaborative features. As future work we want to enhance the adaptative and collaborative aspects of the environment and also to incorporate a GATP, constructing a dynamic and individualised learning environment for geometry.Comment: In Proceedings THedu'11, arXiv:1202.453

    Abmash: Mashing Up Legacy Web Applications by Automated Imitation of Human Actions

    Get PDF
    Many business web-based applications do not offer applications programming interfaces (APIs) to enable other applications to access their data and functions in a programmatic manner. This makes their composition difficult (for instance to synchronize data between two applications). To address this challenge, this paper presents Abmash, an approach to facilitate the integration of such legacy web applications by automatically imitating human interactions with them. By automatically interacting with the graphical user interface (GUI) of web applications, the system supports all forms of integrations including bi-directional interactions and is able to interact with AJAX-based applications. Furthermore, the integration programs are easy to write since they deal with end-user, visual user-interface elements. The integration code is simple enough to be called a "mashup".Comment: Software: Practice and Experience (2013)
    corecore