8 research outputs found

    Agile Testing for Blockchain Development QA

    Get PDF
    Agile testing has evolved into a commonly employed practice in most development disciplines. It has been around as long as the agile manifesto and has developed all the hallmarks of a mature set of practices, i.e., tools, metrics, techniques, etc. But its overlap with blockchain is something that has yet to reach the maturity of either – agile testing or blockchain development. The QA for blockchain development hasn’t been standardized in the same manner as the QA for web development and other areas of software development, even newer ones like cloud-native development. Agile testing leans heavily towards automation, Artificial Intelligence (AI), and Machine Learning (ML) and can benefit from collective or separate advances in the three technologies. But these technologies, regardless of their influence on blockchain development and its QA, cannot become the bridge connecting the two. Blockchain development QA suffers from a significant lack of standardization and a unified set of good practices, and this hinders its ability to adapt agile testing practices into the existing paradigm. However, as blockchain development is adopted by agile teams and its QA becomes more standardized, we may see more overlap between agile testing and blockchain development QA

    Web service testing techniques: A systematic literature review

    Get PDF
    These days continual demands on loosely coupled systems have web service gives basic necessities to deliver resolution that are adaptable and sufficient to be work at runtime for maintaining the high quality of the system. One of the basic techniques to evaluate the quality of such systems is through testing. Due to the rapid popularization of web service, which is progressing and continuously increasing, testing of web service has become a basic necessity to maintain high quality of web service. The testing of the performance of Web service based applications is attracting extensive attention. In order to evaluate the performance of web services, it is essential to evaluate the QoS (Quality of Service) attributes such as interoperability, reusability, auditability, maintainability, accuracy and performance to improve the quality of service. The purpose of this study is to introduce the systematic literature review of web services testing techniques to evaluate the QoS attributes to make the testing technique better. With the intention of better testing quality in web services, this systematic literature review intends to evaluate what QoS parameters are necessary to provide better quality assurance. The focus of systematic literature is also to make sure that quality of testing can be encouraged for the present and future. Consequently, the main attention and motivation of the study is to provide an overview of recent research efforts of web service testing techniques from the research community. Each testing technique in web services has identified apparent standards, benefits, and restrictions. This systemic literature review provides a different testing resolution to industry to decide which testing technique is the most efficient and effective with the testing assignment agenda with available resources. As for the significance, it can be said that web service testing technique are still broadly open for improvements

    Automated visual classification of DOM-based presentation failure reports for responsive web pages

    Get PDF
    Since it is common for the users of a web page to access it through a wide variety of devices—including desktops, laptops, tablets and phones—web developers rely on responsive web design (RWD) principles and frameworks to create sites that are useful on all devices. A correctly implemented responsive web page adjusts its layout according to the viewport width of the device in use, thereby ensuring that its design suitably features the content. Since the use of complex RWD frameworks often leads to web pages with hard‐to‐detect responsive layout failures (RLFs), developers employ testing tools that generate reports of potential RLFs. Since testing tools for responsive web pages, like ReDeCheck, analyse a web page representation called the Document Object Model (DOM), they may inadvertently flag concerns that are not human visible, thereby requiring developers to manually confirm and classify each potential RLF as a true positive (TP), false positive (FP), or non‐observable issue (NOI)—a process that is time consuming and error prone. The conference version of this paper presented Viser, a tool that automatically classified three types of RLFs reported by ReDeCheck. Since Viser was not designed to automatically confirm and classify two types of RLFs that ReDeCheck's DOM‐based analysis could surface, this paper introduces Verve, a tool that automatically classifies all RLF types reported by ReDeCheck. Along with manipulating the opacity of HTML elements in a web page, as does Viser, the Verve tool also uses histogram‐based image comparison to classify RLFs in web pages. Incorporating both the 25 web pages used in prior experiments and 20 new pages not previously considered, this paper's empirical study reveals that Verve's classification of all five types of RLFs frequently agrees with classifications produced manually by humans. The experiments also reveal that Verve took on average about 4 s to classify any of the RLFs among the 469 reported by ReDeCheck. Since this paper demonstrates that classifying an RLF as a TP, FP, or NOI with Verve, a publicly available tool, is less subjective and error prone than the same manual process done by a human web developer, we argue that it is well‐suited for supporting the testing of complex responsive web pages

    Recent advances in low-cost particulate matter sensor: calibration and application

    Get PDF
    Particulate matter (PM) has been monitored routinely due to its negative effects on human health and atmospheric visibility. Standard gravimetric measurements and current commercial instruments for field measurements are still expensive and laborious. The high cost of conventional instruments typically limits the number of monitoring sites, which in turn undermines the accuracy of real-time mapping of sources and hotspots of air pollutants with insufficient spatial resolution. The new trends of PM concentration measurement are personalized portable devices for individual customers and networking of large quantity sensors to meet the demand of Big Data. Therefore, low-cost PM sensors have been studied extensively due to their price advantage and compact size. These sensors have been considered as a good supplement of current monitoring sites for high spatial-temporal PM mapping. However, a large concern is the accuracy of these low-cost PM sensors. Multiple types of low-cost PM sensors and monitors were calibrated against reference instruments. All these units demonstrated high linearity against reference instruments with high R2 values for different types of aerosols over a wide range of concentration levels. The question of whether low-cost PM monitors can be considered as a substituent of conventional instruments was discussed, together with how to qualitatively describe the improvement of data quality due to calibrations. A limitation of these sensors and monitors is that their outputs depended highly on particle composition and size, resulting in as high as 10 times difference in the sensor outputs. Optical characterization of low-cost PM sensors (ensemble measurement) was conducted by combining experimental results with Mie scattering theory. The reasons for their dependence on the PM composition and size distribution were studied. To improve accuracy in estimation of mass concentration, an expression for K as a function of the geometric mean diameter, geometric standard deviation, and refractive index is proposed. To get rid of the influence of the refractive index, we propose a new design of a multi-wavelength sensor with a robust data inversion routine to estimate the PM size distribution and refractive index simultaneously. The utility of the networked system with improved sensitivity was demonstrated by deploying it in a woodworking shop. Data collected by the networked system was utilized to construct spatiotemporal PM concentration distributions using an ordinary Kriging method and an Artificial Neural Network model to elucidate particle generation and ventilation processes. Furthermore, for the outdoor environment, data reported by low-cost sensors were compared against satellite data. The remote sensing data could provide a daily calibration of these low-cost sensors. On the other hand, low-cost PM sensors could provide better accuracy to demonstrate the microenvironment

    20. ASIM Fachtagung Simulation in Produktion und Logistik 2023

    Get PDF

    計算力学研究センター年次報告書

    Get PDF

    Shortest Route at Dynamic Location with Node Combination-Dijkstra Algorithm

    Get PDF
    Abstract— Online transportation has become a basic requirement of the general public in support of all activities to go to work, school or vacation to the sights. Public transportation services compete to provide the best service so that consumers feel comfortable using the services offered, so that all activities are noticed, one of them is the search for the shortest route in picking the buyer or delivering to the destination. Node Combination method can minimize memory usage and this methode is more optimal when compared to A* and Ant Colony in the shortest route search like Dijkstra algorithm, but can’t store the history node that has been passed. Therefore, using node combination algorithm is very good in searching the shortest distance is not the shortest route. This paper is structured to modify the node combination algorithm to solve the problem of finding the shortest route at the dynamic location obtained from the transport fleet by displaying the nodes that have the shortest distance and will be implemented in the geographic information system in the form of map to facilitate the use of the system. Keywords— Shortest Path, Algorithm Dijkstra, Node Combination, Dynamic Location (key words
    corecore