8 research outputs found

    Web service testing techniques: A systematic literature review

    Get PDF
    These days continual demands on loosely coupled systems have web service gives basic necessities to deliver resolution that are adaptable and sufficient to be work at runtime for maintaining the high quality of the system. One of the basic techniques to evaluate the quality of such systems is through testing. Due to the rapid popularization of web service, which is progressing and continuously increasing, testing of web service has become a basic necessity to maintain high quality of web service. The testing of the performance of Web service based applications is attracting extensive attention. In order to evaluate the performance of web services, it is essential to evaluate the QoS (Quality of Service) attributes such as interoperability, reusability, auditability, maintainability, accuracy and performance to improve the quality of service. The purpose of this study is to introduce the systematic literature review of web services testing techniques to evaluate the QoS attributes to make the testing technique better. With the intention of better testing quality in web services, this systematic literature review intends to evaluate what QoS parameters are necessary to provide better quality assurance. The focus of systematic literature is also to make sure that quality of testing can be encouraged for the present and future. Consequently, the main attention and motivation of the study is to provide an overview of recent research efforts of web service testing techniques from the research community. Each testing technique in web services has identified apparent standards, benefits, and restrictions. This systemic literature review provides a different testing resolution to industry to decide which testing technique is the most efficient and effective with the testing assignment agenda with available resources. As for the significance, it can be said that web service testing technique are still broadly open for improvements

    Designing, Aligning, and Visualizing Service Systems

    Get PDF
    Service is a concept that separates the concerns of an organization into (1) the value created for users and (2) the way the organization manages its resources to provide this value. The discipline of management of information technology (IT) uses services to coordinate and to optimize the use of IT resources (servers, applications, databases, etc.) in a way that brings value to users. The concrete application of the service concept is challenging due to its abstract, interdependent and recursive nature. We experienced this challenge while collaborating with the IT department of our university (École Polytechnique FĂ©dĂ©rale de Lausanne, EPFL) when the IT department adopted the IT Infrastructure Library (ITIL) best-practices framework for IT service management. As researchers, we have the goal of improving the understanding of services as a means to structuring what people and organizations do. In the context of the IT department, we studied how to apply the service concept internally within the IT department, and externally (as business services) in the overall organization. In this thesis, we model services by using systems thinking principles. In particular, we use and improve SEAM, the systemic service-modeling method developed in our laboratory. Our main result is an ontology for SEAM service modeling. Our contributions are the heuristics that define how the ontology relates to a perceived reality: for example, the heuristics focus on behavior rather than organization and they put an emphasis on service instances rather than service types. We also define alignment between service systems, based on the properties of the systemsÂż behavior. We show how to model an organization by implementing the concept of service as defined by our ontology. This ontology supports the design of service systems that align across both IT and business services. During our work with over one hundred IT services, we developed several visualization prototypes of a service cartography; we use these prototypes to describe and to relate the different views required for managing services. Our results offer a concrete way to implement the abstract concept of services. This way could be of interest for any organization willing to embark on a large-scale service project

    Recent Advances in Social Data and Artificial Intelligence 2019

    Get PDF
    The importance and usefulness of subjects and topics involving social data and artificial intelligence are becoming widely recognized. This book contains invited review, expository, and original research articles dealing with, and presenting state-of-the-art accounts pf, the recent advances in the subjects of social data and artificial intelligence, and potentially their links to Cyberspace

    A consensus-based approach for structural resilience to earthquakes using machine learning techniques

    Get PDF
    Seismic hazards represent a constant threat for both the built environment but mainly for human lives. Past approaches to seismic engineering considered the building deformability as limited to the elastic behaviour. Following to the introduction of performance-based approaches a whole new methodology for seismic design and assessment was proposed, relying on the ability of a building to extend its deformability in the plastic domain. This links to the ability of the building to undergo large deformations but still withstand it and therefore safeguard human lives. This allowed to distinguish between transient and permanent deformations when undergoing dynamic (e.g., seismic) stresses. In parallel, a whole new discipline is flourishing, which sees traditional structural analysis methods coupled to Artificial Intelligence (AI) strategies. In parallel, the emerging discipline of resilience has been widely implemented in the domain of disaster management and also in structural engineering. However, grounding on an extensive literature review, current approaches to disaster management at the building and district level exhibit a significant fragmentation in terms of strategies of objectives, highlighting the urge for a more holistic conceptualization. The proposed methodology therefore aims at addressing both the building and district levels, by the adoption of scale-specific methodologies suitable for the scale of analysis. At the building level, an analytical three-stage methodology is proposed to enhance traditional investigation and structural optimization strategies by the utilization of object-oriented programming, evolutionary computing and deep learning techniques. This is validated throughout the application of the proposed methodology on a real building in Old Beichuan, which underwent seismically-triggered damages as a result of the 2008 Wenchuan Earthquake. At the district scale, a so-called qualitative methodology is proposed to attain a resilience evaluation in face of geo-environmental hazards and specifically targeting the built environment. A Delphi expert consultation is adopted and a framework is presented. To combine the two scales, a high-level strategy is ultimately proposed in order to interlace the building and district-scale simulations to make them organically interlinked. To this respect, a multi-dimensional mapping of the area of Old-Beichuan is presented to aid the identification of some key indicators of the district-level framework. The research has been conducted in the context of the REACH project, `vi investigating the built environment’s resilience in face of seismically-triggered geo-environmental hazards in the context of the 2008 Wenchuan Earthquake in China. Results show that an optimized performance-based approach would significantly enhance traditional analysis and investigation strategies, providing an approximate damage reduction of 25% with a cost increase of 20%. In addition, the utilization of deep learning techniques to replace traditional simulation engine proved to attain a result precision up to 98%, making it reliable to conduct investigation campaign in relation to specific building features when traditional methods fail due to the impossibility of either accessing the building or tracing pertinent documentation. It is therefore demonstrated how sometimes challenging regulatory frameworks is a necessary step to enhance the resilience of buildings in face of seismic hazards

    Doing privacy right or doing privacy rights. Examining the influence of privacy activities in the nonmarket environment on consumer attitudes and intentions

    Get PDF
    Data breaches are rising in magnitude and cost, with technology and privacy threats advancing at a faster pace than privacy regulation. A more sustainable approach to privacy beyond regulation is required. Whilst privacy studies suggest that exceeding regulatory minimums reduces privacy incidents, there is a dearth of scholarship researching privacy activities beyond regulatory minimums. Organisations typically conduct activities beyond regulatory minimums in the nonmarket environment e.g., political and socially responsible activity. Thus, the nonmarket environment provides a starting point for insight into privacy activities beyond regulation. This research utilises a three-stage sequential mixed-methods approach. Each stage is underpinned by theories of control and justice. In the first stage, an Online Delphi Survey is conducted to develop a taxonomy of control-based and justice-based nonmarket privacy activities. A theoretical framework of four primary approaches to privacy in the nonmarket environment is then developed. In the second stage, a number of CSR (CSR) reports (n=90) are reviewed using thematic analysis (leveraging the taxonomy previously developed). Control and justice totals are then calculated for the privacy activities reported in these publications, enabling their approach to nonmarket privacy to be positioned in one of four primary nonmarket privacy orientations. In the third stage, a theoretical framework is developed, based on the Power Responsibility Equilibrium (PRE) theory. Using this framework, a number of hypotheses are formulated regarding the relationships between nonmarket privacy activities and consumer trust, privacy concern, and purchase intention/continuance intention. These hypotheses are explored quantitatively using an experimental vignette methodology (n=396 for the first experiment, and n=503 for the second experiment). Control is found to be associated with increased privacy concern, and reduced consumer trust and purchase/continuance intention. Justice is found to be associated with reduced privacy concern, and increased consumer trust and purchase/continuance intention. This research describes a typology of nonmarket privacy for the first time, and examines a previously unexplored phenomenon. This research extends PRE Theory to the context of nonmarket privacy activities. This research also extends CSR posture theory with the addition of an additional posture called the Warrior posture, and extends the three Generations of CSR with a Fourth Generation of CSR. The research findings provide insights which can assist organisations to address consumers’ privacy concerns and enhance their corporate reputation and bottom line results
    corecore