1,237 research outputs found

    A look at cloud architecture interoperability through standards

    Get PDF
    Enabling cloud infrastructures to evolve into a transparent platform while preserving integrity raises interoperability issues. How components are connected needs to be addressed. Interoperability requires standard data models and communication encoding technologies compatible with the existing Internet infrastructure. To reduce vendor lock-in situations, cloud computing must implement universal strategies regarding standards, interoperability and portability. Open standards are of critical importance and need to be embedded into interoperability solutions. Interoperability is determined at the data level as well as the service level. Corresponding modelling standards and integration solutions shall be analysed

    Interoperability standards for cloud architecture

    Get PDF
    Enabling cloud infrastructures to evolve into a transparent platform raises interoperability issues. Interoperability requires standard data models and communication technologies compatible with the existing Internet infrastructure. To reduce vendor lock-in situations, cloud computing must implement common strategies regarding standards, interoperability and portability. Open standards are of critical importance and need to be embedded into interoperability solutions. Interoperability is determined at the data level as well as the service level. Relevant modelling standards and integration solutions shall be analysed in the context of clouds

    Developing front-end Web 2.0 technologies to access services, content and things in the future Internet

    Get PDF
    The future Internet is expected to be composed of a mesh of interoperable web services accessible from all over the web. This approach has not yet caught on since global user?service interaction is still an open issue. This paper states one vision with regard to next-generation front-end Web 2.0 technology that will enable integrated access to services, contents and things in the future Internet. In this paper, we illustrate how front-ends that wrap traditional services and resources can be tailored to the needs of end users, converting end users into prosumers (creators and consumers of service-based applications). To do this, we propose an architecture that end users without programming skills can use to create front-ends, consult catalogues of resources tailored to their needs, easily integrate and coordinate front-ends and create composite applications to orchestrate services in their back-end. The paper includes a case study illustrating that current user-centred web development tools are at a very early stage of evolution. We provide statistical data on how the proposed architecture improves these tools. This paper is based on research conducted by the Service Front End (SFE) Open Alliance initiative

    A Service based Development Environment on Web 2.0 Platforms

    Get PDF
    Governments are investing on the IT adoption and promoting the socalled e-economies as a way to improve competitive advantages. One of the main government’s actions is to provide internet access to the most part of the population, people and organisations. Internet provides the required support for connecting organizations, people and geographically distributed developments teams. Software developments are tightly related to the availability of tools and platforms needed for products developments. Internet is becoming the most widely used platform. Software forges such as SourceForge provide an integrated tools environment gathering a set of tools that are suited for each development with a low cost. In this paper we propose an innovating approach based on Web2.0, services and a method engineering approach for software developments. This approach represents one of the possible usages of the internet of the future

    Web Data Extraction, Applications and Techniques: A Survey

    Full text link
    Web Data Extraction is an important problem that has been studied by means of different scientific tools and in a broad range of applications. Many approaches to extracting data from the Web have been designed to solve specific problems and operate in ad-hoc domains. Other approaches, instead, heavily reuse techniques and algorithms developed in the field of Information Extraction. This survey aims at providing a structured and comprehensive overview of the literature in the field of Web Data Extraction. We provided a simple classification framework in which existing Web Data Extraction applications are grouped into two main classes, namely applications at the Enterprise level and at the Social Web level. At the Enterprise level, Web Data Extraction techniques emerge as a key tool to perform data analysis in Business and Competitive Intelligence systems as well as for business process re-engineering. At the Social Web level, Web Data Extraction techniques allow to gather a large amount of structured data continuously generated and disseminated by Web 2.0, Social Media and Online Social Network users and this offers unprecedented opportunities to analyze human behavior at a very large scale. We discuss also the potential of cross-fertilization, i.e., on the possibility of re-using Web Data Extraction techniques originally designed to work in a given domain, in other domains.Comment: Knowledge-based System

    The snowflake effect: the future of mashups and learning

    Get PDF
    Emerging technologies for learning report - Article exploring web mashups and their potential for educatio

    Situational Enterprise Services

    Get PDF
    The ability to rapidly find potential business partners as well as rapidly set up a collaborative business process is desirable in the face of market turbulence. Collaborative business processes are increasingly dependent on the integration of business information systems. Traditional linking of business processes has a large ad hoc character. Implementing situational enterprise services in an appropriate way will deliver the business more flexibility, adaptability and agility. Service-oriented architectures (SOA) are rapidly becoming the dominant computing paradigm. It is now being embraced by organizations everywhere as the key to business agility. Web 2.0 technologies such as AJAX on the other hand provide good user interactions for successful service discovery, selection, adaptation, invocation and service construction. They also balance automatic integration of services and human interactions, disconnecting content from presentation in the delivery of the service. Another Web technology, such as semantic Web, makes automatic service discovery, mediation and composition possible. Integrating SOA, Web 2.0 Technologies and Semantic Web into a service-oriented virtual enterprise connects business processes in a much more horizontal fashion. To be able run these services consistently across the enterprise, an enterprise infrastructure that provides enterprise architecture and security foundation is necessary. The world is constantly changing. So does the business environment. An agile enterprise needs to be able to quickly and cost-effectively change how it does business and who it does business with. Knowing, adapting to diffident situations is an important aspect of today’s business environment. The changes in an operating environment can happen implicitly and explicitly. The changes can be caused by different factors in the application domain. Changes can also happen for the purpose of organizing information in a better way. Changes can be further made according to the users' needs such as incorporating additional functionalities. Handling and managing diffident situations of service-oriented enterprises are important aspects of business environment. In the chapter, we will investigate how to apply new Web technologies to develop, deploy and executing enterprise services

    Introducing "2.0" functionalities in an ERP

    Get PDF
    Companies and ERP editors show an increasing interest for the Web 2.0 technologies, aiming at involving the user of a web site in the creation of content. We summarize in this communication what these tools are and give an overview of recent examples of their use in companies. We show on the example of the most recent ERP of SAP, Business By Design, that if "2.0 tools" are now available in some ERPs, their integration in the business processes is not yet fully done. We suggest in that purpose the first draft of a methodology aiming at developing "2.0 business processes" using an ERP 2.0

    Quality of Web Mashups: A Systematic Mapping Study

    Full text link
    The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-319-04244-2_8Web mashups are a new generation of applications based on the composition of ready-to-use, heterogeneous components. They are gaining momentum thanks to their lightweight composition approach, which represents a new opportunity for companies to leverage on past investments in SOA, Web services, and public APIs. Although several studies are emerging in order to address mashup development, no systematic mapping studies have been reported on how quality issues are being addressed. This paper reports a systematic mapping study on which and how the quality of Web mashups has been addressed and how the product quality-aware approaches have been defined and validated. The aim of this study is to provide a background in which to appropriately develop future research activities. A total of 38 research papers have been included from an initial set of 187 papers. Our results provided some findings regarding how the most relevant product quality characteristics have been addressed in different artifacts and stages of the development process. They have also been useful to detect some research gaps, such as the need of more controlled experiments and more quality-aware mashup development proposals for other characteristics which being important for the Web domain have been neglected such as Usability and ReliabilityThis work is funded by the MULTIPLE project (TIN2009-13838), the Senescyt program (scholarships 2011), and the Erasmus Mundus Programme of the European Commission under the Transatlantic Partnership for Excellence in Engineering - TEE Project.Cedillo Orellana, IP.; Fernández Martínez, A.; Insfrán Pelozo, CE.; Abrahao Gonzales, SM. (2013). Quality of Web Mashups: A Systematic Mapping Study. En Current Trends in Web Engineering. Springer. 66-78. https://doi.org/10.1007/978-3-319-04244-2_8S6678Alkhalifa, E.: The Future of Enterprise Mashups. Business Insights. E-Strategies for Resource Management Systems (2009)Beemer, B., Gregg, D.: Mashups: A Literature Review and Classification Framework. Future Internet 1, 59–87 (2009)Cappiello, C., Daniel, F., Matera, M.: A Quality Model for Mashup Components. In: Gaedke, M., Grossniklaus, M., Díaz, O. (eds.) ICWE 2009. LNCS, vol. 5648, pp. 236–250. Springer, Heidelberg (2009)Cappiello, C., Daniel, F., Matera, M., Pautasso, C.: Information Quality in Mashups. IEEE Internet Computing 14(4), 32–40 (2010)Cappiello, C., Matera, M., Picozzi, M., Daniel, F., Fernandez, A.: Quality-Aware Mashup Composition: Issues, Techniques and Tools. In: 8th International Conference on the Quality of Information and Communications Technology (QUATIC 2012), pp. 10–19 (2012)Fenton, N.E., Pfleeger, S.L.: Software Metrics: A Rigorous and Practical Approach, 2nd edn. International Thompson 1996, pp. I–XII, 1–638 (1996) ISBN 978-1-85032-275-7Fernandez, A., Insfran, E., Abrahão, S.: Usability evaluation methods for the web: A systematic mapping study. Information and Software Technology 53(8), 789–817 (2011)Garousi, V., Mesbah, A., Betin-Can, A., Mirshokraie, S.: A systematic mapping study of web application testing. Information and Software Technology 55(8), 1374–1396 (2013)Grammel, L., Storey, M.-A.: A survey of mashup development environments. In: Chignell, M., Cordy, J., Ng, J., Yesha, Y. (eds.) The Smart Internet. LNCS, vol. 6400, pp. 137–151. Springer, Heidelberg (2010)Hoyer, V., Fischer, M.: Market Overview of Enterprise Mashup Tools. In: Bouguettaya, A., Krueger, I., Margaria, T. (eds.) ICSOC 2008. LNCS, vol. 5364, pp. 708–721. Springer, Heidelberg (2008)ISO/IEC: ISO/IEC 25010 Systems and software engineering. Systems and software Quality Requirements and Evaluation (SQuaRE). System and software quality models (2011)Kitchenham, B., Charters, S.: Guidelines for performing Systematic Literature Reviews in Software Engineering. Version 2.3, ESBE Technical Report, Keele University, UK (2007)Mendes, E.: A systematic review on the Web engineering research. In: International Symposium on Empirical Software Engineering (ISESE 2005), pp. 498–507 (2005)OrangeLabs: State of the Art in Mashup tools, SocEDA project, pp. 1–59 (2011)Petersen, K., Feldt, R., Mujtaba, S., Mattsson, M.: Systematic mapping studies in software engineering. In: 12th International Conference on Evaluation and Assessment in Software Engineering (EASE), pp. 68–77 (2008)Raza, M., Hussain, F.K., Chang, E.: A methodology for quality-based mashup of data sources. In: 10th International Conference on Information Integration and Web-based Applications & Services (iiWAS 2008), pp. 528–533 (2008)Saeed, A.: A Quality-based Framework for Leveraging the Process of Mashup Component Selection (2009), https://gupea.ub.gu.se/handle/2077/21953Sharma, A., Hellmann, T.D., Maurer, F.: Testing of Web Services - A Systematic Mapping. In: 8th World Congress on Services (SERVICES 2012), pp. 346–352 (2012
    corecore