396 research outputs found

    Web service composition: A survey of techniques and tools

    Get PDF
    Web services are a consolidated reality of the modern Web with tremendous, increasing impact on everyday computing tasks. They turned the Web into the largest, most accepted, and most vivid distributed computing platform ever. Yet, the use and integration of Web services into composite services or applications, which is a highly sensible and conceptually non-trivial task, is still not unleashing its full magnitude of power. A consolidated analysis framework that advances the fundamental understanding of Web service composition building blocks in terms of concepts, models, languages, productivity support techniques, and tools is required. This framework is necessary to enable effective exploration, understanding, assessing, comparing, and selecting service composition models, languages, techniques, platforms, and tools. This article establishes such a framework and reviews the state of the art in service composition from an unprecedented, holistic perspective

    What do they really mean by “design”? A textual analysis of the reports from design-led labs that strive for better service provision and policymaking

    Get PDF
    This master’s thesis investigates the notion of design depicted by the design-led labs (DLLs) in the emergent field of design for public sector through the viewpoints of two eminent design scholars, namely Bryan Lawson and Nigel Cross. The DLLs in this thesis refer to various agencies with design as their core competence that are involved in the attempts to improve public service provision and policymaking in collabo-ration with their respective governments. These DLLs have largely owed their origins to the governments’ demands for creating better services and policies with decreased financial re-sources. As a result of budgetary constraints and a variety of complex societal problems, the public sectors has been obligated to seek ways to innovate their solutions. In the effort to miti-gate the difficulties, design seems to have arisen as one of the alternative approaches to ad-dress these challenges on account of its prevalent emergence and achievements around the world. However, the usage of the term “design”, constantly mentioned by the DLLs appears confusing and remains undefined. Due to this lack of clarity with the most essential concept, this thesis attempts to disclose the actual meaning of “design” as seen by the DLLs. This thesis analyses the notion of design adopted by DLLs and explicated in their reports. The notion of design is operationalised on the basis of four landmark design research publications. These serve as investigate lenses in examining the reports from DLLs in four different coun-tries, namely the Design Council (UK), Public Policy Lab (US), Strategic Design Unit at SITRA (Finland), also known as Helsinki Design Lab, and MindLab (Denmark), in order to identify the various notions of design. Through the analysis of the four books, the notions of design are identified and categorised under three themes: design tendencies; design capabilities; design skills. The analysis based on the aforementioned themes shows that the notions of design expressed in the DLL’s reports extend or contradict those established by Lawson and Cross. As a conclusion, this study presents the four extended design capability and design skills, as well as a new set of design capabilities that may contribute to the process of expanding the notions of design adapted for the public sector in order to improve service provision and poli-cymaking. Additionally, the thesis summarises the arguments presented by the DLLs in sup-port of fostering design as a viable tool for the public sector and governments to achieve more effective service provision and policymaking

    Creating architecture for a digital information system leveraging virtual environments

    Get PDF
    Abstract. The topic of the thesis was the creation of a proof of concept digital information system, which utilizes virtual environments. The focus was finding a working design, which can then be expanded upon. The research was conducted using design science research, by creating the information system as the artifact. The research was conducted for Nokia Networks in Oulu, Finland; in this document referred to as “the target organization”. An information system is a collection of distributed computing components, which come together to create value for an organization. Information system architecture is generally derived from enterprise architecture, and consists of a data-, technical- and application architectures. Data architecture outlines the data that the system uses, and the policies related to its usage, manipulation and storage. Technical architecture relates to various technological areas, such as networking and protocols, as well as any environmental factors. The application architecture consists of deconstructing the applications that are used in the operations of the information system. Virtual reality is an experience, where the concepts of presence, autonomy and interaction come together to create an immersive alternative to a regular display-based computer environment. The most typical form of virtual reality consists of a headmounted device, controllers and movement-tracking base stations. The user’s head- and body movement can be tracked, which changes their position in the virtual environment. The proof-of-concept information system architecture used a multi-server -based solution, where one central physical server hosted multiple virtual servers. The system consisted of a website, which was the knowledge-center and where a client software could be downloaded. The client software was the authorization portal, which determined the virtual environments that were available to the user. The virtual reality application included functionalities, which enable co-operative, virtualized use of various Nokia products, in immersive environments. The system was tested in working situations, such as during exhibitions with customers. The proof-of-concept system fulfilled many of the functional requirements set for it, allowing for co-operation in the virtual reality. Additionally, a rudimentary model for access control was available in the designed system. The shortcomings of the system were related to areas such as security and scaling, which can be further developed by introducing a cloud-hosted environment to the architecture

    Model driven validation approach for enterprise architecture and motivation extensions

    Get PDF
    As the endorsement of Enterprise Architecture (EA) modelling continues to grow in diversity and complexity, management of its schema, artefacts, semantics and relationships has become an important business concern. To maintain agility and flexibility within competitive markets, organizations have also been compelled to explore ways of adjusting proactively to innovations, changes and complex events also by use of EA concepts to model business processes and strategies. Thus the need to ensure appropriate validation of EA taxonomies has been considered severally as an essential requirement for these processes in order to exert business motivation; relate information systems to technological infrastructure. However, since many taxonomies deployed today use widespread and disparate modelling methodologies, the possibility to adopt a generic validation approach remains a challenge. The proliferation of EA methodologies and perspectives has also led to intricacies in the formalization and validation of EA constructs as models often times have variant schematic interpretations. Thus, disparate implementations and inconsistent simulation of alignment between business architectures and heterogeneous application systems is common within the EA domain (Jonkers et al., 2003). In this research, the Model Driven Validation Approach (MDVA) is introduced. MDVA allows modelling of EA with validation attributes, formalization of the validation concepts and transformation of model artefacts to ontologies. The transformation simplifies querying based on motivation and constraints. As the extended methodology is grounded on the semiotics of existing tools, validation is executed using ubiquitous query language. The major contributions of this work are the extension of a metamodel of Business Layer of an EAF with Validation Element and the development of EAF model to ontology transformation Approach. With this innovation, domain-driven design and object-oriented analysis concepts are applied to achieve EAF model’s validation using ontology querying methodology. Additionally, the MDVA facilitates the traceability of EA artefacts using ontology graph patterns

    Occupying Time: Design, technology, and the form of interaction

    Full text link
    As technology pervades our everyday life and material culture, new possibilities and problematics are raised for design. Attention in contemporary design discourse is shifting ‘beyond the object’, to the qualities of processes and experiences. The boxes and screens typically the ‘object’ of interaction and interface design are m­iniaturizing, even disappearing, as computation is integrated into familiar materials and o­rdinary o­bjects. This opens possibilities – for example, as computer and materials s­cience converge with fashion and architecture in smart textiles and intelligent environ­ments – even as it turns us back, in new ways, to traditional design disciplines and practices. In this context, design is not only about the spatial or physical form of objects, but the form of interactions that take place – and occupy time – in people’s relations with and through computational and interactive objects. As argued in this thesis, a central, and particular, concern of interaction design must therefore be the ‘temporal form’ of such objects and the ‘form of interaction’ as they are used over time. Furthermore, increasingly pervasive technology means that the temporality of form and interaction is implicated in more widespread changes to the material conditions of design and of society. Challenging conventions – of ‘formalism’ and ‘functionalism’, ‘good’ and ‘total’ design­ – temporal concerns and implications require new ways of thinking about and working with the materiality, users, and effects of design. Located at an intersection between emerging technologies and design traditions, interaction design is approached in ‘Occupying Time’ through diverse disciplinary frames and scales of consideration. If focus in interaction design is typically on proximate ‘Use’, here a discussion of ‘Materials’ scales down to reconsider the more basic spatial and temporal composition of form, and ‘Change’ scales up to large-scale and long-term design effects. To anchor these themes in existing discourse and practice, architecture is a p­rimary frame of reference throughout to explore certain problematics. Accounts of ‘event’, ‘vernacular’, and ‘non-design’, and concepts of ‘becoming’, ‘in the making’, and ‘futurity’, thus extend a theoretical and practical basis for treating time in (interaction) design discourse. Implications for practice also emerge and are discusssed. Basic to the materiality of interaction design, technology puts time central to ‘M­aterial practice’. ‘Participatory practice’ moves beyond user i­nvolvement in design processes to participation in ongoing formation. Since t­emporal form extends design more deeply and further into future use, ‘Critical practice’ e­xamines effects and responsibility. More specific and concrete reflections are situated in relation to my experience in the design research programs ‘IT+Textiles’, ‘Public Play Spaces’, and ‘Static!’. Drawing from architectural discourse and from my own practice, this thesis maps out and builds up a territory of ideas, relations, and examples as an inquiry into issues of time in interaction design

    Efficiency and Automation in Threat Analysis of Software Systems

    Get PDF
    Context: Security is a growing concern in many organizations. Industries developing software systems plan for security early-on to minimize expensive code refactorings after deployment. In the design phase, teams of experts routinely analyze the system architecture and design to find potential security threats and flaws. After the system is implemented, the source code is often inspected to determine its compliance with the intended functionalities. Objective: The goal of this thesis is to improve on the performance of security design analysis techniques (in the design and implementation phases) and support practitioners with automation and tool support.Method: We conducted empirical studies for building an in-depth understanding of existing threat analysis techniques (Systematic Literature Review, controlled experiments). We also conducted empirical case studies with industrial participants to validate our attempt at improving the performance of one technique. Further, we validated our proposal for automating the inspection of security design flaws by organizing workshops with participants (under controlled conditions) and subsequent performance analysis. Finally, we relied on a series of experimental evaluations for assessing the quality of the proposed approach for automating security compliance checks. Findings: We found that the eSTRIDE approach can help focus the analysis and produce twice as many high-priority threats in the same time frame. We also found that reasoning about security in an automated fashion requires extending the existing notations with more precise security information. In a formal setting, minimal model extensions for doing so include security contracts for system nodes handling sensitive information. The formally-based analysis can to some extent provide completeness guarantees. For a graph-based detection of flaws, minimal required model extensions include data types and security solutions. In such a setting, the automated analysis can help in reducing the number of overlooked security flaws. Finally, we suggested to define a correspondence mapping between the design model elements and implemented constructs. We found that such a mapping is a key enabler for automatically checking the security compliance of the implemented system with the intended design. The key for achieving this is two-fold. First, a heuristics-based search is paramount to limit the manual effort that is required to define the mapping. Second, it is important to analyze implemented data flows and compare them to the data flows stipulated by the design

    Designing Malleable Cyberinfastructure to Breach the Golden Barrier

    Get PDF
    Design research perspectives may have a great deal of insights to offer emergency response researchers. We consider man-made and natural disasters as events that often require rapid change to existing institutionalized technical, social, and cultural support structure—a fundamental problem for static systems. Built infrastructure such as electric power and telecommunications or emergency response systems such as fire, police, and National Guard all have static information systems that are tailored to their specific needs. These specialized systems are typical of those developed as a result of applying traditional information systems design theory. They are designed to control domain specific variables and mitigate a specific class of constraints derived from a wellarticulated environment with firm application boundaries. Therefore, typical mission-critical Information and Communication Infrastructure (ICTI) technologies empower knowledge workers with the ability to change current environmental events to ensure safety and security. Disasters create situations that are challenging for typical designs because a disaster erodes control and raises unexpected constraints during an emerging set of circumstances. The unpredictable circumstances of disasters demonstrate that current emergency response ICTI systems are ill equipped to rapidly evolve in concert to address the full scale and scope of such complex problems. A phenomenon found in the treatment of trauma victims, the Golden Trauma Time Interval, is generalized in this paper to all emergencies in order to inform designers of the next generation ICTI. This future ICTI or “Cyberinfrastructure” can provide the essential foundation necessary to dynamically adapt conventional ICTI into a configuration suitable for use during disasters. However, Cyberinfrastructure will suffice only if it can be sufficiently evolved as an Integrated Information Infrastructure (I3 ) that addresses the common sociotechnical factors in these domains. This paper describes fundamental design concepts derived from interdisciplinary theoretical constructs used to inform the creation of a framework to model “complex adaptive systems” (CAS) of which emergency response infrastructural systems and I3 are instances. In previous work, CAS was synthesized with software architecture concepts to arrive at a design approach for the electric power grid’s I3. We will present some of the foundational concepts of CAS that are useful for the future design and development of a Cyberinfrastructure. The ICTI may exist today in a raw form to accomplish the task, but further ICTI design research is required to pinpoint critical inhibitors to its evolution. Also, social, organizational, and institutional issues pertaining to this research will be highlighted as emergency response system design factors needing further consideration. For example, this discussion infers a resolution to the basic tradeoff between personal privacy rights and public safety

    ICS Materials. Towards a re-Interpretation of material qualities through interactive, connected, and smart materials.

    Get PDF
    The domain of materials for design is changing under the influence of an increased technological advancement, miniaturization and democratization. Materials are becoming connected, augmented, computational, interactive, active, responsive, and dynamic. These are ICS Materials, an acronym that stands for Interactive, Connected and Smart. While labs around the world are experimenting with these new materials, there is the need to reflect on their potentials and impact on design. This paper is a first step in this direction: to interpret and describe the qualities of ICS materials, considering their experiential pattern, their expressive sensorial dimension, and their aesthetic of interaction. Through case studies, we analyse and classify these emerging ICS Materials and identified common characteristics, and challenges, e.g. the ability to change over time or their programmability by the designers and users. On that basis, we argue there is the need to reframe and redesign existing models to describe ICS materials, making their qualities emerge

    Survey of the State of the Art in Natural Language Generation: Core tasks, applications and evaluation

    Get PDF
    This paper surveys the current state of the art in Natural Language Generation (NLG), defined as the task of generating text or speech from non-linguistic input. A survey of NLG is timely in view of the changes that the field has undergone over the past decade or so, especially in relation to new (usually data-driven) methods, as well as new applications of NLG technology. This survey therefore aims to (a) give an up-to-date synthesis of research on the core tasks in NLG and the architectures adopted in which such tasks are organised; (b) highlight a number of relatively recent research topics that have arisen partly as a result of growing synergies between NLG and other areas of artificial intelligence; (c) draw attention to the challenges in NLG evaluation, relating them to similar challenges faced in other areas of Natural Language Processing, with an emphasis on different evaluation methods and the relationships between them.Comment: Published in Journal of AI Research (JAIR), volume 61, pp 75-170. 118 pages, 8 figures, 1 tabl
    • 

    corecore