341 research outputs found
Cloud service localisation
The essence of cloud computing is the provision of software
and hardware services to a range of users in dierent locations. The aim of cloud service localisation is to facilitate the internationalisation and localisation of cloud services by allowing their adaption to dierent locales.
We address the lingual localisation by providing service-level language translation techniques to adopt services to dierent languages and regulatory localisation by providing standards-based mappings to achieve regulatory compliance with regionally varying laws, standards and regulations. The aim is to support and enforce the explicit modelling of
aspects particularly relevant to localisation and runtime support consisting of tools and middleware services to automating the deployment based on models of locales, driven by the two localisation dimensions.
We focus here on an ontology-based conceptual information model that integrates locale specication in a coherent way
Knowledge-Intensive Processes: Characteristics, Requirements and Analysis of Contemporary Approaches
Engineering of knowledge-intensive processes (KiPs) is far from being mastered, since they are genuinely knowledge- and data-centric, and require substantial flexibility, at both design- and run-time. In this work, starting from a scientific literature analysis in the area of KiPs and from three real-world domains and application scenarios, we provide a precise characterization of KiPs. Furthermore, we devise some general requirements related to KiPs management and execution. Such requirements contribute to the definition of an evaluation framework to assess current system support for KiPs. To this end, we present a critical analysis on a number of existing process-oriented approaches by discussing their efficacy against the requirements
Software service adaptation based on interface localisation
The aim of Web services is the provision of software services to a range of different users in different locations. Service localisation in this context can facilitate the internationalisation and localisation of services by allowing their adaption to different locales. The authors investigate three dimensions: (i) lingual localisation by providing service-level language translation techniques to adopt services to different languages, (ii) regulatory localisation by providing standards-based mappings to achieve regulatory compliance with regionally varying laws, standards and regulations, and (iii) social localisation by taking into account preferences and customs for individuals and the groups or communities in which they participate. The objective is to support and implement an explicit modelling of aspects that are relevant to localisation and runtime support consisting of tools and middleware services to automating the deployment based on models of locales, driven by the two localisation dimensions. The authors focus here on an ontology-based conceptual information model that integrates locale specification into service architectures in a coherent way
Recommended from our members
A Requirements-Led Approach for Specifying QoS-Aware Service Choreographies: An Experience Report.
[Context and motivation] Choreographies are a form of service composition in which partner services interact in a global scenario without a single point of control. The absence of an explicitly specified orchestration requires changes to requirements practices to recognize the need to optimize software services choreography and monitoring for satisfaction with system requirements.
[Question/problem] We developed a requirements-led approach that aims to provide tools and processes to transform requirements expressed on service-based systems to QoS-aware choreography specifications.
[Principal ideas/results] The approach is used by domain experts to specify natural language requirements on a service-based system, and by choreography designers to adapt their models to satisfy requirements more effectively. Non-functional requirements are mapped to BPMN choreography diagrams as quality properties, using the Q4BPMN notation, that support analysis and monitoring facilities. [Contribution] We report the new integrated approach and provide lessons learned from applying it to a real-world example of dynamic taxi management
Privacy Conscious Web Apps Composition
International audienceSo called "apps" are widespread today on the Internet. Most of them allow users to extend the range of functionalities their websites offer. However, they potentially jeopardize the privacy of users. Indeed, they collect, store and process personal pieces of information. Recent studies show that users feel they lack control over information. They also show that users distrust apps providers and would rather turn to their friends or family when they choose apps. In this paper we propose a model-driven approach to empower end-users with an extended control over their information. Our work is implemented as a web-based tool to compose apps and manage end-users privacy requirements. Our work showcases the unexploited possibilities of current web protocols and technologies in terms of privacy management
Design Ltd.: Renovated Myths for the Development of Socially Embedded Technologies
This paper argues that traditional and mainstream mythologies, which have
been continually told within the Information Technology domain among designers
and advocators of conceptual modelling since the 1960s in different fields of
computing sciences, could now be renovated or substituted in the mould of more
recent discourses about performativity, complexity and end-user creativity that
have been constructed across different fields in the meanwhile. In the paper,
it is submitted that these discourses could motivate IT professionals in
undertaking alternative approaches toward the co-construction of
socio-technical systems, i.e., social settings where humans cooperate to reach
common goals by means of mediating computational tools. The authors advocate
further discussion about and consolidation of some concepts in design research,
design practice and more generally Information Technology (IT) development,
like those of: task-artifact entanglement, universatility (sic) of End-User
Development (EUD) environments, bricolant/bricoleur end-user, logic of
bricolage, maieuta-designers (sic), and laissez-faire method to socio-technical
construction. Points backing these and similar concepts are made to promote
further discussion on the need to rethink the main assumptions underlying IT
design and development some fifty years later the coming of age of software and
modern IT in the organizational domain.Comment: This is the peer-unreviewed of a manuscript that is to appear in D.
Randall, K. Schmidt, & V. Wulf (Eds.), Designing Socially Embedded
Technologies: A European Challenge (2013, forthcoming) with the title
"Building Socially Embedded Technologies: Implications on Design" within an
EUSSET editorial initiative (www.eusset.eu/
Adaptive Process Management in Cyber-Physical Domains
The increasing application of process-oriented approaches in new challenging cyber-physical domains beyond business computing (e.g., personalized healthcare, emergency management, factories of the future, home automation, etc.) has led to reconsider the level of flexibility and support required to manage complex processes in such domains. A cyber-physical domain is characterized by the presence of a cyber-physical system coordinating heterogeneous ICT components (PCs, smartphones, sensors, actuators) and involving real world entities (humans, machines, agents, robots, etc.) that perform complex tasks in the “physical” real world to achieve a common goal. The physical world, however, is not entirely predictable, and processes enacted in cyber-physical domains must be robust to unexpected conditions and adaptable to unanticipated exceptions. This demands a more flexible approach in process design and enactment, recognizing that in real-world environments it is not adequate to assume that all possible recovery activities can be predefined for dealing with the exceptions that can ensue. In this chapter, we tackle the above issue and we propose a general approach, a concrete framework and a process management system implementation, called SmartPM, for automatically adapting processes enacted in cyber-physical domains in case of unanticipated exceptions and exogenous events. The adaptation mechanism provided by SmartPM is based on declarative task specifications, execution monitoring for detecting failures and context changes at run-time, and automated planning techniques to self-repair the running process, without requiring to predefine any specific adaptation policy or exception handler at design-time
Identifying and Consolidating Knowledge Engineering Requirements
Knowledge engineering is the process of creating and maintaining
knowledge-producing systems. Throughout the history of computer science and AI,
knowledge engineering workflows have been widely used because high-quality
knowledge is assumed to be crucial for reliable intelligent agents. However,
the landscape of knowledge engineering has changed, presenting four challenges:
unaddressed stakeholder requirements, mismatched technologies, adoption
barriers for new organizations, and misalignment with software engineering
practices. In this paper, we propose to address these challenges by developing
a reference architecture using a mainstream software methodology. By studying
the requirements of different stakeholders and eras, we identify 23 essential
quality attributes for evaluating reference architectures. We assess three
candidate architectures from recent literature based on these attributes.
Finally, we discuss the next steps towards a comprehensive reference
architecture, including prioritizing quality attributes, integrating components
with complementary strengths, and supporting missing socio-technical
requirements. As this endeavor requires a collaborative effort, we invite all
knowledge engineering researchers and practitioners to join us
Investigation of Geobase Implementation Issues: Case Study of Information Resource Management
Billions of dollars have been wasted on failed information system (IS) projects over the last decade in the private and public sectors. More specifically, the tri-service environment of the U.S. military has not implemented a single successful geospatial IS (GIS). The lack of a service-wide insertion process for GIS was cited as the most significant cause for military GIS failures. GeoBase represents the USAF\u27s most recent GIS implementation. The GeoBase program focuses on Information Resource Management (IRM) and cultural issues. The GeoBase Sustainment Model (GSM), anecdotally developed by GeoBase leadership to reflect implementation issues and the IRM practices of the program, presents a prime research opportunity to examine the legitimacy of the initiative. Within the Federal Government, stricter control on IS has been established in an effort to increase the rate of IS project success. IRM has been offered as the solution. This researcher conducted a case study investigation of GeoBase implementation issues as perceived at the USAF-MAJCOM level to qualitatively assess the validity of the anecdotally constructed GSM. The researcher also assessed the model against key IRM dimensions. Based on a content analysis of the reported implementation issues, IRM documentation, and the GSM itself, the model adequately represented the reported implementation issues and the key IRM dimensions. However, the model was underspecified. Inclusion of communication, a category of reported implementation issues, and advisory committees, a major IRM dimension, would more fully specify the model. A fully specified model may act as the service-wide GIS insertion model, which is currently lacking. (12 tables, 14 figures, 75 refs.
- …