19 research outputs found
A tentative model for it outsourcing
Abstract: Outsourcing projects involve many stakeholders, including clients who transfer the operational responsibility of their business processes, and a vendor who accepts that responsibility. IT outsourcing applies when the transferred responsibilities are limited to IT‐related processes. Outsourcing clients seek to leverage on potential efficiency provided by the vendor, whereby the vendor delivers the same intended business results at a cheaper cost. Vendors potentially achieve these efficiencies through better subject matter expertise and economies of scale. Clients outsource processes considered nonstrategic while strategic processes are managed in‐house. Once a client decides to outsource ITrelated processes, the outsourcing strategy takes the form of one or more of five delivery models. This article portrays a proposed model for successful IT outsourcing. This model assists the outsourcing professional in navigating through the various steps of the outsourcing life span with a clearer awareness of likely causes and potential remedies applicable to the outsourcing strategy on any given outsourcing project
Stakeholder Relationships within Educational Ecosystems – a Literature Review
The ongoing societal and technological changes make it necessary for universities to modify their teaching and learning programs. Regarding the cooperation with stakeholders that have influence on how it could be designed can serve as a basis. With this article we examine the recent contributions in the field of cooperation of higher education (HE) institutions by conducting a structured literature review. A close regard in particular are the interdependences between stakeholders in order to build a basis for future curricula developments. For our conceptualization, we use the quadruple helix model to analyse the educational ecosystems. Therefore in particular the term “educational ecosystem” is taken into account. The results show that evenso that the term is used there is a lack of suitable definitions in this context. So based on the analysis of the recent literature, a definition of educational ecosystem was introduced and the quintuple helix model, which was constructed for the conceptualization of the topic, was extended by further important aspects – knowledge transfer and adaptivity
Eine Analyse der Literatur zur Referenzmodellierung im Geschäftsprozessmanagement unter Berücksichtigung quantitativer Methoden
Im Geschäftsprozessmanagement nimmt die Referenzmodellierung bei der Gestaltung von Geschäftsprozessen eine große Bedeutung ein, da auf bereits existierende Modelle zurückgegriffen werden kann. So kann Zeit für die Entwicklung der Prozesse eingespart und von bereits etabliertem Wissen profitiert werden. Die vorliegende Masterarbeit analysiert die Literatur im Bereich der Referenzmodellierung im Geschäftsprozessmanagement unter Berücksichtigung quantitativer Methoden. Es werden insbesondere die Forschungsrichtungen bzw. Themenbereiche, Entwicklungen und der aktuelle Stand der Literatur in diesem Bereich ermittelt. Zunächst werden deutsch- und englischsprachige Artikel nach bestimmten Kriterien ausgewählt. Anschließend folgt eine quantitativ orientierte Analyse der Literatur. Dabei kommt die Latente Semantische Analyse zum Einsatz, mit deren Hilfe Themenbereiche ermittelt werden und die einzelnen Beiträge den ermittelten Themenbereichen zugeordnet werden können. Darüber hinaus wird die Entwicklung der Anzahl der Artikel in den Themenbereichen im Zeitverlauf betrachtet und auf Unterschiede zwischen der deutsch- und englischsprachigen Literatur eingegangen. In der darauf folgenden qualitativ orientierten Analyse werden die Artikel der einzelnen Themenbereiche inhaltlich analysiert und der aktuelle Stand der Forschung dargestellt. Nicht zuletzt werden die Ergebnisse der qualitativen Analyse in Bezug zu den Ergebnissen der quantitativen Analyse gesetzt
Three Studies on Model Transformations - Parsing, Generation and Ease of Use
ABSTRACTTransformations play an important part in both software development and the automatic processing of natural languages. We present three publications rooted in the multi-disciplinary research of Language Technology and Software Engineering and relate their contribution to the literature on syntactical transformations. Parsing Linear Context-Free Rewriting SystemsThe first publication describes four different parsing algorithms for the mildly context-sensitive grammar formalism Linear Context-Free Rewriting Systems. The algorithms automatically transform a text into a chart. As a result the parse chart contains the (possibly partial) analysis of the text according to a grammar with a lower level of abstraction than the original text. The uni-directional and endogenous transformations are described within the framework of parsing as deduction. Natural Language Generation from Class DiagramsUsing the framework of Model-Driven Architecture we generate natural language from class diagrams. The transformation is done in two steps. In the first step we transform the class diagram, defined by Executable and Translatable UML, to grammars specified by the Grammatical Framework. The grammars are then used to generate the desired text. Overall, the transformation is uni-directional, automatic and an example of a reverse engineering translation. Executable and Translatable UML - How Difficult Can it Be?Within Model-Driven Architecture there has been substantial research on the transformation from Platform-Independent Models (PIM) into Platform-Specifc Models, less so on the transformation from Computationally Independent Models (CIM) into PIMs. This publication reflects on the outcomes of letting novice software developers transform CIMs specified by UML into PIMs defined in Executable and Translatable UML.ConclusionThe three publications show how model transformations can be used within both Language Technology and Software Engineering to tackle the challenges of natural language processing and software development
Enterprise Architecture adoption method for Higher Education Institutions
During the last few years Enterprise Architecture has received increasing attention among industry and academia. Enterprise Architecture (EA) can be defined as (i) a formal description of the current and future state(s) of an organisation, and (ii) a managed change
between these states to meet organisation’s stakeholders’ goals and to create value to the organisation. By adopting EA, organisations may gain a number of benefits such as better decision making, increased revenues and cost reductions, and alignment of business and IT. To increase the performance of public sector operations, and to improve public services and their availability, the Finnish Parliament has ratified the Act on Information Management Governance in Public Administration in 2011. The Act mandates public sector organisations to start adopting EA by 2014, including Higher Education Institutions (HEIs). Despite the benefits of EA and the Act, EA adoption level and maturity in Finnish HEIs are low. This is partly caused by the fact that EA adoption has been found to be difficult. Thus there is a need
for a solution to help organisations to adopt EA successfully. This thesis follows Design Science (DS) approach to improve traditional EA adoption method in order to increase the likelihood of successful adoption. First a model is developed to explain the change resistance during EA adoption. To find out problems associated with EA adoption, an EA-pilot conducted in 2010 among 12 Finnish HEIs was analysed using the model. It was found that most of the problems were caused by misunderstood EA concepts, attitudes, and lack of skills. The traditional EA adoption method does not pay attention to these. To overcome the limitations of the traditional EA adoption method, an improved EA Adoption Method (EAAM) is introduced. By following EAAM, organisations may increase the likelihood of successful EA adoption. EAAM helps in acquiring the mandate for EA adoption from top-management, which has been found to be crucial to success. It also helps in supporting individual and organisational learning, which has also found to be essential in successful adoption
A Scholarship Approach to Model-Driven Engineering
Model-Driven Engineering is a paradigm for software engineering where software models are the primary artefacts throughout the software life-cycle. The aim is to define suitable representations and processes that enable precise and efficient specification, development and analysis of software. Our contributions to Model-Driven Engineering are structured according to Boyer\u27s four functions of academic activity - the scholarships of teaching, discovery, application and integration. The scholarships share a systematic approach towards seeking new insights and promoting progressive change. Even if the scholarships have their differences they are compatible so that theory, practice and teaching can strengthen each other.Scholarship of Teaching: While teaching Model-Driven Engineering to under-graduate students we introduced two changes to our course. The first change was to introduce a new modelling tool that enabled the execution of software models while the second change was to adapt pair lecturing to encourage the students to actively participate in developing models during lectures. Scholarship of Discovery: By using an existing technology for transforming models into source code we translated class diagrams and high-level action languages into natural language texts. The benefit of our approach is that the translations are applicable to a family of models while the texts are reusable across different low-level representations of the same model.Scholarship of Application: Raising the level of abstraction through models might seem a technical issue but our collaboration with industry details how the success of adopting Model-Driven Engineering depends on organisational and social factors as well as technical. Scholarship of Integration: Building on our insights from the scholarships above and a study at three large companies we show how Model-Driven Engineering empowers new user groups to become software developers but also how engineers can feel isolated due to poor tool support. Our contributions also detail how modelling enables a more agile development process as well as how the validation of models can be facilitated through text generation.The four scholarships allow for different possibilities for insights and explore Model-Driven Engineering from diverse perspectives. As a consequence, we investigate the social, organisational and technological factors of Model-Driven Engineering but also examine the possibilities and challenges of Model-Driven Engineering across disciplines and scholarships
Software development in the post-PC era : towards software development as a service
PhD ThesisEngineering software systems is a complex task which involves various stakeholders
and requires planning and management to succeed. As the role of software in our daily
life is increasing, the complexity of software systems is increasing. Throughout the
short history of software engineering as a discipline, the development practises and
methods have rapidly evolved to seize opportunities enabled by new technologies
(e.g., the Internet) and to overcome economical challenges (e.g., the need for cheaper
and faster development).
Today, we are witnessing the Post-PC era. An era which is characterised by mobility and
services. An era which removes organisational and geographical boundaries. An era
which changes the functionality of software systems and requires alternative methods
for conceiving them.
In this thesis, we envision to execute software development processes in the cloud.
Software processes have a software production aspect and a management aspect. To
the best of our knowledge, there are no academic nor industrial solutions supporting the
entire software development process life-cycle(from both production and management
aspects and its tool-chain execution in the cloud.
Our vision is to use the cloud economies of scale and leverage Model-Driven Engineering
(MDE) to integrate production and management aspects into the development
process. Since software processes are seen as workflows, we investigate using existing
Workflow Management Systems to execute software processes and we find that these
systems are not suitable. Therefore, we propose a reference architecture for Software
Development as a Service (SDaaS). The SDaaS reference architecture is the first proposal
which fully supports development of complex software systems in the cloud.
In addition to the reference architecture, we investigate three specific related challenges
and propose novel solutions addressing them. These challenges are:
Modelling & enacting cloud-based executable software processes. Executing
software processes in the cloud can bring several benefits to software develop
ment. In this thesis, we discuss the benefits and considerations of cloud-based
software processes and introduce a modelling language for modelling such processes.
We refer to this language as EXE-SPEM. It extends the Software and Systems
Process Engineering (SPEM2.0) OMG standard to support creating cloudbased
executable software process models. Since EXE-SPEM is a visual modelling
language, we introduce an XML notation to represent EXE-SPEM models
in a machine-readable format and provide mapping rules from EXE-SPEM to
this notation. We demonstrate this approach by modelling an example software
process using EXE-SPEM and mapping it to the XML notation. Software process
models expressed in this XML format can then be enacted in the proposed SDaaS
architecture.
Cost-e cient scheduling of software processes execution in the cloud. Software
process models are enacted in the SDaaS architecture as workflows. We
refer to them sometimes as Software Workflows. Once we have executable software
process models, we need to schedule them for execution. In a setting where
multiple software workflows (and their activities) compete for shared computational
resources (workflow engines), scheduling workflow execution becomes
important. Workflow scheduling is an NP-hard problem which refers to the allocation
of su cient resources (human or computational) to workflow activities.
The schedule impacts the workflow makespan (execution time) and cost as well as
the computational resources utilisation. The target of the scheduling is to reduce
the process execution cost in the cloud without significantly a ecting the process
makespan while satisfying the special requirements of each process activity (e.g.,
executing on a private cloud). We adapt three workflow scheduling algorithms
to fit for SDaaS and propose a fourth one; the Proportional Adaptive Task Schedule.
The algorithms are then evaluated through simulation. The simulation results
show that the our proposed algorithm saves between 19.74% and 45.78% of the
execution cost, provides best resource (VM) utilisation and provides the second
best makespan compared to the other presented algorithms.
Evaluating the SDaaS architecture using a case study from the safety-critical
systems domain. To evaluate the proposed SDaaS reference architecture, we
instantiate a proof-of-concept implementation of the architecture. This imple
mentation is then used to enact safety-critical processes as a case study.
Engineering safety-critical systems is a complex task which involves multiple
stakeholders. It requires shared and scalable computation to systematically involve
geographically distributed teams. In this case study, we use EXE-SPEM to
model a portion of a process (namely; the Preliminary System Safety Assessment
- PSSA) adapted from the ARP4761 [2] aerospace standard. Then, we enact this
process model in the proof-of-concept SDaaS implementation.
By using the SDaaS architecture, we demonstrate the feasibility of our approach
and its applicability to di erent domains and to customised processes. We also
demonstrate the capability of EXE-SPEM to model cloud-based executable processes.
Furthermore, we demonstrate the added value of the process models and
the process execution provenance data recorded by the SDaaS architecture. This
data is used to automate the generation of safety cases argument fragments. Thus,
reducing the development cost and time. Finally, the case study shows that we
can integrate some existing tools and create new ones as activities used in process
models.
The proposed SDaaS reference architecture (combined with its modelling, scheduling
and enactment capabilities) brings the benefits of the cloud to software development. It
can potentially save software production cost and provide an accessible platform that
supports collaborating teams (potentially across di erent locations). The executable
process models support unified interpretation and execution of processes across team(s)
members. In addition, the use of models provide managers with global awareness and
can be utilised for quality assurance and process metrics analysis and improvement.
We see the contributions provided in this thesis as a first step towards an alternative
development method that uses the benefits of cloud and Model-Driven Engineering to
overcome existing challenges and open new opportunities. However, there are several
challenges that are outside the scope of this study which need to be addressed to allow
full support of the SDaaS vision (e.g., supporting interactive workflows). The solutions
provided in this thesis address only part of a bigger vision. There is also a need for
empirical and usability studies to study the impact of the SDaaS architecture on both
the produced products (in terms of quality, cost, time, etc.) and the participating
stakeholders
Mapping and the Citizen Sensor
Maps are a fundamental resource in a diverse array of applications ranging from everyday activities, such as route planning through the legal demarcation of space to scientific studies, such as those seeking to understand biodiversity and inform the design of nature reserves for species conservation. For a map to have value, it should provide an accurate and timely representation of the phenomenon depicted and this can be a challenge in a dynamic world. Fortunately, mapping activities have benefitted greatly from recent advances in geoinformation technologies. Satellite remote sensing, for example, now offers unparalleled data acquisition and authoritative mapping agencies have developed systems for the routine production of maps in accordance with strict standards. Until recently, much mapping activity was in the exclusive realm of authoritative agencies but technological development has also allowed the rise of the amateur mapping community. The proliferation of inexpensive and highly mobile and location aware devices together with Web 2.0 technology have fostered the emergence of the citizen as a source of data. Mapping presently benefits from vast amounts of spatial data as well as people able to provide observations of geographic phenomena, which can inform map production, revision and evaluation. The great potential of these developments is, however, often limited by concerns. The latter span issues from the nature of the citizens through the way data are collected and shared to the quality and trustworthiness of the data. This book reports on some of the key issues connected with the use of citizen sensors in mapping. It arises from a European Co-operation in Science and Technology (COST) Action, which explored issues linked to topics ranging from citizen motivation, data acquisition, data quality and the use of citizen derived data in the production of maps that rival, and sometimes surpass, maps arising from authoritative agencies
A method for developing Reference Enterprise Architectures
Industrial change forces enterprises to constantly adjust their organizational structures in order to stay competitive. In this regard, research acknowledges the potential of Reference Enterprise Architectures (REA). This thesis proposes REAM - a method for developing REAs. After contrasting organizations' needs with approaches available in the current knowledge base, this work identifies the absence of method support for REA development. Proposing REAM, the author aims to close this research gap and evaluates the method's utility by applying REAM in different naturalistic settings
Hierarchical Graphs as Organisational Principle and Spatial Model Applied to Pedestrian Indoor Navigation
In this thesis, hierarchical graphs are investigated from two different angles – as a general modelling principle for (geo)spatial networks and as a practical means to enhance navigation in buildings. The topics addressed are of interest from a multi-disciplinary point of view, ranging from Computer Science in general over Artificial Intelligence
and Computational Geometry in particular to other fields such as Geographic Information Science.
Some hierarchical graph models have been previously proposed by the research community, e.g. to cope with the massive size of road networks, or as a conceptual model for human wayfinding. However, there has not yet been a comprehensive, systematic approach for modelling spatial networks with hierarchical graphs. One particular
problem is the gap between conceptual models and models which can be readily used in practice. Geospatial data is commonly modelled - if at all - only as a flat graph. Therefore, from a practical point of view, it is important to address the automatic construction of a graph hierarchy based on the predominant data models. The work presented deals with this problem: an automated method for construction is introduced and explained. A particular contribution of my thesis is the proposition to use hierarchical graphs as the basis for an extensible, flexible architecture for modelling various (geo)spatial networks. The proposed approach complements classical graph models very well in the sense that their expressiveness is extended: various graphs originating from different
sources can be integrated into a comprehensive, multi-level model.
This more sophisticated kind of architecture allows for extending navigation services beyond the borders of one single spatial network to a collection of heterogeneous networks, thus establishing a meta-navigation service. Another point of discussion is the impact of the hierarchy and distribution on graph algorithms. They have to be
adapted to properly operate on multi-level hierarchies.
By investigating indoor navigation problems in particular, the guiding principles are demonstrated for modelling networks at multiple levels of detail. Complex environments like large public buildings are ideally suited to demonstrate the versatile use of hierarchical
graphs and thus to highlight the benefits of the hierarchical approach.
Starting from a collection of floor plans, I have developed a systematic method for constructing a multi-level graph hierarchy. The nature of indoor environments, especially their inherent diversity, poses an additional challenge: among others, one must deal with complex, irregular, and/or three-dimensional features. The proposed method is also motivated by practical considerations, such as not only finding shortest/fastest paths across rooms and floors, but also by providing descriptions for these paths which are easily understood by people.
Beyond this, two novel aspects of using a hierarchy are discussed: one as an informed heuristic exploiting the specific characteristics of indoor environments in order to enhance classical, general-purpose graph search techniques. At the same time, as a convenient by-
product of this method, clusters such as sections and wings can be detected. The other reason is to better deal with irregular, complex-shaped regions in a way that instructions can also be provided for these spaces. Previous approaches have not considered this problem.
In summary, the main results of this work are:
• hierarchical graphs are introduced as a general spatial data infrastructure. In particular, this architecture allows us to integrate different spatial networks originating from different sources. A small but useful set of operations is proposed for integrating these networks. In order to work in a hierarchical model, classical graph algorithms are generalised. This finding also has implications on the possible integration of separate
navigation services and systems;
• a novel set of core data structures and algorithms have been devised for modelling indoor environments. They cater to the unique characteristics of these environments and can be specifically used to provide enhanced navigation in buildings. Tested on models of several real buildings from our university, some preliminary but promising results were gained from a prototypical implementation and its application on the models