50 research outputs found

    Semantic discovery and reuse of business process patterns

    Get PDF
    Patterns currently play an important role in modern information systems (IS) development and their use has mainly been restricted to the design and implementation phases of the development lifecycle. Given the increasing significance of business modelling in IS development, patterns have the potential of providing a viable solution for promoting reusability of recurrent generalized models in the very early stages of development. As a statement of research-in-progress this paper focuses on business process patterns and proposes an initial methodological framework for the discovery and reuse of business process patterns within the IS development lifecycle. The framework borrows ideas from the domain engineering literature and proposes the use of semantics to drive both the discovery of patterns as well as their reuse

    Stimulating Personal Development and Knowledge Sharing

    Get PDF
    Koper, R., Stefanov, K., & Dicheva, D. (Eds.) (2009). Proceedings of the 5th International TENCompetence Open Workshop "Stimulating Personal Development and Knowledge Sharing". October, 30-31, 2008, Sofia, Bulgaria: TENCompetence Workshop.The fifth open workshop of the TENCompetence project took place in Sofia, Bulgaria, from 30th to 31st October 2008. These proceedings contain the papers that were accepted for publication by the Program Committee.The work on this publication has been sponsored by the TENCompetence Integrated Project that is funded by the European Commission's 6th Framework Programme, priority IST/Technology Enhanced Learning. Contract 027087 [http://www.tencompetence.org

    Processing Structured Data Streams

    Get PDF
    We elaborate this study in order to choose the most suitable technology to develop our proposal. Second, we propose three methods to reduce the set of data to be processed by a query when working with large graphs, namely spatial, temporal and random approximations. These methods are based on Approximate Query Processing techniques and consist in discarding the information that is considered not relevant for the query. The reduction of the data is performed online with the processing and considers both spatial and temporal aspects of the data. Since discarding information in the source data may decrease the validity of the results, we also define the transformation error obtain with these methods in terms of accuracy, precision and recall. Finally, we present a preprocessing algorithm, called SDR algorithm, that is also used to reduce the set of data to be processed, but without compromising the accuracy of the results. It calculates a subgraph from the source graph that contains only the relevant information for a given query. Since this technique is a preprocessing algorithm it is run offline before the actual processing begins. In addition, an incremental version of the algorithm is developed in order to update the subgraph as new information arrives to the system.A large amount of data is daily generated from different sources such as social networks, recommendation systems or geolocation systems. Moreover, this information tends to grow exponentially every year. Companies have discovered that the processing of these data may be important in order to obtain useful conclusions that serve for decision-making or the detection and resolution of problems in a more efficient way, for instance, through the study of trends, habits or customs of the population. The information provided by these sources typically consists of a non-structured and continuous data flow, where the relations among data elements conform graph structures. Inevitably, the processing performance of this information progressively decreases as the size of the data increases. For this reason, non-structured information is usually handled taking into account only the most recent data and discarding the rest, since they are considered not relevant when drawing conclusions. However, this approach is not enough in the case of sources that provide graph-structured data, since it is necessary to consider spatial features as well as temporal features. These spatial features refer to the relationships among the data elements. For example, some cases where it is important to consider spatial aspects are marketing techniques, which require information on the location of users and their possible needs, or the detection of diseases, that use data about genetic relationships among subjects or the geographic scope. It is worth highlighting three main contributions from this dissertation. First, we provide a comparative study of seven of the most common processing platforms to work with huge graphs and the languages that are used to query them. This study measures the performance of the queries in terms of execution time, and the syntax complexity of the languages according to three parameters: number of characters, number of operators and number of internal variables

    Enabling Model-Driven Live Analytics For Cyber-Physical Systems: The Case of Smart Grids

    Get PDF
    Advances in software, embedded computing, sensors, and networking technologies will lead to a new generation of smart cyber-physical systems that will far exceed the capabilities of today’s embedded systems. They will be entrusted with increasingly complex tasks like controlling electric grids or autonomously driving cars. These systems have the potential to lay the foundations for tomorrow’s critical infrastructures, to form the basis of emerging and future smart services, and to improve the quality of our everyday lives in many areas. In order to solve their tasks, they have to continuously monitor and collect data from physical processes, analyse this data, and make decisions based on it. Making smart decisions requires a deep understanding of the environment, internal state, and the impacts of actions. Such deep understanding relies on efficient data models to organise the sensed data and on advanced analytics. Considering that cyber-physical systems are controlling physical processes, decisions need to be taken very fast. This makes it necessary to analyse data in live, as opposed to conventional batch analytics. However, the complex nature combined with the massive amount of data generated by such systems impose fundamental challenges. While data in the context of cyber-physical systems has some similar characteristics as big data, it holds a particular complexity. This complexity results from the complicated physical phenomena described by this data, which makes it difficult to extract a model able to explain such data and its various multi-layered relationships. Existing solutions fail to provide sustainable mechanisms to analyse such data in live. This dissertation presents a novel approach, named model-driven live analytics. The main contribution of this thesis is a multi-dimensional graph data model that brings raw data, domain knowledge, and machine learning together in a single model, which can drive live analytic processes. This model is continuously updated with the sensed data and can be leveraged by live analytic processes to support decision-making of cyber-physical systems. The presented approach has been developed in collaboration with an industrial partner and, in form of a prototype, applied to the domain of smart grids. The addressed challenges are derived from this collaboration as a response to shortcomings in the current state of the art. More specifically, this dissertation provides solutions for the following challenges: First, data handled by cyber-physical systems is usually dynamic—data in motion as opposed to traditional data at rest—and changes frequently and at different paces. Analysing such data is challenging since data models usually can only represent a snapshot of a system at one specific point in time. A common approach consists in a discretisation, which regularly samples and stores such snapshots at specific timestamps to keep track of the history. Continuously changing data is then represented as a finite sequence of such snapshots. Such data representations would be very inefficient to analyse, since it would require to mine the snapshots, extract a relevant dataset, and finally analyse it. For this problem, this thesis presents a temporal graph data model and storage system, which consider time as a first-class property. A time-relative navigation concept enables to analyse frequently changing data very efficiently. Secondly, making sustainable decisions requires to anticipate what impacts certain actions would have. Considering complex cyber-physical systems, it can come to situations where hundreds or thousands of such hypothetical actions must be explored before a solid decision can be made. Every action leads to an independent alternative from where a set of other actions can be applied and so forth. Finding the sequence of actions that leads to the desired alternative, requires to efficiently create, represent, and analyse many different alternatives. Given that every alternative has its own history, this creates a very high combinatorial complexity of alternatives and histories, which is hard to analyse. To tackle this problem, this dissertation introduces a multi-dimensional graph data model (as an extension of the temporal graph data model) that enables to efficiently represent, store, and analyse many different alternatives in live. Thirdly, complex cyber-physical systems are often distributed, but to fulfil their tasks these systems typically need to share context information between computational entities. This requires analytic algorithms to reason over distributed data, which is a complex task since it relies on the aggregation and processing of various distributed and constantly changing data. To address this challenge, this dissertation proposes an approach to transparently distribute the presented multi-dimensional graph data model in a peer-to-peer manner and defines a stream processing concept to efficiently handle frequent changes. Fourthly, to meet future needs, cyber-physical systems need to become increasingly intelligent. To make smart decisions, these systems have to continuously refine behavioural models that are known at design time, with what can only be learned from live data. Machine learning algorithms can help to solve this unknown behaviour by extracting commonalities over massive datasets. Nevertheless, searching a coarse-grained common behaviour model can be very inaccurate for cyber-physical systems, which are composed of completely different entities with very different behaviour. For these systems, fine-grained learning can be significantly more accurate. However, modelling, structuring, and synchronising many fine-grained learning units is challenging. To tackle this, this thesis presents an approach to define reusable, chainable, and independently computable fine-grained learning units, which can be modelled together with and on the same level as domain data. This allows to weave machine learning directly into the presented multi-dimensional graph data model. In summary, this thesis provides an efficient multi-dimensional graph data model to enable live analytics of complex, frequently changing, and distributed data of cyber-physical systems. This model can significantly improve data analytics for such systems and empower cyber-physical systems to make smart decisions in live. The presented solutions combine and extend methods from model-driven engineering, [email protected], data analytics, database systems, and machine learning

    Ontology Validation for Managers

    Get PDF
    Ontology driven conceptual modeling focuses on accurately representing a domain of interest, instead of making information fit an arbitrary set of constructs. It may be used for different purposes, like to achieve semantic interoperability (Nardi, Falbo and Almeida, 2013), development of knowledge representation models (Guizzardi and Zamborlini, 2012) and language evaluation (Santos, Almeida and Guizzardi,2010). Regardless its final application, a model must be accurately defined in order for it to be a successful solution. This new branch of conceptual modeling improves traditional techniques by taking into consideration ontological properties, such as rigidity, identity and dependence, which are derived from a foundational ontology. This increasing interest in more expressive languages for conceptual modeling is shown by OMGs request for language proposals for the Semantic Information Model Federation (SIMF) (OMG,2011). OntoUML (Guizzardi, 2005) is an example of a language designed for that purpose.Its metamodel (Carraretto, 2010) is designed to comply to the Unified Foundational Ontology (UFO). It focus on structural aspects of individuals and universals.Grounded on human cognition and linguistics, it aims to provide the most basic categories in which humans understand and classify things around them.In (Guizzardi, 2010) Guizzardi quotes the famous Dijkstras lecture, in which he discusses the humble programmer and makes an analogy entitled the humble ontologist. He argues that the task of ontology-driven conceptual modeling is extremely complex and thus, modelers should surround themselves with as many tools as possible to aid in the development of the ontology. These complexities arise from different sources. A couple of them come from foundational ontology itself, both its modal nature, which imposes modelers to deal with possibilities, and the many different restrictions of each ontological category. But they also come from the need of accurately defining instance level constraints, which require additional rules, outside of the languages graphical notation. To help modelers to develop high quality OntoUML models, a number of tools have been proposed to aid in different phases of conceptual modeling. From the construction of the models themselves using design patterns questions (Guizzardi et al., 2011), to automatic syntax verification (Benevides, 2010) and model validation through simulation (Benevides et al., 2010). The importance of domain specification that accurately captures the intended conceptualization has been recognized by both the traditional conceptual modeling community (Moody et al., 2003) and the ontology community (Vrandečić, 2009). In this research we want to improve (Benevides et al., 2010) initiative, but focus exclusively on the validation of ontology driven conceptual models, and not on verification. With the complexity of the modeling activity in mind, we want to help modelers to systematically produce high quality ontologies, improving precision and coverage (Gangemi et al., 2005) of the models. We intend to make the simulationbased approach available for users that are not experts in the formal method, relieving them of the need to learn yet another language, solely for the purpose of validating their models

    Smart Universities

    Get PDF
    Institutions of learning at all levels are challenged by a fast and accelerating pace of change in the development of communications technology. Conferences around the world address the issue. Research journals in a wide range of scholarly fields are placing the challenge of understanding "Education's Digital Future" on their agenda. The World Learning Summit and LINQ Conference 2017 proceedings take this as a point of origin. Noting how the future also has a past: Emergent uses of communications technologies in learning are of course neither new nor unfamiliar. What may be less familiar is the notion of "disruption", found in many of the conferences and journal entries currently. Is the disruption of education and learning as transformative as in the case of the film industry, the music industry, journalism, and health? If so, clearly the challenge of understanding future learning and education goes to the core of institutions and organizations as much as pedagogy and practice in the classroom. One approach to the pursuit of a critical debate is the concept of Smart Universities – educational institutions that adopt to the realities of digital online media in an encompassing manner: How can we as smarter universities and societies build sustainable learning eco systems for coming generations, where technologies serve learning and not the other way around? Perhaps that is the key question of our time, reflecting concerns and challenges in a variety of scholarly fields and disciplines? These proceedings present the results from an engaging event that took place from 7th to 9th of June 2017 in Kristiansand, Norway

    Proceedings of the 7th Sound and Music Computing Conference

    Get PDF
    Proceedings of the SMC2010 - 7th Sound and Music Computing Conference, July 21st - July 24th 2010

    Beyond Participatory Design for Service Robotics

    Get PDF
    The spread of technologies as Cloud and Distributed Computing, the Internet of Things (IoT) and Machine Learning techniques comes with highly disruptive innovation potential and consequent design imperatives. High connectivity of devices and machines is shaping not only sensing and monitoring capabilities, but also describing ever more ubiquitous and diffuse computing capabilities, affecting decision-making with a wide range of assisting tools and methods. With the scaling potential of moving beyond its contemporary application such as industrial facilities monitoring, precision farming and agriculture, healthcare and risk management scenarios, RaaS is bound to involve an increasingly fluid and diverse range of users, shaping new socio-technical systems where practices, habits and relationships will evolve in respect to its adoption. On these premises, applied research at Polytechnic Interdepartmental Centre for Service Robotics in Turin, Italy, focuses on the development of a service robotics platform able to operate on the local scale and capable of adapting to evolving scenarios
    corecore