109,816 research outputs found
Towards an ontology for process monitoring and mining
Business Process Analysis (BPA) aims at monitoring, diagnosing, simulating and mining enacted processes in order to support the analysis and enhancement of process models. An effective BPA solution must provide the means for analysing existing e-businesses at three levels of abstraction: the Business Level, the Process Level and the IT Level. BPA requires semantic information that spans these layers of abstraction and which should be easily retrieved from audit trails. To cater for this, we describe the Process Mining Ontology and the Events Ontology which aim to support the analysis of enacted processes at different levels of abstraction spanning from fine grain technical details to coarse grain aspects at the Business Level
Moving from Data-Constrained to Data-Enabled Research: Experiences and Challenges in Collecting, Validating and Analyzing Large-Scale e-Commerce Data
Widespread e-commerce activity on the Internet has led to new opportunities
to collect vast amounts of micro-level market and nonmarket data. In this paper
we share our experiences in collecting, validating, storing and analyzing large
Internet-based data sets in the area of online auctions, music file sharing and
online retailer pricing. We demonstrate how such data can advance knowledge by
facilitating sharper and more extensive tests of existing theories and by
offering observational underpinnings for the development of new theories. Just
as experimental economics pushed the frontiers of economic thought by enabling
the testing of numerous theories of economic behavior in the environment of a
controlled laboratory, we believe that observing, often over extended periods
of time, real-world agents participating in market and nonmarket activity on
the Internet can lead us to develop and test a variety of new theories.
Internet data gathering is not controlled experimentation. We cannot randomly
assign participants to treatments or determine event orderings. Internet data
gathering does offer potentially large data sets with repeated observation of
individual choices and action. In addition, the automated data collection holds
promise for greatly reduced cost per observation. Our methods rely on
technological advances in automated data collection agents. Significant
challenges remain in developing appropriate sampling techniques integrating
data from heterogeneous sources in a variety of formats, constructing
generalizable processes and understanding legal constraints. Despite these
challenges, the early evidence from those who have harvested and analyzed large
amounts of e-commerce data points toward a significant leap in our ability to
understand the functioning of electronic commerce.Comment: Published at http://dx.doi.org/10.1214/088342306000000231 in the
Statistical Science (http://www.imstat.org/sts/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Innovation from user experience in Living Labs: revisiting the ‘innovation factory’-concept with a panel-based and user-centered approach
This paper focuses on the problem of facilitating sustainable innovation practices with a user-centered approach. We do so by revisiting the knowledge-brokering cycle and Hargadon and Sutton’s ideas on building an ‘innovation factory’ within the light of current Living Lab-practices. Based on theoretical as well as practical evidence from a case study analysis of the LeYLab-Living Lab, it is argued that Living Labs with a panel-based approach can act as innovation intermediaries where innovation takes shape through actual user experience in real-life environments, facilitating all four stages within the knowledge-brokering cycle. This finding is also in line with the recently emerging Quadruple Helix-model for innovation, stressing the crucial role of the end-user as a stakeholder throughout the whole innovation process
Run-time risk management in adaptive ICT systems
We will present results of the SERSCIS project related to risk management and mitigation strategies in adaptive multi-stakeholder ICT systems. The SERSCIS approach involves using semantic threat models to support automated design-time threat identification and mitigation analysis. The focus of this paper is the use of these models at run-time for automated threat detection and diagnosis. This is based on a combination of semantic reasoning and Bayesian inference applied to run-time system monitoring data. The resulting dynamic risk management approach is compared to a conventional ISO 27000 type approach, and validation test results presented from an Airport Collaborative Decision Making (A-CDM) scenario involving data exchange between multiple airport service providers
- …