1,712 research outputs found
Goal-oriented design of value and process models from patterns
This thesis defines a design framework and a method for modelling networked businesses. The intended application domain is electronic businesses that extensively use information and communication technology to coordinate work. The key property of the proposed approach is the reuse of design knowledge in the form of design patterns. Design patterns are extracted from models of existing electronic intermediaries considered successful. These businesses have been reverse-engineered to two types of models: economic value exchange models and business process models. The identified patterns comprise two libraries of value exchange and business process patterns, respectively. Patterns are catalogued with, among others, their context, solved problem, and proposed solution. Most importantly, they are annotated with a machine-readable\ud
capability model used as a search key in the library. Capability models are part of the goal-modelling technique for business requirements proposed here. Our goal-modelling technique operationalizes each business goal with a variable and an evaluation function: the evaluation function determines when a measured variable value satisfies the goal. A goal model represents requirements if goals are assigned evaluation functions but the variable values are unknown. In such a case, the goal model specifies what is desired to happen. If, on the other hand, variable values are known, the goal model documents the capabilities of a pattern. The proposed design framework structures the development process into: (1) available design knowledge in libraries of value and process patterns, (2) business requirements captured in a goal model, and (3) economic value and business process perspectives to look at a business system. The design method prescribes steps to transform patterns and requirements into a system specification. These include: (i) identification of relevant pattern based on matching capability and requirements goal models; (ii) synthesis of value and process patterns into value and process models, respectively; and (iii) consistency check procedure for value and process model.\ud
The usefulness of the approach is demonstrated in a real-life example, which shows that the framework and method exhibit a predefined set of desired properties
A tracker solution to the cold dark matter cosmic coincidence problem
Recently, we introduced the notion of "tracker fields," a form of
quintessence which has an attractor-like solution. Using this concept, we
showed how to construct models in which the ratio of quintessence to matter
densities today is independent of initial conditions. Here we apply the same
idea to the standard cold dark matter component in cases where it is composed
of oscillating fields. Combining these ideas, we can construct a model in which
quintessence, cold dark matter, and ordinary matter all contribute comparable
amounts to the total energy density today irrespective of initial conditions.Comment: 8 pages, 2 eps figures, use epsfig.sty, accepted for publication in
Physics Letters
Applied business analytics approach to IT projects – Methodological framework
The design and implementation of a big data project differs from a typical business intelligence project that might be presented concurrently within the same organization. A big data initiative typically triggers a large scale IT project that is expected to deliver the desired outcomes. The industry has identified two major methodologies for running a data centric project, in particular SEMMA (Sample, Explore, Modify, Model and Assess) and CRISP-DM (Cross Industry Standard Process for Data Mining). More general, the professional organizations PMI (Project Management Institute) and IIBA (International Institute of Business Analysis) have defined their methods for project management and business analysis based on the best current industry practices. However, big data projects place new challenges that are not considered by the existing methodologies. The building of end-to-end big data analytical solution for optimization of the supply chain, pricing and promotion, product launch, shop potential and customer value is facing both business and technical challenges. The most common business challenges are unclear and/or poorly defined business cases; irrelevant data; poor data quality; overlooked data granularity; improper contextualization of data; unprepared or bad prepared data; non-meaningful results; lack of skill set. Some of the technical challenges are related to lag of resources and technology limitations; availability of data sources; storage difficulties; security issues; performance problems; little flexibility; and ineffective DevOps. This paper discusses an applied business analytics approach to IT projects and addresses the above-described aspects. The authors present their work on research and development of new methodological framework and analytical instruments applicable in both business endeavors, and educational initiatives, targeting big data. The proposed framework is based on proprietary methodology and advanced analytics tools. It is focused on the development and the implementation of practical solutions for project managers, business analysts, IT practitioners and Business/Data Analytics students. Under discussion are also the necessary skills and knowledge for the successful big data business analyst, and some of the main organizational and operational aspects of the big data projects, including the continuous model deployment
Mathematical treatment of environmental models
Large-scale environmental models can successfully be used in different
important for the modern society studies as, for example, in the investigation of the
influence of the future climatic changes on pollution levels in different countries.
Such models are normally described mathematically by non-linear systems of par-
tial differential equations, which are defined on very large spatial domains and have
to be solved numerically on very long time intervals. Moreover, very often many
different scenarios have also to be developed and used in the investigations. There-
fore, both the storage requirements and the computational work are enormous. The
great difficulties can be overcome only if the following four tasks are successfully
resolved: (a) fast and sufficiently accurate numerical methods are to be selected, (b)
reliable and efficient splitting procedures are to be applied, (c) the cache memories
of the available computers are to be efficiently exploited and (d) the codes are to be
parallelized
An Investigation of the Negotiation Domain for Electronic Commerce Information Systems
To support fully automatic business cycles, information systems for electronic commerce need to be able to conduct negotiation automatically. In recent years, a number of general frameworks for automated negotiation have been proposed. Application of such frameworks in a specific negotiation situation entails selecting the proper framework and adapting it to this situation. This selection and adaptation process is driven by the specific characteristics of the situation. This paper presents a systematic investigation of there characteristics and surveys a number of frameworks for automated negotiation
Ordinary kriging for on-demand average wind interpolation of in-situ wind sensor data
We have developed a domain agnostic ordinary kriging algorithm accessible via a standards-based service-oriented architecture for sensor networks. We exploit the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) standards. We need on-demand interpolation maps so runtime performance is a major priority.Our sensor data comes from wind in-situ observation stations in an area approximately 200km by 125km. We provide on-demand average wind interpolation maps. These spatial estimates can then be compared with the results of other estimation models in order to identify spurious results that sometimes occur in wind estimation.Our processing is based on ordinary kriging with automated variogram model selection (AVMS). This procedure can smooth time point wind measurements to obtain average wind by using a variogram model that reflects the wind phenomenon characteristics. Kriging is enabled for wind direction estimation by a simple but effective solution to the problem of estimating periodic variables, based on vector rotation and stochastic simulation.In cases where for the region of interest all wind directions span 180 degrees, we rotate them so they lie between 90 and 270 degrees and apply ordinary kriging with AVMS directly to the meteorological angle. Else, we transform the meteorological angle to Cartesian space, apply ordinary kriging with AVMS and use simulation to transform the kriging estimates back to meteorological angle.Tests run on a 50 by 50 grid using standard hardware takes about 5 minutes to execute backward transformation with a sample size of 100,000. This is acceptable for our on-demand processing service requirements
Multi-perspective requirements engineering for networked business systems: a framework for pattern composition
How business and software analysts explore, document, and negotiate requirements for enterprise systems is critical to the benefits their organizations will eventually derive. In this paper, we present a framework for analysis and redesign of networked business systems. It is based on libraries of patterns which are derived from existing Internet businesses. The framework includes three perspectives: Economic value, Business processes, and Application communication, each of which applies a goal-oriented method to compose patterns. By means of consistency relationships between perspectives, we demonstrate the usefulness of the patterns as a light-weight approach to exploration of business ideas
Photon emission in a constant magnetic field in 2+1 dimensional space-time
We calculate by the proper-time method the amplitude of the two-photon
emission by a charged fermion in a constant magnetic field in (2+1)-dimensional
space-time. The relevant dynamics reduces to that of a supesymmetric
quantum-mechanical system with one bosonic and one fermionic degrees of
freedom.Comment: 18 pages. v2: references added, some significant changes in the
introductio
The co-evolution of human intersubjectivity, morality and language
The chapter argues that language, which rests on the sharing of linguistic norms, honest information, and moral norms, evolved through a co-evolutionary process with a pivotal role for intersubjectivity. Mainstream evolutionary models, based only on individual-level and gene-level selection, are argued to be incapable to account for such sharing of care, values and information, thus implying the need to evoke multi-level selection, including (cultural) group selection. Four of the most influential current theories of the evolution of human-scale sociality, those of Dunbar, Deacon, Tomasello and Hrdy, are compared and evaluated on the basis of their answers to five questions: (1) Why we and not others? (2) How: by what mechanisms? (3) When? (4) In what kind of social settings? (5) What are the implications for ontogeny? The conclusions are that the theories are to a large degree complementary, and that they all assume, explicitly or not, a role for group selection. Hrdy’s theory, focusing on the evolution of alloparenting, is argued to provide the best explanation for the onset of the evolution of human intersubjectivity, and can furthermore offer a Darwinian framework for Tomasello’s theory of shared intentionality. Deacon’s theory deals rather with the evolution of morality and its co-evolution with “symbolic reference”, but these are necessarily antecedent to the primary evolution of human intersubjectivity. Dunbar’s theory on the transition from “musical” vocal-grooming to vocal “gossip” can be seen as providing a partial explanation for evolution of spoken language, most likely with Homo heidelbergensis 0.5 MYA, but presupposes the capacities accounted for by the other models
- …