11,468 research outputs found
Towards Analytics Aware Ontology Based Access to Static and Streaming Data (Extended Version)
Real-time analytics that requires integration and aggregation of
heterogeneous and distributed streaming and static data is a typical task in
many industrial scenarios such as diagnostics of turbines in Siemens. OBDA
approach has a great potential to facilitate such tasks; however, it has a
number of limitations in dealing with analytics that restrict its use in
important industrial applications. Based on our experience with Siemens, we
argue that in order to overcome those limitations OBDA should be extended and
become analytics, source, and cost aware. In this work we propose such an
extension. In particular, we propose an ontology, mapping, and query language
for OBDA, where aggregate and other analytical functions are first class
citizens. Moreover, we develop query optimisation techniques that allow to
efficiently process analytical tasks over static and streaming data. We
implement our approach in a system and evaluate our system with Siemens turbine
data
Minimizing water and energy consumptions in water and heat exchange networks.
This study presents a mathematical programming formulation for the design of water and heat exchangers networks based on a two-step methodology. First, an MILP (mixed integer linear programming) procedure is used to solve the water and energy allocation problem regarding several objectives. The first step of the design method involves four criteria to be taken into account., ie, fresh water consumption (F1), energy consumption (F2), interconnection number (F3) and number of heat exchangers (F4). The multiobjective optimization Min [F1, F2] is solved by the so-called ɛ-constraint method and leads to several Pareto fronts for fixed numbers of connections and heat exchangers. The second step consists in improving the best results of the first phase with energy integration into the water network. This stage is solved by an MINLP procedure in order to minimize an objective cost function. Two examples reported in the dedicated literature serve as test bench cases to apply the proposed two-step approach. The results show that the simultaneous consideration of the abovementioned objectives is more realistic than the only minimization of fresh water consumption. Indeed, the optimal network does not necessarily correspond to the structure that reaches the fresh water target. For a real paper mill plant, energy consumption decreases of almost 20% as compared with previous studies
Performance modelling and the representation of large scale distributed system functions
This thesis presents a resource based approach to model generation for performance characterization and correctness checking of large scale telecommunications networks. A notion called the timed automaton is proposed and then developed to encapsulate behaviours of networking equipment, system control policies and non-deterministic user behaviours. The states of pooled network resources and the behaviours of resource consumers are represented as continually varying geometric patterns; these patterns form part of the data operated upon by the timed automata. Such a representation technique allows for great flexibility regarding the level of abstraction that can be chosen in the modelling of telecommunications systems. None the less, the notion of system functions is proposed to serve as a constraining framework for specifying bounded behaviours and features of telecommunications systems. Operational concepts are developed for the timed automata; these concepts are based on limit preserving relations. Relations over system states represent the evolution of system properties observable at various locations within the network under study. The declarative nature of such permutative state relations provides a direct framework for generating highly expressive models suitable for carrying out optimization experiments. The usefulness of the developed procedure is demonstrated by tackling a large scale case study, in particular the problem of congestion avoidance in networks; it is shown that there can be global coupling among local behaviours within a telecommunications network. The uncovering of such a phenomenon through a function oriented simulation is a contribution to the area of network modelling. The direct and faithful way of deriving performance metrics for loss in networks from resource utilization patterns is also a new contribution to the work area
The Environment as an Argument
Context-awareness as defined in the setting of Ubiquitous Computing [3] is all about expressing the dependency of a specific computation upon some implicit piece of information. The manipulation and expression of such dependencies may thus be neatly encapsulated in a language where computations are first-class values. Perhaps surprisingly however, context-aware programming has not been explored in a functional setting, where first-class computations and higher-order functions are commonplace. In this paper we present an embedded domain-specific language (EDSL) for constructing context-aware applications in the functional programming language Haskell. © 2012 Springer-Verlag
The VEX-93 environment as a hybrid tool for developing knowledge systems with different problem solving techniques
The paper describes VEX-93 as a hybrid environment for developing
knowledge-based and problem solver systems. It integrates methods and
techniques from artificial intelligence, image and signal processing and
data analysis, which can be mixed. Two hierarchical levels of reasoning
contains an intelligent toolbox with one upper strategic inference engine
and four lower ones containing specific reasoning models: truth-functional
(rule-based), probabilistic (causal networks), fuzzy (rule-based) and
case-based (frames). There are image/signal processing-analysis capabilities
in the form of programming languages with more than one hundred primitive
functions.
User-made programs are embeddable within knowledge basis, allowing the
combination of perception and reasoning. The data analyzer toolbox contains
a collection of numerical classification, pattern recognition and ordination
methods, with neural network tools and a data base query language at
inference engines's disposal.
VEX-93 is an open system able to communicate with external computer programs
relevant to a particular application. Metaknowledge can be used for
elaborate conclusions, and man-machine interaction includes, besides windows
and graphical interfaces, acceptance of voice commands and production of
speech output.
The system was conceived for real-world applications in general domains, but
an example of a concrete medical diagnostic support system at present under
completion as a cuban-spanish project is mentioned.
Present version of VEX-93 is a huge system composed by about one and half
millions of lines of C code and runs in microcomputers under Windows 3.1.Postprint (published version
Recommended from our members
Blending the physical and the digital through conceptual spaces
The rise of the Internet facilitates an ever increasing growth of virtual, i.e. digital spaces which co-exist with the physical environment, i.e. the physical space. In that, the question arises, how physical and digital space can interact synchronously. While sensors provide a means to continuously observe the physical space, several issues arise with respect to mapping sensor data streams to digital spaces, for instance, structured linked data, formally represented through symbolic Semantic Web (SW) standards such as OWL or RDF. The challenge is to bridge between symbolic knowledge representations and the measured data collected by sensors. In particular, one needs to map a given set of arbitrary sensor data to a particular set of symbolic knowledge representations, e.g. ontology instances. This task is particularly challenging due to the vast variety of possible sensor measurements. Conceptual Spaces (CS) provide a means to represent knowledge in geometrical vector spaces in order to enable computation of similarities between knowledge entities by means of distance metrics. We propose an approach which allows to refine symbolic concepts as CS and to ground ontology instances to so-called prototypical members which are vectors in the CS. By computing similarities in terms of spatial distances between a given set of sensor measurements and a finite set of CS members, the most similar instance can be identified. In that, we provide a means to bridge between the physical space, as observed by sensors, and the digital space made up of symbolic representations
The hydrosalinity module of ACRU agrohydrological modelling system (ACRUsalinity) : module development and evaluation.
Thesis (M.Sc.)-University of Natal, Pietermaritzburg, 2003.Water is characterised by both its quantity (availability) and its quality. Salinity, which is one of the major water quality parameters limiting use of a wide range of land and water resources, refers to the total dissolved solutes in water. It is influenced by a combination of several soil-water-salt-plant related processes. In order to develop optimum management schemes for environmental control through relevant hydrological modelling techniques, it is important to identify and understand these processes affecting salinity. Therefore, the various sources and processes controlling salt release and transport from the soil surface through the root zone to groundwater and streams as well as reservoirs are extensively reviewed in this project with subsequent exploration of some hydro salinity modelling approaches. The simulation of large and complex hydrological systems, such as these at a catchment scale, requires a flexible and efficient modelling tool to assist in the assessment of the impact of land and water use alternatives on the salt balance. The currently available catchment models offer varying degrees of suitability with respect to modelling hydrological problems, dependent on the model structure and the type of the approach used. The A CR U agrohydrological modelling system, with its physically-conceptually based characteristics as well as being a multi-purpose model that is able to operate both as a lumped and distributed model, was found to be suitable for hydro salinity modelling at a catchment scale through the incorporation of an appropriate hydro salinity module. The main aim of this project was to develop, validate and verify a hydro salinity module for the ACRU model. This module is developed in the object-oriented version of ACRU, viz. ACRU2000, and it inherits the basic structure and objects of the model. The module involves the interaction of the hydrological processes represented in ACRU and salinity related processes. Hence, it is designated as ACRUSalinity. In general, the module is developed through extensive review of ACRU and hydrosalinity models, followed by conceptualisation and design of objects in the module. It is then written in Java object-oriented programming language. The development of ACRUSalinity is based mainly on the interaction between three objects, viz. Components, Data and Processes. Component objects in ACRU2000 represent the physical features in the hydrological system being modelled. Data objects are mainly used to store data or information. The Process objects describe processes that can take place in a conceptual or real world hydrological system. The Process objects in ACRUSalinity are grouped into six packages that conduct: • the initial salt load determination in subsurface components and a reservoir • determination of wet atmospheric deposition and salt input from irrigation water • subsurface salt balance, salt generation and salt movement • surface flow salt balance and salt movement • reservoir salt budgeting and salt routing and • channel-reach salt balancing and, in the case of distributed hydro salinity modelling, salt transfer between sub-catchments. The second aim of the project was the validation and verification of the module. Code validation was undertaken through mass balance computations while verification of the module was through comparison of simulated streamflow salinity against observed values as recorded at gauging weir UIH005 which drains the Upper Mkomazi Catchment in KwaZuluNatal, South Africa. Results from a graphical and statistical analysis of observed and simulated values have shown that the simulated streamflow salinity values mimic the observed values remarkably well. As part of the module development and validation, sensitivity analysis of the major input parameters of ACRUSalinity was also conducted. This is then followed by a case study that demonstrates some potential applications of the module. In general, results from the module evaluation have indicated that ACRUSalinity can be used to provide a reasonable first order approximation in various hydrosalinity studies. Most of the major sources and controlling factors of salinity are accommodated in the ACRUSalinity module which was developed in this project. However, for a more accurate and a better performance of the module in diversified catchments, further research needs to be conducted to account for the impact of salt loading from certain sources and to derive the value of some input parameters to the new module. The research needs include incorporation in the module of the impact of salt loading from fertilizer applications as well as from urban and industrial effluents. Similarly, further research needs to be undertaken to facilitate the module's conducting salt routing at sub-daily time step and to account for the impact of bypass flows in heavy soils on the surface and subsurface salt balances
- …