2,299 research outputs found
A systematic literature review on the use of artificial intelligence in energy self-management in smart buildings
Buildings are one of the main consumers of energy in cities, which is why a lot of research has been generated around this problem. Especially, the buildings energy management systems must improve in the next years. Artificial intelligence techniques are playing and will play a fundamental role in these improvements. This work presents a systematic review of the literature on researches that have been done in recent years to improve energy management systems for smart building using artificial intelligence techniques. An originality of the work is that they are grouped according to the concept of "Autonomous Cycles of Data Analysis Tasks", which defines that an autonomous management system requires specialized tasks, such as monitoring, analysis, and decision-making tasks for reaching objectives in the environment, like improve the energy efficiency. This organization of the work allows us to establish not only the positioning of the researches, but also, the visualization of the current challenges and opportunities in each domain. We have identified that many types of researches are in the domain of decision-making (a large majority on optimization and control tasks), and defined potential projects related to the development of autonomous cycles of data analysis tasks, feature engineering, or multi-agent systems, among others.European Commissio
Architecture value mapping: using fuzzy cognitive maps as a reasoning mechanism for multi-criteria conceptual design evaluation
The conceptual design phase is the most critical phase in the systems engineering life cycle. The design concept chosen during this phase determines the structure and behavior of the system, and consequently, its ability to fulfill its intended function. A good conceptual design is the first step in the development of a successful artifact. However, decision-making during conceptual design is inherently challenging and often unreliable. The conceptual design phase is marked by an ambiguous and imprecise set of requirements, and ill-defined system boundaries. A lack of usable data for design evaluation makes the problem worse. In order to assess a system accurately, it is necessary to capture the relationships between its physical attributes and the stakeholders\u27 value objectives. This research presents a novel conceptual architecture evaluation approach that utilizes attribute-value networks, designated as \u27Architecture Value Maps\u27, to replicate the decision makers\u27 cogitative processes. Ambiguity in the system\u27s overall objectives is reduced hierarchically to reveal a network of criteria that range from the abstract value measures to the design-specific performance measures. A symbolic representation scheme, the 2-Tuple Linguistic Representation is used to integrate different types of information into a common computational format, and Fuzzy Cognitive Maps are utilized as the reasoning engine to quantitatively evaluate potential design concepts. A Linguistic Ordered Weighted Average aggregation operator is used to rank the final alternatives based on the decision makers\u27 risk preferences. The proposed methodology provides systems architects with the capability to exploit the interrelationships between a system\u27s design attributes and the value that stakeholders associate with these attributes, in order to design robust, flexible, and affordable systems --Abstract, page iii
Design and quality criteria for archetype analysis
A key challenge in addressing the global degradation of natural resources and the environment is to effectively transfer successful strategies across heterogeneous contexts. Archetype analysis is a particularly salient approach in this regard that helps researchers to understand and compare patterns of (un)sustainability in heterogeneous cases. Archetype analysis avoids traps of overgeneralization and ideography by identifying reappearing but nonuniversal patterns that hold for well-defined subsets of cases. It can be applied by researchers working in inter- or transdisciplinary settings to study sustainability issues from a broad range of theoretical and methodological standpoints. However, there is still an urgent need for quality standards to guide the design of theoretically rigorous and practically useful archetype analyses. To this end, we propose four quality criteria and corresponding research strategies to address them: (1) specify the domain of validity for each archetype, (2) ensure that archetypes can be combined to characterize single cases, (3) explicitly navigate levels of abstraction, and (4) obtain a fit between attribute configurations, theories, and empirical domains of validity. These criteria are based on a stocktaking of current methodological challenges in archetypes research, including: to demonstrate the validity of the analysis, delineate boundaries of archetypes, and select appropriate attributes to define them. We thus contribute to a better common understanding of the approach and to the improvement of the research design of future archetype analyses
Context Aware Computing for The Internet of Things: A Survey
As we are moving towards the Internet of Things (IoT), the number of sensors
deployed around the world is growing at a rapid pace. Market research has shown
a significant growth of sensor deployments over the past decade and has
predicted a significant increment of the growth rate in the future. These
sensors continuously generate enormous amounts of data. However, in order to
add value to raw sensor data we need to understand it. Collection, modelling,
reasoning, and distribution of context in relation to sensor data plays
critical role in this challenge. Context-aware computing has proven to be
successful in understanding sensor data. In this paper, we survey context
awareness from an IoT perspective. We present the necessary background by
introducing the IoT paradigm and context-aware fundamentals at the beginning.
Then we provide an in-depth analysis of context life cycle. We evaluate a
subset of projects (50) which represent the majority of research and commercial
solutions proposed in the field of context-aware computing conducted over the
last decade (2001-2011) based on our own taxonomy. Finally, based on our
evaluation, we highlight the lessons to be learnt from the past and some
possible directions for future research. The survey addresses a broad range of
techniques, methods, models, functionalities, systems, applications, and
middleware solutions related to context awareness and IoT. Our goal is not only
to analyse, compare and consolidate past research work but also to appreciate
their findings and discuss their applicability towards the IoT.Comment: IEEE Communications Surveys & Tutorials Journal, 201
Hybrid approaches based on computational intelligence and semantic web for distributed situation and context awareness
2011 - 2012The research work focuses on Situation Awareness and Context Awareness topics.
Specifically, Situation Awareness involves being aware of what is happening in the vicinity
to understand how information, events, and one’s own actions will impact goals and objectives,
both immediately and in the near future. Thus, Situation Awareness is especially
important in application domains where the information flow can be quite high and poor
decisions making may lead to serious consequences.
On the other hand Context Awareness is considered a process to support user applications
to adapt interfaces, tailor the set of application-relevant data, increase the precision of
information retrieval, discover services, make the user interaction implicit, or build smart
environments.
Despite being slightly different, Situation and Context Awareness involve common
problems such as: the lack of a support for the acquisition and aggregation of dynamic environmental
information from the field (i.e. sensors, cameras, etc.); the lack of formal approaches
to knowledge representation (i.e. contexts, concepts, relations, situations, etc.)
and processing (reasoning, classification, retrieval, discovery, etc.); the lack of automated
and distributed systems, with considerable computing power, to support the reasoning on a
huge quantity of knowledge, extracted by sensor data.
So, the thesis researches new approaches for distributed Context and Situation Awareness
and proposes to apply them in order to achieve some related research objectives such
as knowledge representation, semantic reasoning, pattern recognition and information retrieval.
The research work starts from the study and analysis of state of art in terms of
techniques, technologies, tools and systems to support Context/Situation Awareness. The
main aim is to develop a new contribution in this field by integrating techniques deriving
from the fields of Semantic Web, Soft Computing and Computational Intelligence. From
an architectural point of view, several frameworks are going to be defined according to the
multi-agent paradigm.
Furthermore, some preliminary experimental results have been obtained in some application
domains such as Airport Security, Traffic Management, Smart Grids and
Healthcare.
Finally, future challenges is going to the following directions: Semantic Modeling of
Fuzzy Control, Temporal Issues, Automatically Ontology Elicitation, Extension to other
Application Domains and More Experiments. [edited by author]XI n.s
Recommended from our members
Computing resources sensitive parallelization of neural neworks for large scale diabetes data modelling, diagnosis and prediction
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Diabetes has become one of the most severe deceases due to an increasing number of diabetes patients globally. A large amount of digital data on diabetes has been collected through various channels. How to utilize these data sets to help doctors to make a decision on diagnosis, treatment and prediction of diabetic patients poses many challenges to the research community. The thesis investigates mathematical models with a focus on neural networks for large scale diabetes data modelling and analysis by utilizing modern computing technologies such as grid computing and cloud computing. These computing technologies provide users with an inexpensive way to have access to extensive computing resources over the Internet for solving data and computationally intensive problems. This thesis evaluates the performance of seven representative machine learning techniques in classification of diabetes data and the results show that neural network produces the best accuracy in classification but incurs high overhead in data training. As a result, the thesis develops MRNN, a parallel neural network model based on the MapReduce programming model which has become an enabling technology in support of data intensive applications in the clouds.
By partitioning the diabetic data set into a number of equally sized data blocks, the workload in training is distributed among a number of computing nodes for speedup in data training. MRNN is first evaluated in small scale experimental environments using 12 mappers and subsequently is evaluated in large scale simulated environments using up to 1000 mappers. Both the experimental and simulations results have shown the effectiveness of MRNN in classification, and its high scalability in data training.
MapReduce does not have a sophisticated job scheduling scheme for heterogonous computing environments in which the computing nodes may have varied computing capabilities. For this purpose, this thesis develops a load balancing scheme based on genetic algorithms with an aim to balance the training workload among heterogeneous computing nodes. The nodes with more computing capacities will receive more MapReduce jobs for execution. Divisible load theory is employed to guide the evolutionary process of the genetic algorithm with an aim to achieve fast convergence. The proposed load balancing scheme is evaluated in large scale simulated MapReduce environments with varied levels of heterogeneity using different sizes of data sets. All the results show that the genetic algorithm based load balancing scheme significantly reduce the makespan in job execution in comparison with the time consumed without load balancing.This work is funded by the EPSRC and China Market Association
- …