7,717 research outputs found
A Review on Energy Consumption Optimization Techniques in IoT Based Smart Building Environments
In recent years, due to the unnecessary wastage of electrical energy in
residential buildings, the requirement of energy optimization and user comfort
has gained vital importance. In the literature, various techniques have been
proposed addressing the energy optimization problem. The goal of each technique
was to maintain a balance between user comfort and energy requirements such
that the user can achieve the desired comfort level with the minimum amount of
energy consumption. Researchers have addressed the issue with the help of
different optimization algorithms and variations in the parameters to reduce
energy consumption. To the best of our knowledge, this problem is not solved
yet due to its challenging nature. The gap in the literature is due to the
advancements in the technology and drawbacks of the optimization algorithms and
the introduction of different new optimization algorithms. Further, many newly
proposed optimization algorithms which have produced better accuracy on the
benchmark instances but have not been applied yet for the optimization of
energy consumption in smart homes. In this paper, we have carried out a
detailed literature review of the techniques used for the optimization of
energy consumption and scheduling in smart homes. The detailed discussion has
been carried out on different factors contributing towards thermal comfort,
visual comfort, and air quality comfort. We have also reviewed the fog and edge
computing techniques used in smart homes
A hybrid neuro--wavelet predictor for QoS control and stability
For distributed systems to properly react to peaks of requests, their
adaptation activities would benefit from the estimation of the amount of
requests. This paper proposes a solution to produce a short-term forecast based
on data characterising user behaviour of online services. We use \emph{wavelet
analysis}, providing compression and denoising on the observed time series of
the amount of past user requests; and a \emph{recurrent neural network} trained
with observed data and designed so as to provide well-timed estimations of
future requests. The said ensemble has the ability to predict the amount of
future user requests with a root mean squared error below 0.06\%. Thanks to
prediction, advance resource provision can be performed for the duration of a
request peak and for just the right amount of resources, hence avoiding
over-provisioning and associated costs. Moreover, reliable provision lets users
enjoy a level of availability of services unaffected by load variations
Multi-method-modeling of interacting galaxies. I. A unique scenario for NGC 4449?
(abridged) We combined several N-body methods in order to investigate the
interaction scenario between NGC 4449 and DDO 125, a close companion in
projected space. In a first step fast restricted N-body models are used to
confine a region in parameter space reproducing the main observational
features. In a second step a genetic algorithm is applied for a uniqueness test
of our preferred parameter set. We show that our genetic algorithm reliably
recovers orbital parameters, provided that the data are sufficiently accurate,
i.e. all the key features are included.
In the third step the results of the restricted N-body models are compared
with self-consistent N-body simulations. In the case of NGC 4449, the
applicability of the simple restricted N-body calculations is demonstrated.
Additionally, it is shown that the HI gas can be modeled here by a purely
stellar dynamical approach.
In a series of simulations, we demonstrate that the observed features of the
extended HI disc can be explained by a gravitational interaction between NGC
4449 and DDO 125. According to these calculations the closest approach between
both galaxies happened yr ago at a minimum distance of
kpc on a parabolic or slightly elliptic orbit. In the case of an
encounter scenario, the dynamical mass of DDO 125 should not be smaller than
10% of NGC 4449's mass. Before the encounter, the observed HI gas was arranged
in a disc with a radius of 35-40 kpc around the center of NGC 4449. It had the
same orientation as the central ellipsoidal HI structure. The origin of this
disc is still unclear, but it might have been caused by a previous interaction.Comment: 19 pages with 19 figures, accepted for publication in Astron. &
Astrophys., a full PostScript version is available at
http://www.astrophysik.uni-kiel.de/pershome/theis/pub.htm
Internationales Kolloquium über Anwendungen der Informatik und Mathematik in Architektur und Bauwesen : 20. bis 22.7. 2015, Bauhaus-Universität Weimar
The 20th International Conference on the Applications of Computer Science and Mathematics in Architecture and Civil Engineering will be held at the Bauhaus University Weimar from 20th till 22nd July 2015. Architects, computer scientists, mathematicians, and engineers from all over the world will meet in Weimar for an interdisciplinary exchange of experiences, to report on their results in research, development and practice and to discuss. The conference covers a broad range of research areas: numerical analysis, function theoretic methods, partial differential equations, continuum mechanics, engineering applications, coupled problems, computer sciences, and related topics. Several plenary lectures in aforementioned areas will take place during the conference.
We invite architects, engineers, designers, computer scientists, mathematicians, planners, project managers, and software developers from business, science and research to participate in the conference
Decentralized UAV guidance using modified boid algorithms
Decentralized guidance of Unoccupied Air Vehicles (UAVs) is a very challenging problem. Such technology can lead to improved safety, reduced cost, and improved mission efficiency. Only a few ideas for achieving decentralized guidance exist, the most effective being the boid algorithm. Boid algorithms are rule-based guidance methods derived from observations of animal swarms. In this paper, boid rules are used to autonomously control a group of UAVs in high-level transit simulations. This paper differs from previous work in that, as an alternative to using exponentially scaled behavior weightings, the weightings are computed off-line and scheduled according to a contingency management system. The motivation for this technique is to reduce the amount of on-line computation required by the flight system. Many modifications to the basic boid algorithm are required in order to achieve a flightworthy design. These modifications include the ability to define flight areas, limit turning maneuvers in accordance with the aircraft dynamics, and produce intelligent waypoint paths. The use of a contingency management system is also a major modification to the boid algorithm. A Simple Genetic Algorithm is used to partially optimize the behavior weightings of the boid algorithm. While a full optimization of all contingencies is not performed due to computation requirements, the framework for such a process is developed. Wolfram\u27s Matlab software is used to develop and simulate the boid guidance algorithm. The algorithm is interfaced with Cloud Cap Technology\u27s Piccolo autopilot system for Hardware-in-the-Loop simulations. These high-fidelity simulations prove this technology is both feasible and practical. They also prove the boid guidance system developed herein is suitable for comprehensive flight testing
Evaluation of dimensionality reduction methods applied to numerical weather models for solar radiation forecasting
The interest in solar radiation prediction has increased greatly in recent times among the scientific community. In this context, Machine Learning techniques have shown their ability to learn accurate prediction models. The aim of this paper is to go one step further and automatically achieve interpretability during the learning process by performing dimensionality reduction on the input variables. To this end, three non standard multivariate feature selection approaches are applied, based on the adaptation of strong learning algorithms to the feature selection task, as well as a battery of classic dimensionality reduction models. The goal is to obtain robust sets of features that not only improve prediction accuracy but also provide more interpretable and consistent results. Real data from the Weather Research and Forecasting model, which produces a very large number of variables, is used as the input. As is to be expected, the results prove that dimensionality reduction in general is a useful tool for improving performance, as well as easing the interpretability of the results. In fact, the proposed non standard methods offer important accuracy improvements and one of them provides with an intuitive and reduced selection of features and mesoscale nodes (around 10% of the initial variables centered on three specific nodes).This work has been partially supported by the projects TIN2014-54583-C2-2-R, TEC2014-52289-R and TEC2016-81900-REDT of the Spanish Interministerial Commission of Science and Technology (MICYT), and by Comunidad Autónoma de Madrid, under project PRICAM P2013ICE-2933
- …