536 research outputs found
Efficiency and Optimization of Buildings Energy Consumption: Volume II
This reprint, as a continuation of a previous Special Issue entitled “Efficiency and Optimization of Buildings Energy Consumption”, gives an up-to-date overview of new technologies based on Machine Learning (ML) and Internet of Things (IoT) procedures to improve the mathematical approach of algorithms that allow control systems to be improved with the aim of reducing housing sector energy consumption
A novel heuristic algorithm for the modeling and risk assessment of the covid-19 pandemic phenomenon
The modeling and risk assessment of a pandemic phenomenon such as COVID-19 is an important and complicated issue in epidemiology, and such an attempt is of great interest for public health decision-making. To this end, in the present study, based on a recent heuristic algorithm proposed by the authors, the time evolution of COVID-19 is investigated for six different countries/states, namely New York, California, USA, Iran, Sweden and UK. The number of COVID-19-related deaths is used to develop the proposed heuristic model as it is believed that the predicted number of daily deaths in each country/state includes information about the quality of the health system in each area, the age distribution of population, geographical and environmental factors as well as other conditions. Based on derived predicted epidemic curves, a new 3D-epidemic surface is proposed to assess the epidemic phenomenon at any time of its evolution. This research highlights the potential of the proposed model as a tool which can assist in the risk assessment of the COVID-19. Mapping its development through 3D-epidemic surface can assist in revealing its dynamic nature as well as differences and similarities among different districts
Applications of Artificial Intelligence in Battling Against Covid-19: A Literature Review
© 2020 Elsevier Ltd. All rights reserved.Colloquially known as coronavirus, the Severe Acute Respiratory Syndrome CoronaVirus 2 (SARS-CoV-2), that causes CoronaVirus Disease 2019 (COVID-19), has become a matter of grave concern for every country around the world. The rapid growth of the pandemic has wreaked havoc and prompted the need for immediate reactions to curb the effects. To manage the problems, many research in a variety of area of science have started studying the issue. Artificial Intelligence is among the area of science that has found great applications in tackling the problem in many aspects. Here, we perform an overview on the applications of AI in a variety of fields including diagnosis of the disease via different types of tests and symptoms, monitoring patients, identifying severity of a patient, processing covid-19 related imaging tests, epidemiology, pharmaceutical studies, etc. The aim of this paper is to perform a comprehensive survey on the applications of AI in battling against the difficulties the outbreak has caused. Thus we cover every way that AI approaches have been employed and to cover all the research until the writing of this paper. We try organize the works in a way that overall picture is comprehensible. Such a picture, although full of details, is very helpful in understand where AI sits in current pandemonium. We also tried to conclude the paper with ideas on how the problems can be tackled in a better way and provide some suggestions for future works.Peer reviewe
Deep learning applied to computational mechanics: A comprehensive review, state of the art, and the classics
Three recent breakthroughs due to AI in arts and science serve as motivation:
An award winning digital image, protein folding, fast matrix multiplication.
Many recent developments in artificial neural networks, particularly deep
learning (DL), applied and relevant to computational mechanics (solid, fluids,
finite-element technology) are reviewed in detail. Both hybrid and pure machine
learning (ML) methods are discussed. Hybrid methods combine traditional PDE
discretizations with ML methods either (1) to help model complex nonlinear
constitutive relations, (2) to nonlinearly reduce the model order for efficient
simulation (turbulence), or (3) to accelerate the simulation by predicting
certain components in the traditional integration methods. Here, methods (1)
and (2) relied on Long-Short-Term Memory (LSTM) architecture, with method (3)
relying on convolutional neural networks. Pure ML methods to solve (nonlinear)
PDEs are represented by Physics-Informed Neural network (PINN) methods, which
could be combined with attention mechanism to address discontinuous solutions.
Both LSTM and attention architectures, together with modern and generalized
classic optimizers to include stochasticity for DL networks, are extensively
reviewed. Kernel machines, including Gaussian processes, are provided to
sufficient depth for more advanced works such as shallow networks with infinite
width. Not only addressing experts, readers are assumed familiar with
computational mechanics, but not with DL, whose concepts and applications are
built up from the basics, aiming at bringing first-time learners quickly to the
forefront of research. History and limitations of AI are recounted and
discussed, with particular attention at pointing out misstatements or
misconceptions of the classics, even in well-known references. Positioning and
pointing control of a large-deformable beam is given as an example.Comment: 275 pages, 158 figures. Appeared online on 2023.03.01 at
CMES-Computer Modeling in Engineering & Science
Advanced Optimization Methods and Big Data Applications in Energy Demand Forecast
The use of data collectors in energy systems is growing more and more. For example, smart sensors are now widely used in energy production and energy consumption systems. This implies that huge amounts of data are generated and need to be analyzed in order to extract useful insights from them. Such big data give rise to a number of opportunities and challenges for informed decision making. In recent years, researchers have been working very actively in order to come up with effective and powerful techniques in order to deal with the huge amount of data available. Such approaches can be used in the context of energy production and consumption considering the amount of data produced by all samples and measurements, as well as including many additional features. With them, automated machine learning methods for extracting relevant patterns, high-performance computing, or data visualization are being successfully applied to energy demand forecasting. In light of the above, this Special Issue collects the latest research on relevant topics, in particular in energy demand forecasts, and the use of advanced optimization methods and big data techniques. Here, by energy, we mean any kind of energy, e.g., electrical, solar, microwave, or win
Data Science, Data Visualization, and Digital Twins
Real-time, web-based, and interactive visualisations are proven to be outstanding methodologies and tools in numerous fields when knowledge in sophisticated data science and visualisation techniques is available. The rationale for this is because modern data science analytical approaches like machine/deep learning or artificial intelligence, as well as digital twinning, promise to give data insights, enable informed decision-making, and facilitate rich interactions among stakeholders.The benefits of data visualisation, data science, and digital twinning technologies motivate this book, which exhibits and presents numerous developed and advanced data science and visualisation approaches. Chapters cover such topics as deep learning techniques, web and dashboard-based visualisations during the COVID pandemic, 3D modelling of trees for mobile communications, digital twinning in the mining industry, data science libraries, and potential areas of future data science development
Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate Model Predictive Trajectory Tracking
Accurately modeling quadrotor's system dynamics is critical for guaranteeing
agile, safe, and stable navigation. The model needs to capture the system
behavior in multiple flight regimes and operating conditions, including those
producing highly nonlinear effects such as aerodynamic forces and torques,
rotor interactions, or possible system configuration modifications. Classical
approaches rely on handcrafted models and struggle to generalize and scale to
capture these effects. In this paper, we present a novel Physics-Inspired
Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system
dynamics purely from robot experience. Our approach combines the expressive
power of sparse temporal convolutions and dense feed-forward connections to
make accurate system predictions. In addition, physics constraints are embedded
in the training process to facilitate the network's generalization capabilities
to data outside the training distribution. Finally, we design a model
predictive control approach that incorporates the learned dynamics for accurate
closed-loop trajectory tracking fully exploiting the learned model predictions
in a receding horizon fashion. Experimental results demonstrate that our
approach accurately extracts the structure of the quadrotor's dynamics from
data, capturing effects that would remain hidden to classical approaches. To
the best of our knowledge, this is the first time physics-inspired deep
learning is successfully applied to temporal convolutional networks and to the
system identification task, while concurrently enabling predictive control.Comment: Video: https://youtu.be/dsOtKfuRjE
- …