18,135 research outputs found
Deep Learning Techniques for Geospatial Data Analysis
Consumer electronic devices such as mobile handsets, goods tagged with RFID
labels, location and position sensors are continuously generating a vast amount
of location enriched data called geospatial data. Conventionally such
geospatial data is used for military applications. In recent times, many useful
civilian applications have been designed and deployed around such geospatial
data. For example, a recommendation system to suggest restaurants or places of
attraction to a tourist visiting a particular locality. At the same time, civic
bodies are harnessing geospatial data generated through remote sensing devices
to provide better services to citizens such as traffic monitoring, pothole
identification, and weather reporting. Typically such applications are
leveraged upon non-hierarchical machine learning techniques such as Naive-Bayes
Classifiers, Support Vector Machines, and decision trees. Recent advances in
the field of deep-learning showed that Neural Network-based techniques
outperform conventional techniques and provide effective solutions for many
geospatial data analysis tasks such as object recognition, image
classification, and scene understanding. The chapter presents a survey on the
current state of the applications of deep learning techniques for analyzing
geospatial data.
The chapter is organized as below: (i) A brief overview of deep learning
algorithms. (ii)Geospatial Analysis: a Data Science Perspective (iii)
Deep-learning techniques for Remote Sensing data analytics tasks (iv)
Deep-learning techniques for GPS data analytics(iv) Deep-learning techniques
for RFID data analytics.Comment: This is a pre-print of the following chapter: Arvind W. Kiwelekar,
Geetanjali S. Mahamunkar, Laxman D. Netak, Valmik B Nikam, {\em Deep Learning
Techniques for Geospatial Data Analysis}, published in {\bf Machine Learning
Paradigms}, edited by George A. TsihrintzisLakhmi C. Jain, 2020, publisher
Springer, Cham reproduced with permission of publisher Springer, Cha
Research and Education in Computational Science and Engineering
Over the past two decades the field of computational science and engineering
(CSE) has penetrated both basic and applied research in academia, industry, and
laboratories to advance discovery, optimize systems, support decision-makers,
and educate the scientific and engineering workforce. Informed by centuries of
theory and experiment, CSE performs computational experiments to answer
questions that neither theory nor experiment alone is equipped to answer. CSE
provides scientists and engineers of all persuasions with algorithmic
inventions and software systems that transcend disciplines and scales. Carried
on a wave of digital technology, CSE brings the power of parallelism to bear on
troves of data. Mathematics-based advanced computing has become a prevalent
means of discovery and innovation in essentially all areas of science,
engineering, technology, and society; and the CSE community is at the core of
this transformation. However, a combination of disruptive
developments---including the architectural complexity of extreme-scale
computing, the data revolution that engulfs the planet, and the specialization
required to follow the applications to new frontiers---is redefining the scope
and reach of the CSE endeavor. This report describes the rapid expansion of CSE
and the challenges to sustaining its bold advances. The report also presents
strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie
Astrophysical Data Analytics based on Neural Gas Models, using the Classification of Globular Clusters as Playground
In Astrophysics, the identification of candidate Globular Clusters through
deep, wide-field, single band HST images, is a typical data analytics problem,
where methods based on Machine Learning have revealed a high efficiency and
reliability, demonstrating the capability to improve the traditional
approaches. Here we experimented some variants of the known Neural Gas model,
exploring both supervised and unsupervised paradigms of Machine Learning, on
the classification of Globular Clusters, extracted from the NGC1399 HST data.
Main focus of this work was to use a well-tested playground to scientifically
validate such kind of models for further extended experiments in astrophysics
and using other standard Machine Learning methods (for instance Random Forest
and Multi Layer Perceptron neural network) for a comparison of performances in
terms of purity and completeness.Comment: Proceedings of the XIX International Conference "Data Analytics and
Management in Data Intensive Domains" (DAMDID/RCDL 2017), Moscow, Russia,
October 10-13, 2017, 8 pages, 4 figure
Autonomic computing architecture for SCADA cyber security
Cognitive computing relates to intelligent computing platforms that are based on the disciplines of artificial intelligence, machine learning, and other innovative technologies. These technologies can be used to design systems that mimic the human brain to learn about their environment and can autonomously predict an impending anomalous situation. IBM first used the term ‘Autonomic Computing’ in 2001 to combat the looming complexity crisis (Ganek and Corbi, 2003). The concept has been inspired by the human biological autonomic system. An autonomic system is self-healing, self-regulating, self-optimising and self-protecting (Ganek and Corbi, 2003). Therefore, the system should be able to protect itself against both malicious attacks and unintended mistakes by the operator
Big-Data-Driven Materials Science and its FAIR Data Infrastructure
This chapter addresses the forth paradigm of materials research -- big-data
driven materials science. Its concepts and state-of-the-art are described, and
its challenges and chances are discussed. For furthering the field, Open Data
and an all-embracing sharing, an efficient data infrastructure, and the rich
ecosystem of computer codes used in the community are of critical importance.
For shaping this forth paradigm and contributing to the development or
discovery of improved and novel materials, data must be what is now called FAIR
-- Findable, Accessible, Interoperable and Re-purposable/Re-usable. This sets
the stage for advances of methods from artificial intelligence that operate on
large data sets to find trends and patterns that cannot be obtained from
individual calculations and not even directly from high-throughput studies.
Recent progress is reviewed and demonstrated, and the chapter is concluded by a
forward-looking perspective, addressing important not yet solved challenges.Comment: submitted to the Handbook of Materials Modeling (eds. S. Yip and W.
Andreoni), Springer 2018/201
Autonomic computing meets SCADA security
© 2017 IEEE. National assets such as transportation networks, large manufacturing, business and health facilities, power generation, and distribution networks are critical infrastructures. The cyber threats to these infrastructures have increasingly become more sophisticated, extensive and numerous. Cyber security conventional measures have proved useful in the past but increasing sophistication of attacks dictates the need for newer measures. The autonomic computing paradigm mimics the autonomic nervous system and is promising to meet the latest challenges in the cyber threat landscape. This paper provides a brief review of autonomic computing applications for SCADA systems and proposes architecture for cyber security
- …