748 research outputs found
Machine Learning and Integrative Analysis of Biomedical Big Data.
Recent developments in high-throughput technologies have accelerated the accumulation of massive amounts of omics data from multiple sources: genome, epigenome, transcriptome, proteome, metabolome, etc. Traditionally, data from each source (e.g., genome) is analyzed in isolation using statistical and machine learning (ML) methods. Integrative analysis of multi-omics and clinical data is key to new biomedical discoveries and advancements in precision medicine. However, data integration poses new computational challenges as well as exacerbates the ones associated with single-omics studies. Specialized computational approaches are required to effectively and efficiently perform integrative analysis of biomedical data acquired from diverse modalities. In this review, we discuss state-of-the-art ML-based approaches for tackling five specific computational challenges associated with integrative analysis: curse of dimensionality, data heterogeneity, missing data, class imbalance and scalability issues
Machine Learning in IoT Security:Current Solutions and Future Challenges
The future Internet of Things (IoT) will have a deep economical, commercial
and social impact on our lives. The participating nodes in IoT networks are
usually resource-constrained, which makes them luring targets for cyber
attacks. In this regard, extensive efforts have been made to address the
security and privacy issues in IoT networks primarily through traditional
cryptographic approaches. However, the unique characteristics of IoT nodes
render the existing solutions insufficient to encompass the entire security
spectrum of the IoT networks. This is, at least in part, because of the
resource constraints, heterogeneity, massive real-time data generated by the
IoT devices, and the extensively dynamic behavior of the networks. Therefore,
Machine Learning (ML) and Deep Learning (DL) techniques, which are able to
provide embedded intelligence in the IoT devices and networks, are leveraged to
cope with different security problems. In this paper, we systematically review
the security requirements, attack vectors, and the current security solutions
for the IoT networks. We then shed light on the gaps in these security
solutions that call for ML and DL approaches. We also discuss in detail the
existing ML and DL solutions for addressing different security problems in IoT
networks. At last, based on the detailed investigation of the existing
solutions in the literature, we discuss the future research directions for ML-
and DL-based IoT security
Big Data - Supply Chain Management Framework for Forecasting: Data Preprocessing and Machine Learning Techniques
This article intends to systematically identify and comparatively analyze
state-of-the-art supply chain (SC) forecasting strategies and technologies. A
novel framework has been proposed incorporating Big Data Analytics in SC
Management (problem identification, data sources, exploratory data analysis,
machine-learning model training, hyperparameter tuning, performance evaluation,
and optimization), forecasting effects on human-workforce, inventory, and
overall SC. Initially, the need to collect data according to SC strategy and
how to collect them has been discussed. The article discusses the need for
different types of forecasting according to the period or SC objective. The SC
KPIs and the error-measurement systems have been recommended to optimize the
top-performing model. The adverse effects of phantom inventory on forecasting
and the dependence of managerial decisions on the SC KPIs for determining model
performance parameters and improving operations management, transparency, and
planning efficiency have been illustrated. The cyclic connection within the
framework introduces preprocessing optimization based on the post-process KPIs,
optimizing the overall control process (inventory management, workforce
determination, cost, production and capacity planning). The contribution of
this research lies in the standard SC process framework proposal, recommended
forecasting data analysis, forecasting effects on SC performance, machine
learning algorithms optimization followed, and in shedding light on future
research
Classification techniques for arrhythmia patterns using convolutional neural networks and Internet of Things (IoT) devices
The rise of Telemedicine has revolutionized how patients are being treated, leading to several advantages such as enhanced health analysis tools, accessible remote healthcare, basic diagnostic of health parameters, etc. The advent of the Internet of Things (IoT), Artificial Intelligence (AI) and their incorporation into Telemedicine extends the potential of health benefits of Telemedicine even further. Therefore, the synergy between AI, IoT, and Telemedicine creates diverse innovative scenarios for integrating cyber-physical systems into medical health to provide remote monitoring and interactive assistance to patients. Data from World Health Organization reports that 7.4 million people died because of Atrial Fibrillation (AF), recognizing the most common arrhythmia associated with human heart rate. Causes like unhealthy diet, smoking, poor resources to go to the doctor and based on research studies, about 12 and 17.9 million of people will be suffering the AF in the USA and Europe, in 2050 and 2060, respectively. The AF as a cardiovascular disease is becoming an important public health issue to tackle. By using a systematic approach, this paper reviews recent contributions related to the acquisition of heart beats, arrhythmia detection, IoT, and visualization. In particular, by analysing the most closely related papers on Convolutional Neural Network (CNN) and IoT devices in heart disease diagnostics, we present a summary of the main research gaps with suggested directions for future research
Artificial Intelligence based Anomaly Detection of Energy Consumption in Buildings: A Review, Current Trends and New Perspectives
Enormous amounts of data are being produced everyday by sub-meters and smart
sensors installed in residential buildings. If leveraged properly, that data
could assist end-users, energy producers and utility companies in detecting
anomalous power consumption and understanding the causes of each anomaly.
Therefore, anomaly detection could stop a minor problem becoming overwhelming.
Moreover, it will aid in better decision-making to reduce wasted energy and
promote sustainable and energy efficient behavior. In this regard, this paper
is an in-depth review of existing anomaly detection frameworks for building
energy consumption based on artificial intelligence. Specifically, an extensive
survey is presented, in which a comprehensive taxonomy is introduced to
classify existing algorithms based on different modules and parameters adopted,
such as machine learning algorithms, feature extraction approaches, anomaly
detection levels, computing platforms and application scenarios. To the best of
the authors' knowledge, this is the first review article that discusses anomaly
detection in building energy consumption. Moving forward, important findings
along with domain-specific problems, difficulties and challenges that remain
unresolved are thoroughly discussed, including the absence of: (i) precise
definitions of anomalous power consumption, (ii) annotated datasets, (iii)
unified metrics to assess the performance of existing solutions, (iv) platforms
for reproducibility and (v) privacy-preservation. Following, insights about
current research trends are discussed to widen the applications and
effectiveness of the anomaly detection technology before deriving future
directions attracting significant attention. This article serves as a
comprehensive reference to understand the current technological progress in
anomaly detection of energy consumption based on artificial intelligence.Comment: 11 Figures, 3 Table
Theoretical Interpretations and Applications of Radial Basis Function Networks
Medical applications usually used Radial Basis Function Networks just as Artificial Neural Networks. However, RBFNs are Knowledge-Based Networks that can be interpreted in several way: Artificial Neural Networks, Regularization Networks, Support Vector Machines, Wavelet Networks, Fuzzy Controllers, Kernel Estimators, Instanced-Based Learners. A survey of their interpretations and of their corresponding learning algorithms is provided as well as a brief survey on dynamic learning algorithms. RBFNs' interpretations can suggest applications that are particularly interesting in medical domains
Reduction of False Positives in Intrusion Detection Based on Extreme Learning Machine with Situation Awareness
Protecting computer networks from intrusions is more important than ever for our privacy, economy, and national security. Seemingly a month does not pass without news of a major data breach involving sensitive personal identity, financial, medical, trade secret, or national security data. Democratic processes can now be potentially compromised through breaches of electronic voting systems. As ever more devices, including medical machines, automobiles, and control systems for critical infrastructure are increasingly networked, human life is also more at risk from cyber-attacks. Research into Intrusion Detection Systems (IDSs) began several decades ago and IDSs are still a mainstay of computer and network protection and continue to evolve. However, detecting previously unseen, or zero-day, threats is still an elusive goal. Many commercial IDS deployments still use misuse detection based on known threat signatures. Systems utilizing anomaly detection have shown great promise to detect previously unseen threats in academic research. But their success has been limited in large part due to the excessive number of false positives that they produce.
This research demonstrates that false positives can be better minimized, while maintaining detection accuracy, by combining Extreme Learning Machine (ELM) and Hidden Markov Models (HMM) as classifiers within the context of a situation awareness framework. This research was performed using the University of New South Wales - Network Based 2015 (UNSW-NB15) data set which is more representative of contemporary cyber-attack and normal network traffic than older data sets typically used in IDS research. It is shown that this approach provides better results than either HMM or ELM alone and with a lower False Positive Rate (FPR) than other comparable approaches that also used the UNSW-NB15 data set
Deep Learning Methods for Malware and Intrusion Detection: A Systematic Literature Review
Android and Windows are the predominant operating systems used in mobile environment and personal computers and it is expected that their use will rise during the next decade. Malware is one of the main threats faced by these platforms as well as Internet of Things (IoT) environment and the web. With time, these threats are becoming more and more sophisticated and detecting them using traditional machine learning techniques is a hard task. Several research studies have shown that deep learning methods achieve better accuracy comparatively and can learn to efficiently detect and classify new malware samples. In this paper, we present a systematic literature review of the recent studies that focused on intrusion and malware detection and their classification in various environments using deep learning techniques. We searched five well-known digital libraries and collected a total of 107 papers that were published in scholarly journals or preprints. We carefully read the selected literature and critically analyze it to find out which types of threats and what platform the researchers are targeting and how accurately the deep learning-based systems can detect new security threats. This survey will have a positive impact on the learning capabilities of beginners who are interested in starting their research in the area of malware detection using deep learning methods. From the detailed critical analysis, it is identified that CNN, LSTM, DBN, and autoencoders are the most frequently used deep learning methods that have effectively been used in various application scenarios
- …