116 research outputs found
DCM+: Robust Congestion Control Protocol for Mobile Networks
This paper aims at presenting a new robust congestion control protocol for mobile networks. It also can be used for mixed networks and mobile adhoc networks (MANETs). The proposed protocol is called Dynamic Congestion Control Protocol for Mobile Networks (DCM+). It makes use of the bandwidth estimation algorithm used in Westwood+ algorithm. We evaluate DCM+ on the basis of known metrics like throughput, average delay, packet loss and Packet-Delivery-Ratio (PDR). New metrics like Normalized Advancing Index (NAI) and Complete-Transmission-Time (CTT) have been introduced for a comprehensive comparison with other congestion control variants like NewReno, Hybla, Ledbat and BIC. The simulations are done for a one-way single-hop-topology (sender->router->receiver). The findings in this paper clearly show excellent properties of our proposed technique like robustness and stability. It avoids congestions, increases performance, minimizes the end-to-end delay and reduces the transmission time. DCM+ combines the advantages of the protocols NewReno and Westwood+. The simulation results show high improvements, which make this approach extremely adequate for different types of networks
Window size and round-trip-time in a network transmission session
A transmission session in a network constitutes a
period beginning with the transport of data from one
communicating node to the other. A transmission session is
always set out for end-to-end connection and involves many
network resources. Previous research studies on smooth data
flow across a network reveals that the maximum number of data
in an optimal transmission session is associated with window size.
Problems are still encountered when it comes to the rate at which
data move in a transmission session and also the required
window size. This should be dynamically and automatically
controlled. This research investigates the effect of Window Size
and Round-Trip Time (RTT) in a transmission session. Packet
data are collected for many network transmission sessions. The
raw data were normalized, and the NaĂŻve Bayes technique was
used for the analytical evaluation. The effect of window size and
RTT in a transmission session is examined, which reveals that the
rate at which data move in a transmission session can be
dynamically controlled to a considerably high degree of accuracy.
Each network node cannot be overwhelmed when the window
size is adjusted to the required siz
Cyber Security
This open access book constitutes the refereed proceedings of the 18th China Annual Conference on Cyber Security, CNCERT 2022, held in Beijing, China, in August 2022. The 17 papers presented were carefully reviewed and selected from 64 submissions. The papers are organized according to the following topical sections: ​​data security; anomaly detection; cryptocurrency; information security; vulnerabilities; mobile internet; threat intelligence; text recognition
Internet of Underwater Things and Big Marine Data Analytics -- A Comprehensive Survey
The Internet of Underwater Things (IoUT) is an emerging communication
ecosystem developed for connecting underwater objects in maritime and
underwater environments. The IoUT technology is intricately linked with
intelligent boats and ships, smart shores and oceans, automatic marine
transportations, positioning and navigation, underwater exploration, disaster
prediction and prevention, as well as with intelligent monitoring and security.
The IoUT has an influence at various scales ranging from a small scientific
observatory, to a midsized harbor, and to covering global oceanic trade. The
network architecture of IoUT is intrinsically heterogeneous and should be
sufficiently resilient to operate in harsh environments. This creates major
challenges in terms of underwater communications, whilst relying on limited
energy resources. Additionally, the volume, velocity, and variety of data
produced by sensors, hydrophones, and cameras in IoUT is enormous, giving rise
to the concept of Big Marine Data (BMD), which has its own processing
challenges. Hence, conventional data processing techniques will falter, and
bespoke Machine Learning (ML) solutions have to be employed for automatically
learning the specific BMD behavior and features facilitating knowledge
extraction and decision support. The motivation of this paper is to
comprehensively survey the IoUT, BMD, and their synthesis. It also aims for
exploring the nexus of BMD with ML. We set out from underwater data collection
and then discuss the family of IoUT data communication techniques with an
emphasis on the state-of-the-art research challenges. We then review the suite
of ML solutions suitable for BMD handling and analytics. We treat the subject
deductively from an educational perspective, critically appraising the material
surveyed.Comment: 54 pages, 11 figures, 19 tables, IEEE Communications Surveys &
Tutorials, peer-reviewed academic journa
Performance Evaluation And Anomaly detection in Mobile BroadBand Across Europe
With the rapidly growing market for smartphones and user’s confidence for immediate
access to high-quality multimedia content, the delivery of video over wireless networks has
become a big challenge. It makes it challenging to accommodate end-users with flawless
quality of service. The growth of the smartphone market goes hand in hand with the
development of the Internet, in which current transport protocols are being re-evaluated to
deal with traffic growth. QUIC and WebRTC are new and evolving standards. The latter
is a unique and evolving standard explicitly developed to meet this demand and enable
a high-quality experience for mobile users of real-time communication services. QUIC
has been designed to reduce Web latency, integrate security features, and allow a highquality
experience for mobile users. Thus, the need to evaluate the performance of these
rising protocols in a non-systematic environment is essential to understand the behavior
of the network and provide the end user with a better multimedia delivery service. Since
most of the work in the research community is conducted in a controlled environment, we
leverage the MONROE platform to investigate the performance of QUIC and WebRTC
in real cellular networks using static and mobile nodes. During this Thesis, we conduct
measurements ofWebRTC and QUIC while making their data-sets public to the interested
experimenter. Building such data-sets is very welcomed with the research community,
opening doors to applying data science to network data-sets. The development part of the
experiments involves building Docker containers that act as QUIC and WebRTC clients.
These containers are publicly available to be used candidly or within the MONROE
platform. These key contributions span from Chapter 4 to Chapter 5 presented in Part
II of the Thesis.
We exploit data collection from MONROE to apply data science over network
data-sets, which will help identify networking problems shifting the Thesis focus from
performance evaluation to a data science problem.
Indeed, the second part of the Thesis focuses on interpretable data science. Identifying
network problems leveraging Machine Learning (ML) has gained much visibility in the
past few years, resulting in dramatically improved cellular network services. However,
critical tasks like troubleshooting cellular networks are still performed manually by experts
who monitor the network around the clock. In this context, this Thesis contributes by proposing the use of simple interpretable
ML algorithms, moving away from the current trend of high-accuracy ML algorithms
(e.g., deep learning) that do not allow interpretation (and hence understanding) of their
outcome. We prefer having lower accuracy since we consider it interesting (anomalous)
the scenarios misclassified by the ML algorithms, and we do not want to miss them by
overfitting. To this aim, we present CIAN (from Causality Inference of Anomalies in
Networks), a practical and interpretable ML methodology, which we implement in the
form of a software tool named TTrees (from Troubleshooting Trees) and compare it to
a supervised counterpart, named STress (from Supervised Trees). Both methodologies
require small volumes of data and are quick at training. Our experiments using real
data from operational commercial mobile networks e.g., sampled with MONROE probes,
show that STrees and CIAN can automatically identify and accurately classify network
anomalies—e.g., cases for which a low network performance is not justified by operational
conditions—training with just a few hundreds of data samples, hence enabling precise
troubleshooting actions. Most importantly, our experiments show that a fully automated
unsupervised approach is viable and efficient. In Part III of the Thesis which includes
Chapter 6 and 7.
In conclusion, in this Thesis, we go through a data-driven networking roller coaster,
from performance evaluating upcoming network protocols in real mobile networks to
building methodologies that help identify and classify the root cause of networking
problems, emphasizing the fact that these methodologies are easy to implement and can
be deployed in production environments.This work has been supported by IMDEA Networks InstitutePrograma de Doctorado en Multimedia y Comunicaciones por la Universidad Carlos III de Madrid y la Universidad Rey Juan CarlosPresidente: Matteo Sereno.- Secretario: Antonio de la Oliva Delgado.- Vocal: Raquel Barco Moren
INSTANT MESSAGING SPAM DETECTION IN LONG TERM EVOLUTION NETWORKS
The lack of efficient spam detection modules for packet data communication is resulting to increased threat exposure for the telecommunication network users and the service providers. In this thesis, we propose a novel approach to classify spam at the server side by intercepting packet-data communication among instant messaging applications. Spam detection is performed using machine learning techniques on packet headers and contents (if unencrypted) in two different phases: offline training and online classification. The contribution of this study is threefold. First, it identifies the scope of deploying a spam detection module in a state-of-the-art telecommunication architecture. Secondly, it compares the usefulness of various existing machine learning algorithms in order to intercept and classify data packets in near real-time communication of the instant messengers. Finally, it evaluates the accuracy and classification time of spam detection using our approach in a simulated environment of continuous packet data communication. Our research results are mainly generated by executing instances of a peer-to-peer instant messaging application prototype within a simulated Long Term Evolution (LTE) telecommunication network environment. This prototype is modeled and executed using OPNET network modeling and simulation tools. The research produces considerable knowledge on addressing unsolicited packet monitoring in instant messaging and similar applications
Cyber Security
This open access book constitutes the refereed proceedings of the 18th China Annual Conference on Cyber Security, CNCERT 2022, held in Beijing, China, in August 2022. The 17 papers presented were carefully reviewed and selected from 64 submissions. The papers are organized according to the following topical sections: ​​data security; anomaly detection; cryptocurrency; information security; vulnerabilities; mobile internet; threat intelligence; text recognition
Reduction of False Positives in Intrusion Detection Based on Extreme Learning Machine with Situation Awareness
Protecting computer networks from intrusions is more important than ever for our privacy, economy, and national security. Seemingly a month does not pass without news of a major data breach involving sensitive personal identity, financial, medical, trade secret, or national security data. Democratic processes can now be potentially compromised through breaches of electronic voting systems. As ever more devices, including medical machines, automobiles, and control systems for critical infrastructure are increasingly networked, human life is also more at risk from cyber-attacks. Research into Intrusion Detection Systems (IDSs) began several decades ago and IDSs are still a mainstay of computer and network protection and continue to evolve. However, detecting previously unseen, or zero-day, threats is still an elusive goal. Many commercial IDS deployments still use misuse detection based on known threat signatures. Systems utilizing anomaly detection have shown great promise to detect previously unseen threats in academic research. But their success has been limited in large part due to the excessive number of false positives that they produce.
This research demonstrates that false positives can be better minimized, while maintaining detection accuracy, by combining Extreme Learning Machine (ELM) and Hidden Markov Models (HMM) as classifiers within the context of a situation awareness framework. This research was performed using the University of New South Wales - Network Based 2015 (UNSW-NB15) data set which is more representative of contemporary cyber-attack and normal network traffic than older data sets typically used in IDS research. It is shown that this approach provides better results than either HMM or ELM alone and with a lower False Positive Rate (FPR) than other comparable approaches that also used the UNSW-NB15 data set
- …