12,873 research outputs found
2D Proactive Uplink Resource Allocation Algorithm for Event Based MTC Applications
We propose a two dimension (2D) proactive uplink resource allocation
(2D-PURA) algorithm that aims to reduce the delay/latency in event-based
machine-type communications (MTC) applications. Specifically, when an event of
interest occurs at a device, it tends to spread to the neighboring devices.
Consequently, when a device has data to send to the base station (BS), its
neighbors later are highly likely to transmit. Thus, we propose to cluster
devices in the neighborhood around the event, also referred to as the
disturbance region, into rings based on the distance from the original event.
To reduce the uplink latency, we then proactively allocate resources for these
rings. To evaluate the proposed algorithm, we analytically derive the mean
uplink delay, the proportion of resource conservation due to successful
allocations, and the proportion of uplink resource wastage due to unsuccessful
allocations for 2D-PURA algorithm. Numerical results demonstrate that the
proposed method can save over 16.5 and 27 percent of mean uplink delay,
compared with the 1D algorithm and the standard method, respectively.Comment: 6 pages, 6 figures, Published in 2018 IEEE Wireless Communications
and Networking Conference (WCNC
Patent Analytics Based on Feature Vector Space Model: A Case of IoT
The number of approved patents worldwide increases rapidly each year, which
requires new patent analytics to efficiently mine the valuable information
attached to these patents. Vector space model (VSM) represents documents as
high-dimensional vectors, where each dimension corresponds to a unique term.
While originally proposed for information retrieval systems, VSM has also seen
wide applications in patent analytics, and used as a fundamental tool to map
patent documents to structured data. However, VSM method suffers from several
limitations when applied to patent analysis tasks, such as loss of
sentence-level semantics and curse-of-dimensionality problems. In order to
address the above limitations, we propose a patent analytics based on feature
vector space model (FVSM), where the FVSM is constructed by mapping patent
documents to feature vectors extracted by convolutional neural networks (CNN).
The applications of FVSM for three typical patent analysis tasks, i.e., patents
similarity comparison, patent clustering, and patent map generation are
discussed. A case study using patents related to Internet of Things (IoT)
technology is illustrated to demonstrate the performance and effectiveness of
FVSM. The proposed FVSM can be adopted by other patent analysis studies to
replace VSM, based on which various big data learning tasks can be performed
Machine Learning DDoS Detection for Consumer Internet of Things Devices
An increasing number of Internet of Things (IoT) devices are connecting to
the Internet, yet many of these devices are fundamentally insecure, exposing
the Internet to a variety of attacks. Botnets such as Mirai have used insecure
consumer IoT devices to conduct distributed denial of service (DDoS) attacks on
critical Internet infrastructure. This motivates the development of new
techniques to automatically detect consumer IoT attack traffic. In this paper,
we demonstrate that using IoT-specific network behaviors (e.g. limited number
of endpoints and regular time intervals between packets) to inform feature
selection can result in high accuracy DDoS detection in IoT network traffic
with a variety of machine learning algorithms, including neural networks. These
results indicate that home gateway routers or other network middleboxes could
automatically detect local IoT device sources of DDoS attacks using low-cost
machine learning algorithms and traffic data that is flow-based and
protocol-agnostic.Comment: 7 pages, 3 figures, 3 tables, appears in the 2018 Workshop on Deep
Learning and Security (DLS '18
FiFo: Fishbone Forwarding in Massive IoT Networks
Massive Internet of Things (IoT) networks have a wide range of applications,
including but not limited to the rapid delivery of emergency and disaster
messages. Although various benchmark algorithms have been developed to date for
message delivery in such applications, they pose several practical challenges
such as insufficient network coverage and/or highly redundant transmissions to
expand the coverage area, resulting in considerable energy consumption for each
IoT device. To overcome this problem, we first characterize a new performance
metric, forwarding efficiency, which is defined as the ratio of the coverage
probability to the average number of transmissions per device, to evaluate the
data dissemination performance more appropriately. Then, we propose a novel and
effective forwarding method, fishbone forwarding (FiFo), which aims to improve
the forwarding efficiency with acceptable computational complexity. Our FiFo
method completes two tasks: 1) it clusters devices based on the unweighted pair
group method with the arithmetic average; and 2) it creates the main axis and
sub axes of each cluster using both the expectation-maximization algorithm for
the Gaussian mixture model and principal component analysis. We demonstrate the
superiority of FiFo by using a real-world dataset. Through intensive and
comprehensive simulations, we show that the proposed FiFo method outperforms
benchmark algorithms in terms of the forwarding efficiency.Comment: 13 pages, 16 figures, 5 tables; to appear in the IEEE Internet of
Things Journal (Please cite our journal version that will appear in an
upcoming issue.
- …