3,508 research outputs found
How Well Sensing Integrates with Communications in MmWave Wi-Fi?
The development of integrated sensing and communication (ISAC) systems has
recently gained interest for its ability to offer a variety of services
including resources sharing and new applications, for example, localization,
tracking, and health care related. While the sensing capabilities are offered
through many technologies, rending to their wide deployments and the high
frequency spectrum they provide and high range resolution, its accessibility
through the Wi-Fi networks IEEE 802.11ad and 802.11ay has been getting the
interest of research and industry. Even though there is a dedicated
standardization body, namely the 802.11bf task group, working on enhancing the
Wi-Fi sensing performance, investigations are needed to evaluate the
effectiveness of various sensing techniques. In this project, we, in addition
to surveying related literature, we evaluate the sensing performance of the
millimeter wave (mmWave) Wi-Fi systems by simulating a scenario of a human
target using Matlab simulation tools. In this analysis, we processed channel
estimation data using the short time Fourier transform (STFT). Furthermore,
using a channel variation threshold method, we evaluated the performance while
reducing feedback. Our findings indicate that using STFT window overlap can
provide good tracking results, and that the reduction in feedback measurements
using 0.05 and 0.1 threshold levels reduces feedback measurements by 48% and
77%, respectively, without significantly degrading performance.Comment: arXiv admin note: substantial text overlap with arXiv:2207.04859 by
other author
Automating embedded analysis capabilities and managing software complexity in multiphysics simulation part I: template-based generic programming
An approach for incorporating embedded simulation and analysis capabilities
in complex simulation codes through template-based generic programming is
presented. This approach relies on templating and operator overloading within
the C++ language to transform a given calculation into one that can compute a
variety of additional quantities that are necessary for many state-of-the-art
simulation and analysis algorithms. An approach for incorporating these ideas
into complex simulation codes through general graph-based assembly is also
presented. These ideas have been implemented within a set of packages in the
Trilinos framework and are demonstrated on a simple problem from chemical
engineering
Traffic measurement and analysis
Measurement and analysis of real traffic is important to gain knowledge
about the characteristics of the traffic. Without measurement, it is
impossible to build realistic traffic models. It is recent that data
traffic was found to have self-similar properties. In this thesis work
traffic captured on the network at SICS and on the Supernet, is shown to
have this fractal-like behaviour. The traffic is also examined with
respect to which protocols and packet sizes are present and in what
proportions. In the SICS trace most packets are small, TCP is shown to be
the predominant transport protocol and NNTP the most common application.
In contrast to this, large UDP packets sent between not well-known ports
dominates the Supernet traffic. Finally, characteristics of the client
side of the WWW traffic are examined more closely. In order to extract
useful information from the packet trace, web browsers use of TCP and HTTP
is investigated including new features in HTTP/1.1 such as persistent
connections and pipelining. Empirical probability distributions are
derived describing session lengths, time between user clicks and the
amount of data transferred due to a single user click. These probability
distributions make up a simple model of WWW-sessions
The Dynamics of Internet Traffic: Self-Similarity, Self-Organization, and Complex Phenomena
The Internet is the most complex system ever created in human history.
Therefore, its dynamics and traffic unsurprisingly take on a rich variety of
complex dynamics, self-organization, and other phenomena that have been
researched for years. This paper is a review of the complex dynamics of
Internet traffic. Departing from normal treatises, we will take a view from
both the network engineering and physics perspectives showing the strengths and
weaknesses as well as insights of both. In addition, many less covered
phenomena such as traffic oscillations, large-scale effects of worm traffic,
and comparisons of the Internet and biological models will be covered.Comment: 63 pages, 7 figures, 7 tables, submitted to Advances in Complex
System
AoA-aware Probabilistic Indoor Location Fingerprinting using Channel State Information
With expeditious development of wireless communications, location
fingerprinting (LF) has nurtured considerable indoor location based services
(ILBSs) in the field of Internet of Things (IoT). For most pattern-matching
based LF solutions, previous works either appeal to the simple received signal
strength (RSS), which suffers from dramatic performance degradation due to
sophisticated environmental dynamics, or rely on the fine-grained physical
layer channel state information (CSI), whose intricate structure leads to an
increased computational complexity. Meanwhile, the harsh indoor environment can
also breed similar radio signatures among certain predefined reference points
(RPs), which may be randomly distributed in the area of interest, thus mightily
tampering the location mapping accuracy. To work out these dilemmas, during the
offline site survey, we first adopt autoregressive (AR) modeling entropy of CSI
amplitude as location fingerprint, which shares the structural simplicity of
RSS while reserving the most location-specific statistical channel information.
Moreover, an additional angle of arrival (AoA) fingerprint can be accurately
retrieved from CSI phase through an enhanced subspace based algorithm, which
serves to further eliminate the error-prone RP candidates. In the online phase,
by exploiting both CSI amplitude and phase information, a novel bivariate
kernel regression scheme is proposed to precisely infer the target's location.
Results from extensive indoor experiments validate the superior localization
performance of our proposed system over previous approaches
Rate-adaptive H.264 for TCP/IP networks
While there has always been a tremendous demand for streaming video over
TCP/IP networks, the nature of the application still presents some challenging issues.
These applications that transmit multimedia data over best-effort networks like the
Internet must cope with the changing network behavior; specifically, the source encoder
rate should be controlled based on feedback from a channel estimator that probes the
network periodically. First, one such Multimedia Streaming TCP-Friendly Protocol
(MSTFP) is considered, which iteratively integrates forward estimation of network status
with feedback control to closely track the varying network characteristics. Second, a
network-adaptive embedded bit stream is generated using a r-domain rate controller.
The conceptual elegance of this r-domain framework stems from the fact that the
coding bit rate ) (R is approximately linear in the percentage of zeros among the
quantized spatial transform coefficients ) ( r , as opposed to the more traditional, complex
and highly nonlinear ) ( Q R characterization. Though the r-model has been
successfully implemented on a few other video codecs, its application to the emerging
video coding standard H.264 is considered. The extensive experimental results show thatrobust rate control, similar or improved Peak Signal to Noise Ratio (PSNR), and a faster
implementation
Cognitive Interference Management in Retransmission-Based Wireless Networks
Cognitive radio methodologies have the potential to dramatically increase the
throughput of wireless systems. Herein, control strategies which enable the
superposition in time and frequency of primary and secondary user transmissions
are explored in contrast to more traditional sensing approaches which only
allow the secondary user to transmit when the primary user is idle. In this
work, the optimal transmission policy for the secondary user when the primary
user adopts a retransmission based error control scheme is investigated. The
policy aims to maximize the secondary users' throughput, with a constraint on
the throughput loss and failure probability of the primary user. Due to the
constraint, the optimal policy is randomized, and determines how often the
secondary user transmits according to the retransmission state of the packet
being served by the primary user. The resulting optimal strategy of the
secondary user is proven to have a unique structure. In particular, the optimal
throughput is achieved by the secondary user by concentrating its transmission,
and thus its interference to the primary user, in the first transmissions of a
primary user packet. The rather simple framework considered in this paper
highlights two fundamental aspects of cognitive networks that have not been
covered so far: (i) the networking mechanisms implemented by the primary users
(error control by means of retransmissions in the considered model) react to
secondary users' activity; (ii) if networking mechanisms are considered, then
their state must be taken into account when optimizing secondary users'
strategy, i.e., a strategy based on a binary active/idle perception of the
primary users' state is suboptimal.Comment: accepted for publication on Transactions on Information Theor
A passive available bandwidth estimation methodology
The Available Bandwidth (AB) of an end-to-end path is its remaining capacity and it is an important metric for several applications such as overlay routing and P2P networking. That is why many AB estimation tools have been published recently. Most of these tools use the Probe Rate Model, which requires sending packet trains at a rate matching the AB. Its main issue is that it congests the path under measurement. We present a different approach: a novel passive methodology to estimate the AB that does not introduce probe traffic. Our methodology, intended to be applied between two separate nodes, estimates the pathâs AB by analyzing specific parameters of the traffic exchanged. The main challenge is that we cannot rely on any given rate of this traffic. Therefore we rely on a different model, the Utilization Model. In this paper we present our passive methodology and a tool (PKBest) based on it. We evaluate its applicability and accuracy using public NLANR data traces. Our results -more than 300Gb- show that our tool is more accurate than pathChirp, a state-of-the-art active PRM-based tool. At the best of the authorsâ knowledge this is the first passive AB estimation methodology.Preprin
- âŠ