54,282 research outputs found
Performance impact of web services on Internet servers
While traditional Internet servers mainly served static and
later also dynamic content, the popularity of Web services
is increasing rapidly. Web services incorporate additional
overhead compared to traditional web interaction. This
overhead increases the demand on Internet servers which
is of particular importance when the request rate to the
server is high. We conduct experiments that show that the
imposed overhead of Web services is non-negligible
during server overload. In our experiments the response
time for Web services is more than 30% higher and the
server throughput more than 25% lower compared to
traditional web interaction using dynamically created
HTML pages
X-TCP: A Cross Layer Approach for TCP Uplink Flows in mmWave Networks
Millimeter wave frequencies will likely be part of the fifth generation of
mobile networks and of the 3GPP New Radio (NR) standard. MmWave communication
indeed provides a very large bandwidth, thus an increased cell throughput, but
how to exploit these resources at the higher layers is still an open research
question. A very relevant issue is the high variability of the channel, caused
by the blockage from obstacles and the human body. This affects the design of
congestion control mechanisms at the transport layer, and state-of-the-art TCP
schemes such as TCP CUBIC present suboptimal performance. In this paper, we
present a cross layer approach for uplink flows that adjusts the congestion
window of TCP at the mobile equipment side using an estimation of the available
data rate at the mmWave physical layer, based on the actual resource allocation
and on the Signal to Interference plus Noise Ratio. We show that this approach
reduces the latency, avoiding to fill the buffers in the cellular stack, and
has a quicker recovery time after RTO events than several other TCP congestion
control algorithms.Comment: 6 pages, 5 figures, accepted for presentation at the 2017 16th Annual
Mediterranean Ad Hoc Networking Workshop (MED-HOC-NET
Chance and Necessity in Evolution: Lessons from RNA
The relationship between sequences and secondary structures or shapes in RNA
exhibits robust statistical properties summarized by three notions: (1) the
notion of a typical shape (that among all sequences of fixed length certain
shapes are realized much more frequently than others), (2) the notion of shape
space covering (that all typical shapes are realized in a small neighborhood of
any random sequence), and (3) the notion of a neutral network (that sequences
folding into the same typical shape form networks that percolate through
sequence space). Neutral networks loosen the requirements on the mutation rate
for selection to remain effective. The original (genotypic) error threshold has
to be reformulated in terms of a phenotypic error threshold. With regard to
adaptation, neutrality has two seemingly contradictory effects: It acts as a
buffer against mutations ensuring that a phenotype is preserved. Yet it is
deeply enabling, because it permits evolutionary change to occur by allowing
the sequence context to vary silently until a single point mutation can become
phenotypically consequential. Neutrality also influences predictability of
adaptive trajectories in seemingly contradictory ways. On the one hand it
increases the uncertainty of their genotypic trace. At the same time neutrality
structures the access from one shape to another, thereby inducing a topology
among RNA shapes which permits a distinction between continuous and
discontinuous shape transformations. To the extent that adaptive trajectories
must undergo such transformations, their phenotypic trace becomes more
predictable.Comment: 37 pages, 14 figures; 1998 CNLS conference; high quality figures at
http://www.santafe.edu/~walte
- …