4,564 research outputs found
An atomic force microscope operating at hypergravity for in situ measurement of cellular mechano-response
We present a novel atomic force microscope (AFM) system, operational in liquid at variable gravity, dedicated to image cell shape changes of cells in vitro under hypergravity conditions. The hypergravity AFM is realized by mounting a stand-alone AFM into a large-diameter centrifuge. The balance between mechanical forces, both intra- and extracellular, determines both cell shape and integrity. Gravity seems to be an insignificant force at the level of a single cell, in contrast to the effect of gravity on a complete (multicellular) organism, where for instance bones and muscles are highly unloaded under near weightless (microgravity) conditions. However, past space flights and ground based cell biological studies, under both hypogravity and hypergravity conditions have shown changes in cell behaviour (signal transduction), cell architecture (cytoskeleton) and proliferation. Thus the role of direct or indirect gravity effects at the level of cells has remained unclear. Here we aim to address the role of gravity on cell shape. We concentrate on the validation of the novel AFM for use under hypergravity conditions. We find indications that a single cell exposed to 2 to 3 × g reduces some 30–50% in average height, as monitored with AFM. Indeed, in situ measurements of the effects of changing gravitational load on cell shape are well feasible by means of AFM in liquid. The combination provides a promising technique to measure, online, the temporal characteristics of the cellular mechano-response during exposure to inertial forces
Time-Varying Graphs and Dynamic Networks
The past few years have seen intensive research efforts carried out in some
apparently unrelated areas of dynamic systems -- delay-tolerant networks,
opportunistic-mobility networks, social networks -- obtaining closely related
insights. Indeed, the concepts discovered in these investigations can be viewed
as parts of the same conceptual universe; and the formal models proposed so far
to express some specific concepts are components of a larger formal description
of this universe. The main contribution of this paper is to integrate the vast
collection of concepts, formalisms, and results found in the literature into a
unified framework, which we call TVG (for time-varying graphs). Using this
framework, it is possible to express directly in the same formalism not only
the concepts common to all those different areas, but also those specific to
each. Based on this definitional work, employing both existing results and
original observations, we present a hierarchical classification of TVGs; each
class corresponds to a significant property examined in the distributed
computing literature. We then examine how TVGs can be used to study the
evolution of network properties, and propose different techniques, depending on
whether the indicators for these properties are a-temporal (as in the majority
of existing studies) or temporal. Finally, we briefly discuss the introduction
of randomness in TVGs.Comment: A short version appeared in ADHOC-NOW'11. This version is to be
published in Internation Journal of Parallel, Emergent and Distributed
System
Space Division Multiplexing in Optical Fibres
Optical communications technology has made enormous and steady progress for
several decades, providing the key resource in our increasingly
information-driven society and economy. Much of this progress has been in
finding innovative ways to increase the data carrying capacity of a single
optical fibre. In this search, researchers have explored (and close to
maximally exploited) every available degree of freedom, and even commercial
systems now utilize multiplexing in time, wavelength, polarization, and phase
to speed more information through the fibre infrastructure. Conspicuously, one
potentially enormous source of improvement has however been left untapped in
these systems: fibres can easily support hundreds of spatial modes, but today's
commercial systems (single-mode or multi-mode) make no attempt to use these as
parallel channels for independent signals.Comment: to appear in Nature Photonic
A Scalable Correlator Architecture Based on Modular FPGA Hardware, Reuseable Gateware, and Data Packetization
A new generation of radio telescopes is achieving unprecedented levels of
sensitivity and resolution, as well as increased agility and field-of-view, by
employing high-performance digital signal processing hardware to phase and
correlate large numbers of antennas. The computational demands of these imaging
systems scale in proportion to BMN^2, where B is the signal bandwidth, M is the
number of independent beams, and N is the number of antennas. The
specifications of many new arrays lead to demands in excess of tens of PetaOps
per second.
To meet this challenge, we have developed a general purpose correlator
architecture using standard 10-Gbit Ethernet switches to pass data between
flexible hardware modules containing Field Programmable Gate Array (FPGA)
chips. These chips are programmed using open-source signal processing libraries
we have developed to be flexible, scalable, and chip-independent. This work
reduces the time and cost of implementing a wide range of signal processing
systems, with correlators foremost among them,and facilitates upgrading to new
generations of processing technology. We present several correlator
deployments, including a 16-antenna, 200-MHz bandwidth, 4-bit, full Stokes
parameter application deployed on the Precision Array for Probing the Epoch of
Reionization.Comment: Accepted to Publications of the Astronomy Society of the Pacific. 31
pages. v2: corrected typo, v3: corrected Fig. 1
An Overview on IEEE 802.11bf: WLAN Sensing
With recent advancements, the wireless local area network (WLAN) or wireless
fidelity (Wi-Fi) technology has been successfully utilized to realize sensing
functionalities such as detection, localization, and recognition. However, the
WLANs standards are developed mainly for the purpose of communication, and thus
may not be able to meet the stringent requirements for emerging sensing
applications. To resolve this issue, a new Task Group (TG), namely IEEE
802.11bf, has been established by the IEEE 802.11 working group, with the
objective of creating a new amendment to the WLAN standard to meet advanced
sensing requirements while minimizing the effect on communications. This paper
provides a comprehensive overview on the up-to-date efforts in the IEEE
802.11bf TG. First, we introduce the definition of the 802.11bf amendment and
its formation and standardization timeline. Next, we discuss the WLAN sensing
use cases with the corresponding key performance indicator (KPI) requirements.
After reviewing previous WLAN sensing research based on communication-oriented
WLAN standards, we identify their limitations and underscore the practical need
for the new sensing-oriented amendment in 802.11bf. Furthermore, we discuss the
WLAN sensing framework and procedure used for measurement acquisition, by
considering both sensing at sub-7GHz and directional multi-gigabit (DMG)
sensing at 60 GHz, respectively, and address their shared features,
similarities, and differences. In addition, we present various candidate
technical features for IEEE 802.11bf, including waveform/sequence design,
feedback types, as well as quantization and compression techniques. We also
describe the methodologies and the channel modeling used by the IEEE 802.11bf
TG for evaluation. Finally, we discuss the challenges and future research
directions to motivate more research endeavors towards this field in details.Comment: 31 pages, 25 figures, this is a significant updated version of
arXiv:2207.0485
Molecular Communications at the Macroscale: A Novel Framework for Modeling Epidemic Spreading and Mitigation
Using the notion of effective distance proposed by Brockmann and Helbing,
complex spatiotemporal processes of epidemic spreading can be reduced to
circular wave propagation patterns with well-defined wavefronts. This hidden
homogeneity of contagion phenomena enables the mapping of virtual mobility
networks to physical propagation channels. Subsequently, we propose a novel
communications-inspired model of epidemic spreading and mitigation by
establishing the one-to-one correspondence between the essential components
comprising information and disease transmissions. The epidemic processes can be
regarded as macroscale molecular communications, in which individuals are
macroscale information molecules carrying messages (epidemiological states). We
then present the notions of normalized ensemble-average prevalence (NEAP) and
prevalence delay profile (PDP) to characterize the relative impact and time
difference of all the spreading paths, which are analogous to the classical
description methods of path loss and power delay profile in communications.
Furthermore, we introduce the metric of root mean square (RMS) delay spread to
measure the distortion of early contagion dynamics caused by multiple infection
transmission routes. In addition, we show how social and medical interventions
can be understood from the perspectives of various communication modules. The
proposed framework provides an intuitive, coherent, and efficient approach for
characterization of the disease outbreaks by applying the deep-rooted
communications theories as the analytical lens.Comment: 16 pages, 5 figure
- …