908 research outputs found
The Model of the Low Rate Telemetry Communication System for Matlab-Simulink
This article is dedicated to the model of low rate telemetry system, which has been developed for Matlab-Simulink environment. The purpose of this model is a research of the low rate telemetry transmission reliability in those cases where the modulation scheme carrier-subcarrier is used. This modulation scheme is widely used in case of the interplanetary spacecrafts. The main purpose of the model is a research of the effects of AWGN and phase noise especially for very low value of Eb/N0. Effects can be evaluated for the whole transmission system or for its components parts. The model described is very versatile and it can be easily modified or expanded
The Low Rate Telemetry Transmission Simulator
The presented paper is dedicated to the low rate telemetry transmission simulator. The basic concept of the system uses the carrier (DSB) and subcarrier (BPSK). The research is focused on the AWGN and carrier phase noise influence. Presented system can be extended with the described carrier phase noise model. In this paper, some issues related to the described model are also discussed. For example, the relation between bit error rate for uncoded bit stream and bit stream with differential coding, which is used in the model. Authors prove the using of Costas loops for very low energy per bit to noise power spectral density ratio. The influence of additive white Gaussian noise and phase noise is also investigated
CSM-178 - Learning to Recognise Human Faces
Recognition of human faces is an ambitious problem, being currently attacked by psychologists, cognitive scientists, and - to a limited extent - also by AI-community. Nevertheless, computer programs solving this task are still rare. Most of them rely on the artificial neural nets, which are not used in our approach to the problem.
The presented paper reports a successful attempt to extract a reliable set of stable intrinsic features from the images by using edge-detection, boundary grouping, and boundary characterisation. Particular attention is paid to the local properties of the boundaries at junction points. No attempt to attach high-level meaning to the individual features is made.
The resulting symbolic descriptions are processed by a simple Machine-Learning program constructing a recognition scheme in the form of a decision tree. In spite of some constraints - frontal head-on view, limited training set - the results, as measured by predictive accuracy, are promising for dealing with larger numbers of individuals
Dendritic Spine Shape Analysis: A Clustering Perspective
Functional properties of neurons are strongly coupled with their morphology.
Changes in neuronal activity alter morphological characteristics of dendritic
spines. First step towards understanding the structure-function relationship is
to group spines into main spine classes reported in the literature. Shape
analysis of dendritic spines can help neuroscientists understand the underlying
relationships. Due to unavailability of reliable automated tools, this analysis
is currently performed manually which is a time-intensive and subjective task.
Several studies on spine shape classification have been reported in the
literature, however, there is an on-going debate on whether distinct spine
shape classes exist or whether spines should be modeled through a continuum of
shape variations. Another challenge is the subjectivity and bias that is
introduced due to the supervised nature of classification approaches. In this
paper, we aim to address these issues by presenting a clustering perspective.
In this context, clustering may serve both confirmation of known patterns and
discovery of new ones. We perform cluster analysis on two-photon microscopic
images of spines using morphological, shape, and appearance based features and
gain insights into the spine shape analysis problem. We use histogram of
oriented gradients (HOG), disjunctive normal shape models (DNSM), morphological
features, and intensity profile based features for cluster analysis. We use
x-means to perform cluster analysis that selects the number of clusters
automatically using the Bayesian information criterion (BIC). For all features,
this analysis produces 4 clusters and we observe the formation of at least one
cluster consisting of spines which are difficult to be assigned to a known
class. This observation supports the argument of intermediate shape types.Comment: Accepted for BioImageComputing workshop at ECCV 201
(Quantum) Space-Time as a Statistical Geometry of Fuzzy Lumps and the Connection with Random Metric Spaces
We develop a kind of pregeometry consisting of a web of overlapping fuzzy
lumps which interact with each other. The individual lumps are understood as
certain closely entangled subgraphs (cliques) in a dynamically evolving network
which, in a certain approximation, can be visualized as a time-dependent random
graph. This strand of ideas is merged with another one, deriving from ideas,
developed some time ago by Menger et al, that is, the concept of probabilistic-
or random metric spaces, representing a natural extension of the metrical
continuum into a more microscopic regime. It is our general goal to find a
better adapted geometric environment for the description of microphysics. In
this sense one may it also view as a dynamical randomisation of the causal-set
framework developed by e.g. Sorkin et al. In doing this we incorporate, as a
perhaps new aspect, various concepts from fuzzy set theory.Comment: 25 pages, Latex, no figures, some references added, some minor
changes added relating to previous wor
Modeling sea-salt aerosols in the atmosphere: 2. Atmospheric concentrations and fluxes
Atmospheric sea-salt aerosol concentrations are studied using both long-term observations and model simulations of Na+ at seven stations around the globe. Good agreement is achieved between observations and model predictions in the northern hemisphere. A stronger seasonal variation occurs in the high-latitude North Atlantic than in regions close to the equator and in high-latitude southern hemisphere. Generally, concentrations are higher for both boreal and austral winters. With the model, the production flux and removal flux at the atmosphere-ocean interface was calculated and used to estimate the global sea-salt budget. The flux also shows seasonal variation similar to that of sea-salt concentration. Depending on the geographic location, the model predicts that dry deposition accounts for 60–70% of the total sea-salt removed from the atmosphere while in-cloud and below-cloud precipitation scavenging accounts for about 1% and 28–39% of the remainder, respectively. The total amount of sea-salt aerosols emitted from the world oceans to the atmosphere is estimated to be in the vicinity of 1.17×1016 g yr−1. Approximately 99% of the sea-salt aerosol mass generated by wind falls back to the sea with about 1–2% remaining in the atmosphere to be exported from the original grid square (300×300 km). Only a small portion of that exported (∼4%) is associated with submicron particles that are likely to undergo long-range transport
Multiplicity dependence of jet-like two-particle correlations in p-Pb collisions at = 5.02 TeV
Two-particle angular correlations between unidentified charged trigger and
associated particles are measured by the ALICE detector in p-Pb collisions at a
nucleon-nucleon centre-of-mass energy of 5.02 TeV. The transverse-momentum
range 0.7 5.0 GeV/ is examined,
to include correlations induced by jets originating from low
momen\-tum-transfer scatterings (minijets). The correlations expressed as
associated yield per trigger particle are obtained in the pseudorapidity range
. The near-side long-range pseudorapidity correlations observed in
high-multiplicity p-Pb collisions are subtracted from both near-side
short-range and away-side correlations in order to remove the non-jet-like
components. The yields in the jet-like peaks are found to be invariant with
event multiplicity with the exception of events with low multiplicity. This
invariance is consistent with the particles being produced via the incoherent
fragmentation of multiple parton--parton scatterings, while the yield related
to the previously observed ridge structures is not jet-related. The number of
uncorrelated sources of particle production is found to increase linearly with
multiplicity, suggesting no saturation of the number of multi-parton
interactions even in the highest multiplicity p-Pb collisions. Further, the
number scales in the intermediate multiplicity region with the number of binary
nucleon-nucleon collisions estimated with a Glauber Monte-Carlo simulation.Comment: 23 pages, 6 captioned figures, 1 table, authors from page 17,
published version, figures at
http://aliceinfo.cern.ch/ArtSubmission/node/161
Prospects for measuring the gravitational free-fall of antihydrogen with emulsion detectors
The main goal of the AEgIS experiment at CERN is to test the weak equivalence
principle for antimatter. AEgIS will measure the free-fall of an antihydrogen
beam traversing a moir\'e deflectometer. The goal is to determine the
gravitational acceleration g for antihydrogen with an initial relative accuracy
of 1% by using an emulsion detector combined with a silicon micro-strip
detector to measure the time of flight. Nuclear emulsions can measure the
annihilation vertex of antihydrogen atoms with a precision of about 1 - 2
microns r.m.s. We present here results for emulsion detectors operated in
vacuum using low energy antiprotons from the CERN antiproton decelerator. We
compare with Monte Carlo simulations, and discuss the impact on the AEgIS
project.Comment: 20 pages, 16 figures, 3 table
Enumeration of CD4+ T-Cells Using a Portable Microchip Count Platform in Tanzanian HIV-Infected Patients
Background
CD4+ T-lymphocyte count (CD4 count) is a standard method used to monitor HIV-infected patients during anti-retroviral therapy (ART). The World Health Organization (WHO) has pointed out or recommended that a handheld, point-of-care, reliable, and affordable CD4 count platform is urgently needed in resource-scarce settings.
Methods
HIV-infected patient blood samples were tested at the point-of-care using a portable and label-free microchip CD4 count platform that we have developed. A total of 130 HIV-infected patient samples were collected that included 16 de-identified left over blood samples from Brigham and Women's Hospital (BWH), and 114 left over samples from Muhimbili University of Health and Allied Sciences (MUHAS) enrolled in the HIV and AIDS care and treatment centers in the City of Dar es Salaam, Tanzania. The two data groups from BWH and MUHAS were analyzed and compared to the commonly accepted CD4 count reference method (FACSCalibur system).
Results
The portable, battery operated and microscope-free microchip platform developed in our laboratory (BWH) showed significant correlation in CD4 counts compared with FACSCalibur system both at BWH (r = 0.94, p<0.01) and MUHAS (r = 0.49, p<0.01), which was supported by the Bland-Altman methods comparison analysis. The device rapidly produced CD4 count within 10 minutes using an in-house developed automated cell counting program.
Conclusions
We obtained CD4 counts of HIV-infected patients using a portable platform which is an inexpensive (<$1 material cost) and disposable microchip that uses whole blood sample (<10 µl) without any pre-processing. The system operates without the need for antibody-based fluorescent labeling and expensive fluorescent illumination and microscope setup. This portable CD4 count platform displays agreement with the FACSCalibur results and has the potential to expand access to HIV and AIDS monitoring using fingerprick volume of whole blood and helping people who suffer from HIV and AIDS in resource-limited settings.Wallace H. Coulter Foundation (Young Investigation Award in Bioengineering Award)National Institutes of Health (U.S.) (NIH R01AI081534)National Institutes of Health (U.S.) (NIH R21AI087107)National Institutes of Health (U.S.) (NIH grant RR016482)National Institutes of Health (U.S.) (grant AI060354)National Institutes of Health (U.S.) (NIH Fogarty Fellowship
Multi-particle azimuthal correlations in p-Pb and Pb-Pb collisions at the CERN Large Hadron Collider
Measurements of multi-particle azimuthal correlations (cumulants) for charged
particles in p-Pb and Pb-Pb collisions are presented. They help address the
question of whether there is evidence for global, flow-like, azimuthal
correlations in the p-Pb system. Comparisons are made to measurements from the
larger Pb-Pb system, where such evidence is established. In particular, the
second harmonic two-particle cumulants are found to decrease with multiplicity,
characteristic of a dominance of few-particle correlations in p-Pb collisions.
However, when a gap is placed to suppress such correlations,
the two-particle cumulants begin to rise at high-multiplicity, indicating the
presence of global azimuthal correlations. The Pb-Pb values are higher than the
p-Pb values at similar multiplicities. In both systems, the second harmonic
four-particle cumulants exhibit a transition from positive to negative values
when the multiplicity increases. The negative values allow for a measurement of
to be made, which is found to be higher in Pb-Pb collisions at
similar multiplicities. The second harmonic six-particle cumulants are also
found to be higher in Pb-Pb collisions. In Pb-Pb collisions, we generally find
which is indicative of a Bessel-Gaussian
function for the distribution. For very high-multiplicity Pb-Pb
collisions, we observe that the four- and six-particle cumulants become
consistent with 0. Finally, third harmonic two-particle cumulants in p-Pb and
Pb-Pb are measured. These are found to be similar for overlapping
multiplicities, when a gap is placed.Comment: 25 pages, 11 captioned figures, 3 tables, authors from page 20,
published version, figures at http://aliceinfo.cern.ch/ArtSubmission/node/87
- …
