190 research outputs found
Propagation of whistler-mode chorus to low altitudes: divergent ray trajectories and ground accessibility
We investigate the ray trajectories of nonductedly propagating lower-band chorus waves with respect to their initial angle &theta;<sub>0</sub>, between the wave vector and ambient magnetic field. Although we consider a wide range of initial angles &theta;<sub>0</sub>, in order to be consistent with recent satellite observations, we pay special attention to the intervals of initial angles &theta;<sub>0</sub>, for which the waves propagate along the field lines in the source region, i.e. we mainly focus on waves generated with &theta<sub>0</sub> within an interval close to 0&deg; and on waves generated within an interval close to the Gendrin angle. We demonstrate that the ray trajectories of waves generated within an interval close to the Gendrin angle with a wave vector directed towards the lower L-shells (to the Earth) significantly diverge at the frequencies typical for the lower-band chorus. Some of these diverging trajectories reach the topside ionosphere having &theta; close to 0&deg;; thus, a part of the energy may leak to the ground at higher altitudes where the field lines have a nearly vertical direction. The waves generated with different initial angles are reflected. A small variation of the initial wave normal angle thus very dramatically changes the behaviour of the resulting ray. Although our approach is rather theoretical, based on the ray tracing simulation, we show that the initial angle &theta;<sub>0</sub> of the waves reaching the ionosphere (possibly ground) is surprisingly close - differs just by several degrees from the initial angles which fits the observation of magnetospherically reflected chorus revealed by CLUSTER satellites. We also mention observations of diverging trajectories on low altitude satellites
Sampled Weighted Min-Hashing for Large-Scale Topic Mining
We present Sampled Weighted Min-Hashing (SWMH), a randomized approach to
automatically mine topics from large-scale corpora. SWMH generates multiple
random partitions of the corpus vocabulary based on term co-occurrence and
agglomerates highly overlapping inter-partition cells to produce the mined
topics. While other approaches define a topic as a probabilistic distribution
over a vocabulary, SWMH topics are ordered subsets of such vocabulary.
Interestingly, the topics mined by SWMH underlie themes from the corpus at
different levels of granularity. We extensively evaluate the meaningfulness of
the mined topics both qualitatively and quantitatively on the NIPS (1.7 K
documents), 20 Newsgroups (20 K), Reuters (800 K) and Wikipedia (4 M) corpora.
Additionally, we compare the quality of SWMH with Online LDA topics for
document representation in classification.Comment: 10 pages, Proceedings of the Mexican Conference on Pattern
Recognition 201
Engineering a Simplified 0-Bit Consistent Weighted Sampling
The Min-Hashing approach to sketching has become an important tool in data
analysis, information retrial, and classification. To apply it to real-valued
datasets, the ICWS algorithm has become a seminal approach that is widely used,
and provides state-of-the-art performance for this problem space. However, ICWS
suffers a computational burden as the sketch size K increases. We develop a new
Simplified approach to the ICWS algorithm, that enables us to obtain over 20x
speedups compared to the standard algorithm. The veracity of our approach is
demonstrated empirically on multiple datasets and scenarios, showing that our
new Simplified CWS obtains the same quality of results while being an order of
magnitude faster
Assigning the causative lightning to the whistlers observed on satellites
International audienceWe study the penetration of lightning induced whistler waves through the ionosphere by investigating the correspondence between the whistlers observed on the DEMETER and MAGION-5 satellites and the lightning discharges detected by the European lightning detection network EUCLID. We compute all the possible differences between the times when the whistlers were observed on the satellite and times when the lightning discharges were detected. We show that the occurrence histogram for these time differences exhibits a distinct peak for a particular characteristic time, corresponding to the sum of the propagation time and a possible small time shift between the absolute time assigned to the wave record and the clock of the lightning detection network. Knowing this characteristic time, we can search in the EUCLID database for locations, currents, and polarities of causative lightning discharges corresponding to the individual whistlers. We demonstrate that the area in the ionosphere through which the electromagnetic energy induced by a lightning discharge enters into the magnetosphere as whistler mode waves is up to several thousands of kilometres wide
Fast Outlier Rejection by Using Parallax-Based Rigidity Constraint for Epipolar Geometry Estimation
A novel approach is presented in order to reject correspondence outliers between frames using the parallax-based rigidity constraint for epipolar geometry estimation. In this approach, the invariance of 3-D relative projective structure of a stationary scene over different views is exploited to eliminate outliers, mostly due to independently moving objects of a typical scene. The proposed approach is compared against a well-known RANSAC-based algorithm by the help of a test-bed. The results showed that the speed-up, gained by utilization of the proposed technique as a preprocessing step before RANSAC-based approach, decreases the execution time of the overall outlier rejection, significantly
BagMinHash - Minwise Hashing Algorithm for Weighted Sets
Minwise hashing has become a standard tool to calculate signatures which
allow direct estimation of Jaccard similarities. While very efficient
algorithms already exist for the unweighted case, the calculation of signatures
for weighted sets is still a time consuming task. BagMinHash is a new algorithm
that can be orders of magnitude faster than current state of the art without
any particular restrictions or assumptions on weights or data dimensionality.
Applied to the special case of unweighted sets, it represents the first
efficient algorithm producing independent signature components. A series of
tests finally verifies the new algorithm and also reveals limitations of other
approaches published in the recent past.Comment: 10 pages, KDD 201
Follow-up of phase I trial of adalimumab and rosiglitazone in FSGS: III. Report of the FONT study group
Abstract Background Patients with resistant primary focal segmental glomerulosclerosis (FSGS) are at high risk of progression to chronic kidney disease stage V. Antifibrotic agents may slow or halt this process. We present outcomes of follow-up after a Phase I trial of adalimumab and rosiglitazone, antifibrotic drugs tested in the Novel Therapies in Resistant FSGS (FONT) study. Methods 21 patients -- 12 males and 9 females, age 16.0 ± 7.5 yr, and estimated GFR (GFRe) 121 ± 56 mL/min/1.73 m2 -- received adalimumab (n = 10), 24 mg/m2 every 14 days or rosiglitazone (n = 11), 3 mg/m2 per day for 16 weeks. The change in GFRe per month prior to entry and after completion of the Phase I trial was compared. Results 19 patients completed the 16-week FONT treatment phase. The observation period pre-FONT was 18.3 ± 10.2 months and 16.1 ± 5.7 months after the study. A similar percentage of patients, 71% and 56%, in the rosiglitazone and adalimumab cohorts, respectively, had stabilization in GFRe, defined as a reduced negative slope of the line plotting GFRe versus time without requiring renal replacement therapy after completion of the FONT treatment period (P = 0.63). Conclusion Nearly 50% of patients with resistant FSGS who receive novel antifibrotic agents may have a legacy effect with delayed deterioration in kidney function after completion of therapy. Based on this proof-of-concept preliminary study, we recommend long-term follow-up of patients enrolled in clinical trials to ascertain a more comprehensive assessment of the efficacy of experimental treatments
Loosely distinctive features for robust surface alignment
Many successful feature detectors and descriptors exist for 2D intensity images. However, obtaining the same effectiveness in the domain of 3D objects has proven to be a more elusive goal. In fact, the smoothness often found in surfaces and the lack of texture information on the range images produced by conventional 3D scanners hinder both the localization of interesting points and the distinctiveness of their characterization in terms of descriptors. To overcome these limitations several approaches have been suggested, ranging from the simple enlargement of the area over which the descriptors are computed to the reliance on external texture information. In this paper we offer a change in perspective, where a game-theoretic matching technique that exploits global geometric consistency allows to obtain an extremely robust surface registration even when coupled with simple surface features exhibiting very low distinctiveness. In order to assess the performance of the whole approach we compare it with state-of-the-art alignment pipelines. Furthermore, we show that using the novel feature points with well-known alternative non-global matching techniques leads to poorer results. © 2010 Springer-Verlag
- …