1,035 research outputs found
The localization of single pulse in VLBI observation
In our previous work, we propose a cross spectrum based method to extract
single pulse signals from RFI contaminated data, which is originated from
geodetic VLBI postprocessing. This method fully utilizes fringe phase
information of the cross spectrum and hence maximizes signal power, however the
localization was not discussed in that work yet. As the continuation of that
work, in this paper, we further study how to localize single pulses using
astrometric solving method. Assuming that the burst is a point source, we
derive the burst position by solving a set of linear equations given the
relation between residual delay and offset to a priori position. We find that
the single pulse localization results given by both astrometric solving and
radio imaging are consistent within 3 sigma level. Therefore we claim that it
is possible to derive the position of a single pulse with reasonable precision
based on only 3 or even 2 baselines with 4 milliseconds integration. The
combination of cross spectrum based detection and the localization proposed in
this work then provide a thorough solution for searching single pulse in VLBI
observation. According to our calculation, our pipeline gives comparable
accuracy as radio imaging pipeline. Moreover, the computational cost of our
pipeline is much smaller, which makes it more practical for FRB search in
regular VLBI observation. The pipeline is now publicly available and we name it
as "VOLKS", which is the acronym of "VLBI Observation for frb Localization Keen
Searcher".Comment: 11 pages, 4 figures, 3 tables, accepted for publication in A
Short Text Classification Research Based on TW-CNN
Short texts are characterized by short length and sparse features. The study is less effective in the classification of short texts. Motivated by this, this paper seeks to extract features from the ātopicā and āwordā levels with proposing a convolutional neural network (CNN) based on topic and word, which is named TW-CNN. It uses the Latent Dirichlet Allocation (LDA), a topic model, and word2vec to obtain two distinct word vector matrices, which are then respectively taken as the inputs of two CNNs. After the process of convolution and pooling of the CNNs, there are two different vector representations of the text. And the vector representations are connected with the text-topic vector obtained by LDA, forming the final representation vector of the text. In the end, softmax text classification is conducted. And experiments based on short news texts show that the TW-CNN model has an improvement over the traditional CNNs
Parameter Selection and Uncertainty Measurement for Variable Precision Probabilistic Rough Set
In this paper, we consider the problem of parameter selection and uncertainty measurement for a variable precision probabilistic rough set. Firstly, within the framework of the variable precision probabilistic rough set model, the relative discernibility of a variable precision rough set in probabilistic approximation space is discussed, and the conditions that make precision parameters Ī± discernible in a variable precision probabilistic rough set are put forward. Concurrently, we consider the lack of predictability of precision parameters in a variable precision probabilistic rough set, and we propose a systematic threshold selection method based on relative discernibility of sets, using the concept of relative discernibility in probabilistic approximation space. Furthermore, a numerical example is applied to test the validity of the proposed method in this paper. Secondly, we discuss the problem of uncertainty measurement for the variable precision probabilistic rough set. The concept of classical fuzzy entropy is introduced into probabilistic approximation space, and the uncertain information that comes from approximation space and the approximated objects is fully considered. Then, an axiomatic approach is established for uncertainty measurement in a variable precision probabilistic rough set, and several related interesting properties are also discussed. Thirdly, we study the attribute reduction for the variable precision probabilistic rough set. The definition of reduction and its characteristic theorems are given for the variable precision probabilistic rough set. The main contribution of this paper is twofold. One is to propose a method of parameter selection for a variable precision probabilistic rough set. Another is to present a new approach to measurement uncertainty and the method of attribute reduction for a variable precision probabilistic rough set
Ethanol Vapor Sensing Properties of Triangular Silver Nanostructures Based on Localized Surface Plasmon Resonance
A sensitive volatile organic vapor sensor based on the LSPR properties of silver triangular nanoprisms is proposed in this paper. The triangular nanoprisms were fabricated by a nanosphere lithography (NSL) method. They have sharp vertices and edges, and are arranged in an ideal hexangular array. These characteristics ensure that they exhibit an excellent LSPR spectrum and a high sensitivity to the exterior environment changes. The LSPR spectra responding to ethanol vapor and four other volatile organic vaporsāacetone, benzene, hexane and propanolāwere measured with a UV-vis spectrometer in real time. Compared with the other four vapors, ethanol exhibits the highest sensitivity (ā¼0.1 nm/mg Lā1) and the lowest detection limit (ā¼10 mg/L) in the spectral tests. The ethanol vapor test process is also fast (ā¼4 s) and reversible. These insights demonstrate that the triangular nanoprism based nano-sensor can be used in ethanol vapor detection applications
- ā¦