2,921 research outputs found
FEC decoder design optimization for mobile satellite communications
A new telecommunications service for location determination via satellite is being proposed for the continental USA and Europe, which provides users with the capability to find the location of, and communicate from, a moving vehicle to a central hub and vice versa. This communications system is expected to operate in an extremely noisy channel in the presence of fading. In order to achieve high levels of data integrity, it is essential to employ forward error correcting (FEC) encoding and decoding techniques in such mobile satellite systems. A constraint length k = 7 FEC decoder has been implemented in a single chip for such systems. The single chip implementation of the maximum likelihood decoder helps to minimize the cost, size, and power consumption, and improves the bit error rate (BER) performance of the mobile earth terminal (MET)
Detecting Threats of Violence in Online Discussions Using Bigrams of Important Words
Making violent threats towards minorities like immigrants
or homosexuals is increasingly common on the Internet.
We present a method to automatically detect threats of violence
using machine learning. A material of 24,840 sentences from
YouTube was manually annotated as violent threats or not, and
was used to train and test the machine learning model. Detecting
threats of violence works quit well with an error of classifying a
violent sentence as not violent of about 10% when the error of
classifying a non-violent sentence as violent is adjusted to 5%. The
best classification performance is achieved by including features
that combine specially chosen important words and the distance
between those in the sentence
Distributional equivalence and subcompositional coherence in the analysis of contingency tables, ratio-scale measurements and compositional data
We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to “spectral mapping”, a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.Association models, biplot, compositional data, contingency tables, correspondence analysis, distributional equivalence, log-ration transformation, ratio-scale data, singular value decomposition
- …
