6 research outputs found

    Fake News Identification and Classification Using DSSM and Improved Recurrent Neural Network Classifier

    No full text
    The widespread use of social media has enormous consequences for the society, culture and business with potentially positive and negative effects. As online social networks are increasingly used for dissemination of information, at the same they are also becoming a medium for the spread of fake news for various commercial and political purposes. Technologies such as Artificial Intelligence (AI) and Natural Language Processing (NLP) tools offer great promise for researchers to build systems, which could automatically detect fake news. However, detecting fake news is a challenging task to accomplish as it requires models to summarize the news and compare it to the actual news in order to classify it as fake. This project proposes a framework that detects and classifies fake news messages using improved Recurrent Neural Networks and Deep Structured Semantic Model. The proposed approach intuitively identifies important features associated with fake news without previous domain knowledge while achieving accuracy 99%. The performance analysis method used for the proposed system is based on accuracy, specificity and sensitivity

    Lossy image compression based on prediction error and vector quantisation

    No full text
    Abstract Lossy image compression has been gaining importance in recent years due to the enormous increase in the volume of image data employed for Internet and other applications. In a lossy compression, it is essential to ensure that the compression process does not affect the quality of the image adversely. The performance of a lossy compression algorithm is evaluated based on two conflicting parameters, namely, compression ratio and image quality which is usually measured by PSNR values. In this paper, a new lossy compression method denoted as PE-VQ method is proposed which employs prediction error and vector quantization (VQ) concepts. An optimum codebook is generated by using a combination of two algorithms, namely, artificial bee colony and genetic algorithms. The performance of the proposed PE-VQ method is evaluated in terms of compression ratio (CR) and PSNR values using three different types of databases, namely, CLEF med 2009, Corel 1 k and standard images (Lena, Barbara etc.). Experiments are conducted for different codebook sizes and for different CR values. The results show that for a given CR, the proposed PE-VQ technique yields higher PSNR value compared to the existing algorithms. It is also shown that higher PSNR values can be obtained by applying VQ on prediction errors rather than on the original image pixels
    corecore