23 research outputs found

    Deep Web Data Source Classification Based on Text Feature Extension and Extraction

    Get PDF
    With the growth of volume of high quality information in the Deep Web, as the key to utilize this information, Deep Web data source classification becomes one topic with great research value. In this paper, we propose a Deep Web data source classification method based on text feature extension and extraction. Firstly, because the data source contains less text, some data sources even contain less than 10 words. In order to classify the data source based on the text content, the original text must be extended. In text feature extension stage, we use the N-gram model to select extension words. Secondly, we proposed a feature extraction and classification method based on Attention-based Bi-LSTM. By combining LSTM and Attention mechanism, we can obtain contextual semantic representation and focus on words that are closer to the theme of the text, so that more accurate text vector representation can be obtained. In order to evaluate the performance of our classification model, some experiments are executed on the UIUC TEL-8 dataset. The experimental result shows that Deep Web data source classification method based on text feature extension and extraction has certain promotion in performance than some existing methods

    Efficiency and Reliability in Bringing AI into Transport and Smart Cities Solutions

    Get PDF
    capacity and the low cost of the Cloud have facilitated the development of new, powerful algorithms. The efficiency of these algorithms in Big Data processing, Deep Learning and Convolutional Networks is transforming the way we work and is opening new horizons. Thanks to them, we can now analyse data and obtain unimaginable solutions to today’s problems. Nevertheless, our success is not entirely based on algorithms, it also comes from our ability to follow our “gut” when choosing the best combination of algorithms for an intelligent artefact. Their development involves the use of both connectionist and symbolic systems, that is to say data and knowledge. Moreover, it is necessary to work with both historical and real-time data. It is also important to consider development time, costs and the ability to create systems that will interact with their environment, will connect with the objects that surround them and will manage the data they obtain in a reliable manner. In this keynote, the evolution of intelligent computer systems will be examined, especially that of convolutional networks. The need for human capital will be discussed, as well as the need to follow one’s “gut instinct” in problem-solving. Furthermore, the importance of IoT and Blockchain in the development of intelligent systems will be analysed and it will be shown how tools like "Deep Intelligence" make it possible to create computer systems efficiently and effectively. "Smart" infrastructures need to incorporate all added-value resources so they can offer useful services to the society, while reducing costs, ensuring reliability and improving the quality of life of the citizens. The combination of AI with IoT and with blockchain offers a world of possibilities and opportunities. The development of transport, smart cities, urbanizations and leisure areas can be improved through the use of distributed intelligent computer systems. In this regard, edge platforms or fog computing help increase efficiency, reduce network latency, improve security and bring intelligence to the edge of the network, the sensors, users and the environment. Several use cases of intelligent systems will be presented, and it will be analysed how the processes of implementation and use have been optimized by means of different tools

    Automatic Topic-Based Web Page Classification Using Deep Learning

    Get PDF
    The internet is frequently surfed by people by using smartphones, laptops, or computers in order to search information online in the web. The increase of information in the web has made the web pages grow day by day. The automatic topic-based web page classification is used to manage the excessive amount of web pages by classifying them to different categories based on the web page content. Different machine learning algorithms have been employed as web page classifiers to categorise the web pages. However, there is lack of study that review classification of web pages using deep learning. In this study, the automatic topic-based classification of web pages utilising deep learning that has been proposed by many key researchers are reviewed. The relevant research papers are selected from reputable research databases. The review process looked at the dataset, features, algorithm, pre-processing used in classification of web pages, document representation technique and performance of the web page classification model. The document representation technique used to represent the web page features is an important aspect in the classification of web pages as it affects the performance of the web page classification model. The integral web page feature is the textual content. Based on the review, it was found that the image based web page classification showed higher performance compared to the text based web page classification. Due to lack of matrix representation that can effectively handle long web page text content, a new document representation technique which is word cloud image can be used to visualize the words that have been extracted from the text content web page

    Rapid Deployment of Deep AI Models in Engineering Solutions

    Get PDF
    The blockchain system, appeared in 2009 together with the virtual currency bitcoin, is a record of digital transactions based on a huge database in which all financial operations carried out with electronic currency are registered. The Blockchain (or chain of blocks) is a shared database that works as a book for the record of purchase-sale operations or any other transaction. It is the technological base of the operation of bitcoin, for example. It consists of a set of notes that are in a shared online database in which operations, quantities, dates and participants are registered by means of codes. By using cryptographic keys and being distributed by many computers (people), it presents security advantages against manipulation and fraud. A modification in one of the copies would be useless, but the change must be made in all the copies because the database is open and public
    corecore