135 research outputs found

    MPP-MLO: Multilevel Parallel Partitioning for Efficiently Matching Large Ontologies

    Get PDF
    221-229The growing usage of Semantic Web has resulted in an increasing number, size and heterogeneity of ontologies on the web. Therefore, the necessity of ontology matching techniques, which could solve these issues, is highly required. Due to high computational requirements, scalability is always a major concern in ontology matching system. In this work, a partition-based ontology matching system is proposed, which deals with parallel partitioning of the ontologies at multilevel. At first level, the root based ontology partitioning is proposed. Match able sub-ontology pair is generated using an efficient linguistic matcher (IEI-Sub) to uncover anchors and then based on maximum similarity values, pairs are generated. However, a distributed and parallel approach of Map Reduce-based IEI-sub process has been proposed to efficiently handle the anchor discovery process which is highly time-consuming. In second level partitioning, an efficient approach is proposed to form non-overlapping clusters. Extensive experimental evaluation is done by comparing existing approaches with the proposed approach, and the results shows that MPP-MLO turns out to be an efficient and scalable ontology matching system with 58.7% reduction in overall execution time

    MPP-MLO: Multilevel Parallel Partitioning for Efficiently Matching Large Ontologies

    Get PDF
    The growing usage of Semantic Web has resulted in an increasing number, size and heterogeneity of ontologies on the web. Therefore, the necessity of ontology matching techniques, which could solve these issues, is highly required. Due to high computational requirements, scalability is always a major concern in ontology matching system. In this work, a partition-based ontology matching system is proposed, which deals with parallel partitioning of the ontologies at multilevel. At first level, the root based ontology partitioning is proposed. Matchable Sub-ontologies pair is generated using an efficient linguistic matcher (IEI-Sub) to uncover anchors and then based on maximum similarity value, pairs are generated. However, a distributed and parallel approach of MapReduce-based SEI-sub process has been proposed to efficiently handle the anchor discovery process which is highly time-consuming. In second level partitioning, an efficient approach is proposed to form non overlapping clusters. Extensive experimental evaluation is done by comparing existing approaches with the proposed approach, and the results shows that MPP-MLO turns out to be an efficient and scalable ontology matching system

    Emergent relational schemas for RDF

    Get PDF

    Learning AI with deepint.net

    Get PDF
    This keynote will examine the evolution of intelligent computer systems over the last years, underscoring the need for human capital in this field, so that further progress can be made. In this regard, learning about AI through experience is a big challenge, but it is possible thanks to tools such as deepint.net, which enable anyone to develop AI systems; knowledge of programming is no longer necessary

    Intelligent Models in Complex Problem Solving

    Get PDF
    Artificial Intelligence revived in the last decade. The need for progress, the growing processing capacity and the low cost of the Cloud have facilitated the development of new, powerful algorithms. The efficiency of these algorithms in Big Data processing, Deep Learning and Convolutional Networks is transforming the way we work and is opening new horizons

    Managing smart cities with deepint.net

    Get PDF
    In this keynote, the evolution of intelligent computer systems will be examined. The need for human capital will be emphasised, as well as the need to follow one’s “gut instinct” in problem-solving. We will look at the benefits of combining information and knowledge to solve complex problems and will examine how knowledge engineering facilitates the integration of different algorithms. Furthermore, we will analyse the importance of complementary technologies such as IoT and Blockchain in the development of intelligent systems. It will be shown how tools like "Deep Intelligence" make it possible to create computer systems efficiently and effectively. "Smart" infrastructures need to incorporate all added-value resources so they can offer useful services to the society, while reducing costs, ensuring reliability and improving the quality of life of the citizens. The combination of AI with IoT and with blockchain offers a world of possibilities and opportunities

    Smart territories

    Get PDF
    The concept of smart cities is relatively new in research. Thanks to the colossal advances in Artificial Intelligence that took place over the last decade we are able to do all that that we once thought impossible; we build cities driven by information and technologies. In this keynote, we are going to look at the success stories of smart city-related projects and analyse the factors that led them to success. The development of interactive, reliable and secure systems, both connectionist and symbolic, is often a time-consuming process in which numerous experts are involved. However, intuitive and automated tools like “Deep Intelligence” developed by DCSc and BISITE, facilitate this process. Furthermore, in this talk we will analyse the importance of complementary technologies such as IoT and Blockchain in the development of intelligent systems, as well as the use of edge platforms or fog computing

    Smart Buildings

    Get PDF
    This talk presents an efficient cyberphysical platform for the smart management of smart buildings http://www.deepint.net. It is efficient because it facilitates the implementation of data acquisition and data management methods, as well as data representation and dashboard configuration. The platform allows for the use of any type of data source, ranging from the measurements of a multi-functional IoT sensing devices to relational and non-relational databases. It is also smart because it incorporates a complete artificial intelligence suit for data analysis; it includes techniques for data classification, clustering, forecasting, optimization, visualization, etc. It is also compatible with the edge computing concept, allowing for the distribution of intelligence and the use of intelligent sensors. The concept of smart building is evolving and adapting to new applications; the trend to create intelligent neighbourhoods, districts or territories is becoming increasingly popular, as opposed to the previous approach of managing an entire megacity. In this paper, the platform is presented, and its architecture and functionalities are described. Moreover, its operation has been validated in a case study at Salamanca - Ecocasa. This platform could enable smart building to develop adapted knowledge management systems, adapt them to new requirements and to use multiple types of data, and execute efficient computational and artificial intelligence algorithms. The platform optimizes the decisions taken by human experts through explainable artificial intelligence models that obtain data from IoT sensors, databases, the Internet, etc. The global intelligence of the platform could potentially coordinate its decision-making processes with intelligent nodes installed in the edge, which would use the most advanced data processing techniques

    AIoT for Smart territories

    Get PDF
    Artificial Intelligence revived in the last decade. The need for progress, the growing processing capacity and the low cost of the Cloud have facilitated the development of new, powerful algorithms. The efficiency of these algorithms in Big Data processing, Deep Learning and Convolutional Networks is transforming the way we work and is opening new horizons. Thanks to them, we can now analyse data and obtain unimaginable solutions to today’s problems. Nevertheless, our success is not entirely based on algorithms, it also comes from our ability to follow our “gut” when choosing the best combination of algorithms for an intelligent artefact. It's about approaching engineering with a lot of knowledge and tact. This involves the use of both connectionist and symbolic systems, and of having a full understanding of the algorithms used. Moreover, to address today’s problems we must work with both historical and real-time data
    corecore