22 research outputs found

    A Parallel Mining Algorithm for Maximum Erasable Itemset Based on Multi-core Processor

    Get PDF
    Mining the erasable itemset is an interesting research domain, which has been applied to solve the problem of how to efficiently use limited funds to optimise production in economic crisis. After the problem of mining the erasable itemset was posed, researchers have proposed many algorithms to solve it, among which mining the maximum erasable itemset is a significant direction for research. Since all subsets of the maximum erasable itemset are erasable itemsets, all erasable itemsets can be obtained by mining the maximum erasable itemset, which reduces both the quantity of candidate and resultant itemsets generated during the mining process. However, computing many itemset values still takes a lot of CPU time when mining huge amounts of data. And it is difficult to solve the problem quickly with sequential algorithms. Therefore, this proposed study presents a parallel algorithm for the mining of maximum erasable itemsets, called PAMMEI, based on a multi-core processor platform. The algorithm divides the entire mining task into multiple subtasks and assigns them to multiple processor cores for parallel execution, while using an efficient pruning strategy to downsize the space to be searched and increase the mining speed. To verify the efficiency of the PAMMEI algorithm, the paper compares it with most advanced algorithms. The experimental results show that PAMMEI is superior to the comparable algorithms with respect to runtime, memory usage and scalability

    Semantic models in Web based Educational System integration

    Get PDF
    International audienceWeb based e-Education systems are an important kind of information systems that benefited from Web standards for implementation, deployment and integration. In this paper we propose and evaluate a semantic Web approach to support the features and interoperability of a real industrial e-Education system in production. We show how ontology-based knowledge representation supports the required features, their extension to new ones and the integration of external resources (e.g. official standards) as well as the interoperability with other systems.We designed and implemented a proof of concept in an industrial context that was qualitatively and quantitatively evaluated and we benchmarked different alternatives on real data and real queries. We present a complete evaluation of the quality of service and response time in this industrial context and we show that on a real-world tesbed Semantic Web based solutions can meet the industrial requirements, both in terms of functionalities and efficiency compared to existing operational solutions. We also show that an ontology-oriented modelling opens up new opportunities of advanced functionalities supporting resource recommendation and adaptive learning

    Classification of n-th order limit language in formal language classes

    Get PDF
    The study of splicing systems and their language has grown rapidly since Paun developed a splicing system known as a regular splicing scheme that produces a regular language. Since then, the researchers have been eager to classify the splicing language into certain classes in the Chomsky hierarchy, such as context-free language, contextsensitive language and recursive enumerable language. Previously, the study on the nth order limit language was conducted from the biological perspective to the limit language produced. Still, no research has been done from the generation of language point of view. This research presents a generalization on the type of classes of the formal language, the n-th order limit language. The cases to obtain the n-th order limit language are revisited and used to obtain the types of language classes according to the Chomsky hierarchy produced by the n-th order limit language

    An ambient agent model for reading companion robot

    Get PDF
    Reading is essentially a problem-solving task. Based on what is read, like problem solving, it requires effort, planning, self-monitoring, strategy selection, and reflection. Also, as readers are trying to solve difficult problems, reading materials become more complex, thus demands more effort and challenges cognition. To address this issue, companion robots can be deployed to assist readers in solving difficult reading tasks by making reading process more enjoyable and meaningful. These robots require an ambient agent model, monitoring of a reader’s cognitive demand as it could consist of more complex tasks and dynamic interactions between human and environment. Current cognitive load models are not developed in a form to have reasoning qualities and not integrated into companion robots. Thus, this study has been conducted to develop an ambient agent model of cognitive load and reading performance to be integrated into a reading companion robot. The research activities were based on Design Science Research Process, Agent-Based Modelling, and Ambient Agent Framework. The proposed model was evaluated through a series of verification and validation approaches. The verification process includes equilibria evaluation and automated trace analysis approaches to ensure the model exhibits realistic behaviours and in accordance to related empirical data and literature. On the other hand, validation process that involved human experiment proved that a reading companion robot was able to reduce cognitive load during demanding reading tasks. Moreover, experiments results indicated that the integration of an ambient agent model into a reading companion robot enabled the robot to be perceived as a social, intelligent, useful, and motivational digital side-kick. The study contribution makes it feasible for new endeavours that aim at designing ambient applications based on human’s physical and cognitive process as an ambient agent model of cognitive load and reading performance was developed. Furthermore, it also helps in designing more realistic reading companion robots in the future

    Computational Methods for Medical and Cyber Security

    Get PDF
    Over the past decade, computational methods, including machine learning (ML) and deep learning (DL), have been exponentially growing in their development of solutions in various domains, especially medicine, cybersecurity, finance, and education. While these applications of machine learning algorithms have been proven beneficial in various fields, many shortcomings have also been highlighted, such as the lack of benchmark datasets, the inability to learn from small datasets, the cost of architecture, adversarial attacks, and imbalanced datasets. On the other hand, new and emerging algorithms, such as deep learning, one-shot learning, continuous learning, and generative adversarial networks, have successfully solved various tasks in these fields. Therefore, applying these new methods to life-critical missions is crucial, as is measuring these less-traditional algorithms' success when used in these fields

    Advances on Smart Cities and Smart Buildings

    Get PDF
    Modern cities are facing the challenge of combining competitiveness at the global city scale and sustainable urban development to become smart cities. A smart city is a high-tech, intensive and advanced city that connects people, information, and city elements using new technologies in order to create a sustainable, greener city; competitive and innovative commerce; and an increased quality of life. This Special Issue collects the recent advancements in smart cities and covers different topics and aspects

    Tuberculosis diagnosis from pulmonary chest x-ray using deep learning.

    Get PDF
    Doctoral Degree. University of KwaZulu-Natal, Durban.Tuberculosis (TB) remains a life-threatening disease, and it is one of the leading causes of mortality in developing countries. This is due to poverty and inadequate medical resources. While treatment for TB is possible, it requires an accurate diagnosis first. Several screening tools are available, and the most reliable is Chest X-Ray (CXR), but the radiological expertise for accurately interpreting the CXR images is often lacking. Over the years, CXR has been manually examined; this process results in delayed diagnosis, is time-consuming, expensive, and is prone to misdiagnosis, which could further spread the disease among individuals. Consequently, an algorithm could increase diagnosis efficiency, improve performance, reduce the cost of manual screening and ultimately result in early/timely diagnosis. Several algorithms have been implemented to diagnose TB automatically. However, these algorithms are characterized by low accuracy and sensitivity leading to misdiagnosis. In recent years, Convolutional Neural Networks (CNN), a class of Deep Learning, has demonstrated tremendous success in object detection and image classification task. Hence, this thesis proposed an efficient Computer-Aided Diagnosis (CAD) system with high accuracy and sensitivity for TB detection and classification. The proposed model is based firstly on novel end-to-end CNN architecture, then a pre-trained Deep CNN model that is fine-tuned and employed as a features extractor from CXR. Finally, Ensemble Learning was explored to develop an Ensemble model for TB classification. The Ensemble model achieved a new stateof- the-art diagnosis accuracy of 97.44% with a 99.18% sensitivity, 96.21% specificity and 0.96% AUC. These results are comparable with state-of-the-art techniques and outperform existing TB classification models.Author's Publications listed on page iii

    Cyber Security and Critical Infrastructures

    Get PDF
    This book contains the manuscripts that were accepted for publication in the MDPI Special Topic "Cyber Security and Critical Infrastructure" after a rigorous peer-review process. Authors from academia, government and industry contributed their innovative solutions, consistent with the interdisciplinary nature of cybersecurity. The book contains 16 articles: an editorial explaining current challenges, innovative solutions, real-world experiences including critical infrastructure, 15 original papers that present state-of-the-art innovative solutions to attacks on critical systems, and a review of cloud, edge computing, and fog's security and privacy issues
    corecore