28 research outputs found

    Classifying Mental-Disorders through Clinicians Subjective Approach based on Three-way Decision

    Full text link
    In psychiatric diagnosis, a contemporary data-driven, manual-based method for mental disorders classification is the most popular technique; however, it has several inevitable flaws. Using the three-way decision as a framework, we propose a unified model that stands for clinicians' subjective approach (CSA) analysis consisting of three parts: quantitative analysis, quantitative analysis, and evaluation-based analysis. A ranking list and a set of numerical weights based on illness magnitude levels according to the clinician's greatest degree of assumptions are the findings of the qualitative and quantitative investigation. We further create a comparative classification of illnesses into three groups with varying important levels; a three-way evaluation-based model is utilized in this study for the aim of understanding and portraying these results in a more clear way. This proposed method might be integrated with the manual-based process as a complementary tool to improve precision while diagnosing mental disorder

    A Review of Radio Frequency Based Localization for Aerial and Ground Robots with 5G Future Perspectives

    Full text link
    Efficient localization plays a vital role in many modern applications of Unmanned Ground Vehicles (UGV) and Unmanned aerial vehicles (UAVs), which would contribute to improved control, safety, power economy, etc. The ubiquitous 5G NR (New Radio) cellular network will provide new opportunities for enhancing localization of UAVs and UGVs. In this paper, we review the radio frequency (RF) based approaches for localization. We review the RF features that can be utilized for localization and investigate the current methods suitable for Unmanned vehicles under two general categories: range-based and fingerprinting. The existing state-of-the-art literature on RF-based localization for both UAVs and UGVs is examined, and the envisioned 5G NR for localization enhancement, and the future research direction are explored

    Efficient External-Memory Algorithms for Graph Mining

    Get PDF
    The explosion of big data in areas like the web and social networks has posed big challenges to research activities, including data mining, information retrieval, security etc. This dissertation focuses on a particular area, graph mining, and specifically proposes several novel algorithms to solve the problems of triangle listing and computation of neighborhood function in large-scale graphs. We first study the classic problem of triangle listing. We generalize the existing in-memory algorithms into a single framework of 18 triangle-search techniques. We then develop a novel external-memory approach, which we call Pruned Companion Files (PCF), that supports disk operation of all 18 algorithms. When compared to state-of-the-art available implementations MGT and PDTL, PCF runs 5-10 times faster and exhibits orders of magnitude less I/O. We next focus on I/O complexity of triangle listing. Recent work by Pagh etc. provides an appealing theoretical I/O complexity for triangle listing via graph partitioning by random coloring of nodes. Since no implementation of Pagh is available and little is known about the comparison between Pagh and PCF, we carefully implement Pagh, undertake an investigation into the properties of these algorithms, model their I/O cost, understand their shortcomings, and shed light on the conditions under which each method defeats the other. This insight leads us to develop a novel framework we call Trigon that surpasses the I/O performance of both techniques in all graphs and under all RAM conditions. We finally turn our attention to neighborhood function. Exact computation of neighborhood function is expensive in terms of CPU and I/O cost. Previous work mostly focuses on approximations. We show that our novel techniques developed for triangle listing can also be applied to this problem. We next study an application of neighborhood function to ranking of Internet hosts. Our method computes neighborhood functions for each host as an indication of its reputation. The evaluation shows that our method is robust to ranking manipulation and brings less spam to its top ranking list compared to PageRank and TrustRank

    Robust and Scalable Data Representation and Analysis Leveraging Isometric Transformations and Sparsity

    Get PDF
    The main focus of this doctoral thesis is to study the problem of robust and scalable data representation and analysis. The success of any machine learning and signal processing framework relies on how the data is represented and analyzed. Thus, in this work, we focus on three closely related problems: (i) supervised representation learning, (ii) unsupervised representation learning, and (iii) fault tolerant data analysis. For the first task, we put forward new theoretical results on why a certain family of neural networks can become extremely deep and how we can improve this scalability property in a mathematically sound manner. We further investigate how we can employ them to generate data representations that are robust to outliers and to retrieve representative subsets of huge datasets. For the second task, we will discuss two different methods, namely compressive sensing (CS) and nonnegative matrix factorization (NMF). We show that we can employ prior knowledge, such as slow variation in time, to introduce an unsupervised learning component to the traditional CS framework and to learn better compressed representations. Furthermore, we show that prior knowledge and sparsity constraint can be used in the context of NMF, not to find sparse hidden factors, but to enforce other structures, such as piece-wise continuity. Finally, for the third task, we investigate how a data analysis framework can become robust to faulty data and faulty data processors. We employ Bayesian inference and propose a scheme that can solve the CS recovery problem in an asynchronous parallel manner. Furthermore, we show how sparsity can be used to make an optimization problem robust to faulty data measurements. The methods investigated in this work have applications in different practical problems such as resource allocation in wireless networks, source localization, image/video classification, and search engines. A detailed discussion of these practical applications will be presented for each method

    A Review of Radio Frequency Based Localisation for Aerial and Ground Robots with 5G Future Perspectives

    Get PDF
    Efficient localisation plays a vital role in many modern applications of Unmanned Ground Vehicles (UGV) and Unmanned Aerial Vehicles (UAVs), which contributes to improved control, safety, power economy, etc. The ubiquitous 5G NR (New Radio) cellular network will provide new opportunities to enhance the localisation of UAVs and UGVs. In this paper, we review radio frequency (RF)-based approaches to localisation. We review the RF features that can be utilized for localisation and investigate the current methods suitable for Unmanned Vehicles under two general categories: range-based and fingerprinting. The existing state-of-the-art literature on RF-based localisation for both UAVs and UGVs is examined, and the envisioned 5G NR for localisation enhancement, and the future research direction are explored

    Semi-Annual Report to Congress for the Period of October 1, 1985 to March 31, 1986

    Get PDF
    [Excerpt] This semiannual report covers the activities of the Department of Labor\u27s Office of the Inspector General for the period October 1, 1985 through March 31, 1986. During this semiannual reporting period, we have continued our efforts to improve program management and operations within the Department of Labor. Audit initiatives resulted in numerous economy and efficiency findings and recommendations regarding Department Agency operations. The increasing utilization of the clustering strategy by Investigations delivered a strong statistical increase in successful prosecutions. Labor Racketeering continued its efforts to curtail significant employee pension and health fund embezzlements

    Concept learning consistency under three‑way decision paradigm

    Get PDF
    Concept Mining is one of the main challenges both in Cognitive Computing and in Machine Learning. The ongoing improvement of solutions to address this issue raises the need to analyze whether the consistency of the learning process is preserved. This paper addresses a particular problem, namely, how the concept mining capability changes under the reconsideration of the hypothesis class. The issue will be raised from the point of view of the so-called Three-Way Decision (3WD) paradigm. The paradigm provides a sound framework to reconsider decision-making processes, including those assisted by Machine Learning. Thus, the paper aims to analyze the influence of 3WD techniques in the Concept Learning Process itself. For this purpose, we introduce new versions of the Vapnik-Chervonenkis dimension. Likewise, to illustrate how the formal approach can be instantiated in a particular model, the case of concept learning in (Fuzzy) Formal Concept Analysis is considered.This work is supported by State Investigation Agency (Agencia Estatal de InvestigaciĂłn), project PID2019-109152GB-100/AEI/10.13039/501100011033. We acknowledge the reviewers for their suggestions and guidance on additional references that have enriched our paper. Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature

    International Conference on Continuous Optimization (ICCOPT) 2019 Conference Book

    Get PDF
    The Sixth International Conference on Continuous Optimization took place on the campus of the Technical University of Berlin, August 3-8, 2019. The ICCOPT is a flagship conference of the Mathematical Optimization Society (MOS), organized every three years. ICCOPT 2019 was hosted by the Weierstrass Institute for Applied Analysis and Stochastics (WIAS) Berlin. It included a Summer School and a Conference with a series of plenary and semi-plenary talks, organized and contributed sessions, and poster sessions. This book comprises the full conference program. It contains, in particular, the scientific program in survey style as well as with all details, and information on the social program, the venue, special meetings, and more

    Efficient External-Memory Algorithms for Graph Mining

    Get PDF
    The explosion of big data in areas like the web and social networks has posed big challenges to research activities, including data mining, information retrieval, security etc. This dissertation focuses on a particular area, graph mining, and specifically proposes several novel algorithms to solve the problems of triangle listing and computation of neighborhood function in large-scale graphs. We first study the classic problem of triangle listing. We generalize the existing in-memory algorithms into a single framework of 18 triangle-search techniques. We then develop a novel external-memory approach, which we call Pruned Companion Files (PCF), that supports disk operation of all 18 algorithms. When compared to state-of-the-art available implementations MGT and PDTL, PCF runs 5-10 times faster and exhibits orders of magnitude less I/O. We next focus on I/O complexity of triangle listing. Recent work by Pagh etc. provides an appealing theoretical I/O complexity for triangle listing via graph partitioning by random coloring of nodes. Since no implementation of Pagh is available and little is known about the comparison between Pagh and PCF, we carefully implement Pagh, undertake an investigation into the properties of these algorithms, model their I/O cost, understand their shortcomings, and shed light on the conditions under which each method defeats the other. This insight leads us to develop a novel framework we call Trigon that surpasses the I/O performance of both techniques in all graphs and under all RAM conditions. We finally turn our attention to neighborhood function. Exact computation of neighborhood function is expensive in terms of CPU and I/O cost. Previous work mostly focuses on approximations. We show that our novel techniques developed for triangle listing can also be applied to this problem. We next study an application of neighborhood function to ranking of Internet hosts. Our method computes neighborhood functions for each host as an indication of its reputation. The evaluation shows that our method is robust to ranking manipulation and brings less spam to its top ranking list compared to PageRank and TrustRank

    Trends and Future of Sustainable Development : Proceedings of the Conference "Trends and Future of Sustainable Development", 9–10 June 2011, Tampere, Finland

    Get PDF
    corecore