232 research outputs found
Rough sets, their extensions and applications
Rough set theory provides a useful mathematical foundation for developing automated computational systems that can help understand and make use of imperfect knowledge. Despite its recency, the theory and its extensions have been widely applied to many problems, including decision analysis, data-mining, intelligent control and pattern recognition. This paper presents an outline of the basic concepts of rough sets and their major extensions, covering variable precision, tolerance and fuzzy rough sets. It also shows the diversity of successful applications these theories have entailed, ranging from financial and business, through biological and medicine, to physical, art, and meteorological
LearnFCA: A Fuzzy FCA and Probability Based Approach for Learning and Classification
Formal concept analysis(FCA) is a mathematical theory based on lattice and order theory used for data analysis and knowledge representation. Over the past several years, many of its extensions have been proposed and applied in several domains including data mining, machine learning, knowledge management, semantic web, software development, chemistry ,biology, medicine, data analytics, biology and ontology engineering.
This thesis reviews the state-of-the-art of theory of Formal Concept Analysis(FCA) and its various extensions that have been developed and well-studied in the past several years. We discuss their historical roots, reproduce the original definitions and derivations with illustrative examples. Further, we provide a literature review of it’s applications and various approaches adopted by researchers in the areas of dataanalysis, knowledge management with emphasis to data-learning and classification problems.
We propose LearnFCA, a novel approach based on FuzzyFCA and probability theory for learning and classification problems. LearnFCA uses an enhanced version of FuzzyLattice which has been developed to store class labels and probability vectors and has the capability to be used for classifying instances with encoded and unlabelled features. We evaluate LearnFCA on encodings from three datasets - mnist, omniglot and cancer images with interesting results and varying degrees of success.
Adviser: Dr Jitender Deogu
LEARNFCA: A FUZZY FCA AND PROBABILITY BASED APPROACH FOR LEARNING AND CLASSIFICATION
Formal concept analysis(FCA) is a mathematical theory based on lattice and order theory used for data analysis and knowledge representation. Over the past several years, many of its extensions have been proposed and applied in several domains including data mining, machine learning, knowledge management, semantic web, software development, chemistry ,biology, medicine, data analytics, biology and ontology engineering.
This thesis reviews the state-of-the-art of theory of Formal Concept Analysis(FCA) and its various extensions that have been developed and well-studied in the past several years. We discuss their historical roots, reproduce the original definitions and derivations with illustrative examples. Further, we provide a literature review of it’s applications and various approaches adopted by researchers in the areas of dataanalysis, knowledge management with emphasis to data-learning and classification problems.
We propose LearnFCA, a novel approach based on FuzzyFCA and probability theory for learning and classification problems. LearnFCA uses an enhanced version of FuzzyLattice which has been developed to store class labels and probability vectors and has the capability to be used for classifying instances with encoded and unlabelled features. We evaluate LearnFCA on encodings from three datasets - mnist, omniglot and cancer images with interesting results and varying degrees of success.
Adviser: Jitender Deogu
Faculty of Sciences
A comprehensive study of fuzzy rough sets and their application in data reductio
From Quantum Metalanguage to the Logic of Qubits
The main aim of this thesis is to look for a logical deductive calculus (we
will adopt sequent calculus, originally introduced in Gentzen, 1935), which
could describe quantum information and its properties. More precisely, we
intended to describe in logical terms the formation of the qubit (the unit of
quantum information) which is a particular linear superposition of the two
classical bits 0 and 1. To do so, we had to introduce the new connective
"quantum superposition", in the logic of one qubit, Lq, as the classical
conjunction cannot describe this quantum link.Comment: 138 pages, PhD thesis in Mathematic
Recommended from our members
Biological Nanowires: Integration of the silver(I) base pair into DNA with nanotechnological and synthetic biological applications
Modern computing and mobile device technologies are now based on semiconductor technology with nanoscale components, i.e., nanoelectronics, and are used in an increasing variety of consumer, scientific, and space-based applications. This rise to global prevalence has been accompanied by a similarly precipitous rise in fabrication cost, toxicity, and technicality; and the vast majority of modern nanotechnology cannot be repaired in whole or in part. In combination with looming scaling limits, it is clear that there is a critical need for fabrication technologies that rely upon clean, inexpensive, and portable means; and the ideal nanoelectronics manufacturing facility would harness micro- and nanoscale fabrication and self-assembly techniques.
The field of molecular electronics has promised for the past two decades to fill fundamental gaps in modern, silicon-based, micro- and nanoelectronics; yet molecular electronic devices, in turn, have suffered from problems of size, dispersion and reproducibility. In parallel, advances in DNA nanotechnology over the past several decades have allowed for the design and assembly of nanoscale architectures with single-molecule precision, and indeed have been used as a basis for heteromaterial scaffolds, mechanically-active delivery mechanisms, and network assembly. The field has, however, suffered for lack of meaningful modularity in function: few designs to date interact with their surroundings in more than a mechanical manner.
As a material, DNA offers the promise of nanometer resolution, self-assembly, linear shape, and connectivity into branched architectures; while its biological origin offers information storage, enzyme-compatibility and the promise of biologically-inspired fabrication through synthetic biological means. Recent advances in DNA chemistry have isolated and characterized an orthogonal DNA base pair using standard nucleobases: by bridging the gap between mismatched cytosine nucleotides, silver(I) ions can be selectively incorporated into the DNA helix with atomic resolution. The goal of this thesis is to explore how this approach to “metallize” DNA can be combined with structural DNA nanotechnology as a step toward creating electronically-functional DNA networks.
This work begins with a survey of applications for such a transformative technology, including nanoelectronic component fabrication for low-resource and space-based applications. We then investigate the assembly of linear Ag+-functionalized DNA species using biochemical and structural analyses to gain an understanding of the kinetics, yield, morphology, and behavior of this orthogonal DNA base pair. After establishing a protocol for high yield assembly in the presence of varying Ag+ functionalization, we investigate these linear DNA species using electrical means. First a method of coupling orthogonal DNA to single-walled carbon nanotubes (SWCNTs) is explored for self-assembly into nanopatterned transistor devices. Then we carry out scanning tunneling microscope (STM) break junction experiments on short polycytosine, polycationic DNA duplexes and find increased molecular conductance of at least an order of magnitude relative to the most conductive DNA analog.
With an understanding of linear species from both a biochemical and nanoelectronic perspective, we investigate the assembly of nonlinear Ag+-functionalized DNA species. Using rational design principles gathered from the analysis of linear species, a de novo mathematical framework for understanding generalized DNA networks is developed. This provides the basis for a computational model built in Matlab that is able to design DNA networks and nanostructures using arbitrary base parity. In this way, DNA nanostructures are able to be designed using the dC:Ag+:dC base pair, as well as any similar nucleobase or DNA-inspired system (dT:Hg2+:dT, rA:rU, G4, XNA, LNA, PNA, etc.). With this foundation, three general classes of DNA tiles are designed with embedded nanowire elements: single crossover Holliday junction (HJ) tiles, T-junction (TJ) units, and double crossover (DX) tile pairs and structures. A library of orthogonal chemistry DNA nanotechnology is described, and future applications to nanomaterials and circuit architectures are discussed
Performance modelling and the representation of large scale distributed system functions
This thesis presents a resource based approach to model generation for performance characterization and correctness checking of large scale telecommunications networks. A notion called the timed automaton is proposed and then developed to encapsulate behaviours of networking equipment, system control policies and non-deterministic user behaviours. The states of pooled network resources and the behaviours of resource consumers are represented as continually varying geometric patterns; these patterns form part of the data operated upon by the timed automata. Such a representation technique allows for great flexibility regarding the level of abstraction that can be chosen in the modelling of telecommunications systems. None the less, the notion of system functions is proposed to serve as a constraining framework for specifying bounded behaviours and features of telecommunications systems. Operational concepts are developed for the timed automata; these concepts are based on limit preserving relations. Relations over system states represent the evolution of system properties observable at various locations within the network under study. The declarative nature of such permutative state relations provides a direct framework for generating highly expressive models suitable for carrying out optimization experiments. The usefulness of the developed procedure is demonstrated by tackling a large scale case study, in particular the problem of congestion avoidance in networks; it is shown that there can be global coupling among local behaviours within a telecommunications network. The uncovering of such a phenomenon through a function oriented simulation is a contribution to the area of network modelling. The direct and faithful way of deriving performance metrics for loss in networks from resource utilization patterns is also a new contribution to the work area
- …