1,190 research outputs found

    Induced spawning and larval rearing of the sea cucumber Holothuria nobilis

    Get PDF
    Sea cucumber Holothuria nobilis is an economically important species for livelihoods in many countries. However, an increase in demand for this species has led to the depletion of wild stocks. The introduction of this species in aquaculture is necessary to reduce fishing pressure. This study was taken to establish breeding and larval rearing techniques for the development of H. nobilis aquaculture. The broodstock collected from the wild were induced to spawn by using thermal stimulation and the combination of thermal and algal stimulation methods. The larvae obtained from induced spawning were reared using different diets (mixed microalgae and mixed microalgae with artificial feeds) at different stocking densities (300, 600, and 1000 larvae/l). Thermal stimulation is the best method of inducing spawning to H. nobilis yielding up to 1,300,000 fertilized eggs. The highest survival rate (27.5%) of doliolaria larvae was achieved using a mixed microalgae diet. The highest survival rate of doliolaria larvae (41.5 %), growth, and development were in the stocking density of 600 larvae/l

    Learning Models over Relational Data using Sparse Tensors and Functional Dependencies

    Full text link
    Integrated solutions for analytics over relational databases are of great practical importance as they avoid the costly repeated loop data scientists have to deal with on a daily basis: select features from data residing in relational databases using feature extraction queries involving joins, projections, and aggregations; export the training dataset defined by such queries; convert this dataset into the format of an external learning tool; and train the desired model using this tool. These integrated solutions are also a fertile ground of theoretically fundamental and challenging problems at the intersection of relational and statistical data models. This article introduces a unified framework for training and evaluating a class of statistical learning models over relational databases. This class includes ridge linear regression, polynomial regression, factorization machines, and principal component analysis. We show that, by synergizing key tools from database theory such as schema information, query structure, functional dependencies, recent advances in query evaluation algorithms, and from linear algebra such as tensor and matrix operations, one can formulate relational analytics problems and design efficient (query and data) structure-aware algorithms to solve them. This theoretical development informed the design and implementation of the AC/DC system for structure-aware learning. We benchmark the performance of AC/DC against R, MADlib, libFM, and TensorFlow. For typical retail forecasting and advertisement planning applications, AC/DC can learn polynomial regression models and factorization machines with at least the same accuracy as its competitors and up to three orders of magnitude faster than its competitors whenever they do not run out of memory, exceed 24-hour timeout, or encounter internal design limitations.Comment: 61 pages, 9 figures, 2 table

    Qsun: an open-source platform towards practical quantum machine learning applications

    Full text link
    Currently, quantum hardware is restrained by noises and qubit numbers. Thus, a quantum virtual machine that simulates operations of a quantum computer on classical computers is a vital tool for developing and testing quantum algorithms before deploying them on real quantum computers. Various variational quantum algorithms have been proposed and tested on quantum virtual machines to surpass the limitations of quantum hardware. Our goal is to exploit further the variational quantum algorithms towards practical applications of quantum machine learning using state-of-the-art quantum computers. This paper first introduces our quantum virtual machine named Qsun, whose operation is underlined by quantum state wave-functions. The platform provides native tools supporting variational quantum algorithms. Especially using the parameter-shift rule, we implement quantum differentiable programming essential for gradient-based optimization. We then report two tests representative of quantum machine learning: quantum linear regression and quantum neural network.Comment: 18 pages, 7 figure
    corecore