5,872 research outputs found
Coin.AI: A Proof-of-Useful-Work Scheme for Blockchain-based Distributed Deep Learning
One decade ago, Bitcoin was introduced, becoming the first cryptocurrency and
establishing the concept of "blockchain" as a distributed ledger. As of today,
there are many different implementations of cryptocurrencies working over a
blockchain, with different approaches and philosophies. However, many of them
share one common feature: they require proof-of-work to support the generation
of blocks (mining) and, eventually, the generation of money. This proof-of-work
scheme often consists in the resolution of a cryptography problem, most
commonly breaking a hash value, which can only be achieved through brute-force.
The main drawback of proof-of-work is that it requires ridiculously large
amounts of energy which do not have any useful outcome beyond supporting the
currency. In this paper, we present a theoretical proposal that introduces a
proof-of-useful-work scheme to support a cryptocurrency running over a
blockchain, which we named Coin.AI. In this system, the mining scheme requires
training deep learning models, and a block is only mined when the performance
of such model exceeds a threshold. The distributed system allows for nodes to
verify the models delivered by miners in an easy way (certainly much more
efficiently than the mining process itself), determining when a block is to be
generated. Additionally, this paper presents a proof-of-storage scheme for
rewarding users that provide storage for the deep learning models, as well as a
theoretical dissertation on how the mechanics of the system could be
articulated with the ultimate goal of democratizing access to artificial
intelligence.Comment: 17 pages, 5 figure
The Parallelism Motifs of Genomic Data Analysis
Genomic data sets are growing dramatically as the cost of sequencing
continues to decline and small sequencing devices become available. Enormous
community databases store and share this data with the research community, but
some of these genomic data analysis problems require large scale computational
platforms to meet both the memory and computational requirements. These
applications differ from scientific simulations that dominate the workload on
high end parallel systems today and place different requirements on programming
support, software libraries, and parallel architectural design. For example,
they involve irregular communication patterns such as asynchronous updates to
shared data structures. We consider several problems in high performance
genomics analysis, including alignment, profiling, clustering, and assembly for
both single genomes and metagenomes. We identify some of the common
computational patterns or motifs that help inform parallelization strategies
and compare our motifs to some of the established lists, arguing that at least
two key patterns, sorting and hashing, are missing
Automatic Graphics And Game Content Generation Through Evolutionary Computation
Simulation and game content includes the levels, models, textures, items, and other objects encountered and possessed by players during the game. In most modern video games and simulation software, the set of content shipped with the product is static and unchanging, or at best, randomized within a narrow set of parameters. However, ideally, if game content could be constantly and automatically renewed, players would remain engaged longer in the evolving stream of content. This dissertation introduces three novel technologies that together realize this ambition. (1) The first, NEAT Particles, is an evolutionary method to enable users to quickly and easily create complex particle effects through a simple interactive evolutionary computation (IEC) interface. That way, particle effects become an evolvable class of content, which is exploited in the remainder of the dissertation. In particular, (2) a new algorithm called content-generating NeuroEvolution of Augmenting Topologies (cgNEAT) is introduced that automatically generates graphical and game content while the game is played, based on the past preferences of the players. Through cgNEAT, the game platform on its own can generate novel content that is designed to satisfy its players. Finally, (3) the Galactic Arms Race (GAR) multiplayer online video game is constructed to demonstrate these techniques working on a real online gaming platform. In GAR, which was made available to the public and playable online, players pilot space ships and fight enemies to acquire unique particle system weapons that are automatically evolved by the cgNEAT algorithm. The resulting study shows that cgNEAT indeed enables players to discover a wide variety of appealing content that is not only novel, but also based on and extended from previous content that they preferred in the past. The implication is that with cgNEAT it is now possible to create applications that generate their own content to satisfy users, potentially significantly reducing the cost of content creation and considerably increasing entertainment value with a constant stream of evolving content
- …