504 research outputs found

    A Deep Hierarchical Approach to Lifelong Learning in Minecraft

    Full text link
    We propose a lifelong learning system that has the ability to reuse and transfer knowledge from one task to another while efficiently retaining the previously learned knowledge-base. Knowledge is transferred by learning reusable skills to solve tasks in Minecraft, a popular video game which is an unsolved and high-dimensional lifelong learning problem. These reusable skills, which we refer to as Deep Skill Networks, are then incorporated into our novel Hierarchical Deep Reinforcement Learning Network (H-DRLN) architecture using two techniques: (1) a deep skill array and (2) skill distillation, our novel variation of policy distillation (Rusu et. al. 2015) for learning skills. Skill distillation enables the HDRLN to efficiently retain knowledge and therefore scale in lifelong learning, by accumulating knowledge and encapsulating multiple reusable skills into a single distilled network. The H-DRLN exhibits superior performance and lower learning sample complexity compared to the regular Deep Q Network (Mnih et. al. 2015) in sub-domains of Minecraft

    New Era, New Opportunity, Is GES DISC Ready for Big Data Challenge?

    Get PDF
    The new era of Big Data has opened doors for many new opportunities, as well as new challenges, for both Earth science research/application and data communities. As one of the twelve NASA data centers - Goddard Earth Sciences Data and Information Services Center (GES DISC), one of our great challenges has been how to help research/application community efficiently (quickly and properly) accessing, visualizing and analyzing the massive and diverse data in natural hazard research, management, or even prediction. GES DISC has archived over 2000 TB data on premises and distributed over 23,000 TB of data since 2010. Our data has been widely used in every phase of natural hazard management and research, i.e. long term risk assessment and reduction, forecasting and predicting, monitoring and detection, early warning, damage assessment and response. The big data challenge is not just about data storage, but also about data discoverability and accessibility, and even more, about data migration/mirroring in the cloud. This paper is going to demonstrate GES DISCs efforts and approaches of evolving our overall Web services and powerful Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure) tool into further improving data discoverability and accessibility. Prototype works will also be presented

    One- and Two-Dimensional Analysis of Earth Dams

    Get PDF
    Earth dams may experience reduction in shear strength due to seismically induced pore pressures. Such reduction may be large enough to result in large deformations and eventual loss of the reservoir. While the analysis of embankment dams subject to earthquake loading is a complicated process, it is required for evaluation of seismic stability. In particular, the possibility of liquefaction in older, hydraulically-filled or otherwise poorly compacted dams during earthquakes presents a threat that must be addressed. This paper compares two methods of calculating the peak dynamic shear stress (critical to liquefaction evaluation) that occurs in an embankment during an earthquake. The first method is a one-dimensional analysis, which is simple, rapid and inexpensive. The second method is a two-dimensional finite element analysis, which is complicated, long and expensive. Because it is more desirable to use the simpler one-dimensional analysis, the results from the two analyses were compared and indicated that for slopes up to 35° the stresses were comparable

    Relaxed Half-Stochastic Belief Propagation

    Full text link
    Low-density parity-check codes are attractive for high throughput applications because of their low decoding complexity per bit, but also because all the codeword bits can be decoded in parallel. However, achieving this in a circuit implementation is complicated by the number of wires required to exchange messages between processing nodes. Decoding algorithms that exchange binary messages are interesting for fully-parallel implementations because they can reduce the number and the length of the wires, and increase logic density. This paper introduces the Relaxed Half-Stochastic (RHS) decoding algorithm, a binary message belief propagation (BP) algorithm that achieves a coding gain comparable to the best known BP algorithms that use real-valued messages. We derive the RHS algorithm by starting from the well-known Sum-Product algorithm, and then derive a low-complexity version suitable for circuit implementation. We present extensive simulation results on two standardized codes having different rates and constructions, including low bit error rate results. These simulations show that RHS can be an advantageous replacement for the existing state-of-the-art decoding algorithms when targeting fully-parallel implementations
    • …
    corecore