3,065 research outputs found

    Release and Team Utilization Planning Application

    Get PDF
    This project involves the development of a web-based application to manage release planning and team utilization. The goal of the project is to improve mid to longer term planning for agile software development projects and managing the capacity of the teams working on them. While the tool and process can be used by any software development methodology that is using an approach involving several releases to complete the project, the primary focus of this project is based on the experiences and needs of an organization using an Agile/Iterative software development process. The system was developed using Oracle Application Express 4.2, Oracle’s RAD tool for developing Web applications with SQL and PL/SQL

    Purdue Contribution of Fusion Simulation Program

    Full text link
    The overall science goal of the FSP is to develop predictive simulation capability for magnetically confined fusion plasmas at an unprecedented level of integration and fidelity. This will directly support and enable effective U.S. participation in research related to the International Thermonuclear Experimental Reactor (ITER) and the overall mission of delivering practical fusion energy. The FSP will address a rich set of scientific issues together with experimental programs, producing validated integrated physics results. This is very well aligned with the mission of the ITER Organization to coordinate with its members the integrated modeling and control of fusion plasmas, including benchmarking and validation activities. [1]. Initial FSP research will focus on two critical areas: 1) the plasma edge and 2) whole device modeling including disruption avoidance. The first of these problems involves the narrow plasma boundary layer and its complex interactions with the plasma core and the surrounding material wall. The second requires development of a computationally tractable, but comprehensive model that describes all equilibrium and dynamic processes at a sufficient level of detail to provide useful prediction of the temporal evolution of fusion plasma experiments. The initial driver for the whole device model (WDM) will be prediction and avoidance of discharge-terminating disruptions, especially at high performance, which are a critical impediment to successful operation of machines like ITER. If disruptions prove unable to be avoided, their associated dynamics and effects will be addressed in the next phase of the FSP. The FSP plan targets the needed modeling capabilities by developing Integrated Science Applications (ISAs) specific to their needs. The Pedestal-Boundary model will include boundary magnetic topology, cross-field transport of multi-species plasmas, parallel plasma transport, neutral transport, atomic physics and interactions with the plasma wall. It will address the origins and structure of the plasma electric field, rotation, the L-H transition, and the wide variety of pedestal relaxation mechanisms. The Whole Device Model will predict the entire discharge evolution given external actuators (i.e., magnets, power supplies, heating, current drive and fueling systems) and control strategies. Based on components operating over a range of physics fidelity, the WDM will model the plasma equilibrium, plasma sources, profile evolution, linear stability and nonlinear evolution toward a disruption (but not the full disruption dynamics). The plan assumes that, as the FSP matures and demonstrates success, the program will evolve and grow, enabling additional science problems to be addressed. The next set of integration opportunities could include: 1) Simulation of disruption dynamics and their effects; 2) Prediction of core profile including 3D effects, mesoscale dynamics and integration with the edge plasma; 3) Computation of non-thermal particle distributions, self-consistent with fusion, radio frequency (RF) and neutral beam injection (NBI) sources, magnetohydrodynamics (MHD) and short-wavelength turbulence

    Frame of Mind

    Get PDF
    The creative process offers me an escape and tranquility worthy of sharing. When creating art, I reflect on the people, places, and things that move me. I convey my frame of mind through the brush to the canvas. The result is intense strokes of color that deliver clarity of emotion for others to experience. The intention of my work is to share a world that satisfies the eyes, mind, and soul. There is something about each piece that I hope will keep the viewer returning to absorb, contemplate, and enjoy

    Bedrock geologic maps of the Griffin Creek and Bailey Mountain 7.5 minute quadrangles Powell County Montana

    Get PDF

    A hierarchical approach to the prediction of the quaternary structure of GCN4 and its mutants

    Get PDF
    First published in DIMACS Series in Discrete Mathematics and Theoretical Computer Science, 23 (1996) published by the American Mathematical Society.Presented at DIMACS Workshop on Global Minimization of Nonconvex Energy Functions: Molecular Conformation and Protein Folding, March 20-21, 1995.A hierarchical approach to protein folding is employed to examine the folding pathway and predict the quaternary structure of the GCN4 leucine zipper. Structures comparable in quality to experiment have been predicted. In addition, the equilibrium between dimers, trimers and tetramers of a number of GCN4 mutants has been examined. In five out of eight cases, the simulation results are in accordance with the experimental studies of Harbury, et al

    The use of P3b as an indicator of neurophysiologic change from subconcussive impacts in football players

    Get PDF
    There is a growing appreciation in research that subconcussive impacts may affect cognitive functioning. Canadian University football players (n=45) were separated into three groups based on their position/skill (small skilled, big skilled and big unskilled). An impact measuring device (GForceTracker) was used to record the number of impacts that each player experienced in a season. Player groups were separated into two levels of impact exposure: low and high. Players completed baseline, midseason, postseason, and follow-up neurophysiological tests (four months later) to measure P3b amplitude in response to a visual oddball paradigm, and high versus low impact subgroups for each player group were compared. Small skilled and big skilled players showed significant decreases in P3b amplitudes at midseason and postseason, reflecting decreased attentional resources allocated to the task. No skill group exhibited a significant change from baseline at follow-up, illustrating that in-season cognitive function deficits appear to recover in the offseason

    Model Selection in an Information Economy : Choosing what to Learn

    Get PDF
    As online markets for the exchange of goods and services become more common, the study of markets composed at least in part of autonomous agents has taken on increasing importance. In contrast to traditional completeinformation economic scenarios, agents that are operating in an electronic marketplace often do so under considerable uncertainty. In order to reduce their uncertainty, these agents must learn about the world around them. When an agent producer is engaged in a learning task in which data collection is costly, such as learning the preferences of a consumer population, it is faced with a classic decision problem: when to explore and when to exploit. If the agent has a limited number of chances to experiment, it must explicitly consider the cost of learning (in terms of foregone profit) against the value of the information acquired. Information goods add an additional dimension to this problem; due to their flexibility, they can be bundled and priced according to a number of different price schedules. An optimizing producer should consider the profit each price schedule can extract, as well as the difficulty of learning of this schedule. In this paper, we demonstrate the tradeoff between complexity and profitability for a number of common price schedules. We begin with a one-shot decision as to which schedule to learn. Schedules with moderate complexity are preferred in the short and medium term, as they are learned quickly, yet extract a significant fraction of the available profit. We then turn to the repeated version of this one-shot decision and show that moderate complexity schedules, in particular two-part tariff, perform well when the producer must adapt to nonstationarity in the consumer population. When a producer can dynamically change schedules as it learns, it can use an explicit decision-theoretic formulation to greedily select the schedule which appears to yield the greatest profit in the next period. By explicitly considering the both the learnability and the profit extracted by different price schedules, a producer can extract more profit as it learns than if it naively chose models that are accurate once learned.Online learning; information economics; model selection; direct search

    The Genetic Flock Algorithm

    Get PDF
    AbstractThe purpose of this paper is to describe and evaluate a new algorithm for optimization. The new algorithm is named the Genetic Flock Algorithm. This algorithm is a type of hybrid of a Genetic Algorithm and a Particle Swarm Optimization Algorithm. The paper discusses strengths and weaknesses of these two algorithms. It then explains how the Genetic Flock Algorithm combines features of both and gives details of the algorithm. All three algorithms are compared using eight standard optimization problems that are used in the literature. It is shown that the Genetic Flock Algorithm provides superior performance on 75% of the tested cases. In the remaining 25% of the cases it outperforms either the Genetic Algorithm or the Particle Swarm Optimization Algorithm; it is never worse than both. Possible future improvements to the Genetic Flock Algorithm are briefly described
    corecore