40 research outputs found

    High-Dimensional Stochastic Design Optimization by Adaptive-Sparse Polynomial Dimensional Decomposition

    Full text link
    This paper presents a novel adaptive-sparse polynomial dimensional decomposition (PDD) method for stochastic design optimization of complex systems. The method entails an adaptive-sparse PDD approximation of a high-dimensional stochastic response for statistical moment and reliability analyses; a novel integration of the adaptive-sparse PDD approximation and score functions for estimating the first-order design sensitivities of the statistical moments and failure probability; and standard gradient-based optimization algorithms. New analytical formulae are presented for the design sensitivities that are simultaneously determined along with the moments or the failure probability. Numerical results stemming from mathematical functions indicate that the new method provides more computationally efficient design solutions than the existing methods. Finally, stochastic shape optimization of a jet engine bracket with 79 variables was performed, demonstrating the power of the new method to tackle practical engineering problems.Comment: 18 pages, 2 figures, to appear in Sparse Grids and Applications--Stuttgart 2014, Lecture Notes in Computational Science and Engineering 109, edited by J. Garcke and D. Pfl\"{u}ger, Springer International Publishing, 201

    Performance confirmation of the Belle II imaging Time Of Propogation (iTOP) prototype counter

    Full text link
    The Bell Detector at the KEKB asymmetric-energy e{sup +}e{sup -} collider performed extremely well, logging an integrated luminosity an order of magnitude higher than the design baseline. With this inverse attobarn of integrated luminosity, time-dependent CP-violation inn the 3rd generation beauty quarks was firmly established, and is now a precision measurement. Going beyond this to explore if the Kobayashi-Maskawa mechanism is the only contributor to quark-mixing, and to interrogate the flavor sector for non-standard model enhancements, requires a detector and accelerator capable of topping this world-record luminosity by more than an order of magnitude. The Belle II detector at the upgraded Super-KEKB accelerator has been designed to meet this highly ambitious goal of operating at a luminosity approaching 10{sup 36} cm{sup -2} s{sup -1}. Such higher event rates and backgrounds require upgrade of essentially all detector subsystems, as well as their readout. Comparing the Belle composite (threshold Aerogel + Time of Flight) particle identification (PID) system with the DIRC employed by BaBar, quartz radiator internal Cherenkov photon detection proved to have higher kaon efficiency and lower pion fake rates. However, because the detector structure and CsI calorimeter will be retained, an improved barrel PID must fit within a very narrow envelope, as indicated in Figure 1. To effectively utilize this space, a more compact detector concept based on the same quartz radiators, but primarily using photon arrival time was proposed. This Time Of Propagation (TOP) counter was studied in a number of earlier prototype tests. Key to the necessary 10's of picosecond single-photon timing has been the development of the so-called SL-10 Micro-Channel Plate Photo-Multiplier Tube (MCP-PMT), which has demonstrated sub-40 ps single photon Transit Time Spread TTS. Further simulation study of this detector concept indicated that a focusing mirror in the forward direction, as well as a modest image expansion volume and more highly pixelated image plane improve the theoretical detector performance, since timing alone is limited by chromatic dispersion of the Cherenkov photons. This imaging-TOP (or iTOP) counter is the basis of Belle II barrel PID upgrade. However, a number of critical performance parameters must be demonstrated prior to releasing this prototype design for production manufacture

    B-factory Physics from Effective Supersymmetry

    Full text link
    We discuss how to extract non-Standard Model effects from B-factory phenomenology. We then analyze the prospects for uncovering evidence for Effective Supersymmetry, a class of supersymmetric models which naturally suppress flavor changing neutral currents and electric dipole moments without squark universality or small CP violating phases, in experiments at BaBar, BELLE, HERA-B, CDF/D0 and LHC-B.Comment: 5 pages, 2 figures, revtex, eps

    Mathematical analysis: an introduction

    No full text

    Introduction to function algebras

    No full text

    Quantitated Effects of Nutritional Supplementation on Exercise Induced Sweat

    Get PDF
    Discovery studies have identified many metabolites contained in human sweat. However, quantitative analysis of the sweat metabolome content remains mostly unknown. Furthermore several attributes, including rate, have been defined to affect sweat metabolite content, while other effectors, like diet, remain unknown. This study works to quantitatively define the metabolite impact caused by nutritional supplementation. To better understand the effect diet plays, a LC-MS method was developed focusing on improving resolution and peak width. While the literature provided examples of how diet affected sweat metabolite concentrations, the long-term effects of diet have not been explored. The experiment described here attempts to fill that gap. Partial data separation was found among groups ingesting high and low nutritional supplementation. Several subjects given the high nutritional supplementation had decreased sweat metabolite concentrations after twelve weeks. These results suggest nutritional supplementation can impact the sweat metabolome, and diet should be considered in biomarker discovery experimentation

    Building a Broadband Community With a Baldrige Based Approach

    No full text
    This article makes a contribution by providing a conceptual framework for transforming the innovative use of information technology into business growth by simultaneously solving the combined technology and business problem. A total systems approach is facilitated by deploying the National Baldrige Criteria for Performance Excellence for evaluating business model improvements while embedding the disruptive use of information technology. See Clayton M. Christensen\u27s pioneering work1. A key finding of this applied research is that by concurrently solving the business and technology innovations far greater financial success can be realized than when the engineering and engineering management functions are treated independently or in series. Leadership and technical leaders from all areas look for innovative technology that can enhance both business units. The business problem was solved using a non linear approach without disrupting the company\u27s day to day operation. The result became two stand alone non linear businesses operating under a joint linear process
    corecore