10,499 research outputs found

    Picosecond excitation transport in disordered systems

    Get PDF
    Time-resolved fluorescence decay profiles are used to study excitation transport in 2- and 3-dimensional disordered systems. Time-correlated single photon counting detection is used to collect the fluorescence depolarization data. The high signal-to-noise ratios afforded by this technique makes it possible to critically examine current theories of excitation transport;Care has been taken to eliminate or account for the experimental artifacts common to this type of study. Solutions of 3,3[superscript]\u27-diethyloxadicarbocyanine iodide (DODCI) in glycerol serve as a randomly distributed array of energy donors in 3-dimensions. A very thin sample cell (~2 [mu]m) is used to minimize the effects of fluorescence self-absorption on the decay kinetics. Evidence of a dynamic shift of the fluorescence spectrum of DODCI in glycerol due to solvent reorganization is presented. The effects of excitation trapping on the decay profiles is minimized in the data analysis procedure. The 3-body theory of Gochanour, Andersen, and Fayer (GAF) and the far less complex 2-particle analytic theory of Huber, Hamilton, and Barnett yield indistinguishable fits to the data over the wide dynamic range of concentrations and decay times studied;Octadecylrhodamine B (ODRB) dispersed in the lipid dioleoylphosphatidylcholine (DOL) in a Langmuir-Blodgett (LB) film at an air-water interface provides a 2-dimensional array of donors with which to study excitation transport. Characteristics of the LB film make it possible to study a much wider dynamic range of donor concentrations than is possible in a system of dye molecules adsorbed onto an insulating surface. The data fitting model includes fluorescence depolarization due to the restricted rotational motion of the ODRB in the LB film. The excitation transport dynamics for reduced chromophore concentrations up to ~5.0 were described well by a 2-dimensional 2-particle theory developed by Baumann and Fayer. ftn[superscript]1DOE Report IS-T-1290. This work was performed under contract No. W-7405-Eng-82 with the U.S. Department of Energy

    The MM Alternative to EM

    Full text link
    The EM algorithm is a special case of a more general algorithm called the MM algorithm. Specific MM algorithms often have nothing to do with missing data. The first M step of an MM algorithm creates a surrogate function that is optimized in the second M step. In minimization, MM stands for majorize--minimize; in maximization, it stands for minorize--maximize. This two-step process always drives the objective function in the right direction. Construction of MM algorithms relies on recognizing and manipulating inequalities rather than calculating conditional expectations. This survey walks the reader through the construction of several specific MM algorithms. The potential of the MM algorithm in solving high-dimensional optimization and estimation problems is its most attractive feature. Our applications to random graph models, discriminant analysis and image restoration showcase this ability.Comment: Published in at http://dx.doi.org/10.1214/08-STS264 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Rose Effect and the Euro: The Magic is Gone

    Get PDF
    This paper presents an updated meta-analysis of the effect of currency unions on trade, focusing on the Euro area. Using meta-regression methods such as funnel asymmetry test, evidence for strong publication bias is found. The estimated underlying effect for non-Euro studies reaches about 50%. However, the Euro's trade promoting effect corrected for publication bias is insignificant. The Rose effect literature shows signs of the economics research cycle: reported t-statistic is a quadratic function of publication year. Explanatory meta-regression (robust fixed effects and random effects) suggests that some authors produce predictable results. Interestingly, proxies for authors' IT skills were also found significant.Rose effect; Trade; Currency union; Euro; Meta-analysis; Publication bias

    IM3D: A parallel Monte Carlo code for efficient simulations of primary radiation displacements and damage in 3D geometry

    Get PDF
    SRIM-like codes have limitations in describing general 3D geometries, for modeling radiation displacements and damage in nanostructured materials. A universal, computationally efficient and massively parallel 3D Monte Carlo code, IM3D, has been developed with excellent parallel scaling performance. IM3D is based on fast indexing of scattering integrals and the SRIM stopping power database, and allows the user a choice of Constructive Solid Geometry (CSG) or Finite Element Triangle Mesh (FETM) method for constructing 3D shapes and microstructures. For 2D films and multilayers, IM3D perfectly reproduces SRIM results, and can be ∼10[superscript 2] times faster in serial execution and > 10[superscript 4] times faster using parallel computation. For 3D problems, it provides a fast approach for analyzing the spatial distributions of primary displacements and defect generation under ion irradiation. Herein we also provide a detailed discussion of our open-source collision cascade physics engine, revealing the true meaning and limitations of the “Quick Kinchin-Pease” and “Full Cascades” options. The issues of femtosecond to picosecond timescales in defining displacement versus damage, the limitation of the displacements per atom (DPA) unit in quantifying radiation damage (such as inadequacy in quantifying degree of chemical mixing), are discussed.National Natural Science Foundation (China) (Grant 11275229)National Natural Science Foundation (China) (Grant 11475215)National Natural Science Foundation (China) (Grant NSAF U1230202)National Natural Science Foundation (China) (Grant 11534012)National Basic Research Program of China (973 Program) (Grant 2012CB933702)Hefei Center for Physical Science and Technology (Grant 2012FXZY004)Chinese Academy of Sciences (Hefei Institutes of Physical Science (CASHIPS) Director Grant)National Science Foundation (U.S.) (DMR-1410636)National Science Foundation (U.S.) (DMR-1120901

    Flat-plate solar array project. Volume 5: Process development

    Get PDF
    The goal of the Process Development Area, as part of the Flat-Plate Solar Array (FSA) Project, was to develop and demonstrate solar cell fabrication and module assembly process technologies required to meet the cost, lifetime, production capacity, and performance goals of the FSA Project. R&D efforts expended by Government, Industry, and Universities in developing processes capable of meeting the projects goals during volume production conditions are summarized. The cost goals allocated for processing were demonstrated by small volume quantities that were extrapolated by cost analysis to large volume production. To provide proper focus and coverage of the process development effort, four separate technology sections are discussed: surface preparation, junction formation, metallization, and module assembly

    A new simplified daylight evaluation tool, description and validation against the standard method of EN 17037

    Get PDF
    A new daylight evaluation tool using a simplified assessment method to determine the daylight quantity provided to a typical room was developed. Its calculation method is based on a set of formulas integrating the main factors characterizing the indoor space and the outdoor context. The results are expressed as a corrected Glass-to-Floor ratio (GFR*) which is used as a proxy for the daylight provision. This value can then be used to attribute a rating or “Daylight score” to each space. The main finding is that the simplified method is an easy and relatively reliable estimation for daylight provision. A comparison of the tool with detailed daylight simulations according to the daylight factor method of EN 17037:2018 shows that a high correlation is obtained. The tool is applicable for any case which has conditions matching closely to the models and situations defined. Due to its easy implementation and the limited number of input parameters this evaluation method could be well suited for building passport schemes.publishedVersio
    corecore