330,711 research outputs found

    Quantum-optical influences in optoelectronics - an introduction

    Get PDF
    This focused review discusses the increasing importance of quantum optics in the physics and engineering of optoelectronic components. Two influences relating to cavity quantum electrodynamics are presented. One involves the development of low threshold lasers, when the channeling of spontaneous emission into the lasing mode becomes so efficient that the concept of lasing needs revisiting. The second involves the quieting of photon statistics to produce single-photon sources for applications such as quantum information processing. An experimental platform, consisting of quantum-dot gain media inside micro- and nanocavities, is used to illustrate these influences of the quantum mechanical aspect of radiation. An overview is also given on cavity quantum electrodynamics models that may be applied to analyze experiments or design devices.EC/FP7/615613/EU/External Quantum Control of Photonic Semiconductor Nanostructures/EXQUISIT

    Developing a National Design Scoreboard

    Get PDF
    Recognising the growing importance of design, this paper reports on the development of an approach to measuring design at a national level. A series of measures is proposed, that are based around a simplified model of design as a system at a national level. This model was developed though insights from literature and a workshop with government, industry and design sector representatives. Detailed data on design in the UK is presented to highlight the difficulties in collecting reliable and robust data. Evidence is compared with four countries (Spain, Canada, Korea and Sweden). This comparison highlights the inherent difficulties in comparing performance and a revised set of measures is proposed. Finally, an approach to capturing design spend at a firm level is proposed, based on insights from literature and case studies. Keywords: National Design System, Design Performance</p

    Transforming rehabilitation : a summary of evidence on reducing reoffending

    Get PDF

    Multimedia interactive eBooks in laboratory science education

    Get PDF
    Bioscience students in the UK higher education system are making increasing use of technology to support their learning within taught classes and during private study. This experimental study was designed to assess the role for multimedia interactive eBooks in bioscience laboratory classes, delivered using a blended learning approach. Thirty-nine second-year students on a Biomedical Science undergraduate course in a UK university were grouped using an experimental design into alternating trial and control groups and provided with pre-configured iPad tablet devices containing multimedia interactive eBooks. Data collection involved weekly surveys including quantitative and qualitative responses, and analysis of summative assessment marks. Analysis of the results using descriptive statistics methods showed that students made extensive use of eBooks in practical classes and over 70% of students agreed that the eBooks were beneficial for learning. However, less than 40% of students indicated a preference for eBooks over traditional paper protocols for practical-based classes. Although the eBooks were well used by students, they had no statistically significant effect on assessment marks. Overall, the study highlighted the positive feedback from students relating to multimedia interactive eBooks for supporting students’ learning, but illustrated that there are other factors affecting adoption of new technologies

    Where do statistical models come from? Revisiting the problem of specification

    Full text link
    R. A. Fisher founded modern statistical inference in 1922 and identified its fundamental problems to be: specification, estimation and distribution. Since then the problem of statistical model specification has received scant attention in the statistics literature. The paper traces the history of statistical model specification, focusing primarily on pioneers like Fisher, Neyman, and more recently Lehmann and Cox, and attempts a synthesis of their views in the context of the Probabilistic Reduction (PR) approach. As argued by Lehmann [11], a major stumbling block for a general approach to statistical model specification has been the delineation of the appropriate role for substantive subject matter information. The PR approach demarcates the interrelated but complemenatry roles of substantive and statistical information summarized ab initio in the form of a structural and a statistical model, respectively. In an attempt to preserve the integrity of both sources of information, as well as to ensure the reliability of their fusing, a purely probabilistic construal of statistical models is advocated. This probabilistic construal is then used to shed light on a number of issues relating to specification, including the role of preliminary data analysis, structural vs. statistical models, model specification vs. model selection, statistical vs. substantive adequacy and model validation.Comment: Published at http://dx.doi.org/10.1214/074921706000000419 in the IMS Lecture Notes--Monograph Series (http://www.imstat.org/publications/lecnotes.htm) by the Institute of Mathematical Statistics (http://www.imstat.org

    Standardized or simple effect size: what should be reported?

    Get PDF
    It is regarded as best practice for psychologists to report effect size when disseminating quantitative research findings. Reporting of effect size in the psychological literature is patchy – though this may be changing – and when reported it is far from clear that appropriate effect size statistics are employed. This paper considers the practice of reporting point estimates of standardized effect size and explores factors such as reliability, range restriction and differences in design that distort standardized effect size unless suitable corrections are employed. For most purposes simple (unstandardized) effect size is more robust and versatile than standardized effect size. Guidelines for deciding what effect size metric to use and how to report it are outlined. Foremost among these are: i) a preference for simple effect size over standardized effect size, and ii) the use of confidence intervals to indicate a plausible range of values the effect might take. Deciding on the appropriate effect size statistic to report always requires careful thought and should be influenced by the goals of the researcher, the context of the research and the potential needs of readers

    Experimental Design Modulates Variance in BOLD Activation: The Variance Design General Linear Model

    Full text link
    Typical fMRI studies have focused on either the mean trend in the blood-oxygen-level-dependent (BOLD) time course or functional connectivity (FC). However, other statistics of the neuroimaging data may contain important information. Despite studies showing links between the variance in the BOLD time series (BV) and age and cognitive performance, a formal framework for testing these effects has not yet been developed. We introduce the Variance Design General Linear Model (VDGLM), a novel framework that facilitates the detection of variance effects. We designed the framework for general use in any fMRI study by modeling both mean and variance in BOLD activation as a function of experimental design. The flexibility of this approach allows the VDGLM to i) simultaneously make inferences about a mean or variance effect while controlling for the other and ii) test for variance effects that could be associated with multiple conditions and/or noise regressors. We demonstrate the use of the VDGLM in a working memory application and show that engagement in a working memory task is associated with whole-brain decreases in BOLD variance.Comment: 18 pages, 7 figure

    Research-teaching linkages: enhancing graduate attributes. Life sciences

    Get PDF

    Knowledge discovery for friction stir welding via data driven approaches: Part 1 – correlation analyses of internal process variables and weld quality

    Get PDF
    For a comprehensive understanding towards Friction Stir Welding (FSW) which would lead to a unified approach that embodies materials other than aluminium, such as titanium and steel, it is crucial to identify the intricate correlations between the controllable process conditions, the observable internal process variables, and the characterisations of the post-weld materials. In Part I of this paper, multiple correlation analyses techniques have been developed to detect new and previously unknown correlations between the internal process variables and weld quality of aluminium alloy AA5083. Furthermore, a new exploitable weld quality indicator has, for the first time, been successfully extracted, which can provide an accurate and reliable indication of the ‘as-welded’ defects. All results relating to this work have been validated using real data obtained from a series of welding trials that utilised a new revolutionary sensory platform called ARTEMIS developed by TWI Ltd., the original inventors of the FSW process
    • …
    corecore