25,173 research outputs found

    South Pole Telescope Software Systems: Control, Monitoring, and Data Acquisition

    Full text link
    We present the software system used to control and operate the South Pole Telescope. The South Pole Telescope is a 10-meter millimeter-wavelength telescope designed to measure anisotropies in the cosmic microwave background (CMB) at arcminute angular resolution. In the austral summer of 2011/12, the SPT was equipped with a new polarization-sensitive camera, which consists of 1536 transition-edge sensor bolometers. The bolometers are read out using 36 independent digital frequency multiplexing (\dfmux) readout boards, each with its own embedded processors. These autonomous boards control and read out data from the focal plane with on-board software and firmware. An overall control software system running on a separate control computer controls the \dfmux boards, the cryostat and all other aspects of telescope operation. This control software collects and monitors data in real-time, and stores the data to disk for transfer to the United States for analysis

    Enhancing the Efficiency of Organic Photovoltaics by a Photoactive Molecular Mediator

    Get PDF
    High boiling-point solvent additives, such as 1,8-diiodooctane, have been widely used to tune nanoscale phase morphology for increased efficiency in bulk heterojunction organic solar cells. However, liquid-state solvent additives remain in the active films for extended times and later migrate or evaporate from the films, leading to unstable device performance. Here, a solid-state photoactive molecular mediator, namely N(BAI)3, is reported that could be employed to replace the commonly used solvent additives to tune the morphology of bulk heterojunction films for improved device performance. The N(BAI)3 mediator not only resides in the active films locally to fine tune the phase morphology, but also contributes to the additional absorption of the active films, leading to ∼11% enhancement of power conversion efficiency of P3HT:PC60BM devices. Comparative studies are carried out to probe the nanoscale morphologies using grazing incidence wide-angle X-ray scattering and complementary neutron reflectometry. The use of 1 wt% N(BAI)3 is found to effectively tune the packing of P3HT, presumably through balanced π-interactions endowed by its large conjugated π surface, and to promote the formation of a PC60BM-rich top interfacial layer. These findings open up a new way to effectively tailor the phase morphology by photoactive molecular mediators in organic photovoltaics

    Constructing a Virtual Training Laboratory Using Intelligent Agents

    No full text
    This paper reports on the results and experiences of the Trilogy project; a collaborative project concerned with the development of a virtual research laboratory using intelligence agents. This laboratory is designed to support the training of research students in telecommunications traffic engineering. Training research students involves a number of basic activities. They may seek guidance from, or exchange ideas with, more experienced colleagues. High quality academic papers, books and research reports provide a sound basis for developing and maintaining a good understanding of an area of research. Experimental tools enable new ideas to be evaluated, and hypotheses tested. These three components-collaboration, information and experimentation- are central to any research activity, and a good training environment for research should integrate them in a seamless fashion. To this end, we describe the design and implementation of an agent-based virtual laboratory

    Mediator-assisted multi-source routing in information-centric networks

    Get PDF
    Among the new communication paradigms recently proposed, information-centric networking (ICN) is able to natively support content awareness at the network layer shifting the focus from hosts (as in traditional IP networks) to information objects. In this paper, we exploit the intrinsic content-awareness ICN features to design a novel multi-source routing mechanism. It involves a new network entity, the ICN mediator, responsible for locating and delivering the requested information objects that are chunked and stored at different locations. Our approach imposes very limited signalling overhead, especially for large chunk size (MBytes). Simulations show significant latency reduction compared to traditional routing approaches

    Claiming expertise from betwixt and between: Digital humanities librarians, emotional labor, and genre theory

    Get PDF
    Librarians\u27 liminal (intermediate) position within academia situates us to make unique contributions to digital humanities (DH). In this article, we use genre theory, feminist theory, and theories of emotional labor to explore the importance of discourse mediation and affective labor to DH and the interplay between these areas and academic structural inequality. By claiming our expertise and making explicit work that is often not visible, we can advocate for new and varied roles for librarians in digital humanities. Our analysis is informed by both theory and practice, and it takes a dialogic approach that depends upon the interactions between the two

    Towards structured sharing of raw and derived neuroimaging data across existing resources

    Full text link
    Data sharing efforts increasingly contribute to the acceleration of scientific discovery. Neuroimaging data is accumulating in distributed domain-specific databases and there is currently no integrated access mechanism nor an accepted format for the critically important meta-data that is necessary for making use of the combined, available neuroimaging data. In this manuscript, we present work from the Derived Data Working Group, an open-access group sponsored by the Biomedical Informatics Research Network (BIRN) and the International Neuroimaging Coordinating Facility (INCF) focused on practical tools for distributed access to neuroimaging data. The working group develops models and tools facilitating the structured interchange of neuroimaging meta-data and is making progress towards a unified set of tools for such data and meta-data exchange. We report on the key components required for integrated access to raw and derived neuroimaging data as well as associated meta-data and provenance across neuroimaging resources. The components include (1) a structured terminology that provides semantic context to data, (2) a formal data model for neuroimaging with robust tracking of data provenance, (3) a web service-based application programming interface (API) that provides a consistent mechanism to access and query the data model, and (4) a provenance library that can be used for the extraction of provenance data by image analysts and imaging software developers. We believe that the framework and set of tools outlined in this manuscript have great potential for solving many of the issues the neuroimaging community faces when sharing raw and derived neuroimaging data across the various existing database systems for the purpose of accelerating scientific discovery
    corecore