99,145 research outputs found

    Green Design Studio: A modular-based approach for high-performance building design

    Get PDF
    Building energy and indoor air quality (IAQ) are of great importance to climate change and people’s health and wellbeing. They also play a key role in mitigating the risk of transmissions of infectious diseases such as COVID-19. Building design with high performance in energy efficiency and IAQ improvement can save energy, reduce carbon emissions, and improve human health. High-performance building (HPB) design at the early design stage is critical to building’s real performance during operation. Fast and reliable prediction of building performance is, therefore, required for HPB design during the early design iterations. A modular-based method to analyze building performance on energy efficiency, thermal comfort, IAQ, health impacts, and infection risks was developed, implemented, and demonstrated in this study. The modular approach groups the building technologies and systems to modules that can be analyzed at multi-scale building environments, from urban scale, to building, room, and personal scale. The proposed approach was implemented as a plugin on Rhino Grasshopper, a 3D architectural geometry modeling tool. The design and simulation platform was named Green Design Studio. Reduced-order physics-based models were used to simulate thermal, air, and mass transfer and storage in the buildings. Three cases were used as the study case to demonstrate the module-based approach and develop the simulation platform. Optimization algorithms were applied to optimize the design and settings of the building modules beyond the reference case. The case study shows that the optimal design of the small office determined by the developed platform can save up to 27.8% energy use while mitigating more than 99% infection risk compared to the reference case. It reveals that the optimization of green building design using the proposed approach has high potential of energy saving and IAQ improvement. In support of the application of the Green Design Studio platform, a database of green building technology modules for energy efficiency and IAQ improvement was created. Two selected emerging IAQ strategies were studied using the proposed approach and the developed tool, including the in-duct needlepoint bipolar ionizer and the combination of displacement ventilation and partitions. The in-duct ionization system can provide an equivalent single pass removal efficiency (SPRE) of 3.8-13.6% on particle removal without significant ozone and volatile organic compounds (VOCs) removal and generation with minimal energy use. The combined application of displacement ventilation and desk partitions can also effectively mitigate potential virus transmission through coughing or talking. The abundant performance data from experiments and detailed simulations for the studied technologies will be used by the database of the green building technologies and systems. It will allow these two technologies to be applied through the Green Design Studio approach during the early-design stage for a high-performance building. This can potentially help to address IAQ issues, particularly the airborne transmission of respiratory diseases, while maintaining high energy efficiency

    High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Full text link
    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.Comment: 72 page

    Implications of non-volatile memory as primary storage for database management systems

    Get PDF
    Traditional Database Management System (DBMS) software relies on hard disks for storing relational data. Hard disks are cheap, persistent, and offer huge storage capacities. However, data retrieval latency for hard disks is extremely high. To hide this latency, DRAM is used as an intermediate storage. DRAM is significantly faster than disk, but deployed in smaller capacities due to cost and power constraints, and without the necessary persistency feature that disks have. Non-Volatile Memory (NVM) is an emerging storage class technology which promises the best of both worlds. It can offer large storage capacities, due to better scaling and cost metrics than DRAM, and is non-volatile (persistent) like hard disks. At the same time, its data retrieval time is much lower than that of hard disks and it is also byte-addressable like DRAM. In this paper, we explore the implications of employing NVM as primary storage for DBMS. In other words, we investigate the modifications necessary to be applied on a traditional relational DBMS to take advantage of NVM features. As a case study, we have modified the storage engine (SE) of PostgreSQL enabling efficient use of NVM hardware. We detail the necessary changes and challenges such modifications entail and evaluate them using a comprehensive emulation platform. Results indicate that our modified SE reduces query execution time by up to 40% and 14.4% when compared to disk and NVM storage, with average reductions of 20.5% and 4.5%, respectively.The research leading to these results has received funding from the European Union’s 7th Framework Programme under grant agreement number 318633, the Ministry of Science and Technology of Spain under contract TIN2015-65316-P, and a HiPEAC collaboration grant awarded to Naveed Ul Mustafa.Peer ReviewedPostprint (author's final draft

    Gaining insight from large data volumes with ease

    Get PDF
    Efficient handling of large data-volumes becomes a necessity in today's world. It is driven by the desire to get more insight from the data and to gain a better understanding of user trends which can be transformed into economic incentives (profits, cost-reduction, various optimization of data workflows, and pipelines). In this paper, we discuss how modern technologies are transforming well established patterns in HEP communities. The new data insight can be achieved by embracing Big Data tools for a variety of use-cases, from analytics and monitoring to training Machine Learning models on a terabyte scale. We provide concrete examples within context of the CMS experiment where Big Data tools are already playing or would play a significant role in daily operations

    Scalable Database Access Technologies for ATLAS Distributed Computing

    Full text link
    ATLAS event data processing requires access to non-event data (detector conditions, calibrations, etc.) stored in relational databases. The database-resident data are crucial for the event data reconstruction processing steps and often required for user analysis. A main focus of ATLAS database operations is on the worldwide distribution of the Conditions DB data, which are necessary for every ATLAS data processing job. Since Conditions DB access is critical for operations with real data, we have developed the system where a different technology can be used as a redundant backup. Redundant database operations infrastructure fully satisfies the requirements of ATLAS reprocessing, which has been proven on a scale of one billion database queries during two reprocessing campaigns of 0.5 PB of single-beam and cosmics data on the Grid. To collect experience and provide input for a best choice of technologies, several promising options for efficient database access in user analysis were evaluated successfully. We present ATLAS experience with scalable database access technologies and describe our approach for prevention of database access bottlenecks in a Grid computing environment.Comment: 6 pages, 7 figures. To be published in the proceedings of DPF-2009, Detroit, MI, July 2009, eConf C09072

    Molecular Dynamics Simulations of Lead and Lithium in Liquid Phase

    Get PDF
    Pb17Li is today a reference breeder material in diverse fusion R&D programs worldwide. Extracting dynamic and structural properties of liquid LiPb mixtures via molecular dynamics simulations, represent a crucial step for multiscale modeling efforts in order to understand the suitability of this compound for future Nuclear Fusion technologies. At present a Li-Pb cross potential is not available in the literature. Here we present our first results on the validation of two semi-empirical potentials for Li and Pb in liquid phase. Our results represent the establishment of a solid base as a previous but crucial step to implement a LiPb cross potential. Structural and thermodynamical analyses confirm that the implemented potentials for Li and Pb are realistic to simulate both elements in the liquid phase

    Status Report of the DPHEP Study Group: Towards a Global Effort for Sustainable Data Preservation in High Energy Physics

    Full text link
    Data from high-energy physics (HEP) experiments are collected with significant financial and human effort and are mostly unique. An inter-experimental study group on HEP data preservation and long-term analysis was convened as a panel of the International Committee for Future Accelerators (ICFA). The group was formed by large collider-based experiments and investigated the technical and organisational aspects of HEP data preservation. An intermediate report was released in November 2009 addressing the general issues of data preservation in HEP. This paper includes and extends the intermediate report. It provides an analysis of the research case for data preservation and a detailed description of the various projects at experiment, laboratory and international levels. In addition, the paper provides a concrete proposal for an international organisation in charge of the data management and policies in high-energy physics
    • …
    corecore