48 research outputs found

    Results and recommendations from an intercomparison of six Hygroscopicity-TDMA systems

    Get PDF
    The performance of six custom-built Hygrocopicity-Tandem Differential Mobility Analyser (H-TDMA) systems was investigated in the frame of an international calibration and intercomparison workshop held in Leipzig, February 2006. The goal of the workshop was to harmonise H-TDMA measurements and develop recommendations for atmospheric measurements and their data evaluation. The H-TDMA systems were compared in terms of the sizing of dry particles, relative humidity (RH) uncertainty, and consistency in determination of number fractions of different hygroscopic particle groups. The experiments were performed in an air-conditioned laboratory using ammonium sulphate particles or an external mixture of ammonium sulphate and soot particles. The sizing of dry particles of the six H-TDMA systems was within 0.2 to 4.2% of the selected particle diameter depending on investigated size and individual system. Measurements of ammonium sulphate aerosol found deviations equivalent to 4.5% RH from the set point of 90% RH compared to results from previous experiments in the literature. Evaluation of the number fraction of particles within the clearly separated growth factor modes of a laboratory generated externally mixed aerosol was done. The data from the H-TDMAs was analysed with a single fitting routine to investigate differences caused by the different data evaluation procedures used for each H-TDMA. The differences between the H-TDMAs were reduced from +12/-13% to +8/-6% when the same analysis routine was applied. We conclude that a common data evaluation procedure to determine number fractions of externally mixed aerosols will improve the comparability of H-TDMA measurements. It is recommended to ensure proper calibration of all flow, temperature and RH sensors in the systems. It is most important to thermally insulate the aerosol humidification unit and the second DMA and to monitor these temperatures to an accuracy of 0.2 degrees C. For the correct determination of external mixtures, it is necessary to take into account size-dependent losses due to diffusion in the plumbing between the DMAs and in the aerosol humidification unit.Peer reviewe

    Toward community standards and software for whole-cell modeling

    Get PDF
    Whole-cell (WC) modeling is a promising tool for biological research, bioengineering, and medicine. However, substantial work remains to create accurate, comprehensive models of complex cells. Methods: We organized the 2015 Whole-Cell Modeling Summer School to teach WC modeling and evaluate the need for new WC modeling standards and software by recoding a recently published WC model in SBML. Results: Our analysis revealed several challenges to representing WC models using the current standards. Conclusion: We, therefore, propose several new WC modeling standards, software, and databases. Significance:We anticipate that these new standards and software will enable more comprehensive models

    A study of the link between cosmic rays and clouds with a cloud chamber at the CERN PS

    Get PDF
    Recent satellite data have revealed a surprising correlation between galactic cosmic ray (GCR) intensity and the fraction of the Earth covered by clouds. If this correlation were to be established by a causal mechanism, it could provide a crucial step in understanding the long-sought mechanism connecting solar and climate variability. The Earth's climate seems to be remarkably sensitive to solar activity, but variations of the Sun's electromagnetic radiation appear to be too small to account for the observed climate variability. However, since the GCR intensity is strongly modulated by the solar wind, a GCR-cloud link may provide a sufficient amplifying mechanism. Moreover if this connection were to be confirmed, it could have profound consequences for our understanding of the solar contributions to the current global warming. The CLOUD (Cosmics Leaving OUtdoor Droplets) project proposes to test experimentally the existence a link between cosmic rays and cloud formation, and to understand the microphysical mechanism. CLOUD plans to perform detailed laboratory measurements in a particle beam at CERN, where all the parameters can be precisely controlled and measured. The beam will pass through an expansion cloud chamber and a reactor chamber where the atmosphere is to be duplicated by moist air charged with selected aerosols and trace condensable vapours. An array of external detectors and mass spectrometers is used to analyse the physical and chemical characteristics of the aerosols and trace gases during beam exposure. Where beam effects are found, the experiment will seek to evaluate their significance in the atmosphere by incorporating them into aerosol and cloud models.Recent satellite data have revealed a surprising correlation between galactic cosmic ray (GCR) intensity and the fraction of the Earth covered by clouds. If this correlation were to be established by a causal mechanism, it could provide a crucial step in understanding the long-sought mechanism connecting solar and climate variability. The Earth's climate seems to be remarkably sensitive to solar activity, but variations of the Sun's electromagnetic radiation appear to be too small to account for the observed climate variability. However, since the GCR intensity is strongly modulated by the solar wind, a GCR-cloud link may provide a sufficient amplifying mechanism. Moreover if this connection were to be confirmed, it could have profound consequences for our understanding of the solar contributions to the current global warming. The CLOUD (Cosmics Leaving OUtdoor Droplets) project proposes to test experimentally the existence a link between cosmic rays and cloud formation, and to understand the microphysical mechanism. CLOUD plans to perform detailed laboratory measurements in a particle beam at CERN, where all the parameters can be precisely controlled and measured. The beam will pass through an expansion cloud chamber and a reactor chamber where the atmosphere is to be duplicated by moist air charged with selected aerosols and trace condensable vapours. An array of external detectors and mass spectrometers is used to analyse the physical and chemical characteristics of the aerosols and trace gases during beam exposure. Where beam effects are found, the experiment will seek to evaluate their significance in the atmosphere by incorporating them into aerosol and cloud models

    CLOUD: an atmospheric research facility at CERN

    Get PDF
    This report is the second of two addenda to the CLOUD proposal at CERN (physics/0104048), which aims to test experimentally the existence a link between cosmic rays and cloud formation, and to understand the microphysical mechanism. The document places CLOUD in the framework of a CERN facility for atmospheric research, and provides further details on the particle beam requirements

    Captive outsourcing - a way to move complex products to emerging markets

    Full text link
    Purpose – The purpose of this paper is to examine how companies should off‐shore complex product related tasks to low‐cost countries, without jeopardizing their competitive advantage and intellectual property, while building solid and sustainable business in the sourcing country. Design/methodology/approach – The underlying case concerns a multinational, globally operating engineering company delivering complex system products used as part of industrial and social infrastructure, and its entry to off‐shoring and how it has evolved from a green field operation to sizeable value center over the past six years. Findings – The case provides support to the fact that companies understanding that building permanent, knowledge‐based and proprietary presence with full product management responsibility in lower labor cost countries will be more responsive in serving customers, cost efficient in maintaining old infrastructure products and in delivering new ones in the future. Further, complex product business companies focusing on long term and knowledge based legacy building in emerging economies will develop, not only more robust global business platform for themselves, but they will also contribute to the sustainable development of the global economy. Originality/value – The paper presents unique descriptive data on the overall outsourcing strategy of a global engineering company and how one of its off‐shoring units has evolved over time since its inception

    Memory-based scheduling of scientific computing clusters

    Full text link
    This study looks at how increased memory utilisation affects throughput and energy consumption in scientific computing, especially in high-energy physics. Our aim is to minimise energy consumed by a set of jobs without increasing the processing time. The earlier tests indicated that, especially in data analysis, throughput can increase over 100% and energy consumption decrease 50% by processing multiple jobs in parallel per CPU core. Since jobs are heterogeneous, it is not possible to find an optimum value for the number of parallel jobs. A better solution is based on memory utilisation, but finding an optimum memory threshold is not straightforward. Therefore, a fuzzy logic-based algorithm was developed that can dynamically adapt the memory threshold based on the overall load. In this way, it is possible to keep memory consumption stable with different workloads while achieving significantly higher throughput and energy-efficiency than using a traditional fixed number of jobs or fixed memory threshold approaches
    corecore