70 research outputs found

    An open democracy

    Get PDF
    Sovereign power is retained and shared by the citizens of a country. Using electoral tools, governing structures are formed to ensure protection of national interests. As with any institution, proper control of the government guarantees its adherence to the tasks delegated to it by its citizens. In turn, citizens have to be provided with, and are encouraged to access and evaluate, information generated by the government. On the other hand, governments generate sensitive information (e.g., intelligence, internal reports, etc) that are required for self-evaluation and defense against threats to the nation. Governments are granted a privilege to collect, store and use such information to perform necessary tasks. How far does governmental privilege go relative tothe intrinsic right of citizens to access and evaluate information

    Open access and beyond

    Get PDF
    Uncensored exchange of scientific results hastens progress. Open Access does not stop at the removal of price and permission barriers; still, censorship and reading disabilities, to name a few, hamper access to information. Here, we invite the scientific community and the public to discuss new methods to distribute, store and manage literature in order to achieve unfettered access to literature

    Another challenge for scientists

    Get PDF
    By nature, scientists contribute to our understanding of nature and ourselves. As communities undergo significant changes, new challenges are presented. Here, we offer alternative views on recent changes in society

    The problem of choice

    Get PDF
    Convictions are a driving force for actions. Considering that every individual has a different set of convictions and larger groups act once a consensus decision is reached, one can see that debate is an inherent exercise in decision-making. This requires a sustainably generated surplus to allow time for intellectual exchange, gathering of information and dissemination of findings. It is essential that the full spectrum of options remain treated equally. At the end of this process, a choice has to be made. Looking back at a later time point, a retrospective analysis sometimes reveals that the choice was neither completely free nor a truly conscious one. Leaving the issue of consequences of a once made decision aside, we wish to contribute to the debate of the problem of choice

    To know or not to know: archiving and the under-appreciated historical value of data

    Get PDF
    Surplus goods, produced by a community, allow individuals to dedicate their efforts to abstract problems, while enjoying the benefits of support from the community. In return, the community benefits from the intellectual work, say, efficiently producing goods or profound medical aid. In further elevating quality of life, we need to understand nature and biology on the most detailed level. Inevitably, research costs are increasing along with the need for more scientists to specialize their efforts. As a result, a vast amount of data and information is generated that needs to be archived and made openly accessible with the permission to re-use and re-distribute. With economies undergoing crises and prosperity in an almost cyclic manner, it seems that funding for science and technology follows a similar pattern. Another aspect to the problem of the loss of data is the human propensity, at the level of each individual researcher, to passively discard data in the course of daily life and through a career. In a typical laboratory, significant amounts of information is still stored on disks in file cabinets or on isolated computers, and is lost when a research group disbands. Being conscientious to one's data, to see that it reaches a place in which it can persist beyond the lifespan of any one individual requires responsibility on the part of its creator

    Thermal Analysis of a 3D Stacked High-Performance Commercial Microprocessor using Face-to-Face Wafer Bonding Technology

    Full text link
    3D integration technologies are seeing widespread adoption in the semiconductor industry to offset the limitations and slowdown of two-dimensional scaling. High-density 3D integration techniques such as face-to-face wafer bonding with sub-10 μ\mum pitch can enable new ways of designing SoCs using all 3 dimensions, like folding a microprocessor design across multiple 3D tiers. However, overlapping thermal hotspots can be a challenge in such 3D stacked designs due to a general increase in power density. In this work, we perform a thorough thermal simulation study on sign-off quality physical design implementation of a state-of-the-art, high-performance, out-of-order microprocessor on a 7nm process technology. The physical design of the microprocessor is partitioned and implemented in a 2-tier, 3D stacked configuration with logic blocks and memory instances in separate tiers (logic-over-memory 3D). The thermal simulation model was calibrated to temperature measurement data from a high-performance, CPU-based 2D SoC chip fabricated on the same 7nm process technology. Thermal profiles of different 3D configurations under various workload conditions are simulated and compared. We find that stacking microprocessor designs in 3D without considering thermal implications can result in maximum die temperature up to 12{\deg}C higher than their 2D counterparts under the worst-case power-indicative workload. This increase in temperature would reduce the amount of time for which a power-intensive workload can be run before throttling is required. However, logic-over-memory partitioned 3D CPU implementation can mitigate this temperature increase by half, which makes the temperature of the 3D design only 6^\circC higher than the 2D baseline. We conclude that using thermal aware design partitioning and improved cooling techniques can overcome the thermal challenges associated with 3D stacking

    Trace gas/aerosol boundary concentrations and their impacts on continental-scale AQMEII modeling domains

    Get PDF
    Copyright 2011 Elsevier B.V., All rights reserved.Over twenty modeling groups are participating in the Air Quality Model Evaluation International Initiative (AQMEII) in which a variety of mesoscale photochemical and aerosol air quality modeling systems are being applied to continental-scale domains in North America and Europe for 2006 full-year simulations for model inter-comparisons and evaluations. To better understand the reasons for differences in model results among these participating groups, each group was asked to use the same source of emissions and boundary concentration data for their simulations. This paper describes the development and application of the boundary concentration data for this AQMEII modeling exercise. The European project known as GEMS (Global and regional Earth-system Monitoring using Satellite and in-situ data) has produced global-scale re-analyses of air quality for several years, including 2006 (http://gems.ecmwf.int). The GEMS trace gas and aerosol data were made available at 3-hourly intervals on a regular latitude/longitude grid of approximately 1.9° resolution within 2 "cut-outs" from the global model domain. One cut-out was centered over North America and the other over Europe, covering sufficient spatial domain for each modeling group to extract the necessary time- and space-varying (horizontal and vertical) concentrations for their mesoscale model boundaries. Examples of the impact of these boundary concentrations on the AQMEII continental simulations are presented to quantify the sensitivity of the simulations to boundary concentrations. In addition, some participating groups were not able to use the GEMS data and instead relied upon other sources for their boundary concentration specifications. These are noted, and the contrasting impacts of other data sources for boundary data are presented. How one specifies four-dimensional boundary concentrations for mesoscale air quality simulations can have a profound impact on the model results, and hence, this aspect of data preparation must be performed with considerable care.Peer reviewedFinal Accepted Versio

    Planetary Candidates Observed by Kepler. VIII. A Fully Automated Catalog with Measured Completeness and Reliability Based on Data Release 25

    Full text link
    We present the Kepler Object of Interest (KOI) catalog of transiting exoplanets based on searching 4 yr of Kepler time series photometry (Data Release 25, Q1–Q17). The catalog contains 8054 KOIs, of which 4034 are planet candidates with periods between 0.25 and 632 days. Of these candidates, 219 are new, including two in multiplanet systems (KOI-82.06 and KOI-2926.05) and 10 high-reliability, terrestrial-size, habitable zone candidates. This catalog was created using a tool called the Robovetter, which automatically vets the DR25 threshold crossing events (TCEs). The Robovetter also vetted simulated data sets and measured how well it was able to separate TCEs caused by noise from those caused by low signal-to-noise transits. We discuss the Robovetter and the metrics it uses to sort TCEs. For orbital periods less than 100 days the Robovetter completeness (the fraction of simulated transits that are determined to be planet candidates) across all observed stars is greater than 85%. For the same period range, the catalog reliability (the fraction of candidates that are not due to instrumental or stellar noise) is greater than 98%. However, for low signal-to-noise candidates between 200 and 500 days around FGK-dwarf stars, the Robovetter is 76.7% complete and the catalog is 50.5% reliable. The KOI catalog, the transit fits, and all of the simulated data used to characterize this catalog are available at the NASA Exoplanet Archive

    Planetary Candidates Observed by Kepler. VIII. A Fully Automated Catalog With Measured Completeness and Reliability Based on Data Release 25

    Get PDF
    We present the Kepler Object of Interest (KOI) catalog of transiting exoplanets based on searching four years of Kepler time series photometry (Data Release 25, Q1-Q17). The catalog contains 8054 KOIs of which 4034 are planet candidates with periods between 0.25 and 632 days. Of these candidates, 219 are new and include two in multi-planet systems (KOI-82.06 and KOI-2926.05), and ten high-reliability, terrestrial-size, habitable zone candidates. This catalog was created using a tool called the Robovetter which automatically vets the DR25 Threshold Crossing Events (TCEs, Twicken et al. 2016). The Robovetter also vetted simulated data sets and measured how well it was able to separate TCEs caused by noise from those caused by low signal-to-noise transits. We discusses the Robovetter and the metrics it uses to sort TCEs. For orbital periods less than 100 days the Robovetter completeness (the fraction of simulated transits that are determined to be planet candidates) across all observed stars is greater than 85%. For the same period range, the catalog reliability (the fraction of candidates that are not due to instrumental or stellar noise) is greater than 98%. However, for low signal-to-noise candidates between 200 and 500 days around FGK dwarf stars, the Robovetter is 76.7% complete and the catalog is 50.5% reliable. The KOI catalog, the transit fits and all of the simulated data used to characterize this catalog are available at the NASA Exoplanet Archive.Comment: 61 pages, 23 Figures, 9 Tables, Accepted to The Astrophysical Journal Supplement Serie

    A Spitzer-MIPS search for dust in compact high-velocity HI clouds

    Full text link
    We employ three-band Spitzer-MIPS observations to search for cold dust emission in three neutral hydrogen compact high-velocity clouds (CHVCs) in the vicinity of the Milky Way. Far-infrared emission correlated with HI column density was previously reported in HVC Complex C, indicating that this object contains dust heated by the Galactic radiation field at its distance of ~10kpc. Assuming published Spitzer, IRAS, and Planck IR-HI correlations for Complex C, our Spitzer observations are of sufficient depth to directly detect 160um dust emission in the CHVCs if it is present at the same level as in Complex C, but no emission is detected in any of the targets. For one of the targets (CHVC289) which has well-localized HI clumps, we therefore conclude that it is fundamentally different from Complex C, with either a lower dust-to-gas ratio or a greater distance from the Galactic disk (and consequently cooler dust temperature). Firm conclusions cannot be drawn for the other two Spitzer-observed CHVCs since their small-scale HI structures are not sufficiently well known; nonetheless, no extended dust emission is apparent despite their relatively high HI column densities. The lack of dust emission in CHVC289 suggests that at least some compact high-velocity clouds objects may exhibit very low dust-to-gas ratios and/or greater Galactocentric distances than large HVC complexes.Comment: 8 pages, 4 figures, text and Figure 4 substantially revised to include Planck results after referee repor
    corecore