4,889 research outputs found

    Die tunesische Verfassung zwischen demokratischem Anspruch und Verfassungsrealität

    Full text link
    Die einstige Euphorie auf eine Demokratisierung der Staaten des „Arabischen Frühlings“ ist nach den jüngsten Entwicklungen in Libyen oder Ägypten getrübt. Einzig Tunesien gilt nach wie vor als hoffnungsvoller Kandidat für eine erfolgreiche demokratische Konsolidierung. Verstärkt wird dieser Enthusiasmus durch die Verabschiedung der neuen Verfassung im Januar 2014, die erstmals und einzigartig im arabischen Kontext, Menschen-, Freiheits- und Grundrechte gewährt, sowie die Gleichstellung der Geschlechter sichert. Fraglich ist jedoch, ob die Ratifizierung einer –zumindest formal betrachtet – demokratischen Verfassung auch zur Entwicklung einer demokratischen politischen Gesellschaft führt, die für die Beseitigung autoritärer und hybrider Strukturen notwendig ist. Um also Aussagen zum demokratischen Potential der tunesischen Verfassung machen zu können, müssen sowohl die Verfassungsrealität als auch ihre gesellschaftlichen und politischen Bedingungen hinterfragt werden

    Observational Characterization of the Downward Atmospheric Longwave Radiation at the Surface in the City of São Paulo

    Get PDF
    This work describes the seasonal and diurnal variations of downward longwave atmospheric irradiance (LW) at the surface in São Paulo, Brazil, using 5-min-averaged values of LW, air temperature, relative humidity, and solar radiation observed continuously and simultaneously from 1997 to 2006 on a micrometeorological platform, located at the top of a 4-story building. An objective procedure, including 2-step filtering and dome emission effect correction, was used to evaluate the quality of the 9-yr-long LW dataset. The comparison between LW values observed and yielded by the Surface Radiation Budget project shows spatial and temporal agreement, indicating that monthly and annual average values of LW observed in one point of São Paulo can be used as representative of the entire metropolitan region of São Paulo. The maximum monthly averaged value of the LW is observed during summer (389 ± 14 W m-2; January), and the minimum is observed during winter (332 ± 12 W m-2; July). The effective emissivity follows the LW and shows a maximum in summer (0.907 ± 0.032; January) and a minimum in winter (0.818 ± 0.029; June). The mean cloud effect, identified objectively by comparing the monthly averaged values of the LW during clear-sky days and all-sky conditions, intensified the monthly average LW by about 32.0 ± 3.5 W m-2 and the atmospheric effective emissivity by about 0.088 ± 0.024. In August, the driest month of the year in São Paulo, the diurnal evolution of the LW shows a minimum (325 ± 11 W m-2) at 0900 LT and a maximum (345 ± 12 W m-2) at 1800 LT, which lags behind (by 4 h) the maximum diurnal variation of the screen temperature. The diurnal evolution of effective emissivity shows a minimum (0.781 ± 0.027) during daytime and a maximum (0.842 ± 0.030) during nighttime. The diurnal evolution of all-sky condition and clear-sky day differences in the effective emissivity remain relatively constant (7% ± 1%), indicating that clouds do not change the emissivity diurnal pattern. The relationship between effective emissivity and screen air temperature and between effective emissivity and water vapor is complex. During the night, when the planetary boundary layer is shallower, the effective emissivity can be estimated by screen parameters. During the day, the relationship between effective emissivity and screen parameters varies from place to place and depends on the planetary boundary layer process. Because the empirical expressions do not contain enough information about the diurnal variation of the vertical stratification of air temperature and moisture in São Paulo, they are likely to fail in reproducing the diurnal variation of the surface emissivity. The most accurate way to estimate the LW for clear-sky conditions in São Paulo is to use an expression derived from a purely empirical approach

    Virtual Data in CMS Analysis

    Full text link
    The use of virtual data for enhancing the collaboration between large groups of scientists is explored in several ways: - by defining ``virtual'' parameter spaces which can be searched and shared in an organized way by a collaboration of scientists in the course of their analysis; - by providing a mechanism to log the provenance of results and the ability to trace them back to the various stages in the analysis of real or simulated data; - by creating ``check points'' in the course of an analysis to permit collaborators to explore their own analysis branches by refining selections, improving the signal to background ratio, varying the estimation of parameters, etc.; - by facilitating the audit of an analysis and the reproduction of its results by a different group, or in a peer review context. We describe a prototype for the analysis of data from the CMS experiment based on the virtual data system Chimera and the object-oriented data analysis framework ROOT. The Chimera system is used to chain together several steps in the analysis process including the Monte Carlo generation of data, the simulation of detector response, the reconstruction of physics objects and their subsequent analysis, histogramming and visualization using the ROOT framework.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003, 9 pages, LaTeX, 7 eps figures. PSN TUAT010. V2 - references adde

    Addressing the clumsiness loophole in a Leggett-Garg test of macrorealism

    Get PDF
    The rise of quantum information theory has lent new relevance to experimental tests for non-classicality, particularly in controversial cases such as adiabatic quantum computing superconducting circuits. The Leggett-Garg inequality is a "Bell inequality in time" designed to indicate whether a single quantum system behaves in a macrorealistic fashion. Unfortunately, a violation of the inequality can only show that the system is either (i) non-macrorealistic or (ii) macrorealistic but subjected to a measurement technique that happens to disturb the system. The "clumsiness" loophole (ii) provides reliable refuge for the stubborn macrorealist, who can invoke it to brand recent experimental and theoretical work on the Leggett-Garg test inconclusive. Here, we present a revised Leggett-Garg protocol that permits one to conclude that a system is either (i) non-macrorealistic or (ii) macrorealistic but with the property that two seemingly non-invasive measurements can somehow collude and strongly disturb the system. By providing an explicit check of the invasiveness of the measurements, the protocol replaces the clumsiness loophole with a significantly smaller "collusion" loophole.Comment: 7 pages, 3 figure

    Locating bugs without looking back

    Get PDF
    Bug localisation is a core program comprehension task in software maintenance: given the observation of a bug, e.g. via a bug report, where is it located in the source code? Information retrieval (IR) approaches see the bug report as the query, and the source code files as the documents to be retrieved, ranked by relevance. Such approaches have the advantage of not requiring expensive static or dynamic analysis of the code. However, current state-of-the-art IR approaches rely on project history, in particular previously fixed bugs or previous versions of the source code. We present a novel approach that directly scores each current file against the given report, thus not requiring past code and reports. The scoring method is based on heuristics identified through manual inspection of a small sample of bug reports. We compare our approach to eight others, using their own five metrics on their own six open source projects. Out of 30 performance indicators, we improve 27 and equal 2. Over the projects analysed, on average we find one or more affected files in the top 10 ranked files for 76% of the bug reports. These results show the applicability of our approach to software projects without history

    Behavioural compensation by drivers of a simulator when using a vision enhancement system

    Get PDF
    Technological progress is suggesting dramatic changes to the tasks of the driver, with the general aim of making driving environment safer. Before any of these technologies are implemented, empirical research is required to establish if these devices do, in fact, bring about the anticipated improvements. Initially, at least, simulated driving environments offer a means of conducting this research. The study reported here concentrates on the application of a vision enhancement (VE) system within the risk homeostasis paradigm. It was anticipated, in line with risk homeostasis theory, that drivers would compensate for the reduction in risk by increasing speed. The results support the hypothesis although, after a simulated failure of the VE system, drivers did reduce their speed due to reduced confidence in the reliability of the system

    Adsorption and temperature-dependent decomposition of SO<sub>2</sub> on Cu(100) and Cu(111): A fast and high-resolution core-level spectroscopy study

    Get PDF
    The adsorption and temperature-dependent decomposition of SO2 on Cu(100) and Cu(111) have been studied by fast and high-resolution core-level photoemission. The analysis of the S 2p and O 1s data shows that molecular SO2 adsorption dominates at 170 K. On heating the SO2-covered surfaces to about room temperature, SO2 decomposes into SO+O+S. On further heating SO+O recombine to form SO2, which is the only species detected in corresponding temperature-programmed desorption (TPD) experiments. From the temperature- (time-) dependent S and O coverages a ‘‘TPD curve’’ can be constructed
    corecore