815 research outputs found

    Genuine Counterfactual Communication with a Nanophotonic Processor

    Full text link
    In standard communication information is carried by particles or waves. Counterintuitively, in counterfactual communication particles and information can travel in opposite directions. The quantum Zeno effect allows Bob to transmit a message to Alice by encoding information in particles he never interacts with. The first suggested protocol not only required thousands of ideal optical components, but also resulted in a so-called "weak trace" of the particles having travelled from Bob to Alice, calling the scalability and counterfactuality of previous proposals and experiments into question. Here we overcome these challenges, implementing a new protocol in a programmable nanophotonic processor, based on reconfigurable silicon-on-insulator waveguides that operate at telecom wavelengths. This, together with our telecom single-photon source and highly-efficient superconducting nanowire single-photon detectors, provides a versatile and stable platform for a high-fidelity implementation of genuinely trace-free counterfactual communication, allowing us to actively tune the number of steps in the Zeno measurement, and achieve a bit error probability below 1%, with neither post-selection nor a weak trace. Our demonstration shows how our programmable nanophotonic processor could be applied to more complex counterfactual tasks and quantum information protocols.Comment: 6 pages, 4 figure

    NEW SEISMIC SOURCE ZONE MODEL FOR PORTUGAL AND AZORES

    Get PDF
    The development of seismogenic source models is one of the first steps in seismic hazard assessment. In seismic hazard terminology, seismic source zones (SSZ) are polygons (or volumes) that delineate areas with homogeneous characteristics of seismicity. The importance of using knowledge on geology, seismicity and tectonics in the definition of source zones has been recognized for a long time [1]. However, the definition of SSZ tends to be subjective and controversial. Using SSZ based on broad geology, by spreading the seismicity clusters throughout the areal extent of a zone, provides a way to account for possible long-term non-stationary seismicity behavior [2,3]. This approach effectively increases seismicity rates in regions with no significant historical or instrumental seismicity, while decreasing seismicity rates in regions that display higher rates of seismicity. In contrast, the use of SSZ based on concentrations of seismicity or spatial smoothing results in stationary behavior [4]. In the FP7 Project SHARE (Seismic Hazard Harmonization in Europe), seismic hazard will be assessed with a logic tree approach that allows for three types of branches for seismicity models: a) smoothed seismicity, b) SSZ, c) SSZ and faults. In this context, a large-scale zonation model for use in the smoothed seismicity branch, and a new consensus SSZ model for Portugal and Azores have been developed. The new models were achieved with the participation of regional experts by combining and adapting existing models and incorporating new regional knowledge of the earthquake potential. The main criteria used for delineating the SSZ include distribution of seismicity, broad geological architecture, crustal characteristics (oceanic versus continental, tectonically active versus stable, etc.), historical catalogue completeness, and the characteristics of active or potentially-active faults. This model will be integrated into an Iberian model of SSZ to be used in the Project SHARE seismic hazard assessment

    On instantons as Kaluza-Klein modes of M5-branes

    Full text link
    Instantons and W-bosons in 5d maximally supersymmetric Yang-Mills theory arise from a circle compactification of the 6d (2,0) theory as Kaluza-Klein modes and winding self-dual strings, respectively. We study an index which counts BPS instantons with electric charges in Coulomb and symmetric phases. We first prove the existence of unique threshold bound state of (noncommutative) U(1) instantons for any instanton number, and also show that charged instantons in the Coulomb phase correctly give the degeneracy of SU(2) self-dual strings. By studying SU(N) self-dual strings in the Coulomb phase, we find novel momentum-carrying degrees on the worldsheet. The total number of these degrees equals the anomaly coefficient of SU(N) (2,0) theory. We finally show that our index can be used to study the symmetric phase of this theory, and provide an interpretation as the superconformal index of the sigma model on instanton moduli space.Comment: 54 pages, 2 figures. v2: references added, figure improved, added comments on self-dual string anomaly, added new materials on the symmetric phase index, other minor correction

    The New ‘Hidden Abode’: Reflections on Value and Labour in the New Economy

    Get PDF
    In a pivotal section of Capital, volume 1, Marx (1976: 279) notes that, in order to understand the capitalist production of value, we must descend into the ‘hidden abode of production’: the site of the labour process conducted within an employment relationship. In this paper we argue that by remaining wedded to an analysis of labour that is confined to the employment relationship, Labour Process Theory (LPT) has missed a fundamental shift in the location of value production in contemporary capitalism. We examine this shift through the work of Autonomist Marxists like Hardt and Negri, Lazaratto and Arvidsson, who offer theoretical leverage to prize open a new ‘hidden abode’ outside employment, for example in the ‘production of organization’ and in consumption. Although they can open up this new ‘hidden abode’, without LPT's fine-grained analysis of control/resistance, indeterminacy and structured antagonism, these theorists risk succumbing to empirically naive claims about the ‘new economy’. Through developing an expanded conception of a ‘new hidden abode’ of production, the paper demarcates an analytical space in which both LPT and Autonomist Marxism can expand and develop their understanding of labour and value production in today's economy. </jats:p

    A Sample of Intermediate-Mass Star-Forming Regions: Making Stars at Mass Column Densities <1 g/cm^2

    Full text link
    In an effort to understand the factors that govern the transition from low- to high-mass star formation, we identify for the first time a sample of intermediate-mass star-forming regions (IM SFRs) where stars up to - but not exceeding - 8 solar masses are being produced. We use IRAS colors and Spitzer Space Telescope mid-IR images, in conjunction with millimeter continuum and CO maps, to compile a sample of 50 IM SFRs in the inner Galaxy. These are likely to be precursors to Herbig AeBe stars and their associated clusters of low-mass stars. IM SFRs constitute embedded clusters at an early evolutionary stage akin to compact HII regions, but they lack the massive ionizing central star(s). The photodissociation regions that demarcate IM SFRs have typical diameters of ~1 pc and luminosities of ~10^4 solar luminosities, making them an order of magnitude less luminous than (ultra)compact HII regions. IM SFRs coincide with molecular clumps of mass ~10^3 solar masses which, in turn, lie within larger molecular clouds spanning the lower end of the giant molecular cloud mass range, 10^4-10^5 solar masses. The IR luminosity and associated molecular mass of IM SFRs are correlated, consistent with the known luminosity-mass relationship of compact HII regions. Peak mass column densities within IM SFRs are ~0.1-0.5 g/cm^2, a factor of several lower than ultra-compact HII regions, supporting the proposition that there is a threshold for massive star formation at ~1 g/cm^2.Comment: 61 pages, 6 tables, 20 figures. Accepted for publication in the Astronomical Journa

    Effect of sampling rate on acceleration and counts of hip- and wrist-worn ActiGraph accelerometers in children

    Get PDF
    Sampling rate (Hz) of ActiGraph accelerometers may affect processing of acceleration to activity counts when using a hip-worn monitor, but research is needed to quantify if sampling rate affects actual acceleration (mg's), when using wrist-worn accelerometers and during non-locomotive activities. Objective: To assess the effect of ActiGraph sampling rate on total counts/15-sec and mean acceleration and to compare differences due to sampling rate between accelerometer wear locations and across different types of activities. Approach: Children (n=29) wore a hip- and wrist-worn accelerometer (sampled at 100 Hz, downsampled in MATLAB to 30 Hz) during rest/transition periods, active video games, and a treadmill test to volitional exhaustion. Mean acceleration and counts/15-sec were computed for each axis and as vector magnitude. Main Results: There were mostly no significant differences in mean acceleration. However, 100 Hz data resulted in significantly more total counts/15-sec (mean bias 4-43 counts/15-sec across axes) for both the hip- and wrist-worn monitor when compared to 30 Hz data. Absolute differences increased with activity intensity (hip: r=0.46-0.63; wrist: r=0.26-0.55) and were greater for hip- versus wrist-worn monitors. Percent agreement between 100 and 30 Hz data was high (97.4-99.7%) when cut-points or machine learning algorithms were used to classify activity intensity. Significance: Our findings support that sampling rate affects the generation of counts but adds that differences increase with intensity and when using hip-worn monitors. We recommend researchers be consistent and vigilantly report the sampling rate used, but note that classifying data into activity intensities resulted in agreement despite differences in sampling rate

    The economics of free: freemium games, branding and the impatience economy

    Get PDF
    The gaming industry has seen dramatic change and expansion with the emergence of ‘casual’ games that promote shorter periods of game play. Free to download, but structured around micro-payments, these games raise the complex relationship between game design and commercial strategies. Although offering a free gameplay experience in line with open access philosophies, these games also create systems that offer control over the temporal dynamics of that experience to monetise player attention and inattention. This article will examine three ‘freemium’ games, Snoopy Street Fair, The Simpsons’ Tapped Out and Dragonvale, to explore how they combine established branding strategies with gameplay methods that monetise player impatience. In examining these games, this article will ultimately indicate the need for game studies to interrogate the intersection between commercial motivations and game design 2 and a broader need for media and cultural studies to consider the social, cultural, economic and political implications of impatience

    Superconformal symmetry and maximal supergravity in various dimensions

    Full text link
    In this paper we explore the relation between conformal superalgebras with 64 supercharges and maximal supergravity theories in three, four and six dimensions using twistorial oscillator techniques. The massless fields of N=8 supergravity in four dimensions were shown to fit into a CPT-self-conjugate doubleton supermultiplet of the conformal superalgebra SU(2,2|8) a long time ago. We show that the fields of maximal supergravity in three dimensions can similarly be fitted into the super singleton multiplet of the conformal superalgebra OSp(16|4,R), which is related to the doubleton supermultiplet of SU(2,2|8) by dimensional reduction. Moreover, we construct the ultra-short supermultiplet of the six-dimensional conformal superalgebra OSp(8*|8) and show that its component fields can be organized in an on-shell superfield. The ultra-short OSp(8*|8) multiplet reduces to the doubleton supermultiplet of SU(2,2|8) upon dimensional reduction. We discuss the possibility of a chiral maximal (4,0) six-dimensional supergravity theory with USp(8) R-symmetry that reduces to maximal supergravity in four dimensions and is different from six-dimensional (2,2) maximal supergravity, whose fields cannot be fitted into a unitary supermultiplet of a simple conformal superalgebra. Such an interacting theory would be the gravitational analog of the (2,0) theory.Comment: 54 pages, PDFLaTeX, Section 5 and several references added. Version accepted for publication in JHE

    The influence of calcium and magnesium in drinking water and diet on cardiovascular risk factors in individuals living in hard and soft water areas with differences in cardiovascular mortality

    Get PDF
    BACKGROUND: The role of water hardness as a risk factor for cardiovascular disease has been widely investigated and evaluated as regards regional differences in cardiovascular disease. This study was performed to evaluate the relation between calcium and magnesium in drinking water and diet and risk factors for cardiovascular disease in individuals living in hard and soft water areas with considerable differences in cardiovascular mortality. METHODS: A random sample of 207 individuals living in two municipalities characterised by differences in cardiovascular mortality and water hardness was invited for an examination including a questionnaire about health, social and living conditions and diet. Intake of magnesium and calcium was calculated from the diet questionnaire with special consideration to the use of local water. Household water samples were delivered by each individual and were analysed for magnesium and calcium. RESULTS: In the total sample, there were positive correlations between the calcium content in household water and systolic blood pressure (SBP) and negative correlations with s-cholesterol and s-LDL-cholesterol. No correlation was seen with magnesium content in household water to any of the risk factors. Calcium content in diet showed no correlation to cardiovascular risk factors. Magnesium in diet was positively correlated to diastolic blood pressure (DBP). In regression analyses controlled for age and sex 18.5% of the variation in SBP was explained by the variation in BMI, HbA1c and calcium content in water. Some 27.9% of the variation in s-cholesterol could be explained by the variation in s-triglycerides (TG), and calcium content in water. CONCLUSIONS: This study of individuals living in soft and hard water areas showed significant correlations between the content of calcium in water and major cardiovascular risk factors. This was not found for magnesium in water or calcium or magnesium in diet. Regression analyses indicated that calcium content in water could be a factor in the complexity of relationships and importance of cardiovascular risk factors. From these results it is not possible to conclude any definite causal relation and further research is needed

    The 2013 European Seismic Hazard Model: key components and results

    Get PDF
    The 2013 European Seismic Hazard Model (ESHM13) results from a community-based probabilistic seismic hazard assessment supported by the EU-FP7 project “Seismic Hazard Harmonization in Europe” (SHARE, 2009–2013). The ESHM13 is a consistent seismic hazard model for Europe and Turkey which overcomes the limitation of national borders and includes a through quantification of the uncertainties. It is the first completed regional effort contributing to the “Global Earthquake Model” initiative. It might serve as a reference model for various applications, from earthquake preparedness to earthquake risk mitigation strategies, including the update of the European seismic regulations for building design (Eurocode 8), and thus it is useful for future safety assessment and improvement of private and public buildings. Although its results constitute a reference for Europe, they do not replace the existing national design regulations that are in place for seismic design and construction of buildings. The ESHM13 represents a significant improvement compared to previous efforts as it is based on (1) the compilation of updated and harmonised versions of the databases required for probabilistic seismic hazard assessment, (2) the adoption of standard procedures and robust methods, especially for expert elicitation and consensus building among hundreds of European experts, (3) the multi-disciplinary input from all branches of earthquake science and engineering, (4) the direct involvement of the CEN/TC250/SC8 committee in defining output specifications relevant for Eurocode 8 and (5) the accounting for epistemic uncertainties of model components and hazard results. Furthermore, enormous effort was devoted to transparently document and ensure open availability of all data, results and methods through the European Facility for Earthquake Hazard and Risk (www.​efehr.​org)
    corecore