601 research outputs found

    Re-Politicising Regulation: Politics: Regulatory Variation and Fuzzy Liberalisation in the Single European Energy Market

    Get PDF
    [From the introduction] The idea that we are living in the age of the regulatory state has dominated the study of public policy in the European Union and its member states in general, and the study of the utilities sectors in particular.1 The European Commission’s continuous drive to expand the Single Market has therefore been a free-market and rule-oriented project, driven by regulatory politics rather than policies that involve direct public expenditure. The dynamics of European integration are rooted in three central concepts: free trade, multilateral rules, and supranational cooperation. During the 1990s EU competition policy took a ‘public turn’ and set its sights on the public sector.2 EU legislation broke up national monopolies in telecommunications, electricity and gas, and set the scene for further extension of the single market into hitherto protected sectors. Both the integration theory literature (intergovernmentalist and institutionalist alike) and literature on the emergence of the EU as a ‘regulatory state’ assumed that this was primarily a matter of policy making: once agreement had been reached to liberalise the utilities markets a relatively homogeneous process would follow. The regulatory state model fit the original common market blueprint better the old industrial policy approaches. On the other hand, sector-specific studies continue to reveal a less than fully homogeneous internal market. The EU has undergone momentous changes in the last two decades, which have rendered the notion of a homogeneous single market somewhat unrealistic

    A terrestrial search for dark contents of the vacuum, such as dark energy, using atom interferometry

    Full text link
    We describe the theory and first experimental work on our concept for searching on earth for the presence of dark content of the vacuum (DCV) using atom interferometry. Specifically, we have in mind any DCV that has not yet been detected on a laboratory scale, but might manifest itself as dark energy on the cosmological scale. The experimental method uses two atom interferometers to cancel the effect of earth's gravity and diverse noise sources. It depends upon two assumptions: first, that the DCV possesses some space inhomogeneity in density, and second that it exerts a sufficiently strong non-gravitational force on matter. The motion of the apparatus through the DCV should then lead to an irregular variation in the detected matter-wave phase shift. We discuss the nature of this signal and note the problem of distinguishing it from instrumental noise. We also discuss the relation of our experiment to what might be learned by studying the noise in gravitational wave detectors such as LIGO.The paper concludes with a projection that a future search of this nature might be carried out using an atom interferometer in an orbiting satellite. The apparatus is now being constructed

    A Testable Solution of the Cosmological Constant and Coincidence Problems

    Full text link
    We present a new solution to the cosmological constant (CC) and coincidence problems in which the observed value of the CC, Λ\Lambda, is linked to other observable properties of the universe. This is achieved by promoting the CC from a parameter which must to specified, to a field which can take many possible values. The observed value of Lambda ~ 1/(9.3 Gyrs)^2(approximately10(120)inPlanckunits)isdeterminedbyanewconstraintequationwhichfollowsfromtheapplicationofacausallyrestrictedvariationprinciple.Whenappliedtoourvisibleuniverse,themodelmakesatestablepredictionforthedimensionlessspatialcurvatureofOmegak0=0.0056sb/0.5;wheresb 1/2isaQCDparameter.Requiringthataclassicalhistoryexist,ourmodeldeterminestheprobabilityofobservingagivenLambda.TheobservedCCvalue,whichwesuccessfullypredict,istypicalwithinourmodelevenbeforetheeffectsofanthropicselectionareincluded.Whenanthropicselectioneffectsareaccountedfor,wefindthattheobservedcoincidencebetweentLambda=Lambda(1/2)andtheageoftheuniverse,tU,isatypicaloccurrenceinourmodel.IncontrasttomultiverseexplanationsoftheCCproblems,oursolutionisindependentofthechoiceofapriorweightingofdifferent (approximately 10^(-120) in Planck units) is determined by a new constraint equation which follows from the application of a causally restricted variation principle. When applied to our visible universe, the model makes a testable prediction for the dimensionless spatial curvature of Omega_k0 = -0.0056 s_b/0.5; where s_b ~ 1/2 is a QCD parameter. Requiring that a classical history exist, our model determines the probability of observing a given Lambda. The observed CC value, which we successfully predict, is typical within our model even before the effects of anthropic selection are included. When anthropic selection effects are accounted for, we find that the observed coincidence between t_Lambda = Lambda^(-1/2) and the age of the universe, t_U, is a typical occurrence in our model. In contrast to multiverse explanations of the CC problems, our solution is independent of the choice of a prior weighting of different \Lambda$-values and does not rely on anthropic selection effects. Our model includes no unnatural small parameters and does not require the introduction of new dynamical scalar fields or modifications to general relativity, and it can be tested by astronomical observations in the near future.Comment: 31 pages, 4 figures; v2: version accepted by Phys. Rev.

    Is Cosmology Solved?

    Get PDF
    We have fossil evidence from the thermal background radiation that our universe expanded from a considerably hotter denser state. We have a well defined and testable description of the expansion, the relativistic Friedmann-Lemaitre model. Its observational successes are impressive but I think hardly enough for a convincing scientific case. The lists of observational constraints and free hypotheses within the model have similar lengths. The scorecard on the search for concordant measures of the mass density parameter and the cosmological constant shows that the high density Einstein-de Sitter model is challenged, but that we cannot choose between low density models with and without a cosmological constant. That is, the relativistic model is not strongly overconstrained, the usual test of a mature theory. Work in progress will greatly improve the situation and may at last yield a compelling test. If so, and the relativistic model survives, it will close one line of research in cosmology: we will know the outlines of what happened as our universe expanded and cooled from high density. It will not end research: some of us will occupy ourselves with the details of how galaxies and other large-scale structures came to be the way they are, others with the issue of what our universe was doing before it was expanding. The former is being driven by rapid observational advances. The latter is being driven mainly by theory, but there are hints of observational guidance.Comment: 13 pages, 3 figures. To be published in PASP as part of the proceedings of the Smithsonian debate, Is Cosmology Solved

    On Gravitational Waves in Spacetimes with a Nonvanishing Cosmological Constant

    Full text link
    We study the effect of a cosmological constant Λ\Lambda on the propagation and detection of gravitational waves. To this purpose we investigate the linearised Einstein's equations with terms up to linear order in Λ\Lambda in a de Sitter and an anti-de Sitter background spacetime. In this framework the cosmological term does not induce changes in the polarization states of the waves, whereas the amplitude gets modified with terms depending on Λ\Lambda. Moreover, if a source emits a periodic waveform, its periodicity as measured by a distant observer gets modified. These effects are, however, extremely tiny and thus well below the detectability by some twenty orders of magnitude within present gravitational wave detectors such as LIGO or future planned ones such as LISA.Comment: 8 pages, 4 figures, accepted for publication in Physical Review

    Experimental Designs for Binary Data in Switching Measurements on Superconducting Josephson Junctions

    Full text link
    We study the optimal design of switching measurements of small Josephson junction circuits which operate in the macroscopic quantum tunnelling regime. Starting from the D-optimality criterion we derive the optimal design for the estimation of the unknown parameters of the underlying Gumbel type distribution. As a practical method for the measurements, we propose a sequential design that combines heuristic search for initial estimates and maximum likelihood estimation. The presented design has immediate applications in the area of superconducting electronics implying faster data acquisition. The presented experimental results confirm the usefulness of the method. KEY WORDS: optimal design, D-optimality, logistic regression, complementary log-log link, quantum physics, escape measurement

    De-Sitter-spacetime instability from a nonstandard vector field

    Full text link
    It is found that de-Sitter spacetime, the constant-curvature matter-free solution of the Einstein equations with a positive cosmological constant, becomes classically unstable due to the dynamic effects of a certain type of vector field (fundamentally different from a gauge field). The perturbed de-Sitter universe evolves towards a final exotic singularity. The relevant vector-field configurations violate the strong and dominant energy conditions.Comment: 10 pages, v7: published versio

    Particle decays and stability on the de Sitter universe

    Full text link
    We study particle decay in de Sitter space-time as given by first order perturbation theory in a Lagrangian interacting quantum field theory. We study in detail the adiabatic limit of the perturbative amplitude and compute the "phase space" coefficient exactly in the case of two equal particles produced in the disintegration. We show that for fields with masses above a critical mass mcm_c there is no such thing as particle stability, so that decays forbidden in flat space-time do occur here. The lifetime of such a particle also turns out to be independent of its velocity when that lifetime is comparable with de Sitter radius. Particles with mass lower than critical have a completely different behavior: the masses of their decay products must obey quantification rules, and their lifetime is zero.Comment: Latex, 38 pages, 1 PostScript figure; added references, minor corrections and remark

    A Note on the Integral Formulation of Einstein's Equations Induced on a Braneworld

    Full text link
    We revisit the integral formulation (or Green's function approach) of Einstein's equations in the context of braneworlds. The integral formulation has been proposed independently by several authors in the past, based on the assumption that it is possible to give a reinterpretation of the local metric field in curved spacetimes as an integral expression involving sources and boundary conditions. This allows one to separate source-generated and source-free contributions to the metric field. As a consequence, an exact meaning to Mach's Principle can be achieved in the sense that only source-generated (matter fields) contributions to the metric are allowed for; universes which do not obey this condition would be non-Machian. In this paper, we revisit this idea concentrating on a Randall-Sundrum-type model with a non-trivial cosmology on the brane. We argue that the role of the surface term (the source-free contribution) in the braneworld scenario may be quite subtler than in the 4D formulation. This may pose, for instance, an interesting issue to the cosmological constant problem.Comment: 10 pages, no figures, accepted for publication in the General Relativity and Gravitation Journa

    Merging transcriptomics and metabolomics - advances in breast cancer profiling

    Get PDF
    Background Combining gene expression microarrays and high resolution magic angle spinning magnetic resonance spectroscopy (HR MAS MRS) of the same tissue samples enables comparison of the transcriptional and metabolic profiles of breast cancer. The aim of this study was to explore the potential of combining these two different types of information. Methods Breast cancer tissue from 46 patients was analyzed by HR MAS MRS followed by gene expression microarrays. Two strategies were used to combine the gene expression and metabolic data; first using multivariate analyses to identify different groups based on gene expression and metabolic data; second correlating levels of specific metabolites to transcripts to suggest new hypotheses of connections between metabolite levels and the underlying biological processes. A parallel study was designed to address experimental issues of combining microarrays and HR MAS MRS. Results In the first strategy, using the microarray data and previously reported molecular classification methods, the majority of samples were classified as luminal A. Three subgroups of luminal A tumors were identified based on hierarchical clustering of the HR MAS MR spectra. The samples in one of the subgroups, designated A2, showed significantly lower glucose and higher alanine levels than the other luminal A samples, suggesting a higher glycolytic activity in these tumors. This group was also enriched for genes annotated with Gene Ontology (GO) terms related to cell cycle and DNA repair. In the second strategy, the correlations between concentrations of myo-inositol, glycine, taurine, glycerophosphocholine, phosphocholine, choline and creatine and all transcripts in the filtered microarray data were investigated. GO-terms related to the extracellular matrix were enriched among the genes that correlated the most to myo-inositol and taurine, while cell cycle related GO-terms were enriched for the genes that correlated the most to choline. Additionally, a subset of transcripts was identified to have slightly altered expression after HR MAS MRS and was therefore removed from all other analyses. Conclusions Combining transcriptional and metabolic data from the same breast carcinoma sample is feasible and may contribute to a more refined subclassification of breast cancers as well as reveal relations between metabolic and transcriptional levels. See Commentary: http://www.biomedcentral.com/1741-7015/8/7
    corecore