6,618 research outputs found

    A Solution to Concerns over Public Access to Scientific Data

    Get PDF
    To balance the public need for accountability and better policy decision making with concerns regarding burdens on scientists and scientific progress, the authors propose that increased access be limited to data relevant in analyzing regulations that would have an annual economic impact of at least $100 million. They also recommend establishing an agency to replicate key findings used to support regulations before they are finalized.Regulatory Reform

    Should Researchers Be Required to Share Data Used in Supporting Regulatory Decisions?

    Get PDF
    The scientific establishment is deeply concerned over a proposed regulation that would require data to be shared on projects that are federally funded. Specifically, the proposed amendment to OMB Circular A-110 would require data collected by researchers at universities, hospitals, and non-profit institutions to be shared with interested parties if (1) the data are produced as part of a grant or agreement funded by the federal government; (2) the data are used in a published study; and (3) the data or study is used in formulating a policy or rule. Parties could request the data under the Freedom of Information Act. The proposed rule responds to a provision by Senator Richard Shelby in the 1999 Omnibus Spending Bill that requires data generated under federal awards at universities and non-profit institutions to be available to the public. This regulatory analysis develops an economic framework for evaluating proposals to provide greater access to research data. Our analysis also offers specific recommendations for improving OMB Circular A-110 as well as the broader regulatory process. We argue that the economic analysis of sharing research findings can be separated into three parts: the impact of requiring public access on incentives to produce data, research, and innovation; the impact of that requirement on the quality of research; and the impact of required access on the efficiency and transparency of policy. The economic analysis demonstrates that the standard property-rights framework used to justify time-limited property rights for the use of data is not sufficient for addressing broader problems in which research and data could be used to help inform public policy decisions. The value of sharing data for public policy must also be considered. A second conclusion is that traditional peer review done by scientific journals is not adequate for purposes of relying on research for major public policy decisions. A third conclusion is that scientists who are reluctant to share their findings are more likely to have errors in their analysis than the average researcher. A fourth conclusion is that requiring the release of data could slow the development of data and delay the publication of results. Although substantial costs and uncertainty may be associated with greater public access to data, our analysis suggests that academic norms alone provide very limited access to scientific data. We recommend improving Circular A-110 by narrowing and clarifying the scope of the proposed regulation. The proposed regulation should apply to economically significant regulations that have an annual economic impact of at least $100 million. In addition, we recommend that Congress create an agency that would be charged with replicating the findings of regulatory agencies before such regulations could be implemented. The recommendations concerning replication would require additional legal authority. Taken together, our recommendations would help lay the foundation for a regulatory system that is more accountable and has more scientific integrity.

    The Effect of EDTA in Attachment Gain and Root Coverage

    Get PDF
    Root surface biomodification using low pH agents such as citric acid and tetracycline has been proposed to enhance root coverage following connective tissue grafting. The authors hypothesized that root conditioning with neutral pH edetic acid would improve vertical recession depth, root surface coverage, pocket depth, and clinical attachment levels. Twenty teeth in 10 patients with Miller class I and II recession were treated with connective tissue grafting. The experimental sites received 24% edetic acid in sterile distilled water applied to the root surface for 2 minutes before grafting. Controls were pretreated with only sterile distilled water. Measurements were evaluated before surgery and 6 months after surgery. Analysis of variance was used to determine differences between experimental and control groups. We found significant postoperative improvements in vertical recession depth, root surface coverage, and clinical attachment levels in test and control groups, compared to postoperative data. Pocket depth differences were not significant (P\u3c.01)

    "GiGa": the Billion Galaxy HI Survey -- Tracing Galaxy Assembly from Reionization to the Present

    Full text link
    In this paper, we review the Billion Galaxy Survey that will be carried out at radio--optical wavelengths to micro--nanoJansky levels with the telescopes of the next decades. These are the Low-Frequency Array, the Square Kilometer Array and the Large Synoptic Survey Telescope as survey telescopes, and the Thirty Meter class Telescopes for high spectral resolution+AO, and the James Webb Space Telescope (JWST) for high spatial resolution near--mid IR follow-up. With these facilities, we will be addressing fundamental questions like how galaxies assemble with super-massive black-holes inside from the epoch of First Light until the present, how these objects started and finished the reionization of the universe, and how the processes of star-formation, stellar evolution, and metal enrichment of the IGM proceeded over cosmic time. We also summarize the high-resolution science that has been done thus far on high redshift galaxies with the Hubble Space Telescope (HST). Faint galaxies have steadily decreasing sizes at fainter fluxes and higher redshifts, reflecting the hierarchical formation of galaxies over cosmic time. HST has imaged this process in great structural detail to z<~6. We show that ultradeep radio-optical surveys may slowly approach the natural confusion limit, where objects start to unavoidably overlap because of their own sizes, which only SKA can remedy with HI redshifts for individual sub-clumps. Finally, we summarize how the 6.5 meter James Webb Space Telescope (JWST) will measure first light, reionization, and galaxy assembly in the near--mid-IR.Comment: 8 pages, LaTeX2e requires 'aip' style (included), 8 postscript figures. To appear in the proceedings of the `The Evolution of Galaxies through the Neutral Hydrogen Window' conference, Arecibo Observatory Feb 1-3, 2008; Eds. R. Minchin & E. Momjian, AIP Conf Pro

    Atemporal diagrams for quantum circuits

    Full text link
    A system of diagrams is introduced that allows the representation of various elements of a quantum circuit, including measurements, in a form which makes no reference to time (hence ``atemporal''). It can be used to relate quantum dynamical properties to those of entangled states (map-state duality), and suggests useful analogies, such as the inverse of an entangled ket. Diagrams clarify the role of channel kets, transition operators, dynamical operators (matrices), and Kraus rank for noisy quantum channels. Positive (semidefinite) operators are represented by diagrams with a symmetry that aids in understanding their connection with completely positive maps. The diagrams are used to analyze standard teleportation and dense coding, and for a careful study of unambiguous (conclusive) teleportation. A simple diagrammatic argument shows that a Kraus rank of 3 is impossible for a one-qubit channel modeled using a one-qubit environment in a mixed state.Comment: Minor changes in references. Latex 32 pages, 13 figures in text using PSTrick

    Comment on ``Consistent Sets Yield Contrary Inferences in Quantum Theory''

    Get PDF
    In a recent paper Kent has pointed out that in consistent histories quantum theory it is possible, given initial and final states, to construct two different consistent families of histories, in each of which there is a proposition that can be inferred with probability one, and such that the projectors representing these two propositions are mutually orthogonal. In this note we stress that, according to the rules of consistent history reasoning two such propositions are not contrary in the usual logical sense namely, that one can infer that if one is true then the other is false, and both could be false. No single consistent family contains both propositions, together with the initial and final states, and hence the propositions cannot be logically compared. Consistent histories quantum theory is logically consistent, consistent with experiment as far as is known, consistent with the usual quantum predictions for measurements, and applicable to the most general physical systems. It may not be the only theory with these properties, but in our opinion, it is the most promising among present possibilities.Comment: 2pages, uses REVTEX 3.

    First Lattice Study of the NN-P11(1440)P_{11}(1440) Transition Form Factors

    Full text link
    Experiments at Jefferson Laboratory, MIT-Bates, LEGS, Mainz, Bonn, GRAAL, and Spring-8 offer new opportunities to understand in detail how nucleon resonance (NN^*) properties emerge from the nonperturbative aspects of QCD. Preliminary data from CLAS collaboration, which cover a large range of photon virtuality Q2Q^2 show interesting behavior with respect to Q2Q^2 dependence: in the region Q21.5GeV2Q^2 \le 1.5 {GeV}^2, both the transverse amplitude, A1/2(Q2)A_{1/2}(Q^2), and the longitudinal amplitude, S1/2(Q2)S_{1/2}(Q^2), decrease rapidly. In this work, we attempt to use first-principles lattice QCD (for the first time) to provide a model-independent study of the Roper-nucleon transition form factor.Comment: 4 pages, 2 figures, double colum

    On stable homotopy equivalences

    Get PDF
    corecore