3,364 research outputs found

    Towards durable multistakeholder-generated solutions: The pilot application of a problem-oriented policy learning protocol to legality verification and community rights in Peru

    Get PDF
    This paper reports and reflects on the pilot application of an 11-step policy learning protocol that was developed by Cashore and Lupberger (2015) based on several years of Cashore’s multi-author collaborations. The protocol was applied for the first time in Peru in 2015 and 2016 by the IUFRO Working Party on Forest Policy Learning Architectures (hereinafter referred to as the project team). The protocol integrates insights from policy learning scholarship (Hall 1993, Sabatier 1999) with Bernstein and Cashore’s (2000, 2012) four pathways of influence framework. The pilot implementation in Peru focused on how global timber legality verification interventions might be harnessed to promote local land rights. Legality verification focuses attention on the checking and auditing of forest management units in order to verify that timber is harvested and traded in compliance with the law. We specifically asked: How can community legal ownership of, and access to, forestland and forest resources be enhanced? The protocol was designed as a dynamic tool, the implementation of which fosters iterative rather than linear processes. It directly integrated two objectives: 1) identifying the causal processes through which global governance initiatives might be harnessed to produce durable results ‘on the ground’; 2) generating insights and strategies in collaboration with relevant stakeholders. This paper reviews and critically evaluates our work in designing and piloting the protocol. We assess what seemed to work well and suggest modifications, including an original diagnostic framework for nurturing durable change. We also assess the implications of the pilot application of the protocol for policy implementation that works to enhance the influence of existing international policy instruments, rather than contributing to fragmentation and incoherence by creating new ones

    W2: ESTIMATING DRUG EFFECTS: FROM CLINICAL TRIAL RESULTS TO ACTUAL PRACTICE

    Get PDF

    Theory of Room Temperature Ferromagnet V(TCNE)_x (1.5 < x < 2): Role of Hidden Flat Bands

    Full text link
    Theoretical studies on the possible origin of room temperature ferromagnetism (ferromagnetic once crystallized) in the molecular transition metal complex, V(TCNE)_x (1.5<x<2) have been carried out. For this family, there have been no definite understanding of crystal structure so far because of sample quality, though the effective valence of V is known to be close to +2. Proposing a new crystal structure for the stoichiometric case of x=2, where the valence of each TCNE molecule is -1 and resistivity shows insulating behavior, exchange interaction among d-electrons on adjacent V atoms has been estimated based on the cluster with 3 vanadium atoms and one TCNE molecule. It turns out that Hund's coupling among d orbitals within the same V atoms and antiferromagnetic coupling between d oribitals and LUMO of TCNE (bridging V atoms) due to hybridization result in overall ferromagnetism (to be precise, ferrimagnetism). This view based on localized electrons is supplemented by the band picture, which indicates the existence of a flat band expected to lead to ferromagnetism as well consistent with the localized view. The off-stoichiometric cases (x<2), which still show ferromagnetism but semiconducting transport properties, have been analyzed as due to Anderson localization.Comment: Accepted for publication in J. Phys. Soc. Jpn. Vol.79 (2010), No. 3 (March issue), in press; 6 pages, 8 figure

    Impediments to mixing classical and quantum dynamics

    Full text link
    The dynamics of systems composed of a classical sector plus a quantum sector is studied. We show that, even in the simplest cases, (i) the existence of a consistent canonical description for such mixed systems is incompatible with very basic requirements related to the time evolution of the two sectors when they are decoupled. (ii) The classical sector cannot inherit quantum fluctuations from the quantum sector. And, (iii) a coupling among the two sectors is incompatible with the requirement of physical positivity of the theory, i.e., there would be positive observables with a non positive expectation value.Comment: RevTex, 21 pages. Title slightly modified and summary section adde

    Bell State Preparation using Pulsed Non-Degenerate Two-Photon Entanglement

    Get PDF
    We report a novel Bell state preparation experiment. High-purity Bell states are prepared by using femtosecond pulse pumped \emph{nondegenerate} collinear spontaneous parametric down-conversion. The use of femtosecond pump pulse {\em does not} result in reduction of quantum interference visibility in our scheme in which post-selection of amplitudes and other traditional mechanisms, such as, using thin nonlinear crystals or narrow-band spectral filters are not used. Another distinct feature of this scheme is that the pump, the signal, and the idler wavelengths are all distinguishable, which is very useful for quantum communications.Comment: 4 pages, submitted to PR

    Violation of Bell inequalities by photons more than 10 km apart

    Full text link
    A Franson-type test of Bell inequalities by photons 10.9 km apart is presented. Energy-time entangled photon-pairs are measured using two-channel analyzers, leading to a violation of the inequalities by 16 standard deviations without subtracting accidental coincidences. Subtracting them, a 2-photon interference visibility of 95.5% is observed, demonstrating that distances up to 10 km have no significant effect on entanglement. This sets quantum cryptography with photon pairs as a practical competitor to the schemes based on weak pulses.Comment: 4 pages, REVTeX, 2 postscript figures include

    Statistical mechanics of the vertex-cover problem

    Full text link
    We review recent progress in the study of the vertex-cover problem (VC). VC belongs to the class of NP-complete graph theoretical problems, which plays a central role in theoretical computer science. On ensembles of random graphs, VC exhibits an coverable-uncoverable phase transition. Very close to this transition, depending on the solution algorithm, easy-hard transitions in the typical running time of the algorithms occur. We explain a statistical mechanics approach, which works by mapping VC to a hard-core lattice gas, and then applying techniques like the replica trick or the cavity approach. Using these methods, the phase diagram of VC could be obtained exactly for connectivities c<ec<e, where VC is replica symmetric. Recently, this result could be confirmed using traditional mathematical techniques. For c>ec>e, the solution of VC exhibits full replica symmetry breaking. The statistical mechanics approach can also be used to study analytically the typical running time of simple complete and incomplete algorithms for VC. Finally, we describe recent results for VC when studied on other ensembles of finite- and infinite-dimensional graphs.Comment: review article, 26 pages, 9 figures, to appear in J. Phys. A: Math. Ge

    Optimal static pricing for a tree network

    Get PDF
    We study the static pricing problem for a network service provider in a loss system with a tree structure. In the network, multiple classes share a common inbound link and then have dedicated outbound links. The motivation is from a company that sells phone cards and needs to price calls to different destinations. We characterize the optimal static prices in order to maximize the steady-state revenue. We report new structural findings as well as alternative proofs for some known results. We compare the optimal static prices versus prices that are asymptotically optimal, and through a set of illustrative numerical examples we show that in certain cases the loss in revenue can be significant. Finally, we show that static prices obtained using the reduced load approximation of the blocking probabilities can be easily obtained and have near-optimal performance, which makes them more attractive for applications.Massachusetts Institute of Technology. Center for Digital BusinessUnited States. Office of Naval Research (Contract N00014-95-1-0232)United States. Office of Naval Research (Contract N00014-01-1-0146)National Science Foundation (U.S.) (Contract DMI-9732795)National Science Foundation (U.S.) (Contract DMI-0085683)National Science Foundation (U.S.) (Contract DMI-0245352

    Multiplicity Studies and Effective Energy in ALICE at the LHC

    Full text link
    In this work we explore the possibility to perform ``effective energy'' studies in very high energy collisions at the CERN Large Hadron Collider (LHC). In particular, we focus on the possibility to measure in pppp collisions the average charged multiplicity as a function of the effective energy with the ALICE experiment, using its capability to measure the energy of the leading baryons with the Zero Degree Calorimeters. Analyses of this kind have been done at lower centre--of--mass energies and have shown that, once the appropriate kinematic variables are chosen, particle production is characterized by universal properties: no matter the nature of the interacting particles, the final states have identical features. Assuming that this universality picture can be extended to {\it ion--ion} collisions, as suggested by recent results from RHIC experiments, a novel approach based on the scaling hypothesis for limiting fragmentation has been used to derive the expected charged event multiplicity in AAAA interactions at LHC. This leads to scenarios where the multiplicity is significantly lower compared to most of the predictions from the models currently used to describe high energy AAAA collisions. A mean charged multiplicity of about 1000-2000 per rapidity unit (at η∼0\eta \sim 0) is expected for the most central Pb−PbPb-Pb collisions at sNN=5.5TeV\sqrt{s_{NN}} = 5.5 TeV.Comment: 12 pages, 19 figures. In memory of A. Smirnitski
    • …
    corecore