2,426 research outputs found

    Block designs for experiments with non-normal response

    No full text
    Many experiments measure a response that cannot be adequately described by a linear model withnormally distributed errors and are often run in blocks of homogeneous experimental units. Wedevelop the first methods of obtaining efficient block designs for experiments with an exponentialfamily response described by a marginal model fitted via Generalized Estimating Equations. Thismethodology is appropriate when the blocking factor is a nuisance variable as, for example, occursin industrial experiments. A D-optimality criterion is developed for finding designs robust to thevalues of the marginal model parameters and applied using three strategies: unrestricted algorithmicsearch, use of minimum-support designs, and blocking of an optimal design for the correspondingGeneralized Linear Model. Designs obtained from each strategy are critically compared and shownto be much more efficient than designs that ignore the blocking structure. The designs are comparedfor a range of values of the intra-block working correlation and for exchangeable, autoregressive andnearest neighbor structures. An analysis strategy is developed for a binomial response that allows es-timation from experiments with sparse data, and its efectiveness demonstrated. The design strategiesare motivated and demonstrated through the planning of an experiment from the aeronautics industr

    Applying tradable permits to biodiversity conservation: A conceptual analysis of trading rules

    Get PDF
    Tradable permits have already been applied in many areas of environmental policy and may be a possible response to increasing calls for flexible conservation instruments which are able to successfully conserve biodiversity while allowing for economic development. The idea behind applying tradable permits to conservation is that developers wishing to turn land to economic purposes, thereby destroying valuable habitat, may only do so if they submit a permit to the conservation agency showing that habitat of at least the equivalent ecological value is restored elsewhere. The developer himself does not need to carry out the restoration, but may buy a permit from a third party thus allowing a market to emerge. However, applying tradable permits to biodiversity conservation is a complex issue, because destroyed and restored habitats are likely to differ. The purpose of this essay is to discuss on a conceptual level the consequences of these differences along the dimensions of type, space and time for the design of trading rules. We consider the resulting effects on trading activity in the permit market and the cost-effectiveness as well as the ecological effectiveness of the scheme. We find various trade-offs with regard to market activity, cost-effectiveness, ecological effectiveness and transaction costs . --Tradable permits,conservation policy,cost-effectiveness,habitat banking,economic development,land use,market based instruments

    Re-Reforming the Bostonian System: A Novel Approach to the Schooling Problem

    Get PDF
    This paper proposes the notion of E-stability to conciliate Pareto efficiency and fairness. We propose the use of a centralized procedure, the Exchanging Places Mechanism. It endows students a position according with the Gale and Shapley students optimal stable matching as tentative allocation and allows the student to trade their positions. We show that the final allocation is E-stable, i.e. efficient, fair and immune to any justifiable objection that students can formulate.School allocation problem, Pareto efficient matching

    Managing Permit Markets to Stabilize Prices

    Get PDF
    The political economy of environmental policy favors the use of quantity-based instruments over price-based instruments (e.g., tradable permits over green taxes), at least in the United States. With cost uncertainty, however, there are clear efficiency advantages to prices in many cases, especially for stock pollutants such as greenhouse gases. The question arises, therefore, of whether one can design flexible quantity policies that mimic the behavior of price policies, namely stable permit prices and abatement costs. We explore a number of “quantity-plus” policies that replicate the behavior of a price policy through rules that adjust the effective permit cap for unexpectedly low or high costs. They do so without necessitating any monetary exchanges between the government and the regulated firms, which can be a significant political barrier to the use of price instruments.permit market, prices, quantities, banking, borrowing, uncertainty

    Complementarities and Collusion in an FCC Spectrum Auction

    Get PDF
    We empirically study bidding in the C Block of the US mobile phone spectrum auctions. Spectrum auctions are conducted using a simultaneous ascending auction design that allows bidders to assemble packages of licenses with geographic complementarities. While this auction design allows the market to find complementarities, the auction might also result in an inefficient equilibrium. In addition, these auctions have equilibria where implicit collusion is sustained through threats of bidding wars. We estimate a structural model in order to test for the presence of complementarities and implicit collusion. The estimation strategy is valid under a wide variety of alternative assumptions about equilibrium in these auctions and is robust to potentially important forms of unobserved heterogeneity. We make suggestions about the design of future spectrum auctions.Technology and Industry

    Asymptotically Optimal Load Balancing Topologies

    Full text link
    We consider a system of NN servers inter-connected by some underlying graph topology GNG_N. Tasks arrive at the various servers as independent Poisson processes of rate λ\lambda. Each incoming task is irrevocably assigned to whichever server has the smallest number of tasks among the one where it appears and its neighbors in GNG_N. Tasks have unit-mean exponential service times and leave the system upon service completion. The above model has been extensively investigated in the case GNG_N is a clique. Since the servers are exchangeable in that case, the queue length process is quite tractable, and it has been proved that for any λ<1\lambda < 1, the fraction of servers with two or more tasks vanishes in the limit as NN \to \infty. For an arbitrary graph GNG_N, the lack of exchangeability severely complicates the analysis, and the queue length process tends to be worse than for a clique. Accordingly, a graph GNG_N is said to be NN-optimal or N\sqrt{N}-optimal when the occupancy process on GNG_N is equivalent to that on a clique on an NN-scale or N\sqrt{N}-scale, respectively. We prove that if GNG_N is an Erd\H{o}s-R\'enyi random graph with average degree d(N)d(N), then it is with high probability NN-optimal and N\sqrt{N}-optimal if d(N)d(N) \to \infty and d(N)/(Nlog(N))d(N) / (\sqrt{N} \log(N)) \to \infty as NN \to \infty, respectively. This demonstrates that optimality can be maintained at NN-scale and N\sqrt{N}-scale while reducing the number of connections by nearly a factor NN and N/log(N)\sqrt{N} / \log(N) compared to a clique, provided the topology is suitably random. It is further shown that if GNG_N contains Θ(N)\Theta(N) bounded-degree nodes, then it cannot be NN-optimal. In addition, we establish that an arbitrary graph GNG_N is NN-optimal when its minimum degree is No(N)N - o(N), and may not be NN-optimal even when its minimum degree is cN+o(N)c N + o(N) for any 0<c<1/20 < c < 1/2.Comment: A few relevant results from arXiv:1612.00723 are included for convenienc

    Randomization Inference and Sensitivity Analysis for Composite Null Hypotheses With Binary Outcomes in Matched Observational Studies

    Get PDF
    We present methods for conducting hypothesis testing and sensitivity analyses for composite null hypotheses in matched observational studies when outcomes are binary. Causal estimands discussed include the causal risk difference, causal risk ratio, and the effect ratio. We show that inference under the assumption of no unmeasured confounding can be performed by solving an integer linear program, while inference allowing for unmeasured confounding of a given strength requires solving an integer quadratic program. Through simulation studies and data examples, we demonstrate that our formulation allows these problems to be solved in an expedient manner even for large datasets and for large strata. We further exhibit that through our formulation, one can assess the impact of various assumptions about the potential outcomes on the performed inference. R scripts are provided that implement our methods. Supplementary materials for this article are available online. Keywords: Causal inference; Causal risk; Effect ratio; Integer programming; Sensitivity analysi
    corecore