1,400 research outputs found
Geology for planning in St. Clair County, Illinois.
Cover title.At head of title: State of Illinois. Department of Registration and Education.Includes bibliographical references (pages 32-35)
Antitrust Error
Fueled by economics, antitrust has evolved into a highly sophisticated body of law. Its malleable doctrine enables courts to tailor optimal standards to a wide variety of economic phenomena. Indeed, economic theory has been so revolutionary that modern U.S. competition law bears little resemblance to that which prevailed fifty years ago. Yet, for all the contributions of economics, its explanatory powers are subject to important limitations. Profound questions remain at the borders of contemporary antitrust enforcement, but answers remain elusive. It is because of the epistemological limitations of economic analysis that antitrust remains unusually vulnerable to error. The fear of mistakenly ascribing anticompetitive labels to innocuous conduct is now pervasive. The Supreme Court has repeatedly framed its rulings in a manner that shows sensitivity to the unavoidability of error. In doing so, it has adopted the principle of decision theory that Type I errors are generally to be preferred over Type II. It has crafted a pro-defendant body of jurisprudence accordingly. In 2008, the Justice Department picked up the gauntlet and published the first definitive attempt at extrapolating optimal error rules. Yet, in 2009, the new administration promptly withdrew the report, opining that it could “separate the wheat from the chaff” and thus marginalizing the issue of error. Notwithstanding this confident proclamation, error remains as visible as ever. Intel’s behavior in offering rebates has been subject to wildly fluctuating analysis by the U.S. and E.U. enforcement agencies. In a marked departure from precedent, the DOJ is again viewing vertical mergers with concern. And the agency has reversed course on the legality of exclusionary payments in the pharmaceutical industry. Antitrust divergence, both within and outside the United States, remains painfully apparent, demonstrable proof that vulnerability to error remains systemic. For this reason, error analysis may be the single most important unresolved issue facing modern competition policy. This Article seeks to challenge the contemporary mode of error analysis in antitrust law. We explain the causes and consequences of antitrust error and articulate a variety of suggested cures. In doing so, we debunk the current presumption that false positives are necessarily to be preferred over false negatives. We highlight a variety of cases in which the contemporary bias in favor of underenforcement should be revisited
Antitrust Error
Fueled by economics, antitrust has evolved into a highly sophisticated body of law. Its malleable doctrine enables courts to tailor optimal standards to a wide variety of economic phenomena. Indeed, economic theory has been so revolutionary that modern U.S. competition law bears little resemblance to that which prevailed fifty years ago. Yet, for all the contributions of economics, its explanatory powers are subject to important limitations. Profound questions remain at the borders of contemporary antitrust enforcement, but answers remain elusive. It is because of the epistemological limitations of economic analysis that antitrust remains unusually vulnerable to error. The fear of mistakenly ascribing anticompetitive labels to innocuous conduct is now pervasive. The Supreme Court has repeatedly framed its rulings in a manner that shows sensitivity to the unavoidability of error. In doing so, it has adopted the principle of decision theory that Type I errors are generally to be preferred over Type II. It has crafted a pro-defendant body of jurisprudence accordingly. In 2008, the Justice Department picked up the gauntlet and published the first definitive attempt at extrapolating optimal error rules. Yet, in 2009, the new administration promptly withdrew the report, opining that it could “separate the wheat from the chaff” and thus marginalizing the issue of error. Notwithstanding this confident proclamation, error remains as visible as ever. Intel’s behavior in offering rebates has been subject to wildly fluctuating analysis by the U.S. and E.U. enforcement agencies. In a marked departure from precedent, the DOJ is again viewing vertical mergers with concern. And the agency has reversed course on the legality of exclusionary payments in the pharmaceutical industry. Antitrust divergence, both within and outside the United States, remains painfully apparent, demonstrable proof that vulnerability to error remains systemic. For this reason, error analysis may be the single most important unresolved issue facing modern competition policy. This Article seeks to challenge the contemporary mode of error analysis in antitrust law. We explain the causes and consequences of antitrust error and articulate a variety of suggested cures. In doing so, we debunk the current presumption that false positives are necessarily to be preferred over false negatives. We highlight a variety of cases in which the contemporary bias in favor of underenforcement should be revisited
The Riddle Underlying Refusal-to-Deal Theory
May a dominant firm refuse to share its intellectual property (IP) with its rivals? This question lies at the heart of a highly divisive, international debate concerning the proper application of the antitrust laws. In this short Essay, we consider a profound, yet previously unaddressed, incongruity underlying the controversy. Specifically, why is it that monopolists refuse to share their IP, even at monopoly prices? To resolve this issue, some have recommended compulsory licensing, which would require monopolists to license their IP in certain circumstances. This proposal, however, entails an inescapable contradiction, one rooted in the issue of monopolists’ seemingly inexplicable refusal to share their IP
The Riddle Underlying Refusal-to-Deal Theory
May a dominant firm refuse to share its intellectual property (IP) with its rivals? This question lies at the heart of a highly divisive, international debate concerning the proper application of the antitrust laws. In this short Essay, we consider a profound, yet previously unaddressed, incongruity underlying the controversy. Specifically, why is it that monopolists refuse to share their IP, even at monopoly prices? To resolve this issue, some have recommended compulsory licensing, which would require monopolists to license their IP in certain circumstances. This proposal, however, entails an inescapable contradiction, one rooted in the issue of monopolists’ seemingly inexplicable refusal to share their IP
Method and apparatus for shadow aperture backscatter radiography (SABR) system and protocol
A shadow aperture backscatter radiography (SABR) system includes at least one penetrating radiation source for providing a penetrating radiation field, and at least one partially transmissive radiation detector, wherein the partially transmissive radiation detector is interposed between an object region to be interrogated and the radiation source. The partially transmissive radiation detector transmits a portion of the illumination radiation field. A shadow aperture having a plurality of radiation attenuating regions having apertures therebetween is disposed between the radiation source and the detector. The apertures provide illumination regions for the illumination radiation field to reach the object region, wherein backscattered radiation from the object is detected and generates an image by the detector in regions of the detector that are shadowed by the radiation attenuation regions
Pure phase-encoded MRI and classification of solids
Here, the authors combine a pure phase-encoded magnetic resonance imaging (MRI) method with a new tissue-classification technique to make geometric models of a human tooth. They demonstrate the feasibility of three-dimensional imaging of solids using a conventional 11.7-T NMR spectrometer. In solid-state imaging, confounding line-broadening effects are typically eliminated using coherent averaging methods. Instead, the authors circumvent them by detecting the proton signal at a fixed phase-encode time following the radio-frequency excitation. By a judicious choice of the phase-encode time in the MRI protocol, the authors differentiate enamel and dentine sufficiently to successfully apply a new classification algorithm. This tissue-classification algorithm identifies the distribution of different material types, such as enamel and dentine, in volumetric data. In this algorithm, the authors treat a voxel as a volume, not as a single point, and assume that each voxel may contain more than one material. They use the distribution of MR image intensities within each voxel-sized volume to estimate the relative proportion of each material using a probabilistic approach. This combined approach, involving MRI and data classification, is directly applicable to bone imaging and hard-tissue contrast-based modeling of biological solids
Why citizens don’t like paying for public goods with their taxes– and how institutions can change that
Why are Americans so against paying taxes to fund basic government functions such as roads and education? In new research, Alan M. Jacobs and J. Scott Matthews find that many citizens object to paying for public investment because they do not trust politicians to spend new revenues as promised. Using online experiments with voting-age US citizens, they find that support for using taxation to pay for investment was dependent on how much voters trusted the institution charged with carrying out the work. Local governments and the military were trusted to a much greater degree than Congress, especially among conservatives. Citizens were also more willing to pay more for public goods when they were told that the new taxes would be set aside in a dedicated trust fund account
- …