750 research outputs found

    Privately Legislated Intellectual Property Rights: Reconciling Freedom of Contract with Public Good Uses of Information

    Get PDF
    In an age of omnipresent clickwrap licenses, we acknowledge the need for a uniform set of default rules that would validate non-negotiable licenses as a mechanism for minimizing transaction costs likely to hinder economic development in a networked environment. However, we contend that any model of contract formation not driven by the traditional norms of mutual assent requires specially formulated doctrinal tools to avoid undermining long-established public good uses of information for such purposes as education and research, technical innovation, free speech, and the preservation of free competition. With the convergence of digital and telecommunications technologies, creators and innovators who distribute computerized information goods online can increasingly combat the causes of market failure directly-even in the absence of statutory intellectual property rights-by recourse to standard form contractual agreements that allow access to electronically stored information only on the licensor\u27s terms and conditions. In the networked environment, however, routine validation of mass-market access contracts and of non-negotiable constraints on users would tend to convert standard form licenses of digitized information goods into functional equivalents of privately legislated intellectual property rights. Firms possessing any degree of market power could thereby control access to, and use of, digitized information by means of adhesion contracts that alter or ignore the balance between incentives to create and free competition that the Framers recognized in the Constitution and that Congress has progressively codified in statutory intellectual property laws. Because existing legal doctrines appear insufficient to control the likely costs of such a radical social experiment, the main thrust of this Article is to formulate and develop minimalist doctrinal tools to limit the misuse of adhesion contracts that might otherwise adversely affect the preexisting balance of public and private interests. We believe such tools ought to figure prominently in any set of uniform state laws governing computerized information transactions, whether or not they emerge from the current debate surrounding a proposed Article 2B of the Uniform Commercial Code ( U.C.C. or the Code ). In Part I of this Article, we begin by identifying key misconceptions concerning the interface between federal intellectual property rights and state contract laws that have marred the drafters\u27 own notes and comments in the various iterations of Article 2B. We then explain how digital technologies, when combined with mass-market contracts, enable information providers to alter the existing legislative balance between public and private interests in unexpected and socially harmful ways. We further demonstrate that the uniform state laws proposed to validate these private rights have been crafted without balancing the social costs of legal incentives to innovate against the benefits of free competition, and without regard for the constitutional mandate to promote the [p]rogress of [s]cience and useful [a]rts.\u27\u27 On the contrary, the drafters of Article 2B empower purveyors of digitized information goods to undermine, by contract, long-standing policies and practices that directly promote cumulative and sequential innovation as well as the public interest in education, science, research, competition, and freedom of expression. In Part II, we discuss the new doctrinal tools with which we would empower courts to apply public-interest checks on standardized access contracts and on non-negotiable terms and conditions affecting users of computerized information goods. In so doing, we take pains to preserve the maximum degree of freedom of contract, not just with respect to negotiated terms generally, but even with respect to non-negotiable terms lacking any socially harmful or demonstrably anticompetitive impact over time. We also compare the costs and benefits of Article 2B, as refined by the addition of our proposed safeguards, with those likely to ensue if Article 2B were adopted in its present form. Here, we focus particularly on issues affecting the legal protection of computer software, on the role that the fair use exception of copyright law might play in information transactions generally, and on issues affecting bundles of factual information that cannot be copyrighted under existing laws. In Part III, we explore the deeper implications of a shift from the traditional, assent-driven model of contract formation to a model that validates non-negotiable contracts of adhesion containing socially acceptable terms and conditions. We show that a minimalist regulatory tool along the lines of our proposed public-interest unconscionability doctrine yields positive social benefits, despite the transaction costs and enforcement problems it logically engenders. We also explore the connection between the kind of non-negotiable middle ground we deem indispensable to a paradigm shift in contract formation and the need for a broader information policy. We conclude with a prediction that if Article 2B were to incorporate the safeguards we propose, it might better yield sound empirical data for devising the long-term information policies that elude us in our present state of ignorance and uncertainty

    Second order parameter-uniform convergence for a finite difference method for a partially singularly perturbed linear parabolic system

    Get PDF
    A linear system of nn second order differential equations of parabolic reaction-diffusion type with initial and boundary conditions is considered. The first kk equations are singularly perturbed. Each of the leading terms of the first mm equations, mleqkmleq k, is multiplied by a small positive parameter and these parameters are assumed to be distinct. The leading terms of the next kmk-m equations are multiplied by the same perturbation parameter varepsilonmvarepsilon_m. Since the components of the solution exhibit overlapping layers, Shishkin piecewise-uniform meshes are introduced, which are used in conjunction with a classical finite difference discretisation, to construct a numerical method for solving this problem. It is proved that in the maximum norm the numerical approximations obtained with this method are first order convergent in time and essentially second order convergent in the space variable, uniformly with respect to all of the parameters

    Second order parameter-uniform convergence for a finite difference method for a partially singularly perturbed linear parabolic system

    Get PDF
    A linear system of nn second order differential equations of parabolic reaction-diffusion type with initial and boundary conditions is considered. The first kk equations are singularly perturbed. Each of the leading terms of the first mm equations, mleqkmleq k, is multiplied by a small positive parameter and these parameters are assumed to be distinct. The leading terms of the next kmk-m equations are multiplied by the same perturbation parameter varepsilonmvarepsilon_m. Since the components of the solution exhibit overlapping layers, Shishkin piecewise-uniform meshes are introduced, which are used in conjunction with a classical finite difference discretisation, to construct a numerical method for solving this problem. It is proved that in the maximum norm the numerical approximations obtained with this method are first order convergent in time and essentially second order convergent in the space variable, uniformly with respect to all of the parameters

    Can forest management based on natural disturbances maintain ecological resilience?

    Get PDF
    Given the increasingly global stresses on forests, many ecologists argue that managers must maintain ecological resilience: the capacity of ecosystems to absorb disturbances without undergoing fundamental change. In this review we ask: Can the emerging paradigm of natural-disturbance-based management (NDBM) maintain ecological resilience in managed forests? Applying resilience theory requires careful articulation of the ecosystem state under consideration, the disturbances and stresses that affect the persistence of possible alternative states, and the spatial and temporal scales of management relevance. Implementing NDBM while maintaining resilience means recognizing that (i) biodiversity is important for long-term ecosystem persistence, (ii) natural disturbances play a critical role as a generator of structural and compositional heterogeneity at multiple scales, and (iii) traditional management tends to produce forests more homogeneous than those disturbed naturally and increases the likelihood of unexpected catastrophic change by constraining variation of key environmental processes. NDBM may maintain resilience if silvicultural strategies retain the structures and processes that perpetuate desired states while reducing those that enhance resilience of undesirable states. Such strategies require an understanding of harvesting impacts on slow ecosystem processes, such as seed-bank or nutrient dynamics, which in the long term can lead to ecological surprises by altering the forest's capacity to reorganize after disturbance

    Metagenes Associated with Survival in Non-Small Cell Lung Cancer

    Get PDF
    NSCLC (non-small cell lung cancer) comprises about 80% of all lung cancer cases worldwide. Surgery is most effective treatment for patients with early-stage disease. However, 30%–55% of these patients develop recurrence within 5 years. Therefore, markers that can be used to accurately classify early-stage NSCLC patients into different prognostic groups may be helpful in selecting patients who should receive specific therapies

    Plasma Wakefield Acceleration with a Modulated Proton Bunch

    Get PDF
    The plasma wakefield amplitudes which could be achieved via the modulation of a long proton bunch are investigated. We find that in the limit of long bunches compared to the plasma wavelength, the strength of the accelerating fields is directly proportional to the number of particles in the drive bunch and inversely proportional to the square of the transverse bunch size. The scaling laws were tested and verified in detailed simulations using parameters of existing proton accelerators, and large electric fields were achieved, reaching 1 GV/m for LHC bunches. Energy gains for test electrons beyond 6 TeV were found in this case.Comment: 9 pages, 7 figure

    All-optical switching and strong coupling using tunable whispering-gallery-mode microresonators

    Full text link
    We review our recent work on tunable, ultrahigh quality factor whispering-gallery-mode bottle microresonators and highlight their applications in nonlinear optics and in quantum optics experiments. Our resonators combine ultra-high quality factors of up to Q = 3.6 \times 10^8, a small mode volume, and near-lossless fiber coupling, with a simple and customizable mode structure enabling full tunability. We study, theoretically and experimentally, nonlinear all-optical switching via the Kerr effect when the resonator is operated in an add-drop configuration. This allows us to optically route a single-wavelength cw optical signal between two fiber ports with high efficiency. Finally, we report on progress towards strong coupling of single rubidium atoms to an ultra-high Q mode of an actively stabilized bottle microresonator.Comment: 20 pages, 24 figures. Accepted for publication in Applied Physics B. Changes according to referee suggestions: minor corrections to some figures and captions, clarification of some points in the text, added references, added new paragraph with results on atom-resonator interactio

    Wind modelling of very massive stars up to 300 solar masses

    Full text link
    Some studies have claimed a universal stellar upper-mass limit of 150 Msun. A factor that is often overlooked is that there might be a difference between the current and initial masses of the most massive stars, as a result of mass loss. We present Monte Carlo mass-loss predictions for very massive stars in the range 40-300 Msun, with large luminosities and Eddington factors Gamma. Using our new dynamical approach, we find an upturn in the mass-loss vs. Gamma dependence, at the point where the winds become optically thick. This coincides with the location where wind efficiency numbers surpass the single-scattering limit of Eta = 1, reaching values up to Eta = 2.5. Our modelling suggests a transition from common O-type winds to Wolf-Rayet characteristics at the point where the winds become optically thick. This transitional behaviour is also revealed with respect to the wind acceleration parameter beta, which starts at values below 1 for the optically thin O-stars, and naturally reaches values as high as 1.5-2 for the optically thick Wolf-Rayet models. An additional finding concerns the transition in spectral morphology of the Of and WN characteristic He II line at 4686 Angstrom. When we express our mass-loss predictions as a function of the electron scattering Gamma_e (=L/M) only, we obtain a mass-loss Gamma dependence that is consistent with a previously reported power-law Mdot propto Gamma^5 (Vink 2006) that was based on our semi-empirical modelling approach. When we express Mdot in terms of both Gamma and stellar mass, we find Mdot propto M^0.8 Gamma^4.8 for our high Gamma models. Finally, we confirm that the Gamma-effect on the mass-loss predictions is much stronger than that of an increased helium abundance, calling for a fundamental revision in the way mass loss is incorporated in evolutionary models of the most massive stars.Comment: minor language changes (Astronomy & Astrophysics in press - 11 pages, 10 figures
    corecore