15,867 research outputs found

    Lenz v. Universal Music Corp. And the Potential Effect of Fair Use Analysis Under the Takedown Procedures of §512 of the DMCA

    Get PDF
    The notice and takedown/putback procedures in §512 of the Digital Millennium Act fail to adequately protect the rights of individuals who post content on the internet. This iBrief examines the notice and takedown/putback procedures and Judge Fogel\u27s decision in Lenz v. Universal Music Corp., which requires a copyright owner to conduct a fair use evaluation prior to issuing a takedown notice. This iBrief concludes such a requirement is an appropriate first step towards creating adequate protection for user-generated content on the Internet

    An extension of democratic principles to our economic and social institutions would go a long way to reducing inequality

    Get PDF
    Inequality has grown substantially in Britain over the last few decades and Mike O’Donnell argues that there needs to be radical institutional reform if the underlying issues are to be adequately addressed

    Generics, Race, and Social Perspectives

    Get PDF
    The project of this paper is to deliver a semantics for a broad subset of bare plural generics about racial kinds, a class which I will dub 'Type C generics.' Examples include 'Blacks are criminal' and 'Muslims are terrorists.' Type C generics have two interesting features. First, they link racial kinds with ​ socially perspectival predicates ​ (SPPs). SPPs lead interpreters to treat the relationship between kinds and predicates in generic constructions as nomic or non-accidental. Moreover, in computing their content, interpreters must make implicit reference to socially privileged​ ​ perspectives which are treated as authoritative about whether a given object fits into the extension of the predicate. Such deference grants these authorities influence over both the conventional meaning of these terms and over the nature of the ​ objects ​ in the social ontology that these terms purport to describe, much the way a baseball umpire is authoritative over the meaning and metaphysics of 'strike'/​ strike ​. Second, terms like 'criminal' and 'terrorist' receive default ​ racialized ​ interpretations in which these terms conventionally token racial or ethnic identities. I show that neither of these features can be explained by Sarah-Jane Leslie's influential 'weak semantics' for generics, and show how my own 'socially perspectival semantics' fares better on both counts. Finally, I give an analysis of 'Blacks are criminal' which explores the semantic mechanisms that underlie default racialized interpretations

    Econometric Estimation of Distance Functions and Associated Measures of Productivity and Efficiency Change

    Get PDF
    The economically-relevant characteristics of multi-input multi-output production technologies can be represented using distance functions. The econometric approach to estimating these functions typically involves factoring out one of the outputs or inputs and estimating the resulting equation using maximum likelihood methods. A problem with this approach is that the outputs or inputs that are not factored out may be correlated with the composite error term. Fernandez, Koop and Steel (2000, p. 58) have developed a Bayesian solution to this so-called ‘endogeneity’ problem. O'Donnell (2007) has adapted the approach to the estimation of directional distance functions. This paper shows how the approach can be used to estimate Shephard (1953) distance functions and an associated index of total factor productivity (TFP) change. The TFP index is a new multiplicatively-complete index that satisfies most, if not all, economically-relevant tests and axioms from index number theory. The fact that it is multiplicatively-complete means it can be exhaustively decomposed into a measure of technical change and various measures of efficiency change. The decomposition can be implemented without the use of price data and without making any assumptions concerning either the optimising behaviour of firms or the degree of competition in product markets. The methodology is illustrated using state-level quantity data on U.S. agricultural inputs and outputs over the period 1960-2004. Results are summarised in terms of the characteristics (e.g., means) of estimated probability densities for measures of TFP change, technical change and output-oriented measures of efficiency change.

    Radiofrequency Ablation for Post Infarction Ventricular Tachycardia

    Get PDF
    Radiofrequency ablation has an important role in the management of post infarction ventricular tachycardia. The mapping and ablation of ventricular tachycardia (VT) is complex and technically challenging. In the era of implantable cardioverter defibrillators, the role of radiofrequency ablation is most commonly reserved as an adjunctive treatment for patients with frequent, symptomatic episodes of ventricular tachycardia. In this setting the procedure has a success rate of around 70-80% and a low complication rate. With improved ability to predict recurrent VT and improvements in mapping and ablation techniques and technologies, the role of radiofrequency ablation should expand further

    Estimating State-allocable Production Technologies When there are two States of Nature and State Allocations of Inputs are Unobserved.

    Get PDF
    Chambers and Quiggin (2000) have used state-contingent production theory to establish important results concerning economic behaviour in the presence of uncertainty, including problems of consumer choice, the theory of the firm, and principal-agent relationships. Empirical application of the state contingent approach has proved difficult, not least because most of the data needed for applying standard econometric methods are lost in unrealized states of the world. O’Donnell and Griffiths (2006) show how a restrictive type of state-contingent technology can be estimated in a finite mixtures framework. This paper shows how Bayesian methodology can be used to estimate more flexible types of state-contingent technologies.

    Excited multiplets of Eu in GaN

    Get PDF
    A method to calculate the multiplet states of lanthanide impurities in solids is presented. This approach is based on a semi-empirical density functional method which includes corrections to account for the correlation and spin-orbit coupling of the 4f electrons. Specific multiplet states of the rare earth are produced by constraining the system. This approach is then used to investigate some of the properties of substitutional europium impurities in gallium nitride, reproducing the relative energy of two multiplets, and discussing a potential excitation mechanism for these centers

    A composition theorem for the Fourier Entropy-Influence conjecture

    Full text link
    The Fourier Entropy-Influence (FEI) conjecture of Friedgut and Kalai [FK96] seeks to relate two fundamental measures of Boolean function complexity: it states that H[f]CInf[f]H[f] \leq C Inf[f] holds for every Boolean function ff, where H[f]H[f] denotes the spectral entropy of ff, Inf[f]Inf[f] is its total influence, and C>0C > 0 is a universal constant. Despite significant interest in the conjecture it has only been shown to hold for a few classes of Boolean functions. Our main result is a composition theorem for the FEI conjecture. We show that if g1,...,gkg_1,...,g_k are functions over disjoint sets of variables satisfying the conjecture, and if the Fourier transform of FF taken with respect to the product distribution with biases E[g1],...,E[gk]E[g_1],...,E[g_k] satisfies the conjecture, then their composition F(g1(x1),...,gk(xk))F(g_1(x^1),...,g_k(x^k)) satisfies the conjecture. As an application we show that the FEI conjecture holds for read-once formulas over arbitrary gates of bounded arity, extending a recent result [OWZ11] which proved it for read-once decision trees. Our techniques also yield an explicit function with the largest known ratio of C6.278C \geq 6.278 between H[f]H[f] and Inf[f]Inf[f], improving on the previous lower bound of 4.615

    An Econometric Approach To Estimating Support Prices And Measures Of Productivity Change In Public Hospitals

    Get PDF
    In industry sectors where market prices are unavailable it is common to represent multiple-input multiple-output production technologies using distance functions. Econometric estimation of such functions is complicated by the fact that more than one variable in the function may be endogenous. In such cases, maximum likelihood estimation can lead to biased and inconsistent estimates of the model parameters and associated measures of firm performance. We solve the problem by using linear programming to construct a quantity index. The distance function is then written in the form of a conventional stochastic frontier model where the explanatory variables are unambiguously exogenous. We use this approach to estimate productivity indexes and support (or shadow) prices for a sample of Australian public hospitals. We decompose the productivity index into several measures of environmental change and efficiency change. We find that the productivity effects of improvements in input-oriented technical efficiency have been largely offset by the effects of deteriorations in the production environment over time.
    corecore