4,711 research outputs found

    Selecting the number of principal components: estimation of the true rank of a noisy matrix

    Full text link
    Principal component analysis (PCA) is a well-known tool in multivariate statistics. One significant challenge in using PCA is the choice of the number of components. In order to address this challenge, we propose an exact distribution-based method for hypothesis testing and construction of confidence intervals for signals in a noisy matrix. Assuming Gaussian noise, we use the conditional distribution of the singular values of a Wishart matrix and derive exact hypothesis tests and confidence intervals for the true signals. Our paper is based on the approach of Taylor, Loftus and Tibshirani (2013) for testing the global null: we generalize it to test for any number of principal components, and derive an integrated version with greater power. In simulation studies we find that our proposed methods compare well to existing approaches.Comment: 29 pages, 9 figures, 4 table

    Comparison of dynamic fatigue behavior between SiC whisker-reinforced composite and monolithic silicon nitrides

    Get PDF
    The dynamic fatigue behavior of 30 vol percent silicon nitride whisker-reinforced composite and monolithic silicon nitrides were determined as a function of temperature from 1100 to 1300 C in ambient air. The fatigue susceptibility parameter, n, decreased from 88.1 to 20.1 for the composite material, and from 50.8 to 40.4 for the monolithic, with increasing temperature from 1100 to 1300 C. A transition in the dynamic fatigue curve occurred for the composite material at a low stressing rate of 2 MPa/min at 1300 C, resulting in a very low value of n equals 5.8. Fractographic analysis showed that glassy phases in the slow crack growth region were more pronounced in the composite compared to the monolithic material, implying that SiC whisker addition promotes the formation of glass rich phases at the grain boundaries, thereby enhancing fatigue. These results indicate that SiC whisker addition to Si3 N4 matrix substantially deteriorates fatigue resistance inherent to the matrix base material for this selected material system

    The role of ARE2 cis-acting elements in Hro-twist messenger RNA localization

    Get PDF

    Beyond Purposivism in Tax Law

    Get PDF

    Early Release in International Criminal Law

    Get PDF

    Legal Analysis, Policy Analysis, and the Price of Deference: An Empirical Study of Mayo and Chevron

    Get PDF
    A huge literature contemplates the theoretical relationship between judicial deference and agency rulemaking. But relatively little empirical work has studied the actual effect of deference on how agencies draft regulations. As a result, some of the most important questions surrounding deference—whether it encourages agencies to focus on policy analysis instead of legal analysis, its relationship to procedures like notice and comment—have so far been dominated by conjecture and anecdote. Because Chevron, U.S.A., Inc. v. Natural Resources Defense Council, Inc. applied simultaneously across agencies, it has been difficult to separate its specific causal effect from other contemporaneous events in the 1980s, like the rise of cost-benefit analysis and the new textualism. This Article contends with this problem by exploiting a unique event in administrative law: the Supreme Court’s 2011 decision in Mayo Foundation v. United States, which required that courts apply Chevron deference to interpretative tax regulations. By altering the deference regime applicable to one specific category of regulation, Mayo created a natural experiment with a treatment group (interpretative tax regulations) and a control group (all other regulations)

    Measuring Clarity in Legal Text

    Get PDF
    Legal cases often turn on judgments of textual clarity: when the text is unclear, judges allow extrinsic evidence in contract disputes, consult legislative history in statutory interpretation, and more. Despite this, almost no empirical work considers the nature or prevalence of legal clarity. Scholars and judges who study real-world documents to inform the interpretation of legal text primarily treat unclear text as a research problem to be solved with more data rather than a fundamental feature of language. This Article makes both theoretical and empirical contributions to the legal concept of textual clarity. It first advances a theory of clarity that distinguishes between information and determinacy. A judge might find text unclear because she personally lacks sufficient information to decide which interpretation is best; alternatively, she might find it unclear because the text itself is fundamentally indeterminate. Fundamental linguistic indeterminacy explains ongoing interpretive debates and limits the potential for text-focused methods (including corpus linguistics) to decide cases. With this theoretical background, the Article then proposes a new method to algorithmically evaluate textual clarity. Applying techniques from natural language processing and artificial intelligence that measure the semantic similarity between words, we can shed valuable new light on questions of legal interpretation. This Article finds that text is frequently indeterminate in real-world legal cases. Moreover, estimates of similarity vary substantially from corpus to corpus, even for large and reputable corpora. This suggests that word use is highly corpus-specific and that meaning can vary even between general-purpose corpora that theoretically capture ordinary meaning. These empirical findings have important implications for ongoing doctrinal debates, suggesting that text is less clear and objective than many textualists believe. Ultimately, the Article offers new insights both to theorists considering the role of legal text and to empiricists seeking to understand how text is used in the real world

    A Solution to k-Exclusion with O(logk) RMR Complexity

    Get PDF
    We specify and prove an algorithm solving k-Exclusion, a generalization of the Mutual Exclusion problem. k-Exclusion requires that at most k processes be in the Critical Section (CS) at once; in addition, we require bounded exit, starvation freedom and fairness properties. The goal within this framework is to minimize the number of Remote Memory References (RMRs) made. Previous algorithms have required Omega(k) RMRs in the worst case. Our algorithm requires O(log k) RMRs in the worst case under the Cache-Coherent (CC) model, a considerable improvement in time complexity
    • …
    corecore