8,350 research outputs found

    Automation and robotics technology for intelligent mining systems

    Get PDF
    The U.S. Bureau of Mines is approaching the problems of accidents and efficiency in the mining industry through the application of automation and robotics to mining systems. This technology can increase safety by removing workers from hazardous areas of the mines or from performing hazardous tasks. The short-term goal of the Automation and Robotics program is to develop technology that can be implemented in the form of an autonomous mining machine using current continuous mining machine equipment. In the longer term, the goal is to conduct research that will lead to new intelligent mining systems that capitalize on the capabilities of robotics. The Bureau of Mines Automation and Robotics program has been structured to produce the technology required for the short- and long-term goals. The short-term goal of application of automation and robotics to an existing mining machine, resulting in autonomous operation, is expected to be accomplished within five years. Key technology elements required for an autonomous continuous mining machine are well underway and include machine navigation systems, coal-rock interface detectors, machine condition monitoring, and intelligent computer systems. The Bureau of Mines program is described, including status of key technology elements for an autonomous continuous mining machine, the program schedule, and future work. Although the program is directed toward underground mining, much of the technology being developed may have applications for space systems or mining on the Moon or other planets

    The Performance of the Turek-Fletcher Model Averaged Confidence Interval

    Get PDF
    We consider the model averaged tail area (MATA) confidence interval proposed by Turek and Fletcher, CSDA, 2012, in the simple situation in which we average over two nested linear regression models. We prove that the MATA for any reasonable weight function belongs to the class of confidence intervals defined by Kabaila and Giri, JSPI, 2009. Each confidence interval in this class is specified by two functions b and s. Kabaila and Giri show how to compute these functions so as to optimize these intervals in terms of satisfying the coverage constraint and minimizing the expected length for the simpler model, while ensuring that the expected length has desirable properties for the full model. These Kabaila and Giri "optimized" intervals provide an upper bound on the performance of the MATA for an arbitrary weight function. This fact is used to evaluate the MATA for a broad class of weights based on exponentiating a criterion related to Mallows' C_P. Our results show that, while far from ideal, this MATA performs surprisingly well, provided that we choose a member of this class that does not put too much weight on the simpler model

    Fletcher-Turek Model Averaged Profile Likelihood Confidence Intervals

    Get PDF
    We evaluate the model averaged profile likelihood confidence intervals proposed by Fletcher and Turek (2011) in a simple situation in which there are two linear regression models over which we average. We obtain exact expressions for the coverage and the scaled expected length of the intervals and use these to compute these quantities in particular situations. We show that the Fletcher-Turek confidence intervals can have coverage well below the nominal coverage and expected length greater than that of the standard confidence interval with coverage equal to the same minimum coverage. In these situations, the Fletcher-Turek confidence intervals are unfortunately not better than the standard confidence interval used after model selection but ignoring the model selection process

    SAM CLEMENS' HANNIBAL, 1836-1838

    Get PDF

    Racially Restrictive Covenants in the United States: A Call to Action

    Full text link
    This paper examines the history and structure of racially restrictive covenants in the United States to better comprehend their continued existence, despite their illegality. While unenforceable, racially restrictive covenants signal tone and intent, may be psychologically damaging, and perpetuate segregation. Racially restrictive covenants were widespread tools of discrimination used by white homeowners to prevent the migration of people of color into their neighborhoods during the first half of the 20th century. In its 1948 decision, Shelley v. Kramer, the U.S. Supreme Court held that racially restrictive covenants could not be enforced, but the practice of inserting such covenants into title documents remained common. Finally, in 1968, the Federal Fair Housing Act made the practice of writing racial covenants into deeds illegal. However, nearly seventy years after Shelley and 60 years after the Fair Housing Act, racially restrictive covenants remain common features of deeds. This may be for several reasons. First, since covenants run with the land, they become part of the land title in perpetuity. Second, the process to remove covenants is expensive and time-consuming. Third, the majority of owners may not be aware that their properties are subject to racially restrictive covenants. Despite these challenges, it may be possible to adopt policies to improve removal rates. This paper calls lawyers, urban planners, and real estate professionals to action in light of their active role in the proliferation of racially restrictive covenants in the 20th century.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/143831/1/A_12 Racially Restrictive Covenants in the US.pd

    Repatriation and Cultural Preservation: Potent Objects, Potent Pasts

    Get PDF
    Parts I and II discuss the preservation idea itself and the history of museums\u27 participation in cultural preservation efforts. Parts III and IV then look specifically at the repatriation issue, providing some background on initiatives that have influenced peoples\u27 thoughts and actions. Finally, Part V outlines and discusses some of the issues that have made resolution of the repatriation issue particularly complex

    Designing experiments for an application in laser and surface Chemistry

    No full text
    We consider the design used to collect data for a Second Harmonic Generation (SHG) experiment, where the behaviour of interfaces between two phases, for example the surface of a liquid, is investigated. These studies have implications in surfactants, catalysis, membranes and electrochemistry. Ongoing work will be described in designing experiments to investigate nonlinear models used to represent the data, relating the intensity of the SHG signal to the polarisation angles of the polarised light beam. The choice of design points and their effect on parameter estimates is investigated. Various designs and the current practice of using equal-spaced levels are investigated, and their relative merits compared on the basis of the overall aim of the chemical study
    • …
    corecore