781 research outputs found

    Bayesian evaluation of the southern hemisphere radiocarbon offset during the holocene

    Get PDF
    While an interhemispheric offset in atmospheric radiocarbon levels from AD 1950–950 is now well established, its existence earlier in the Holocene is less clear, with some studies reporting globally uniform 14C levels while others finding Southern Hemisphere samples older by a few decades. In this paper, we present a method for wiggle-matching Southern Hemisphere data sets against Northern Hemisphere curves, using the Bayesian calibration program OxCal 4.1 with the Reservoir Offset function accommodating a potential interhemispheric offset. The accuracy and robustness of this approach is confirmed by wiggle-matching known-calendar age sequences of the Southern Hemisphere calibration curve SHCal04 against the Northern Hemisphere curve IntCal04. We also show that 5 of 9 Holocene Southern Hemisphere data sets are capable of yielding reliable offset information. Those data sets that are accurate and precise show that interhemispheric offset levels in the Early Holocene are similar to modern levels, confirming SHCal04 as the curve of choice for calibrating Southern Hemisphere samples

    The New Zealand Kauri (Agathis Australis) Research Project: A Radiocarbon Dating Intercomparison of Younger Dryas Wood and Implications for IntCal13

    Get PDF
    We describe here the New Zealand kauri (Agathis australis) Younger Dryas (YD) research project, which aims to undertake Δ14C analysis of ~140 decadal floating wood samples spanning the time interval ~13.1–11.7 kyr cal BP. We report 14C intercomparison measurements being undertaken by the carbon dating laboratories at University of Waikato (Wk), University of California at Irvine (UCI), and University of Oxford (OxA). The Wk, UCI, and OxA laboratories show very good agreement with an interlaboratory comparison of 12 successive decadal kauri samples (average offsets from consensus values of –7 to +4 14C yr). A University of Waikato/University of Heidelberg (HD) intercomparison involving measurement of the YD-age Swiss larch tree Ollon505, shows a HD/Wk offset of ~10–20 14C yr (HD younger), and strong evidence that the positioning of the Ollon505 series is incorrect, with a recommendation that the 14C analyses be removed from the IntCal calibration database

    Using Markov Models and Statistics to Learn, Extract, Fuse, and Detect Patterns in Raw Data

    Full text link
    Many systems are partially stochastic in nature. We have derived data driven approaches for extracting stochastic state machines (Markov models) directly from observed data. This chapter provides an overview of our approach with numerous practical applications. We have used this approach for inferring shipping patterns, exploiting computer system side-channel information, and detecting botnet activities. For contrast, we include a related data-driven statistical inferencing approach that detects and localizes radiation sources.Comment: Accepted by 2017 International Symposium on Sensor Networks, Systems and Securit

    User-generated censorship : manipulating the maps of social media

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Comparative Media Studies, 2013.Cataloged from PDF version of thesis. Vita.Includes bibliographical references.The last decade has seen the rise of new technologies for making information more broadly available and accessible. Variously called 'user-generated content,' 'social media,''social news,' 'crowd-curation,' and so on, these design conventions, algorithmic arrangements, and user practices have been widely praised for 'democratizing' media by lowering the barriers to publishing, accrediting, and aggregating information. Intermediary platforms like Facebook, reddit, and Twitter, among others, are generally expected to elicit valuable knowledge through the algorithmic filtering mechanisms broadly distributed among their users. This thesis investigates user-generated censorship: an emergent mode of intervention by which users strategically manipulate social media to suppress speech. It shows that the tools designed to help make information more available have been repurposed and reversed to make it less available. Case studies reveal that these platforms, far from being neutral pipes through which information merely travels, are in fact contingent sociotechnical systems upon and through which users effect their politics through the power of algorithms. By strategically pulling the levers which make links to sites more or less visible, users recompose the representations of the world produced by social media, altering pathways of access and availability and changing the flow of information. This thesis incorporates insights from media studies, sociology, law and policy, information science, and science-technology studies to study user-generated censorship. It contributes to a broader conversation now emerging across fields which seeks to explore and understand the politics of our developing social media systems.by Christopher E. Peterson.S.M

    Minimum pressure envelope cavitation analysis using two-dimensional panel method

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2008.Includes bibliographical references (leaf 42).An analysis tool for calculating minimum pressure envelopes was developed using XFOIL. This thesis presents MATLAB® executables that interface with a modified version of XFOIL for determining the minimum pressure of a foil operating in an inviscid fluid. The code creates minimum pressure envelopes, similar to those published by Brockett (1965). XFOIL, developed by Mark Drela in 1986, is a design system for Low Reynolds Number Airfoils that combines the speed and accuracy of high-order panel methods with fully-coupled viscous/inviscid interaction. XFOIL was altered such that it reads in command line arguments that provide operating instructions, rather than operator interaction via menu options. In addition, all screen output and plotting functions were removed. These modifications removed XFOIL's user interface, and created a "black box" version of XFOIL that would perform the desired calculations and write the output to a file. These modifications allow rapid execution and interface by an external program, such as MATLAB®. In addition, XFOIL's algorithms provide a significant improvement in the accuracy of minimum pressure prediction over the method published by Brockett. Development of the modified XFOIL and MATLAB® interface contained in this thesis is intended for future interface with Open-source Propeller Design and Analysis Program (OpenProp). OpenProp is an open source MATLAB®-based suite of propeller design tools. Currently, OpenProp performs parametric analysis and single propeller design, but does not perform cavitation analysis. Minimum pressure envelopes provide the propeller designer information about operating conditions encountered by propellers.(cont.) The code developed in this thesis allows the designer to rapidly assess cavitation conditions while in the design phase, and make modifications to propeller blade design in order to optimize cavitation performance. A methodology for design is discussed outlining future integration with OpenProp.by Christopher J. Peterson.S.M

    Financing green buildings

    Get PDF
    Thesis (S.M. in Real Estate Development)--Massachusetts Institute of Technology, Program in Real Estate Development in Conjunction with the Center for Real Estate, 2013.This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Appendixes are printed landscape orientation. Cataloged from student-submitted PDF version of thesis.Includes bibliographical references (pages 50-51).An emerging trend in real estate is the development of sustainable buildings, partially due to the huge environmental impact of the design, construction and operation of commercial buildings. This thesis provides a brief history of the green building movement and the two (2) programs that encourage the development of energy-efficient and sustainable buildings in the United States: the U. S. Green Building Council's Leadership in Energy and Environmental Design (LEED) program and the Energy Star program, jointly sponsored by the Department of Energy and the Environmental Protection Agency. This thesis also summarizes a study by Piet Eichholtz, Nils Kok and John Quigley titled "Doing Well by Doing Good? Green Office Buildings" published December 2010 in the American Economic Review. This study found a commercial building with an Energy Star rating will rent for three percent (3%) more per square foot. The addition to effective rent was approximately seven percent (7%). The increase in value for a sale of a green building was as much as sixteen percent (16%). Then, using the same data as Eichholtz, Kok and Quigley, this thesis reports on the location and ownership of these green buildings, and calculates Loan to Value (LTV) ratios using the most recent sales price and financing amounts from the CoStar Group. In addition, the property's current LEED certification status is provided as well as a review of Federal and State incentives for sustainable buildings. The results indicate that more green buildings are located in California, Texas and Colorado. Investment Management firms, National Developer/Owners and Real Estate Investment Trusts own the majority of green properties. The Loan to Value (LTV) ratio for green buildings is no higher than those for conventional office buildings. Not enough information is available to compare mortgage interest rates between green and conventional properties. The number of LEED buildings and level of certification has increased since 2008. The states with the largest number of LEED buildings are California, Texas, Colorado and Virginia, correlating with the top states for green buildings overall. Although a worthy goal, there is limited Federal and State assistance for financing of sustainable buildings.by Christopher John Pierce.S.M.in Real Estate Developmen

    Prospects of antineutrino detection as an IAEA verification metric for the disposition of weapons-grade plutonium in the United States

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Nuclear Science and Engineering; and, (S.M. in Technology and Policy)--Massachusetts Institute of Technology, Engineering Systems Division, Technology and Policy Program, 2012.Cataloged from PDF version of thesis.Includes bibliographical references (p. 104-111).After the end of World War II, the world entered an even more turbulent period as it faced the beginnings of the Cold War, during which the prospect of mutually assured destruction between the world's largest nuclear weapon states was ever-present, and often provoked tense confrontations. Although fears of a nuclear holocaust significantly subsided after the dissolution of the Soviet Union in 1991, the world faced a potentially more dangerous prospect: the proliferation risks associated with the insecurity and unauthorized acquisition of Soviet-era nuclear warheads. Although all Soviet-era weapons were eventually acquired by Russia, concerns about the excessively large weapons stockpiles of the United States and Russia, combined with the goal of nuclear disarmament, led to the Plutonium Management and Disposition Agreement (PMDA). During the Cold War, the US and the Soviet Union respectively produced approximately 100 and 150 metric tons of weapons-grade plutonium (WGPu). Under the terms of the PMDA, both nations formally each agreed to irradiate 34 MT of excess military plutonium in the form of mixed oxide fuel (MOX) in nuclear power reactors. One of the major issues of concern associated with this agreement relates to the verification measures that will be implemented to ensure actual WGPu disposition. Additionally, despite a commitment (Article VII.3 of the PMDA) to engage and consult with the International Atomic Energy Agency (IAEA) to establish arrangements to monitor its plutonium disposition process, a formalized IAEA role within a potential multilateral verification regime has yet to be determined. In this work, the ability of the US to achieve the goals of its plutonium disposition campaign by 2018 is assessed. The suitability of the IAEA as an objective party to a multilateral verification regime under the auspices of the PMDA is also analyzed. In an attempt to aid the IAEA with such expected verification procedures, the applicability of antineutrino detection as a potential monitoring technology which could significantly enhance current monitoring procedures is considered. Although there has not yet been a formal demonstration of this technology under the auspices of the PMDA, the technology has been successfully fielded and nonintrusively operated at US and Russian reactors for years at a time, with the explicit aim of demonstrating potential relevance to a range of safeguards and verification tasks. The sensitivity of an antineutrino detector to antineutrino count rate measurements was analyzed through a hypothesis testing procedure which sought to identify statistically significant differences between the count rate evolutions of a designated baseline and potential diversion scenarios. With a specified set of parameters, the test demonstrated that the detector was capable of identifying the replacement of 7 WGPu MOX fuel assemblies with conventional LEU fuel assemblies within 360 days of the fuel cycle operation at a >95% true positive rate and a 5% false positive rate limit. These results were essentially still maintained even with a nonreactor- based antineutrino event background signal as high as 25%. Although pitfalls with regard to systematic uncertainty and operator malfeasance were revealed, potential solutions to such issues are also presented and discussed. All in all, the results obtained in this work confirm the potential efficacy and viability of antineutrino rate based measurements for a range of reactor safeguards and verification tasks.by Christopher Michael Copeland.S.M.in Technology and PolicyS.M

    Design and implementation of small satellite inspection

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2012.Cataloged from PDF version of thesis.Includes bibliographical references (p. 179-181).For a variety of missions, vision-based navigation and similar architectures provide the advantage of detailed measurements for a fraction of the size and complexity of ground-based imagers. This thesis provides a simple navigation algorithm using no more than a visual centroid measurement to enable in-situ inspection of space objects. This work evaluates those inspection maneuvers using the Synchronize Position Hold Engage Reorient Experimental Satellites, known as SPHERES. Evaluation of hardware performance was done using data from the International Space Station, in concert with ground-based simulations. Ultimately, this work is in preparation for future experimentation using the VERTIGO vision-navigation payload for SPHERES. The first step presented is an analysis of the measurement capabilities of the SPHERES system and the predicted performance of the VERTIGO system. Using this analysis it is shown that tests run using the former system are applicable to the latter in terms of accuracy, precision, and observability. The second step is an analysis of the tests run on the Space Station, a comparison to those predicted by simulation, and an extension of those results to simulations of more complex maneuvers. Further, a determination of the robustness of the control to disturbances is also performed. Finally, this thesis reflects on the technical and programmatic challenges of developing the VERTIGO payload. From these challenges, lessons are drawn which may guide future developers and program managers, particularly in the university engineering environment.by Michael Christopher O'Connor.S.M

    Cost, affordability and cost-effectiveness of strategies to control tuberculosis in countries with high HIV prevalence

    Get PDF
    Background: The HIV epidemic has caused a dramatic increase in tuberculosis (TB) in East and southern Africa. Several strategies have the potential to reduce the burden of TB in high HIV prevalence settings, and cost and cost-effectiveness analyses can help to prioritize them when budget constraints exist. However, published cost and cost-effectiveness studies are limited.Methods: Our objective was to compare the cost, affordability and cost-effectiveness of seven strategies for reducing the burden of TB in countries with high HIV prevalence. A compartmental difference equation model of TB and HIV and recent cost data were used to assess the costs (year 2003 USprices)andeffects(TBcasesaverted,deathsaverted,DALYsgained)ofthesestrategiesinKenyaduringtheperiod20042023.Results:ThethreelowestcostandmostcosteffectivestrategieswereimprovingTBcurerates,improvingTBcasedetectionrates,andimprovingbothtogether.TheincrementalcostofcombinedimprovementstocasedetectionandcurewasbelowUS prices) and effects (TB cases averted, deaths averted, DALYs gained) of these strategies in Kenya during the period 2004-2023.Results: The three lowest cost and most cost-effective strategies were improving TB cure rates, improving TB case detection rates, and improving both together. The incremental cost of combined improvements to case detection and cure was below US15 million per year (7.5% of year 2000 government health expenditure); the mean cost per DALY gained of these three strategies ranged from US18toUS18 to US34. Antiretroviral therapy (ART) had the highest incremental costs, which by 2007 could be as large as total government health expenditures in year 2000. ART could also gain more DALYs than the other strategies, at a cost per DALY gained of around US260toUS260 to US530. Both the costs and effects of treatment for latent tuberculosis infection (TLTI) for HIV+ individuals were low; the cost per DALY gained ranged from about US85toUS85 to US370. Averting one HIV infection for less than US$250 would be as cost-effective as improving TB case detection and cure rates to WHO target levels. Conclusions: To reduce the burden of TB in high HIV prevalence settings, the immediate goal should be to increase TB case detection rates and, to the extent possible, improve TB cure rates, preferably in combination. Realising the full potential of ART will require substantial new funding and strengthening of health system capacity so that increased funding can be used effectively
    corecore