736 research outputs found

    A Holocene paleoenvironmental record based on ungulate stable isotopes from Lukenya Hill, Kenya

    Full text link
    Investigating the development of Holocene behavioral adaptations requires knowing how and why different human groups are distributed on the landscape. An expanded dataset of site-specific environmental and habitat reconstructions from eastern Africa are crucial contextual components necessary for pushing this line of inquiry forward. This paper provides localized paleoenvironmental data from Holocene deposits at the multi-site Lukenya Hill archaeological complex on the Athi-Kapiti Plains of Kenya. Lukenya Hill preserves two temporal units, an early-mid Holocene (~9.0–4.6 ka) and a late Holocene (~2.3–1.2 ka), which span the end of the African Humid Period and the onset of late Holocene aridification. Carbon isotope analysis of herbivore tooth enamel (n = 22) indicates an increase in open grasslands over time with the early-mid Holocene having a woodier signal than the late Holocene and Recent populations in the Athi ecosystem. This pattern deviates from local environmental sequences in the Lake Victoria and Lake Turkana basins, providing additional evidence of heterogeneous habitat conditions during the Holocene of eastern Africa. The expansion of locally specific paleoecological datasets in eastern Africa allows for an examination of the role climate and ecology played in human economic and behavioral development during the Holocene.Accepted manuscrip

    Accountable Algorithms

    Get PDF
    Many important decisions historically made by people are now made by computers. Algorithms count votes, approve loan and credit card applications, target citizens or neighborhoods for police scrutiny, select taxpayers for IRS audit, grant or deny immigration visas, and more. The accountability mechanisms and legal standards that govern such decision processes have not kept pace with technology. The tools currently available to policymakers, legislators, and courts were developed to oversee human decisionmakers and often fail when applied to computers instead. For example, how do you judge the intent of a piece of software? Because automated decision systems can return potentially incorrect, unjustified, or unfair results, additional approaches are needed to make such systems accountable and governable. This Article reveals a new technological toolkit to verify that automated decisions comply with key standards of legal fairness. We challenge the dominant position in the legal literature that transparency will solve these problems. Disclosure of source code is often neither necessary (because of alternative techniques from computer science) nor sufficient (because of the issues analyzing code) to demonstrate the fairness of a process. Furthermore, transparency may be undesirable, such as when it discloses private information or permits tax cheats or terrorists to game the systems determining audits or security screening. The central issue is how to assure the interests of citizens, and society as a whole, in making these processes more accountable. This Article argues that technology is creating new opportunities—subtler and more flexible than total transparency—to design decisionmaking algorithms so that they better align with legal and policy objectives. Doing so will improve not only the current governance of automated decisions, but also—in certain cases—the governance of decisionmaking in general. The implicit (or explicit) biases of human decisionmakers can be difficult to find and root out, but we can peer into the “brain” of an algorithm: computational processes and purpose specifications can be declared prior to use and verified afterward. The technological tools introduced in this Article apply widely. They can be used in designing decisionmaking processes from both the private and public sectors, and they can be tailored to verify different characteristics as desired by decisionmakers, regulators, or the public. By forcing a more careful consideration of the effects of decision rules, they also engender policy discussions and closer looks at legal standards. As such, these tools have far-reaching implications throughout law and society. Part I of this Article provides an accessible and concise introduction to foundational computer science techniques that can be used to verify and demonstrate compliance with key standards of legal fairness for automated decisions without revealing key attributes of the decisions or the processes by which the decisions were reached. Part II then describes how these techniques can assure that decisions are made with the key governance attribute of procedural regularity, meaning that decisions are made under an announced set of rules consistently applied in each case. We demonstrate how this approach could be used to redesign and resolve issues with the State Department’s diversity visa lottery. In Part III, we go further and explore how other computational techniques can assure that automated decisions preserve fidelity to substantive legal and policy choices. We show how these tools may be used to assure that certain kinds of unjust discrimination are avoided and that automated decision processes behave in ways that comport with the social or legal standards that govern the decision. We also show how automated decisionmaking may even complicate existing doctrines of disparate treatment and disparate impact, and we discuss some recent computer science work on detecting and removing discrimination in algorithms, especially in the context of big data and machine learning. And lastly, in Part IV, we propose an agenda to further synergistic collaboration between computer science, law, and policy to advance the design of automated decision processes for accountabilit

    Integrating resource selection into spatial capture-recapture models for large carnivores

    Get PDF
    Wildlife managers need reliable methods to estimate large carnivore densities and population trends; yet large carnivores are elusive, difficult to detect, and occur at low densities making traditional approaches intractable. Recent advances in spatial capture-recapture (SCR) models have provided new approaches for monitoring trends in wildlife abundance and these methods are particularly applicable to large carnivores. We applied SCR models in a Bayesian framework to estimate mountain lion densities in the Bitterroot Mountains of west central Montana. We incorporate an existing resource selection function (RSF) as a density co-variate to account for heterogeneity in habitat use across the study area and include data collected from harvested lions. We identify individuals through DNA samples collected by (1) biopsy darting mountain lions detected in systematic surveys of a study area, (2) opportunistically collecting hair and scat samples, and (3) sampling all harvested mountain lions. We included 80 DNA samples collected from 62 individuals in the analysis. Including information on predicted habitat use as a co-variate on the distribution of activity centers reduced the median estimated density by 44% the standard deviation by 7% and the width of 95% credible intervals by 10% as compared to standard SCR models. Within the two management units of interest, we estimated a median mountain lion density of 4.5 mountain lions/100 km2 (95% CI=2.9, 7.7) and 5.2 mountain lions/100 km2 (95% CI=3.4, 9.1). Including harvested individuals (dead recovery) did not create a significant bias in the detection process by introducing individuals that could not be detected after removal. However, the dead recovery component of the model did have a substantial effect on results by increasing sample size. The ability to account for heterogeneity in habitat use provides a useful extension to SCR models, and will enhance the ability of wildlife managers to reliably and economically estimate density of wildlife populations, particularly large carnivores

    Rapid radiocarbon (14C) analysis of coral and carbonate samples using a continuous-flow accelerator mass spectrometry (CFAMS) system

    Get PDF
    Author Posting. © American Geophysical Union, 2011. This article is posted here by permission of American Geophysical Union for personal use, not for redistribution. The definitive version was published in Paleoceanography 26 (2011): PA4212, doi:10.1029/2011PA002174.Radiocarbon analyses of carbonate materials provide critical information for understanding the last glacial cycle, recent climate history and paleoceanography. Methods that reduce the time and cost of radiocarbon (14C) analysis are highly desirable for large sample sets and reconnaissance type studies. We have developed a method for rapid radiocarbon analysis of carbonates using a novel continuous-flow accelerator mass spectrometry (CFAMS) system. We analyzed a suite of deep-sea coral samples and compared the results with those obtained using a conventional AMS system. Measurement uncertainty is <0.02 Fm or 160 Ryr for a modern sample and the mean background was 37,800 Ryr. Radiocarbon values were repeatable and in good agreement with those from the conventional AMS system. Sample handling and preparation is relatively simple and the method offered a significant increase in speed and cost effectiveness. We applied the method to coral samples from the Eastern Pacific Ocean to obtain an age distribution and identify samples for further analysis. This paper is intended to update the paleoceanographic community on the status of this new method and demonstrate its feasibility as a choice for rapid and affordable radiocarbon analysis.This work was performed under NSF Cooperative Agreement OCE‐0753487, and also NSF‐OPP awards 0636787 and 0944474

    Ultrafast Spectroelectrochemistry of the Catechol/o‐Quinone Redox Couple in Aqueous Buffer Solution

    Get PDF
    Eumelanin is a natural pigment found in many organisms that provides photoprotection from harmful UV radiation. As a redox‐active biopolymer, the structure of eumelanin is thought to contain different redox states of quinone, including catechol subunits. To further explore the excited state properties of eumelanin, we have investigated the catechol/o‐quinone redox couple by spectroelectrochemical means, in a pH 7.4 aqueous buffered solution, and using a boron doped diamond mesh electrode. At pH 7.4, the two proton, two electron oxidation of catechol is promoted, which facilitates continuous formation of the unstable o‐quinone product in solution. Ultrafast transient absorption (femtosecond to nanosecond) measurements of o‐quinone species involve initial formation of an excited singlet state followed by triplet state formation within 24 ps. In contrast, catechol in aqueous buffer leads to formation of the semiquinone radical Δt&gt;500 ps. Our results demonstrate the rich photochemistry of the catechol/o‐quinone redox couple and provides further insight into the excited state processes of these key building blocks of eumelanin
    • 

    corecore