13,686 research outputs found
The potential release of phosphorus in floodplains
In the Illinois River Watershed, there has been growing concern over elevated phosphorus concentrations in the water column. This study evaluated how much phosphorus is contributed from floodplain soils into surface waters, examining the relationship between the flux of phosphorus released and the amount of phosphorus stored in the soil. This was investigated by artificially inundating soil cores from four sites and determining the soluble reactive phosphorus concentrations of the overlying water and the levels of Water and Mehlich-3 extractable phosphorus in the soil. The flux of phosphorus to the overlying water ranged from 0.43 to 6.61 mg m-2 hr-1 within the short-term (16.5-hr incubation) and 0.06 to 1.26 mg m-2 hr-1 over the long term (282.5-hr incubation). Phosphorus flux to the overlying water was significantly correlated with the amount of phosphorus stored in the soil. This study showed that riparian soils with elevated phosphorus content have the potential to release phosphorus when flooded
Leptophilic Dark Matter with interactions
We consider a scenario where dark matter (DM) interacts exclusively with
Standard Model (SM) leptons at tree level. Due to the absence of tree-level
couplings to quarks, the constraints on leptophilic dark matter arising from
direct detection and hadron collider experiments are weaker than those for a
generic WIMP. We study a simple model in which interactions of DM with SM
leptons are mediated by a leptophilic boson, and determine constraints on
this scenario arising from relic density, direct detection, and other
experiments. We then determine current LHC limits and project the future
discovery reach. We show that, despite the absence of direct interactions with
quarks, this scenario can be strongly constrained.Comment: 12 pages, 15 figure
Bayesian analysis of 210Pb dating
In many studies of environmental change of the past few centuries, 210Pb
dating is used to obtain chronologies for sedimentary sequences. One of the
most commonly used approaches to estimate the ages of depths in a sequence is
to assume a constant rate of supply (CRS) or influx of `unsupported' 210Pb from
the atmosphere, together with a constant or varying amount of `supported'
210Pb. Current 210Pb dating models do not use a proper statistical framework
and thus provide poor estimates of errors. Here we develop a new model for
210Pb dating, where both ages and values of supported and unsupported 210Pb
form part of the parameters. We apply our model to a case study from Canada as
well as to some simulated examples. Our model can extend beyond the current CRS
approach, deal with asymmetric errors and mix 210Pb with other types of dating,
thus obtaining more robust, realistic and statistically better defined
estimates.Comment: 22 Pages, 4 Figure
Nietzsche and the Analytics: A Reexamination of His Critique of Truth
The paper examines how two analytic scholars, Maudemarie Clark and Robert Lanier Anderson, overcame the major self-referential inconsistency in Nietzsche\u27s epistemology, which analytic scholars have long avoided. The author outlines four key interpretive issues for analytics in Nietzsche scholarship, which stem from Nietzsche\u27s writing style and from his critique of the practice of philosophy in general. The author illustrates how it is possible to overcome these particular interpretive issues by taking his concerns into consideration, and by highlighting how his concerns are compatible with analytic practices and values
Discrimination on Wheels: How Big Data Uses License Plate Surveillance to Put the Brakes on Disadvantaged Drivers
As scholarly discourse increasingly raises concerns about the negative societal effects of “fintech,” “dirty data,” and “technochauvinism,” a growing technology provides an instructive illustration of all three of these problems. Surveillance software companies are using automated license plate reader (ALPR) technology to develop predictive analytical tools. In turn, software companies market those tools to auto financers and insurers as a risk assessment input to evaluate consumers seeking to buy a car. Proponents of this technology might argue that more information about consumer travel habits will result in more accurate and individualized risk predictions, potentially increasing vehicle ownership among marginalized groups. Expanding access to cars would go a long way toward undoing the economic suppression of many people who are low-income or of color.
However, discrimination in the consumer scoring cycle shows that ALPR-based data analytics will only exacerbate the economic and racial disparities in car ownership. Competing incentives and biased assumptions steer the choices of the humans who collect ALPR data, creating a conflict that irredeemably poisons the data and any consumer access decisions that spring from it. Moreover, using location data to assess risk means that automobile costs may be based on value judgments about the neighborhoods that consumers visit. Thus, rather than creating an equal path to economic mobility, the tainted ALPR data collection methodology reinforces discrimination. Not only that, but using the data to score consumers risks resuscitating and repackaging the practice of redlining.
This article analyzes the fintech model as represented by the use of ALPR technology in auto financing and insurance. Existing commentary surrounding ALPR has focused on ALPR’s privacy and Fourth Amendment implications. While scholars and commentators have been busy examining law enforcement’s engagement with this high-tech surveillance technology, powerful private actors have flown under the radar while subjecting vulnerable consumers to ALPR’s exploitative commercial applications. This article deviates from prior commentary by contemplating ALPR through a consumer law lens. It exposes the ways in which consumer laws have left disadvantaged drivers unprotected. Finally, it advances a number of proposals, including removing geographic inputs from auto access decision making, developing a central base of technological expertise to audit algorithms, and banning commercial use of ALPR
Mind the Gap: Towards the Integration of Critical Gerontology in Public Library Praxis
Aging populations challenge public libraries to adapt their materials, services and programming to maximize the wellbeing and functional capacity of older adults and enhance their social participation and security. For older adult patrons using public library spaces and services, the capacity to which the public library has been able to deliver on these qualities remains unclear. In the past, libraries and library staff have been critiqued for narrowly interpreting the needs of older adults, concentrating on aging as a loss or deficit. To understand the current state of Canadian urban public library services for older adults, publically accessible texts, documents and reports made available on five public library systems’ websites were analyzed. This analysis uncovered certain gaps in adherence to key guidelines in the Canadian Library Association’s Canadian Guidelines on Library and Information Services for Older Adults and revealed a lack of integration of older adults’ own ideas and feedback for their programs and events. The incorporation of a critical gerontology approach throughout the analysis begins to elucidate this study’s findings and calls for the questioning of current conceptualizations of older adults and the library services created for them. Public libraries are uniquely poised to engage with older adults and the addition of a critical gerontology lens in library practice and research will aid in the refocusing of resources and policies to more responsively support older adults’ evolving needs
- …