609 research outputs found
The origins of chemical warfare in the French Army
This is an accepted manuscript of an article published by SAGE in War in History on 1/11/2013, available online: https://doi.org/10.1177/0968344513494659
The accepted version of the publication may differ from the final published version.Following the Germans’ first use of chlorine gas during the second battle of Ypres, the Entente had to develop means of protection from future poison gas attacks as well as systems for retaliation. This article, through the analysis of heretofore unexamined archival sources, considers early French attempts at engaging in chemical warfare. Contrary to the existing historiography, the French army aggressively adapted to, and engaged in, chemical warfare. Indeed, the French army would be the first to fire asphyxiating gas shells from field guns and, by June 1915, would pioneer the use of gas as a neutralization weapon to be used in counter-battery fire, as opposed to unleashing gas via canisters to engage enemy infantry. Such innovation invites a rethinking not only of French gas efforts but also of the role and evolution of the French army as a whole on the Western Front, a topic which the Anglophone world is in great need of examining further.Published versio
Truncated Variance Reduction: A Unified Approach to Bayesian Optimization and Level-Set Estimation
We present a new algorithm, truncated variance reduction (TruVaR), that
treats Bayesian optimization (BO) and level-set estimation (LSE) with Gaussian
processes in a unified fashion. The algorithm greedily shrinks a sum of
truncated variances within a set of potential maximizers (BO) or unclassified
points (LSE), which is updated based on confidence bounds. TruVaR is effective
in several important settings that are typically non-trivial to incorporate
into myopic algorithms, including pointwise costs and heteroscedastic noise. We
provide a general theoretical guarantee for TruVaR covering these aspects, and
use it to recover and strengthen existing results on BO and LSE. Moreover, we
provide a new result for a setting where one can select from a number of noise
levels having associated costs. We demonstrate the effectiveness of the
algorithm on both synthetic and real-world data sets.Comment: Accepted to NIPS 201
Evaluating Singleplayer and Multiplayer in Human Computation Games
Human computation games (HCGs) can provide novel solutions to intractable
computational problems, help enable scientific breakthroughs, and provide
datasets for artificial intelligence. However, our knowledge about how to
design and deploy HCGs that appeal to players and solve problems effectively is
incomplete. We present an investigatory HCG based on Super Mario Bros. We used
this game in a human subjects study to investigate how different social
conditions---singleplayer and multiplayer---and scoring
mechanics---collaborative and competitive---affect players' subjective
experiences, accuracy at the task, and the completion rate. In doing so, we
demonstrate a novel design approach for HCGs, and discuss the benefits and
tradeoffs of these mechanics in HCG design.Comment: 10 pages, 4 figures, 2 table
Time evolution of intrinsic alignments of galaxies
Intrinsic alignments (IA), correlations between the intrinsic shapes and
orientations of galaxies on the sky, are both a significant systematic in weak
lensing and a probe of the effect of large-scale structure on galactic
structure and angular momentum. In the era of precision cosmology, it is thus
especially important to model IA with high accuracy. Efforts to use
cosmological perturbation theory to model the dependence of IA on the
large-scale structure have thus far been relatively successful; however, extant
models do not consistently account for time evolution. In particular, advection
of galaxies due to peculiar velocities alters the impact of IA, because galaxy
positions when observed are generally different from their positions at the
epoch when IA is believed to be set. In this work, we evolve the galaxy IA from
the time of galaxy formation to the time at which they are observed, including
the effects of this advection, and show how this process naturally leads to a
dependence of IA on the velocity shear. We calculate the galaxy-galaxy-IA
bispectrum to tree level (in the linear matter density) in terms of the evolved
IA coefficients. We then discuss the implications for weak lensing systematics
as well as for studies of galaxy formation and evolution. We find that
considering advection introduces nonlocality into the bispectrum, and that the
degree of nonlocality represents the memory of a galaxy's path from the time of
its formation to the time of observation. We discuss how this result can be
used to constrain the redshift at which IA is determined and provide Fisher
estimation for the relevant measurements using the example of SDSS-BOSS.Comment: 30 pages, 5 figures, 2 table
Fine-Grained Car Detection for Visual Census Estimation
Targeted socioeconomic policies require an accurate understanding of a
country's demographic makeup. To that end, the United States spends more than 1
billion dollars a year gathering census data such as race, gender, education,
occupation and unemployment rates. Compared to the traditional method of
collecting surveys across many years which is costly and labor intensive,
data-driven, machine learning driven approaches are cheaper and faster--with
the potential ability to detect trends in close to real time. In this work, we
leverage the ubiquity of Google Street View images and develop a computer
vision pipeline to predict income, per capita carbon emission, crime rates and
other city attributes from a single source of publicly available visual data.
We first detect cars in 50 million images across 200 of the largest US cities
and train a model to predict demographic attributes using the detected cars. To
facilitate our work, we have collected the largest and most challenging
fine-grained dataset reported to date consisting of over 2600 classes of cars
comprised of images from Google Street View and other web sources, classified
by car experts to account for even the most subtle of visual differences. We
use this data to construct the largest scale fine-grained detection system
reported to date. Our prediction results correlate well with ground truth
income data (r=0.82), Massachusetts department of vehicle registration, and
sources investigating crime rates, income segregation, per capita carbon
emission, and other market research. Finally, we learn interesting
relationships between cars and neighborhoods allowing us to perform the first
large scale sociological analysis of cities using computer vision techniques.Comment: AAAI 201
Using Deep Learning and Google Street View to Estimate the Demographic Makeup of the US
The United States spends more than $1B each year on initiatives such as the
American Community Survey (ACS), a labor-intensive door-to-door study that
measures statistics relating to race, gender, education, occupation,
unemployment, and other demographic factors. Although a comprehensive source of
data, the lag between demographic changes and their appearance in the ACS can
exceed half a decade. As digital imagery becomes ubiquitous and machine vision
techniques improve, automated data analysis may provide a cheaper and faster
alternative. Here, we present a method that determines socioeconomic trends
from 50 million images of street scenes, gathered in 200 American cities by
Google Street View cars. Using deep learning-based computer vision techniques,
we determined the make, model, and year of all motor vehicles encountered in
particular neighborhoods. Data from this census of motor vehicles, which
enumerated 22M automobiles in total (8% of all automobiles in the US), was used
to accurately estimate income, race, education, and voting patterns, with
single-precinct resolution. (The average US precinct contains approximately
1000 people.) The resulting associations are surprisingly simple and powerful.
For instance, if the number of sedans encountered during a 15-minute drive
through a city is higher than the number of pickup trucks, the city is likely
to vote for a Democrat during the next Presidential election (88% chance);
otherwise, it is likely to vote Republican (82%). Our results suggest that
automated systems for monitoring demographic trends may effectively complement
labor-intensive approaches, with the potential to detect trends with fine
spatial resolution, in close to real time.Comment: 41 pages including supplementary material. Under review at PNA
Rebellion and resistance in French Indochina in the First World War
The First World War was not merely a clash of empires, it was also a clash within empires. This fact remains largely ignored despite the dozens of anticolonial uprisings around the world which erupted during, and as a result of, the war. In 1916 alone there were uprisings across French North, West and Equatorial Africa, in Portuguese Angola and Mozambique, the Middle East, Central Asia, Southeast Asia and Ireland. Most of these uprisings were responding both to European efforts to extract resources (especially manpower) from the colonies to support the war effort, whilst also taking advantage of the reduced presence of European troops in Asia and Africa as men were recalled from the colonies to take part in the war in Europe. This article examines anticolonial rebellions in French Indochina, especially the attack on Saigon Central Prison in 1916, as a case study in the wider global history of anticolonial rebellion during the Frist World War. Examination of this rebellion shows how the First World War not only generated the opportunities and challenges which led to a surge of anticolonial uprisings around the world, but also changed the political, social and religious character of anticolonial struggle in Indochina. This article offers a reappraisal of the global and imperial consequences of the First World War, and argues that anticolonialism should be more central in our discussion and memory of the conflict
- …