5,150 research outputs found
Near-infrared identification of the counterpart to X1908+075: a new OB-supergiant X-ray binary
We report the near-infrared (IR) identification of the likely counterpart to
X1908+075, a highly-absorbed Galactic X-ray source recently suspected to belong
to the rare class of OB supergiant-neutron star binary systems. Our JHKs-band
imaging of the field reveals the existence within the X-ray error boxes of a
near-IR source consistent with an early-type star lying at d=7 kpc and
suffering A(V)=16 mag of extinction, the latter value being in good agreement
with the hydrogen column density derived from a modelling of the X-ray
spectrum. Our follow-up, near-IR spectroscopic observations confirm the nature
of this candidate and lead to a late O-type supergiant classification, thereby
supporting the identification of a new Galactic OB-supergiant X-ray binary.Comment: Accepted for publication in MNRAS, 7 pages, 3 figure
Asynchrony in image analysis: using the luminance-to-response-latency relationship to improve segmentation
We deal with the probiem of segmenting static images, a procedure known to be difficult in the case of very
noisy patterns, The proposed approach rests on the transformation of a static image into a data flow in which
the first image points to be processed are the brighter ones. This solution, inspired by human perception, in
which strong luminances elicit reactions from the visual system before weaker ones, has led to the notion of
asynchronous processing. The asynchronous processing of image points has required the design of a specific
architecture that exploits time differences in the processing of information. The results otained when very
noisy images are segmented demonstrate the strengths of this architecture; they also suggest extensions of
the approach to other computer vision problem
Racial Beliefs, Location and the Causes of Crime
This paper provides a unified explanation for why blacks commit more crime, are located in poorer neighborhoods and receive lower wages than whites. If everybody believes that blacks are more criminal than whites - even if there is no basis for this - then blacks are offered lower wages and, as a result, locate further away from jobs. Distant residence increases even more the black-white wage gap because of more tiredness and higher commuting costs. Blacks have thus a lower opportunity cost of committing crime and become indeed more criminal than whites. Therefore beliefs are self-fulfilling.Self-Fulfilling Prejudies; Urban Black Ghettos; Crime
Organized Crime, Corruption and Punishment
We analyze an oligopoly model in which differentiated criminal organizations globally compete on criminal activities and engage in local corruption to avoid punishment. When law enforcers are sufficiently well-paid, difficult to bribe and corruption detection highly probable, we show that increasing policing or sanctions effectively deters crime. However, when bribing costs are low, that is badly-paid and dishonest law enforcers work in a weak governance environment, and the rents from criminal activity relative to legal activity are sufficiently high, we find that increasing policing and sanctions can generate higher crime rates. In particular, the relationship between the traditional instruments of deterrence, namely intensification of policing and increment of sanctions, and crime is nonmonotonic. Beyond a threshold, increases in expected punishment induce organized crime to corruption, and ensuing impunity leads too higher rather than lower crime.Deterrence; Organized Crime; Corruption; Oligopoly; Free Entry
Graphic Symbol Recognition using Graph Based Signature and Bayesian Network Classifier
We present a new approach for recognition of complex graphic symbols in
technical documents. Graphic symbol recognition is a well known challenge in
the field of document image analysis and is at heart of most graphic
recognition systems. Our method uses structural approach for symbol
representation and statistical classifier for symbol recognition. In our system
we represent symbols by their graph based signatures: a graphic symbol is
vectorized and is converted to an attributed relational graph, which is used
for computing a feature vector for the symbol. This signature corresponds to
geometry and topology of the symbol. We learn a Bayesian network to encode
joint probability distribution of symbol signatures and use it in a supervised
learning scenario for graphic symbol recognition. We have evaluated our method
on synthetically deformed and degraded images of pre-segmented 2D architectural
and electronic symbols from GREC databases and have obtained encouraging
recognition rates.Comment: 5 pages, 8 figures, Tenth International Conference on Document
Analysis and Recognition (ICDAR), IEEE Computer Society, 2009, volume 10,
1325-132
Organised crime, corruption and punishment
We analyse an oligopoly model in which differentiated criminal organisations globally compete on criminal activities and engage in local corruption to avoid punishment. When law enforcers are sufficiently well-paid, difficult to bribe and corruption detection highly probable, we show that increasing policing, or sanctions, effectively deters crime. However, when bribing costs are low, that is badly-paid and dishonest law enforcers work in a weak governance environment, and the rents from criminal activity relative to legal activity are sufficiently high, we find that increasing policing and sanctions can generate higher crime rates. In particular, the relationship between the traditional instruments of deterrence, namely intensification of policing and sanctions, and the crime rate is nonmonotonic. Beyond a threshold, further increases in intended expected punishment create incentives for organised crime extending corruption rings, and ensuing impunity results in a fall of actual expected punishment that yields more rather than less crime. Keywords; intended deterrence, organised crime, weak governance, corruption
Analysis and Diversion of Duqu's Driver
The propagation techniques and the payload of Duqu have been closely studied
over the past year and it has been said that Duqu shared functionalities with
Stuxnet. We focused on the driver used by Duqu during the infection, our
contribution consists in reverse-engineering the driver: we rebuilt its source
code and analyzed the mechanisms it uses to execute the payload while avoiding
detection. Then we diverted the driver into a defensive version capable of
detecting injections in Windows binaries, thus preventing further attacks. We
specifically show how Duqu's modified driver would have detected Duqu.Comment: Malware 2013 - 8th International Conference on Malicious and Unwanted
Software (2013
Experiments in Clustering Homogeneous XML Documents to Validate an Existing Typology
This paper presents some experiments in clustering homogeneous XMLdocuments
to validate an existing classification or more generally anorganisational
structure. Our approach integrates techniques for extracting knowledge from
documents with unsupervised classification (clustering) of documents. We focus
on the feature selection used for representing documents and its impact on the
emerging classification. We mix the selection of structured features with fine
textual selection based on syntactic characteristics.We illustrate and evaluate
this approach with a collection of Inria activity reports for the year 2003.
The objective is to cluster projects into larger groups (Themes), based on the
keywords or different chapters of these activity reports. We then compare the
results of clustering using different feature selections, with the official
theme structure used by Inria.Comment: (postprint); This version corrects a couple of errors in authors'
names in the bibliograph
- …