4,968 research outputs found

    On-road traffic emissions of polycyclic aromatic hydrocarbons and their oxy- and nitro- derivative compounds measured in a road tunnel environment

    Get PDF
    AbstractVehicular emissions are a key source of polycyclic aromatic compounds (PACs), including polycyclic aromatic hydrocarbons (PAHs) and their oxygenated (OPAH) and nitrated (NPAH) derivatives, in the urban environment. Road tunnels are a useful environment for the characterisation of on-road vehicular emissions, providing a realistic traffic fleet and a lack of direct sunlight, chemical reactivity and non-traffic sources. In the present investigation the concentrations of selected PAHs, OPAHs and NPAHs have been measured in the Parc des Princes Tunnel in Paris (PdPT, France), and at the Queensway Road Tunnel and an urban background site in Birmingham (QT, U.K). A higher proportion of semi-volatile (3–4 ring) PAH, OPAH and NPAH compounds are associated with the particulate phase compared with samples from the ambient environment. A large (~85%) decline in total PAH concentrations is observed between 1992 and 2012 measurements in QT. This is attributed primarily to the introduction of catalytic converters in the U.K as well as increasingly stringent EU vehicle emissions legislation. In contrast, NPAH concentrations measured in 2012 are similar to those measured in 1996. This observation, in addition to an increased proportion of (Phe+Flt+Pyr) in the observed PAH burden in the tunnel, is attributed to the increased number of diesel passenger vehicles in the U.K during this period. Except for OPAHs, comparable PAH and NPAH concentrations are observed in both investigated tunnels (QT and PdP). Significant differences are shown for specific substances between PAC chemical profiles in relation with the national traffic fleet differences (33% diesel passenger cars in U.K. vs 69% in France and up to 80% taking into account all vehicle categories). The dominating and sole contribution of 1-Nitropyrene observed in the PdPT NPAH profile strengthens the promising use of this compound as a diesel exhaust marker for PM source apportionment studies

    Adversarially Learned Abnormal Trajectory Classifier

    Full text link
    We address the problem of abnormal event detection from trajectory data. In this paper, a new adversarial approach is proposed for building a deep neural network binary classifier, trained in an unsupervised fashion, that can distinguish normal from abnormal trajectory-based events without the need for setting manual detection threshold. Inspired by the generative adversarial network (GAN) framework, our GAN version is a discriminative one in which the discriminator is trained to distinguish normal and abnormal trajectory reconstruction errors given by a deep autoencoder. With urban traffic videos and their associated trajectories, our proposed method gives the best accuracy for abnormal trajectory detection. In addition, our model can easily be generalized for abnormal trajectory-based event detection and can still yield the best behavioural detection results as demonstrated on the CAVIAR dataset.Comment: Accepted for the 16th Conference on Computer and Robot Vision (CRV) 201

    Theoretical and observational constraints on the H i intensity power spectrum

    Get PDF
    Mapping of the neutral hydrogen (H i) 21-cm intensity fluctuations across redshifts promises a novel and powerful probe of cosmology. The neutral hydrogen gas mass density ΩH i\Omega _{\rm H\,\small {i}} and bias parameter bH ib_{\rm H\,\small {i}} are key astrophysical inputs to the H i intensity fluctuation power spectrum. We compile the latest theoretical and observational constraints on ΩH i\Omega _{\rm H\,\small {i}} and bH ib_{\rm H\,\small {i}} at various redshifts in the post-reionization universe. Constraints are incorporated from galaxy surveys, H i intensity mapping experiments, damped Lyman α system observations, theoretical prescriptions for assigning H i to dark matter haloes and the results of numerical simulations. Using a minimum variance interpolation scheme, we obtain the predicted uncertainties on the H i intensity fluctuation power spectrum across redshifts 0-3.5 for three different confidence scenarios. We provide a convenient tabular form for the interpolated values of ΩH i\Omega _{\rm H\,\small {i}}, bH ib_{\rm H\,\small {i}} and the H i power spectrum amplitude and their uncertainties. We discuss the consequences for the measurement of the power spectrum by current and future intensity mapping experiment

    Deep sclerectomy with the Ex-PRESS X-200 implant for the surgical treatment of glaucoma

    Get PDF
    The efficacy and safety of a newly designed Ex-PRESS X-200 drainage device for the surgical treatment of glaucoma was evaluated. A clinical, prospective, monocentric, non-randomised, unmasked study on patients with medically uncontrolled glaucoma was performed. A superficial scleral flap was created. A posterior deep sclerectomy (DS) was dissected without opening the Schlemm's canal and an Ex-PRESS X-200 device was inserted under the scleral flap into the anterior chamber to drain aqueous humour into the intrascleral space. Biomicroscopy, best-corrected visual acuity (BCVA), applanation intra-ocular pressure (IOP) measurements, and fundus examination were performed before surgery, on the first day, the first week, and 1, 2, 3, 6, 12 and 18months after surgery. The mean follow-up was 18.6±2.4months (mean±SD) for the 26 eyes that were treated with the Ex-PRESS X-200 device. Pre-operatively, the mean BCVA was 0.6±0.3, the mean IOP was 22.0±5.1mmHg, and the mean number of medications per patients was 2.8±0.8. Eighteen months after surgery the mean BCVA was 0.5±0.4, the mean IOP was reduced to 12.0±3.9mmHg, and the mean number of medications per patient was 0.6±1.2. Eighty-five percent of patients achieved an IOP<18mmHg with or without medication and 69% without medication. Post-operative complications were hyphaema (15%), Seidel (15%), encysted blebs (54%) and bleb fibrosis in 8% of patients. Mitomycin C(MMC) was administered to 15 patients (58%) with needling being performed on 10 (38%) of these patients. Mid-term results of DS with the Ex-PRESS X-200 implant demonstrated its efficacy in controlling IOP with few post-operative complications in difficult eyes with an increased risk of surgery failur

    Le développement de la science économique au Japon dans la seconde moitié du xixe siècle et les premiers travaux japonais en histoire économique

    Get PDF
    Résumé : Si la question du développement de la puissance économique japonaise est encore peu étudiée en France, la question du développement de la science économique au Japon l’est encore moins. Notre travail traite ici de ce dernier point, avec une attention particulière concernant le développement de la méthode historique d’inspiration allemande, qui fut au cours de la seconde moitié du xixe siècle la plus dominante à l’échelle mondiale en matière économique. Contre la théorie d’un simple « suivisme », nous montrons que les travaux japonais ont rapidement, sinon aussitôt, fait preuve d’une qualité tout à fait comparable à ce qu’il pouvait se faire en « Occident » au même moment, se montrant à la pointe et non à la traîne du développement scientifique. Pour cela nous insistons d’abord sur la maturité intellectuelle qui a permis l’assimilation au Japon de la problématique de « l’économie politique » occidentale, soulignant la multitude des travaux de traduction et d’enseignement nous montrons que les scientifiques japonais ont embrassé toute la diversité des différentes approches occidentales, le classicisme libéral d’inspiration franco-britannique mais aussi la méthode historique d’inspiration allemande. Nous présentons les trois premiers travaux japonais développant la méthode historique, entre 1889 et 1900, montrant leur rapide montée en gamme, avec pour point d’orgue le travail de Fukuda Tokuzō, d’une qualité remarquable, se plaçant en tête du front pionnier d’une science moderne en constant renouvellement.Abstract: In French historiography, research focusing on Japan’s economic power is gradually developing, and yet, the study of the emergence of economics as a scientific field in modern Japan has yet to be fully undertaken. The present work grapples with this phenomenon, with a particular focus on the influence of the historical method, which constituted the dominant model during the second half of the 19th century. As a way to debunk the simplistic theory of a blind, unquestioning Japanese conformity to this model, we will show that Japanese works at the time quickly – if not immediately – made show of the same quality as what was done in “the West”; far from lagging behind their Western counterparts, these were in fact cutting-edge works. This was first and foremost due to intellectual maturity, which allowed the assimilation, in Japan, of the problematic of Western “political economy”. As we can see by the plethora of texts, translated into Japanese, pertaining to this theme, as well as its teaching, Japanese scientists embraced all matter of Western approaches, be it classical liberalism, inspired by France and Great Britain, or the historical method, inspired by Germany. We will present the first three Japanese works presenting the latter, written between 1889 and 1900, and we will show their swift upgrade in content, culminating with the remarkable work of Fukuda Tokuzō, which became the forerunner of a modern science, economics, which was constantly renewing itself

    L'apport de la télédétection à un modèle de neige appliqué à un système d'aide à la gestion des barrages dans le sud du Québec

    Get PDF
    The Centre d'expertise hydrique du Québec (CEHQ) operates a distributed hydrological model (MOHYSE), which integrates a snow model (SPH-AV), for the management of dams in the south of Québec. It appears that the estimation of the water quantity of snowmelt in spring remains a variable with a large uncertainty. This research aims to evaluate the potential of remote sensing data for the characterization of snow and ultimately to develop methods of integration of satellite data in the snow model for the improvement of the simulations of spring floods. Remote sensing snow cover area (SCA) products (MODIS[subscript SCN] & IMS) are compared with snow depth surveys at Environment Canada stations and initial simulations of the models. Thru these comparisons, an effective method of integration (seuil[subscript ÉEN]) of remote sensing SCA products, based on the hypothesis that satellites can not identify small amount of snow because snow become"dirty" and discontinuous, was developed.The improvement of the Nash coefficient and the root mean square error for spring 2004 to 2007 for the simulations with the approach developed compared with streamflow simulated without remote sensing is 0.11 and 21% on the optimized watershed (du Nord) and 0.13 and 22% on the verification watershed (aux Écorces).The method also relies to improve peaks identification as much as 36% on the du Nord watershed and 19% on the aux Écorces watershed.The study also shows the potential of QSCAT data for the characterization of snow cover. Overall accuracies around 90% are obtained for the detection of melt during the month of April from 2001 to 2007 on both studied watersheds.The relation between the rise of the backscatter coefficient and the snow depth surveys shows good correlation for the 2004 to 2006 years for the Lachute and St-Jérôme stations (0.64 to 0.93), but less interesting results for the St-Hippolyte station (0.29 to 0.73). QSCAT products considering only the descendant orbit give best results.The integration of remote sensing albedo product did not allow improvement in the simulations because of holes in the temporal series caused by cloud cover. Also, the relation between fractional snow cover and snow depth did not show interesting results in an operational context.The study shows the interest to create new remote sensing SCA products more precise on the studied region. Future works should also evaluate the possibility to adapt the seuil[subscript ÉEN] method for a Kalman filter approach. A more spatially extensive study and a better comprehension of the backscatter response in microwaves of the different elements might eventually permit to obtain useful results with QSCAT data

    Predicting Next Local Appearance for Video Anomaly Detection

    Full text link
    We present a local anomaly detection method in videos. As opposed to most existing methods that are computationally expensive and are not very generalizable across different video scenes, we propose an adversarial framework that learns the temporal local appearance variations by predicting the appearance of a normally behaving object in the next frame of a scene by only relying on its current and past appearances. In the presence of an abnormally behaving object, the reconstruction error between the real and the predicted next appearance of that object indicates the likelihood of an anomaly. Our method is competitive with the existing state-of-the-art while being significantly faster for both training and inference and being better at generalizing to unseen video scenes.Comment: Accepted as an oral presentation for MVA'202
    • …
    corecore