2,042 research outputs found

    Single molecule localization by ℓ2−ℓ0\ell_2-\ell_0 constrained optimization

    Get PDF
    Single Molecule Localization Microscopy (SMLM) enables the acquisition of high-resolution images by alternating between activation of a sparse subset of fluorescent molecules present in a sample and localization. In this work, the localization problem is formulated as a constrained sparse approximation problem which is resolved by rewriting the ℓ0\ell_0 pseudo-norm using an auxiliary term. In the preliminary experiments with the simulated ISBI datasets the algorithm yields as good results as the state-of-the-art in high-density molecule localization algorithms.Comment: In Proceedings of iTWIST'18, Paper-ID: 13, Marseille, France, November, 21-23, 201

    Dramatic Response of Nail Psoriasis to Infliximab

    Get PDF
    Nail psoriasis, affecting up to 50% of psoriatic patients, is an important cause of serious psychological and physical distress. Traditional treatments for nail psoriasis, which include topical or intralesional corticosteroids, topical vitamin D analogues, photochemotherapy, oral retinoids, methotrexate, and cyclosporin, can be time-consuming, painful, or limited by significant toxicities. Biological agents may have the potential to revolutionize the management of patients with disabling nail psoriasis. We present another case of disabling nail psoriasis that responded dramatically to infliximab

    Meteorological time series forecasting based on MLP modelling using heterogeneous transfer functions

    Full text link
    In this paper, we propose to study four meteorological and seasonal time series coupled with a multi-layer perceptron (MLP) modeling. We chose to combine two transfer functions for the nodes of the hidden layer, and to use a temporal indicator (time index as input) in order to take into account the seasonal aspect of the studied time series. The results of the prediction concern two years of measurements and the learning step, eight independent years. We show that this methodology can improve the accuracy of meteorological data estimation compared to a classical MLP modelling with a homogenous transfer function

    Urban ozone concentration forecasting with artificial neural network in Corsica

    Full text link
    Atmospheric pollutants concentration forecasting is an important issue in air quality monitoring. Qualitair Corse, the organization responsible for monitoring air quality in Corsica (France) region, needs to develop a short-term prediction model to lead its mission of information towards the public. Various deterministic models exist for meso-scale or local forecasting, but need powerful large variable sets, a good knowledge of atmospheric processes, and can be inaccurate because of local climatical or geographical particularities, as observed in Corsica, a mountainous island located in a Mediterranean Sea. As a result, we focus in this study on statistical models, and particularly Artificial Neural Networks (ANN) that have shown good results in the prediction of ozone concentration at horizon h+1 with data measured locally. The purpose of this study is to build a predictor to realize predictions of ozone and PM10 at horizon d+1 in Corsica in order to be able to anticipate pollution peak formation and to take appropriated prevention measures. Specific meteorological conditions are known to lead to particular pollution event in Corsica (e.g. Saharan dust event). Therefore, several ANN models will be used, for meteorological conditions clustering and for operational forecasting.Comment: Sustainable Solutions for Energy and Environment. EENVIRO 2013, Buchatrest : Romania (2013

    Trimming a consistent OWL knowledge base, relying on linguistic evidence

    Get PDF
    International audienceIntuitively absurd but logically consistent sets of statements are common in publicly available OWL datasets. This article proposes an original and fully automated method to point at erroneous axioms in a consistent OWL knowledge base, by weakening it in order to improve its compliance with linguistic evidence gathered from natural language texts. A score for evaluating the compliance of subbases of the input knowledge base is proposed, as well as a trimming algorithm to discard potentially erroneous axioms. The whole approach is evaluated on two real datasets, with automatically retrieved web pages as a linguistic input

    Ontological Analysis For Description Logics Knowledge Base Debugging

    Get PDF
    International audienceFormal ontology provides axiomatizations of domain independent principles which, among other applications,can be used to identify modeling errors within a knowledge base. The Ontoclean methodology is probably the best-known illustration of this strategy, but its cost in terms of manual work is often considered dissuasive. This article investigates the applicability of such debugging strategies to Description Logics knowledge bases, showing that even a partial and shallow analysis rapidly performed with a top-level ontology can reveal the presence of violations of common sense, and that the bottleneck, if there is one, may instead reside in the resolution of the resulting inconsistency or incoherence

    Prioritized base Debugging in Description Logics

    Get PDF
    International audienceThe problem investigated is the identification within an input knowledge base of axioms which should be preferably discarded (or amended) in order to restore consistency, coherence, or get rid of undesired consequences. Most existing strategies for this task in Description Logics rely on conflicts, either computing all minimal conflicts beforehand, or generating conflicts on demand, using diagnosis. The article studies how prioritized base revision can be effectively applied in the former case. The first main contribution is the observation that for each axiom appearing in a minimal conflict, two bases can be obtained for a negligible cost, representing what part of the input knowledge must be preserved if this axiom is discarded or retained respectively, and which may serve as a basis to obtain a semantically motivated preference relation over these axioms. The second main contributions is an algorithm which, assuming this preference relation is known, selects some of the maximal consistent/coherent subsets of the input knowledge base accordingly, without the need to compute all of of them

    Distributional semantics for ontology verification

    Get PDF
    International audienceAs they grow in size, OWL ontologies tend to comprise intuitively incompatible statements,even when they remain logically consistent. This is true in particular of lightweight ontologies, especially the ones which aggregate knowledge from different sources. The article investigates how distributional semantics can help detect and repair violation of common sense in consistent ontologies, based on the identification of consequences which are unlikely to hold if the rest of the ontology does. A score evaluating the plausibility for a consequence to hold with regard to distributional evidence is defined, as well as several methods in order to decide which statements should be preferably amended or discarded. A conclusive evaluation is also provided, which consists in extending an input ontology with randomly generated statements, before trying to discard them automatically

    Mapping roadside nitrogen dioxide concentrations using non-stationary kriging

    Get PDF
    Report of a studyAtmospheric nitrogen dioxide (N02) concentrations around a major road in Alsace (France) are estimated on a fine grid using measurements given by passive samplers and a geostatistical approach. Data are referenced to a local coordinate system where (x, y) are respectively the distance from and along the road. They show a strong non-stationarity which does not allow ordinary kriging to be used in the estimation. Therefore a trend is modelled by a combination of exponential and polynomial functions. Experimental residuals are then computed as the differences between measurements and the trend. The idea is to interpolate the residuals at the nodes of the grid, applying kriging methods, and to add them to the trend estimate. Since their variance is not stationary either, an intermediary step is required. lt consists in modelling the standard deviation of the residuals as a function of the drift and normalizing the residuals by this model. This defines a new regionalized variable which can be estimated in the framework of stationary geostatistics. Two possible kriging systems are tested, depending on the fitted variogram model: in the first one, a pure nugget effect (white noise) is used, in which case the best linear estimator of N02 concentration is the trend model; in the second one, a structured exponential variogram is adjusted. This case study shows that non-stationarity may not only characterize the raw variable but can also affect the variance of a phenomenon. lt illustrates the interest of modelling it so as to improve the experimental variogram, fit an acceptable variogram model and compute the variance of the estimation error even if the estimator is reduced to a simple regression function
    • 

    corecore