5,281 research outputs found
Modélisation, classification et propagation dans des réseaux d'influence
The cognitive map model provides a user a solution to visualize the influences between different notion, and to compute the propagation of influences on a target. Like cognitive maps, our model offers a graphical representation of influences between notions. The distinctive feature of our model is that on a unique support, each notion is precisely defined by conceptual graphs. The combination of operations of cognitive maps and operations of conceptual graphs provides a powerful method to make decision. Firstly, the definition of a notion and the projection provides a solution to compute semantically linked notions. Secondly, original propagations can be computed from such semantically linked notions
Le modèle des cartes cognitives contextuelles
Le modèle des cartes cognitives offre une représentation graphique d'un réseau d'influences entre différentes notions. Une carte cognitive peut contenir un grand nombre de liens d'influence ce qui rend difficile son exploitation. De plus ces influences ne sont pas toujours pertinentes pour des utilisations différentes de la carte. Nous proposons une extension de ce modèle qui précise le contexte de validité d'une influence à l'aide de graphes conceptuels et nous fournissons un mécanisme de filtrage des in- fluences en fonction d'un contexte d'utilisation. A cognitive maps is a network of influences between concepts. A cognitive map can contain a great number of influence what makes difficult its exploitation. Moreover these influences are not always relevant for different use of a map. We propose an extension of this model which specifies the context of validity of an influence using conceptual graphs and we provide a filtering mechanism of the in- fluences according to a context of use
Les cartes cognitives hiérarchiques
Une carte cognitive fournit une représentation graphique d’un réseaud’influence entre des concepts. Les cartes cognitives de dimensions importantes ont l’inconvénient d’être difficiles à appréhender, interpréter et exploiter. Cet article présente un modèle de cartes cognitives hiérarchiques permettant au concepteur d’effectuer des regroupements de concepts qui sont ensuite utilisés dans un mécanisme permettant à l’utilisateur d’obtenir des vues partielles et synthétiques d’une carte
Using UML Class Diagram as a Knowledge Engineering Tool
Date du colloque : 05/2000International audienc
A useful logical semantics of UML for querying and checking UML class diagram
Date du colloque : 01/2009International audienc
Contextual Cognitive Map
The model of cognitive maps introduced by Tolman [1] provides a representation of an influence network between notions. A cognitive map can contain a lot of influences that makes difficult its exploitation. Moreover these influences are not always relevant for different uses of a map. This paper extends the cognitive map model by describing the validity context of each influence with a conceptual graph. A filtering mechanism of the influences according to a use context is provided so as to obtain a simpler and more adjusted map for a user. A prototype that implements this model of contextual cognitive map has been developed
Extracting constraints from direct detection searches of supersymmetric dark matter in the light of null results from the LHC in the squark sector
The comparison of the results of direct detection of Dark Matter, obtained
with various target nuclei, requires model-dependent, or even arbitrary,
assumptions. Indeed, to draw conclusions either the spin-dependent (SD) or the
spin-independent (SI) interaction has to be neglected. In the light of the null
results from supersymmetry searches at the LHC, the squark sector is pushed to
high masses. We show that for a squark sector at the TeV scale, the framework
used to extract contraints from direct detection searches can be redefined as
the number of free parameters is reduced. Moreover, the correlation observed
between SI and SD proton cross sections constitutes a key issue for the
development of the next generation of Dark Matter detectors.Comment: Figure 3 has been updated. Conclusions unchange
Modelling stochastic bivariate mortality
Stochastic mortality, i.e. modelling death arrival via a jump process with stochastic intensity, is gaining increasing reputation as a way to represent mortality risk. This paper represents a first attempt to model the mortality risk of couples of individuals, according to the stochastic intensity approach.
On the theoretical side, we extend to couples the Cox processes set up, i.e. the idea that mortality is driven by a jump process whose intensity is itself a stochastic process, proper of a particular generation within each gender. Dependence between the survival times of the members of a couple is captured by an Archimedean copula.
On the calibration side, we fit the joint survival function by calibrating separately the (analytical) copula and the (analytical) margins. First, we select the best fit copula according to the methodology of Wang and Wells (2000) for censored data. Then, we provide a sample-based calibration for the intensity, using a time-homogeneous, non mean-reverting, affine process: this gives the analytical marginal survival functions. Coupling the best fit copula with the calibrated margins we obtain, on a sample generation, a joint survival function which incorporates the stochastic nature of mortality improvements and is far from representing independency.On the contrary, since the best fit copula turns out to be a Nelsen one, dependency is increasing with age and long-term dependence exists
Aperçus de recherche : interroger efficacement un ensemble de bases RDF
In the context of information retrieval in the Web of Data, we propose a kind of compact version of a RDF triplestore, that acts as an overview on this base of RDF triples. An overview is not only more compact that the initial triplestore, but also SPARQL can be used on it. An overview is built in such a way that if a SPARQL query on overview has no result, then there is no result too to this query into the initial triplestore. So, querying overviews is a more efficient solution than querying the whole triplestores when the query has often no result from these RDF databases. It is usually the case when a user query triplestores on the web of data. Our solution has been evaluated using RDF bases extracted from DBPedia and queries extracted either from the most common used on DBPedia or because of their resolution complexity
- …
