13,586 research outputs found
Semantic distillation: a method for clustering objects by their contextual specificity
Techniques for data-mining, latent semantic analysis, contextual search of
databases, etc. have long ago been developed by computer scientists working on
information retrieval (IR). Experimental scientists, from all disciplines,
having to analyse large collections of raw experimental data (astronomical,
physical, biological, etc.) have developed powerful methods for their
statistical analysis and for clustering, categorising, and classifying objects.
Finally, physicists have developed a theory of quantum measurement, unifying
the logical, algebraic, and probabilistic aspects of queries into a single
formalism. The purpose of this paper is twofold: first to show that when
formulated at an abstract level, problems from IR, from statistical data
analysis, and from physical measurement theories are very similar and hence can
profitably be cross-fertilised, and, secondly, to propose a novel method of
fuzzy hierarchical clustering, termed \textit{semantic distillation} --
strongly inspired from the theory of quantum measurement --, we developed to
analyse raw data coming from various types of experiments on DNA arrays. We
illustrate the method by analysing DNA arrays experiments and clustering the
genes of the array according to their specificity.Comment: Accepted for publication in Studies in Computational Intelligence,
Springer-Verla
Behavior patterns in hormonal treatments using fuzzy logic models
Assisted reproductive technologies are a combination of medical strategies designed to treat infertility patients. Ideal stimulation treatment has to be individualized, but one of the main challenges which clinicians face in the everyday clinic is how to select the best medical protocol for a patient. This work aims to look for behavior patterns in this kind of treatments, using fuzzy logic models with the objective of helping gynecologists and embryologists to make decisions that could improve the process of in vitro fertilization. For this purpose, a real-world dataset composed of one hundred and twenty-three (123) patients and five hundred and fifty-nine (559) treatments applied in relation to such patients provided by an assisted reproduction clinic, has been used to obtain the fuzzy models. As conclusion, this work corroborates some known clinic experiences, provides some new ones and proposes a set of questions to be solved in future experiments.Ministerio de Economía y Competitividad TIN2013-46928-C3-3-RMinisterio de Economía y Competitividad TIN2016-76956- C3-2-RMinisterio de Economía y Competitividad TIN2015-71938-RED
Artificial neural networks in geospatial analysis
Artificial neural networks are computational models widely used in geospatial analysis for data classification, change detection, clustering, function approximation, and forecasting or prediction. There are many types of neural networks based on learning paradigm and network architectures. Their use is expected to grow with increasing availability of massive data from remote sensing and mobile platforms
A Context-theoretic Framework for Compositionality in Distributional Semantics
Techniques in which words are represented as vectors have proved useful in
many applications in computational linguistics, however there is currently no
general semantic formalism for representing meaning in terms of vectors. We
present a framework for natural language semantics in which words, phrases and
sentences are all represented as vectors, based on a theoretical analysis which
assumes that meaning is determined by context.
In the theoretical analysis, we define a corpus model as a mathematical
abstraction of a text corpus. The meaning of a string of words is assumed to be
a vector representing the contexts in which it occurs in the corpus model.
Based on this assumption, we can show that the vector representations of words
can be considered as elements of an algebra over a field. We note that in
applications of vector spaces to representing meanings of words there is an
underlying lattice structure; we interpret the partial ordering of the lattice
as describing entailment between meanings. We also define the context-theoretic
probability of a string, and, based on this and the lattice structure, a degree
of entailment between strings.
We relate the framework to existing methods of composing vector-based
representations of meaning, and show that our approach generalises many of
these, including vector addition, component-wise multiplication, and the tensor
product.Comment: Submitted to Computational Linguistics on 20th January 2010 for
revie
Historical collaborative geocoding
The latest developments in digital have provided large data sets that can
increasingly easily be accessed and used. These data sets often contain
indirect localisation information, such as historical addresses. Historical
geocoding is the process of transforming the indirect localisation information
to direct localisation that can be placed on a map, which enables spatial
analysis and cross-referencing. Many efficient geocoders exist for current
addresses, but they do not deal with the temporal aspect and are based on a
strict hierarchy (..., city, street, house number) that is hard or impossible
to use with historical data. Indeed historical data are full of uncertainties
(temporal aspect, semantic aspect, spatial precision, confidence in historical
source, ...) that can not be resolved, as there is no way to go back in time to
check. We propose an open source, open data, extensible solution for geocoding
that is based on the building of gazetteers composed of geohistorical objects
extracted from historical topographical maps. Once the gazetteers are
available, geocoding an historical address is a matter of finding the
geohistorical object in the gazetteers that is the best match to the historical
address. The matching criteriae are customisable and include several dimensions
(fuzzy semantic, fuzzy temporal, scale, spatial precision ...). As the goal is
to facilitate historical work, we also propose web-based user interfaces that
help geocode (one address or batch mode) and display over current or historical
topographical maps, so that they can be checked and collaboratively edited. The
system is tested on Paris city for the 19-20th centuries, shows high returns
rate and is fast enough to be used interactively.Comment: WORKING PAPE
Geometric and form feature recognition tools applied to a design for assembly methodology
The paper presents geometric tools for an automated Design for Assembly (DFA) assessment system. For each component in an assembly a two step features search is performed: firstly (using the minimal bounding box) mass, dimensions and symmetries are identified allowing the part to be classified, according to DFA convention, as either rotational or prismatic; secondly form features are extracted allowing an effective method of mechanised orientation to be determined. Together these algorithms support the fuzzy decision support system, of an assembly-orientated CAD system known as FuzzyDFA
Early aspects: aspect-oriented requirements engineering and architecture design
This paper reports on the third Early Aspects: Aspect-Oriented Requirements Engineering and Architecture Design Workshop, which has been held in Lancaster, UK, on March 21, 2004. The workshop included a presentation session and working sessions in which the particular topics on early aspects were discussed. The primary goal of the workshop was to focus on challenges to defining methodical software development processes for aspects from early on in the software life cycle and explore the potential of proposed methods and techniques to scale up to industrial applications
Requirements modelling and formal analysis using graph operations
The increasing complexity of enterprise systems requires a more advanced
analysis of the representation of services expected than is currently possible.
Consequently, the specification stage, which could be facilitated by formal
verification, becomes very important to the system life-cycle. This paper presents
a formal modelling approach, which may be used in order to better represent
the reality of the system and to verify the awaited or existing system’s properties,
taking into account the environmental characteristics. For that, we firstly propose
a formalization process based upon properties specification, and secondly we
use Conceptual Graphs operations to develop reasoning mechanisms of verifying
requirements statements. The graphic visualization of these reasoning enables us
to correctly capture the system specifications by making it easier to determine if
desired properties hold. It is applied to the field of Enterprise modelling
Beyond Dualisms in Methodology: An Integrative Design Research Medium "MAPS" and some Reflections
Design research is an academic issue and increasingly an essential success factor for industrial, organizational and social innovation. The fierce rejection of 1st generation design methods in the early 1970s resulted in the postmodernist attitude of "no methods", and subsequently, after more than a decade, in the strong adoption of scientific methods, or "the" scientific method, for design research. The current situation regarding methodology is characterized by unproductive dualisms such as scientific methods vs. designerly methods, normative methods vs. descriptive methods, research vs. design. The potential of the early (1st generation) methods is neglected and the practical usefulness of design research is impeded. The suggestion for 2nd generation methods as discussed by Rittel and others has hardly been taken up in design. The development of a methodological tool / medium for research through design – MAPS – (which is the central part of the paper) presents the cause and catalyst for some reflections about the usability / desirability / usefulness of methodical support for the design (research) process.
Keywords:
Integrative Design Research Medium, Research Through Design, MAPS, Methodology</p
- …