191 research outputs found
On SAT representations of XOR constraints
We study the representation of systems S of linear equations over the
two-element field (aka xor- or parity-constraints) via conjunctive normal forms
F (boolean clause-sets). First we consider the problem of finding an
"arc-consistent" representation ("AC"), meaning that unit-clause propagation
will fix all forced assignments for all possible instantiations of the
xor-variables. Our main negative result is that there is no polysize
AC-representation in general. On the positive side we show that finding such an
AC-representation is fixed-parameter tractable (fpt) in the number of
equations. Then we turn to a stronger criterion of representation, namely
propagation completeness ("PC") --- while AC only covers the variables of S,
now all the variables in F (the variables in S plus auxiliary variables) are
considered for PC. We show that the standard translation actually yields a PC
representation for one equation, but fails so for two equations (in fact
arbitrarily badly). We show that with a more intelligent translation we can
also easily compute a translation to PC for two equations. We conjecture that
computing a representation in PC is fpt in the number of equations.Comment: 39 pages; 2nd v. improved handling of acyclic systems, free-standing
proof of the transformation from AC-representations to monotone circuits,
improved wording and literature review; 3rd v. updated literature,
strengthened treatment of monotonisation, improved discussions; 4th v. update
of literature, discussions and formulations, more details and examples;
conference v. to appear LATA 201
Certainty Closure: Reliable Constraint Reasoning with Incomplete or Erroneous Data
Constraint Programming (CP) has proved an effective paradigm to model and
solve difficult combinatorial satisfaction and optimisation problems from
disparate domains. Many such problems arising from the commercial world are
permeated by data uncertainty. Existing CP approaches that accommodate
uncertainty are less suited to uncertainty arising due to incomplete and
erroneous data, because they do not build reliable models and solutions
guaranteed to address the user's genuine problem as she perceives it. Other
fields such as reliable computation offer combinations of models and associated
methods to handle these types of uncertain data, but lack an expressive
framework characterising the resolution methodology independently of the model.
We present a unifying framework that extends the CP formalism in both model
and solutions, to tackle ill-defined combinatorial problems with incomplete or
erroneous data. The certainty closure framework brings together modelling and
solving methodologies from different fields into the CP paradigm to provide
reliable and efficient approches for uncertain constraint problems. We
demonstrate the applicability of the framework on a case study in network
diagnosis. We define resolution forms that give generic templates, and their
associated operational semantics, to derive practical solution methods for
reliable solutions.Comment: Revised versio
Tau Be or not Tau Be? - A Perspective on Service Compatibility and Substitutability
One of the main open research issues in Service Oriented Computing is to
propose automated techniques to analyse service interfaces. A first problem,
called compatibility, aims at determining whether a set of services (two in
this paper) can be composed together and interact with each other as expected.
Another related problem is to check the substitutability of one service with
another. These problems are especially difficult when behavioural descriptions
(i.e., message calls and their ordering) are taken into account in service
interfaces. Interfaces should capture as faithfully as possible the service
behaviour to make their automated analysis possible while not exhibiting
implementation details. In this position paper, we choose Labelled Transition
Systems to specify the behavioural part of service interfaces. In particular,
we show that internal behaviours (tau transitions) are necessary in these
transition systems in order to detect subtle errors that may occur when
composing a set of services together. We also show that tau transitions should
be handled differently in the compatibility and substitutability problem: the
former problem requires to check if the compatibility is preserved every time a
tau transition is traversed in one interface, whereas the latter requires a
precise analysis of tau branchings in order to make the substitution preserve
the properties (e.g., a compatibility notion) which were ensured before
replacement.Comment: In Proceedings WCSI 2010, arXiv:1010.233
Analysis and Verification of Service Interaction Protocols - A Brief Survey
Modeling and analysis of interactions among services is a crucial issue in
Service-Oriented Computing. Composing Web services is a complicated task which
requires techniques and tools to verify that the new system will behave
correctly. In this paper, we first overview some formal models proposed in the
literature to describe services. Second, we give a brief survey of verification
techniques that can be used to analyse services and their interaction. Last, we
focus on the realizability and conformance of choreographies.Comment: In Proceedings TAV-WEB 2010, arXiv:1009.330
Diagnosing and measuring incompatibilities between pairs of services
International audienceThis text presents a tool, from its design to its implementation, which detects all behavioural incompatibilities between two service interfaces. Unlike prior work, the proposed solution does not simply check whether two services are incompatible or not, it rather provides detailed diagnosis, including the incompatibilities and for each one the location in the service interfaces where these incompatibilities occur. A measure of similarity between interfaces which considers outputs from the detection algorithm is proposed too. A visual report of the comparison analysis is also provided which pinpoints a set of incompatibilities that cause a behavioural interface not to simulate another one
A PCR-mutagenesis strategy for rapid detection of mutations in codon 634 of the ret proto-oncogene related to MEN 2A.
BACKGROUND: Multiple endocrine neoplasias type 2A (MEN 2A) is a dominantly inherited cancer syndrome. Missence mutations in the codon encoding cysteine 634 of the ret proto-oncogene have been found in 85% of the MEN 2A families. The main tumour type always present in MEN 2A is medullar thyroid carcinoma (MTC). Only 25% of all MTC are hereditary, and generally they are identified by a careful family history. However, some familial MTCs are not easily detected by this means and underdiagnosis of MEN 2A is suspected. METHODS: DNA samples from MEN 2A patients were amplified by PCR. The products were incubated with the restriction enzyme Bst ApI or Bgl I. The samples were loaded in non-denaturing 10% Polyacrilamyde Gel and run at 120 volts for 40 min. The gels were stained with 10 ÎĽg/ml ethidium bromide, and the bands were visualized under a UV lamp. RESULTS: We developed a PCR-mutagenic method to check the integrity of the three bases of the cysteine 634 codon. CONCLUSION: The method can be used to detect inherited mutations in MTC patients without a clear family history. The method is relatively simple to use as a routine test in these patients to decrease the underdiagnosis of MEN 2A. In addition, the assay can be used to screen affected families with any mutation in cysteine 634
Constraint solving in uncertain and dynamic environments - a survey
International audienceThis article follows a tutorial, given by the authors on dynamic constraint solving at CP 2003 (Ninth International Conference on Principles and Practice of Constraint Programming) in Kinsale, Ireland. It aims at offering an overview of the main approaches and techniques that have been proposed in the domain of constraint satisfaction to deal with uncertain and dynamic environments
Distribution of Alarin Immunoreactivity in the Mouse Brain
Alarin is a 25 amino acid peptide that belongs to the galanin peptide family. It is derived from the galanin-like peptide gene by a splice variant, which excludes exon 3. Alarin was first identified in gangliocytes of neuroblastic tumors and later shown to have a vasoactive function in the skin. Recently, alarin was demonstrated to stimulate food intake as well as the hypothalamic–pituitary–gonadal axis in rodents, suggesting that it might be a neuromodulatory peptide in the brain. However, the individual neurons in the central nervous system that express alarin have not been identified. Here, we determined the distribution of alarin-like immunoreactivity (alarin-LI) in the adult murine brain. The specificity of the antibody against alarin was demonstrated by the absence of labeling after pre-absorption of the antiserum with synthetic alarin peptide and in transgenic mouse brains lacking neurons expressing the GALP gene. Alarin-LI was observed in different areas of the murine brain. A high intensity of alarin-LI was detected in the accessory olfactory bulb, the medial preoptic area, the amygdala, different nuclei of the hypothalamus such as the arcuate nucleus and the ventromedial hypothalamic nucleus, the trigeminal complex, the locus coeruleus, the ventral chochlear nucleus, the facial nucleus, and the epithelial layer of the plexus choroideus. The distinct expression pattern of alarin in the adult mouse brain suggests potential functions in reproduction and metabolism
Impact of Immunization Technology and Assay Application on Antibody Performance – A Systematic Comparative Evaluation
Antibodies are quintessential affinity reagents for the investigation and determination of a protein's expression patterns, localization, quantitation, modifications, purification, and functional understanding. Antibodies are typically used in techniques such as Western blot, immunohistochemistry (IHC), and enzyme-linked immunosorbent assays (ELISA), among others. The methods employed to generate antibodies can have a profound impact on their success in any of these applications. We raised antibodies against 10 serum proteins using 3 immunization methods: peptide antigens (3 per protein), DNA prime/protein fragment-boost (“DNA immunization”; 3 per protein), and full length protein. Antibodies thus generated were systematically evaluated using several different assay technologies (ELISA, IHC, and Western blot). Antibodies raised against peptides worked predominantly in applications where the target protein was denatured (57% success in Western blot, 66% success in immunohistochemistry), although 37% of the antibodies thus generated did not work in any of these applications. In contrast, antibodies produced by DNA immunization performed well against both denatured and native targets with a high level of success: 93% success in Western blots, 100% success in immunohistochemistry, and 79% success in ELISA. Importantly, success in one assay method was not predictive of success in another. Immunization with full length protein consistently yielded the best results; however, this method is not typically available for new targets, due to the difficulty of generating full length protein. We conclude that DNA immunization strategies which are not encumbered by the limitations of efficacy (peptides) or requirements for full length proteins can be quite successful, particularly when multiple constructs for each protein are used
- …