297 research outputs found

    A lower bound on CNF encodings of the at-most-one constraint

    Full text link
    Constraint "at most one" is a basic cardinality constraint which requires that at most one of its nn boolean inputs is set to 11. This constraint is widely used when translating a problem into a conjunctive normal form (CNF) and we investigate its CNF encodings suitable for this purpose. An encoding differs from a CNF representation of a function in that it can use auxiliary variables. We are especially interested in propagation complete encodings which have the property that unit propagation is strong enough to enforce consistency on input variables. We show a lower bound on the number of clauses in any propagation complete encoding of the "at most one" constraint. The lower bound almost matches the size of the best known encodings. We also study an important case of 2-CNF encodings where we show a slightly better lower bound. The lower bound holds also for a related "exactly one" constraint.Comment: 38 pages, version 3 is significantly reorganized in order to improve readabilit

    Children's reported investment of mental effort when viewing child and adult television programs

    Get PDF
    The present study was designed to examine developmental differences in children's active, conscious cognitive processing of television program information during home viewing. Specifically, the study focused on examining (1) relationships among children's ages, the types of television programs Viewed, and reported amounts of invested mental effort when televiewing, and (2) contributions of factors in children's home viewing environments to reported levels of mental effort investment. Subjects were a convenient sample of 40 second-grade, 42 fourth-grade, and 34 sixth-grade children and their parents. Parental questionnaires provided data on parental participation during the children's televiewing. Individual interviews with the children provided information on the child's performance of other activities such as playing when viewing, the child's familiarity with different television programs, and reported amounts of mental effort invested when viewing child and adult types of programs

    SNP fingerprinting and farmer criteria for selection, multiplication, and traceability of cacao in Nicaragua

    Get PDF
    Introduction. Genetic diversity, registration, and traceability of cacao planting material are the essential tripod to support its sustainable cultivation. In Nicaragua, cocoa planting material is mostly obtained from seeds, which leads to great variability in yield, bean quality, and tolerance to pests and diseases. Farmers, technical staff, development projects, and investors depend on a limited supply of elite trees as a reliable source of genetic material to support new cacao fronts and meet market quality standards. Therefore, the development of a national genetic improvement program and a planting material traceability system in Nicaragua are necessary to improve the sustainability of cacao cultivation. Objective. To evaluate the genetic resources on farms and provide information for future breeding programs, as well as to lay foundation for a national traceability and certification system. Materials and methods. Fourty-nine elite trees selected by farmers in the main producing regions in Nicaragua between 2018 and 2020 were sampled. Ninety-three SNP markers were used to characterize them and resolve their genetic origins. Results. The evaluated cacao trees had a narrow genetic background, mainly composed of hybrids with Amelonado (36 %), Criollo (17 %) and Iquitos (15 %) origins. A set of trees with high genetic diversity that could be incorporated into a selection and breeding program was identified. Conclusion. Fingerprinting with SNP markers was a useful tool for evaluating the genetic links of cultivated cacao and can be used for varietal identity test at the farm level. The study provided the basis for developing both a breeding program and a traceability system of cacao planting material in Nicaragua

    Certainty Closure: Reliable Constraint Reasoning with Incomplete or Erroneous Data

    Full text link
    Constraint Programming (CP) has proved an effective paradigm to model and solve difficult combinatorial satisfaction and optimisation problems from disparate domains. Many such problems arising from the commercial world are permeated by data uncertainty. Existing CP approaches that accommodate uncertainty are less suited to uncertainty arising due to incomplete and erroneous data, because they do not build reliable models and solutions guaranteed to address the user's genuine problem as she perceives it. Other fields such as reliable computation offer combinations of models and associated methods to handle these types of uncertain data, but lack an expressive framework characterising the resolution methodology independently of the model. We present a unifying framework that extends the CP formalism in both model and solutions, to tackle ill-defined combinatorial problems with incomplete or erroneous data. The certainty closure framework brings together modelling and solving methodologies from different fields into the CP paradigm to provide reliable and efficient approches for uncertain constraint problems. We demonstrate the applicability of the framework on a case study in network diagnosis. We define resolution forms that give generic templates, and their associated operational semantics, to derive practical solution methods for reliable solutions.Comment: Revised versio

    Tau Be or not Tau Be? - A Perspective on Service Compatibility and Substitutability

    Get PDF
    One of the main open research issues in Service Oriented Computing is to propose automated techniques to analyse service interfaces. A first problem, called compatibility, aims at determining whether a set of services (two in this paper) can be composed together and interact with each other as expected. Another related problem is to check the substitutability of one service with another. These problems are especially difficult when behavioural descriptions (i.e., message calls and their ordering) are taken into account in service interfaces. Interfaces should capture as faithfully as possible the service behaviour to make their automated analysis possible while not exhibiting implementation details. In this position paper, we choose Labelled Transition Systems to specify the behavioural part of service interfaces. In particular, we show that internal behaviours (tau transitions) are necessary in these transition systems in order to detect subtle errors that may occur when composing a set of services together. We also show that tau transitions should be handled differently in the compatibility and substitutability problem: the former problem requires to check if the compatibility is preserved every time a tau transition is traversed in one interface, whereas the latter requires a precise analysis of tau branchings in order to make the substitution preserve the properties (e.g., a compatibility notion) which were ensured before replacement.Comment: In Proceedings WCSI 2010, arXiv:1010.233

    Analysis and Verification of Service Interaction Protocols - A Brief Survey

    Get PDF
    Modeling and analysis of interactions among services is a crucial issue in Service-Oriented Computing. Composing Web services is a complicated task which requires techniques and tools to verify that the new system will behave correctly. In this paper, we first overview some formal models proposed in the literature to describe services. Second, we give a brief survey of verification techniques that can be used to analyse services and their interaction. Last, we focus on the realizability and conformance of choreographies.Comment: In Proceedings TAV-WEB 2010, arXiv:1009.330

    Diagnosing and measuring incompatibilities between pairs of services

    Get PDF
    International audienceThis text presents a tool, from its design to its implementation, which detects all behavioural incompatibilities between two service interfaces. Unlike prior work, the proposed solution does not simply check whether two services are incompatible or not, it rather provides detailed diagnosis, including the incompatibilities and for each one the location in the service interfaces where these incompatibilities occur. A measure of similarity between interfaces which considers outputs from the detection algorithm is proposed too. A visual report of the comparison analysis is also provided which pinpoints a set of incompatibilities that cause a behavioural interface not to simulate another one

    DOSCATs: Double standards for protein quantification

    Get PDF
    The two most common techniques for absolute protein quantification are based on either mass spectrometry (MS) or on immunochemical techniques, such as western blotting (WB). Western blotting is most often used for protein identification or relative quantification, but can also be deployed for absolute quantification if appropriate calibration standards are used. MS based techniques offer superior data quality and reproducibility, but WB offers greater sensitivity and accessibility to most researchers. It would be advantageous to apply both techniques for orthogonal quantification, but workflows rarely overlap. We describe DOSCATs (DOuble Standard conCATamers), novel calibration standards based on QconCAT technology, to unite these platforms. DOSCATs combine a series of epitope sequences concatenated with tryptic peptides in a single artificial protein to create internal tryptic peptide standards for MS as well as an intact protein bearing multiple linear epitopes. A DOSCAT protein was designed and constructed to quantify five proteins of the NF-κB pathway. For three target proteins, protein fold change and absolute copy per cell values measured by MS and WB were in excellent agreement. This demonstrates that DOSCATs can be used as multiplexed, dual purpose standards, readily deployed in a single workflow, supporting seamless quantitative transition from MS to WB

    Constraint solving in uncertain and dynamic environments - a survey

    Get PDF
    International audienceThis article follows a tutorial, given by the authors on dynamic constraint solving at CP 2003 (Ninth International Conference on Principles and Practice of Constraint Programming) in Kinsale, Ireland. It aims at offering an overview of the main approaches and techniques that have been proposed in the domain of constraint satisfaction to deal with uncertain and dynamic environments

    Distribution of Alarin Immunoreactivity in the Mouse Brain

    Get PDF
    Alarin is a 25 amino acid peptide that belongs to the galanin peptide family. It is derived from the galanin-like peptide gene by a splice variant, which excludes exon 3. Alarin was first identified in gangliocytes of neuroblastic tumors and later shown to have a vasoactive function in the skin. Recently, alarin was demonstrated to stimulate food intake as well as the hypothalamic–pituitary–gonadal axis in rodents, suggesting that it might be a neuromodulatory peptide in the brain. However, the individual neurons in the central nervous system that express alarin have not been identified. Here, we determined the distribution of alarin-like immunoreactivity (alarin-LI) in the adult murine brain. The specificity of the antibody against alarin was demonstrated by the absence of labeling after pre-absorption of the antiserum with synthetic alarin peptide and in transgenic mouse brains lacking neurons expressing the GALP gene. Alarin-LI was observed in different areas of the murine brain. A high intensity of alarin-LI was detected in the accessory olfactory bulb, the medial preoptic area, the amygdala, different nuclei of the hypothalamus such as the arcuate nucleus and the ventromedial hypothalamic nucleus, the trigeminal complex, the locus coeruleus, the ventral chochlear nucleus, the facial nucleus, and the epithelial layer of the plexus choroideus. The distinct expression pattern of alarin in the adult mouse brain suggests potential functions in reproduction and metabolism
    • …
    corecore