8,666 research outputs found

    A comparison of SNPs and microsatellites as linkage mapping markers: lessons from the zebra finch (Taeniopygia guttata)

    Get PDF
    Background: Genetic linkage maps are essential tools when searching for quantitative trait loci (QTL). To maximize genome coverage and provide an evenly spaced marker distribution a combination of different types of genetic marker are sometimes used. In this study we created linkage maps of four zebra finch (Taeniopygia guttata) chromosomes (1, 1A, 2 and 9) using two types of marker, Single Nucleotide Polymorphisms (SNPs) and microsatellites. To assess the effectiveness and accuracy of each kind of marker we compared maps built with each marker type separately and with both types of marker combined. Linkage map marker order was validated by making comparisons to the assembled zebra finch genome sequence. Results: We showed that marker order was less reliable and linkage map lengths were inflated for microsatellite maps relative to SNP maps, apparently due to differing error rates between the two types of marker. Guidelines on how to minimise the effects of error are provided. In particular, we show that when combining both types of marker the conventional process of building linkage maps, whereby the most informative markers are added to the map first, has to be modified in order to improve map accuracy. Conclusions: When using multiple types and large numbers of markers to create dense linkage maps, the least error prone loci (SNPs) rather than the most informative should be used to create framework maps before the addition of other potentially more error prone markers (microsatellites). This raises questions about the accuracy of marker order and predicted recombination rates in previous microsatellite linkage maps which were created using the conventional building process, however, provided suitable error detection strategies are followed microsatellite-based maps can continue to be regarded as reasonably reliable

    The C++0x "Concepts" Effort

    Full text link
    C++0x is the working title for the revision of the ISO standard of the C++ programming language that was originally planned for release in 2009 but that was delayed to 2011. The largest language extension in C++0x was "concepts", that is, a collection of features for constraining template parameters. In September of 2008, the C++ standards committee voted the concepts extension into C++0x, but then in July of 2009, the committee voted the concepts extension back out of C++0x. This article is my account of the technical challenges and debates within the "concepts" effort in the years 2003 to 2009. To provide some background, the article also describes the design space for constrained parametric polymorphism, or what is colloquially know as constrained generics. While this article is meant to be generally accessible, the writing is aimed toward readers with background in functional programming and programming language theory. This article grew out of a lecture at the Spring School on Generic and Indexed Programming at the University of Oxford, March 2010

    Covariance and Controvariance: a fresh look at an old issue (a primer in advanced type systems for learning functional programmers)

    Full text link
    Twenty years ago, in an article titled "Covariance and contravariance: conflict without a cause", I argued that covariant and contravariant specialization of method parameters in object-oriented programming had different purposes and deduced that, not only they could, but actually they should both coexist in the same language. In this work I reexamine the result of that article in the light of recent advances in (sub-)typing theory and programming languages, taking a fresh look at this old issue. Actually, the revamping of this problem is just an excuse for writing an essay that aims at explaining sophisticated type-theoretic concepts, in simple terms and by examples, to undergraduate computer science students and/or willing functional programmers. Finally, I took advantage of this opportunity to describe some undocumented advanced techniques of type-systems implementation that are known only to few insiders that dug in the code of some compilers: therefore, even expert language designers and implementers may find this work worth of reading

    Higher-Order Process Modeling: Product-Lining, Variability Modeling and Beyond

    Full text link
    We present a graphical and dynamic framework for binding and execution of business) process models. It is tailored to integrate 1) ad hoc processes modeled graphically, 2) third party services discovered in the (Inter)net, and 3) (dynamically) synthesized process chains that solve situation-specific tasks, with the synthesis taking place not only at design time, but also at runtime. Key to our approach is the introduction of type-safe stacked second-order execution contexts that allow for higher-order process modeling. Tamed by our underlying strict service-oriented notion of abstraction, this approach is tailored also to be used by application experts with little technical knowledge: users can select, modify, construct and then pass (component) processes during process execution as if they were data. We illustrate the impact and essence of our framework along a concrete, realistic (business) process modeling scenario: the development of Springer's browser-based Online Conference Service (OCS). The most advanced feature of our new framework allows one to combine online synthesis with the integration of the synthesized process into the running application. This ability leads to a particularly flexible way of implementing self-adaption, and to a particularly concise and powerful way of achieving variability not only at design time, but also at runtime.Comment: In Proceedings Festschrift for Dave Schmidt, arXiv:1309.455

    Aspect-Oriented Programming with Type Classes

    Get PDF
    We consider the problem of adding aspects to a strongly typed language which supports type classes. We show that type classes as supported by the Glasgow Haskell Compiler can model an AOP style of programming via a simple syntax-directed transformation scheme where AOP programming idioms are mapped to type classes. The drawback of this approach is that we cannot easily advise functions in programs which carry type annotations. We sketch a more principled approach which is free of such problems by combining ideas from intentional type analysis with advanced overloading resolution strategies. Our results show that type-directed static weaving is closely related to type class resolution -- the process of typing and translating type class programs

    Trust, but Verify: Two-Phase Typing for Dynamic Languages

    Get PDF
    A key challenge when statically typing so-called dynamic languages is the ubiquity of value-based overloading, where a given function can dynamically reflect upon and behave according to the types of its arguments. Thus, to establish basic types, the analysis must reason precisely about values, but in the presence of higher-order functions and polymorphism, this reasoning itself can require basic types. In this paper we address this chicken-and-egg problem by introducing the framework of two-phased typing. The first "trust" phase performs classical, i.e. flow-, path- and value-insensitive type checking to assign basic types to various program expressions. When the check inevitably runs into "errors" due to value-insensitivity, it wraps problematic expressions with DEAD-casts, which explicate the trust obligations that must be discharged by the second phase. The second phase uses refinement typing, a flow- and path-sensitive analysis, that decorates the first phase's types with logical predicates to track value relationships and thereby verify the casts and establish other correctness properties for dynamically typed languages
    corecore