165 research outputs found

    Cholesteric Liquid Crystal Displays as Optical Sensors of Barbiturate Binding

    Full text link
    The influence on the optical properties of cholesteric liquid crystal displays (LCDs) was examined for neutral molecule binding by mesogen/receptors in the mesomorphic phase. The motivation was to prepare neutral molecule sensors that use a colour change to signal analyte binding. A receptor that binds barbiturate analytes was modified with two or one cholesteryl groups to yield compounds 2 and 3, respectively. LCDs were prepared by incorporating one of the receptor/mesogen compounds into a cholesteric LC blend along with a potential Hbonding guest. The optical properties of the LCDs were then determined by measuring the absorbance of the displays. For various LCDs, the colour of the display depended upon several factors: the amount of guest molecule used, the number of cholesteryl side chains on the receptor and the mole concentration of receptor/mesogen in the blend. In particular, complementary host/guest binding of H-bonding analytes by the bis(cholesteryl) receptor 2 in a cholesteric LCD caused a change of up to +70 nm, which was observed by the naked eye as a blue-to-orange colour change. Control experiments confirm that the colour of an LCD is a consequence of molecular recognition in the mesomorphic phase

    Proof Explanation in the DR-DEVICE System

    Get PDF
    Trust is a vital feature for Semantic Web: If users (humans and agents) are to use and integrate system answers, they must trust them. Thus, systems should be able to explain their actions, sources, and beliefs, and this issue is the topic of the proof layer in the design of the Semantic Web. This paper presents the design and implementation of a system for proof explanation on the Semantic Web, based on defeasible reasoning. The basis of this work is the DR-DEVICE system that is extended to handle proofs. A critical aspect is the representation of proofs in an XML language, which is achieved by a RuleML language extension

    Ada and Grace: Direct Interaction with Museum Visitors

    Full text link

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Integration of DFDs into a UML - based model-driven engineering approach

    Get PDF
    The main aim of this article is to discuss how the functional and the object-oriented views can be inter-played to represent the various modeling perspectives of embedded systems.We discuss whether the object-oriented modeling paradigm, the predominant one to develop software at the present time, is also adequate for modeling embedded software and how it can be used with the functional paradigm.More specifically, we present how the main modeling tool of the traditional structured methods, data flow diagrams, can be integrated in an object-oriented development strategy based on the unified modeling language. The rationale behind the approach is that both views are important for modeling purposes in embedded systems environments, and thus a combined and integrated model is not only useful, but also fundamental for developing complex systems. The approach was integrated in amodel-driven engineering process, where tool support for the models used was provided. In addition, model transformations have been specified and implemented to automate the process.We exemplify the approach with an IPv6 router case study.FEDER -Fundação para a Ciência e a Tecnologia(HH-02-383

    Motion Rail: A Virtual Reality Level Crossing Training Application

    Get PDF
    This paper presents the development and usability testing of a Virtual Reality (VR) based system named 'Motion Rail' for training children on railway crossing safety. The children are to use a VR head mounted device and a controller to navigate the VR environment to perform a level crossing task and they will receive instant feedback on pass or failure on a display in the VR environment. Five participants consisting of two male and three females were considered for the usability test. The outcomes of the test was promising, as the children were very engaging and will like to adopt this training approach in future safety training

    Developing Ontologies withing Decentralized Settings

    Get PDF
    This chapter addresses two research questions: “How should a well-engineered methodology facilitate the development of ontologies within communities of practice?” and “What methodology should be used?” If ontologies are to be developed by communities then the ontology development life cycle should be better understood within this context. This chapter presents the Melting Point (MP), a proposed new methodology for developing ontologies within decentralised settings. It describes how MP was developed by taking best practices from other methodologies, provides details on recommended steps and recommended processes, and compares MP with alternatives. The methodology presented here is the product of direct first-hand experience and observation of biological communities of practice in which some of the authors have been involved. The Melting Point is a methodology engineered for decentralised communities of practice for which the designers of technology and the users may be the same group. As such, MP provides a potential foundation for the establishment of standard practices for ontology engineering
    corecore