647 research outputs found

    Plant-like substitutions in the large-subunit carboxy terminus of Chlamydomonas Rubisco increase CO2/O2 Specificity

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Ribulose-1,5-bisphosphate is the rate-limiting enzyme in photosynthesis. The catalytic large subunit of the green-algal enzyme from <it>Chlamydomonas reinhardtii </it>is ~90% identical to the flowering-plant sequences, although they confer diverse kinetic properties. To identify the regions that may account for species variation in kinetic properties, directed mutagenesis and chloroplast transformation were used to create four amino-acid substitutions in the carboxy terminus of the <it>Chlamydomonas </it>large subunit to mimic the sequence of higher-specificity plant enzymes.</p> <p>Results</p> <p>The quadruple-mutant enzyme has a 10% increase in CO<sub>2</sub>/O<sub>2 </sub>specificity and a lower carboxylation catalytic efficiency. The mutations do not seem to influence the protein expression, structural stability or the function in vivo.</p> <p>Conclusion</p> <p>Owing to the decreased carboxylation catalytic efficiency, the quadruple-mutant is not a "better" enzyme. Nonetheless, because of its positive influence on specificity, the carboxy terminus, relatively far from the active site, may serve as a target for enzyme improvement via combinatorial approaches.</p

    Conceptualizing a Framework of Institutionalized Appellate Arbitration in International Commercial Arbitration

    Get PDF
    The absence of the option to prefer substantive appeals from arbitral adjudication is a conspicuous systemic peculiarity of the arbitral process. While this absence has for the most part been accepted without question or resistance as being an axiomatic entailment of the arbitral process, the last two decades have witnessed an increasing amount of criticism directed at it, both from scholarship as well as the business community. The criticism has been especially emphatic, in relation to international commercial arbitrations, a sizeable proportion of which pertain to complex and high stake disputes. Moreover, there has been a concurrent increase in the demand from commercial parties for the provision of the option to subject arbitral awards to substantive review. In response to this, some major ADR Institutions such as CPR, JAMS and recently, the AAA have introduced provisions within their respective frameworks in respect of the optional substantive review of arbitral awards. These developments necessitate a reevaluation of the integrality of the systemic absence of substantive appeals to arbitration. More important is the appraisal of the conduciveness of such absence to the interests of parties desirous of arbitrating their disputes. This in turn requires a cogitation of the arguments advanced both in support of and in opposition to the institutionalization of arbitral appeals, i.e. introducing them at the State level and not only at the level of ADR Institutions, to determine two things: firstly, whether the benefits therefrom would significantly outweigh the potential drawbacks thereof, and secondly, whether the effects of such drawbacks can be offset, if not altogether eliminated. The present article answers both the abovementioned questions in the affirmative. Further, it conceptualizes a model of appellate arbitration tailored specifically to adequately address the possible drawbacks of institutionalizing arbitral appeals. This model, which I have christened the “Novel Appellate Arbitration Model” or NAAM has been structured in a manner as to facilitate meaningful error correction, while at the same time largely preserving the ‘classical advantages’ of arbitration such as finality, speed, neutrality and inexpensiveness

    Optimal designs for two-stage genome-wide association studies

    Full text link
    Genome-wide association (GWA) studies require genotyping hundreds of thousands of markers on thousands of subjects, and are expensive at current genotyping costs. To conserve resources, many GWA studies are adopting a staged design in which a proportion of the available samples are genotyped on all markers in stage 1, and a proportion of these markers are genotyped on the remaining samples in stage 2. We describe a strategy for designing cost-effective two-stage GWA studies. Our strategy preserves much of the power of the corresponding one-stage design and minimizes the genotyping cost of the study while allowing for differences in per genotyping cost between stages 1 and 2. We show that the ratio of stage 2 to stage 1 per genotype cost can strongly influence both the optimal design and the genotyping cost of the study. Increasing the stage 2 per genotype cost shifts more of the genotyping and study cost to stage 1, and increases the cost of the study. This higher cost can be partially mitigated by adopting a design with reduced power while preserving the false positive rate or by increasing the false positive rate while preserving power. For example, reducing the power preserved in the two-stage design from 99 to 95% that of the one-stage design decreases the two-stage study cost by ∼15%. Alternatively, the same cost savings can be had by relaxing the false positive rate by 2.5-fold, for example from 1/300,000 to 2.5/300,000, while retaining the same power. Genet. Epidemiol . 2007. © 2007 Wiley-Liss, Inc.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/57367/1/20240_ftp.pd

    DFT Techniques and Automation for Asynchronous NULL Conventional Logic Circuits

    Get PDF
    Conventional automatic test pattern generation (ATPG) algorithms fail when applied to asynchronous NULL convention logic (NCL) circuits due to the absence of a global clock and presence of more state-holding elements, leading to poor fault coverage. This paper presents a design-for-test (DFT) approach aimed at making asynchronous NCL designs testable using conventional ATPG programs. We propose an automatic DFT insertion flow (ADIF) methodology that performs scan and test point insertion on NCL designs to improve test coverage, using a custom ATPG library. Experimental results show significant increase in fault coverage for NCL cyclic and acyclic pipelined designs

    A novel system to obtain addresses of out-patients-assessent in routine clinic practice in Madras

    Get PDF
    A novel method of obtaining accurate home addresses from out-patients was introduced as a routine procedure in 6 chest clinics of Madras City, following highly satisfactory results under study conditions. In this method, the patient is given a card (the address card), and asked to get his exact address entered on it by any knowledgeable person of his choice such as a landlord or neighbour. An assessment of the system was undertaken after it had been in operation for about 8 months. A complete and legible address was available for 82 % of 3956 patients, the range in the 6 clinics being 74 % to 91 %. The main causes for failure were : not giving address card to patient (7 %), patient not reattending the clinic (6 %), and patient reattending but not returning the address card (3%). Corrective measures have now been introduced, and a re-assessment will be undertaken in due course

    A novel system to obtain addresses of out-patients-assessment in routine clinic practice in Madras

    Get PDF
    A novel method of obtaining accurate home addresses from out-patients was introduced as a routine procedure in 6 chest clinics of Madras City, following highly satisfactory results under study conditions. In this method, the patient is given a card (the Address card), and asked to get his exact address entered on it by any knowledgeable person of his choice such as the landlord or a neighbour. An assessment of the system was undertaken after it had been in operation for about 8 months. A complete and legible address was available for 82 % of 3956 patients, the range in the 6 clinics being 74% to 91%. The main causes for failure were: not giving Address card to patient (7 %), patient not reattending the clinic (6%), and patient reattending but not returning the Address card (3 %). Corrective measures have now been introduced, and a re-assessment will be undertaken in due course

    Automated energy calculation and estimation for delayinsensitive digital circuits

    Get PDF
    Abstract With increasingly smaller feature sizes and higher on-chip densities, the power dissipation of VLSI systems has become a primary concern for designers. This paper first describes a procedure to simulate a transistor-level design using a VHDL testbench, and then presents a fast and efficient energy estimation approach for delay-insensitive (DI) systems, based on gate-level switching. Specifically, the VHDL testbench reads the transistor-level design&apos;s outputs and supplies the inputs accordingly, also allowing for automatic checking of functional correctness. This type of transistor-level simulation is absolutely necessary for asynchronous circuits because the inputs change relative to handshaking signals, which are not periodic, instead of changing relative to a periodic clock pulse, as do synchronous systems. The method further supports automated calculation of power and energy metrics. The energy estimation approach produces results three orders of magnitude faster than transistor-level simulation, and has been automated and works with standard industrial design tool suites, such as Mentor Graphics and Synopsys. Both methods are applied to the NULL Convention Logic (NCL) DI paradigm, and are first demonstrated using a simple NCL sequencer, and then tested on a number of different NCL 4-bit  4-bit unsigned multiplier architectures. Energy per operation is automatically calculated for both methods, using an exhaustive testbench to simulate all input combinations and to check for functional correctness. The results show that both methods produce the desired output for all circuits, and that the gate-level switching approach developed herein produces results more than 1000 times as fast as transistor-level simulation, that fall within the range obtained by two different industry-standard transistor-level simulators. Hence, the developed energy estimation method is extremely useful for quickly determining how architecture changes affect energy usage.

    Genetic screening for gynecological cancer: where are we heading?

    Get PDF
    The landscape of cancer genetics in gynecological oncology is rapidly changing. The traditional family history-based approach has limitations and misses >50% mutation carriers. This is now being replaced by population-based approaches. The need for changing the clinical paradigm from family history-based to population-based BRCA1/BRCA2 testing in Ashkenazi Jews is supported by data that demonstrate population-based BRCA1/BRCA2 testing does not cause psychological harm and is cost effective. This article covers various genetic testing strategies for gynecological cancers, including population-based approaches, panel and direct-to-consumer testing as well as the need for innovative approaches to genetic counseling. Advances in genetic testing technology and computational analytics have facilitated an integrated systems medicine approach, providing increasing potential for population-based genetic testing, risk stratification, and cancer prevention. Genomic information along-with biological/computational tools will be used to deliver predictive, preventive, personalized and participatory (P4) and precision medicine in the future

    Hands-On Projects and Exercises to Strengthen Understanding of Basic Computer Engineering Concepts

    Get PDF
    The Introduction to Computer Engineering course at the University of Missouri-Rolla provides a thorough understanding of basic digital logic analysis and design. The course covers: digital numbering systems, Boolean algebra, function minimization using Karnaugh maps (K-maps), memory elements, and sequential logic design. Students\u27 grades are determined by their performance on homework assignments, quizzes, and in-class examinations. A laboratory course (optional for all but EE and CpE majors) supplements the lecture by providing experiments that include analysis and design using Mentor Graphics and FPGAs. While the laboratory is a very useful supplement to the lecture, almost half the students taking the lecture are not required to take the laboratory and there is not sufficient time in the laboratory schedule to introduce significant design elements. In Fall 2004, hands-on group projects, for all students, were introduced to the lecture course. The goal was for students to develop a more practical understanding and appreciation of hardware design and to improve motivation. Two projects were introduced that involve design of simple digital systems (based on practical applications), design optimization, and physical realization of the system using logic gates and/or memory elements. Two surveys, conducted during the semester, show the benefit of hands-on projects in gaining experience on basic digital hardware design
    corecore