579 research outputs found

    Lipase-catalyzed Reactions at Interfaces of Two-phase Systems and Microemulsions

    Get PDF
    This work describes the influence of two polar lipids, Sn-1/3 and Sn-2 monopalmitin, on the activity of lipase in biphasic systems and in microemulsions. In previous communications, we have shown that Sn-2 monoglycerides can replace Sn-1,3 regiospecific lipases at the oil-water interface, causing a drastically reduced rate of lipolysis. We here demonstrate that even if the lipase is expelled from the interface, it can catalyze esterification of the Sn-2 monoglyceride with fatty acids in both macroscopic oil-water systems and in microemulsions, leading to formation of di- and triglyceride

    IMPACT OF A MORE INTENSIVE INSECT PEST INFESTATION LEVEL ON COTTON PRODUCTION: TEXAS HIGH PLAINS

    Get PDF
    This study evaluated implications of increased bollworm problems in a 20-county area of the Texas High Plains relative to cotton yields and economic impact. Results did not indicate a serious effect of bollworms upon lint yield when insecticides were used for control. However, estimated annual reduction in farmer profit due to the bollworm for 1979-81 was over $30 million. Yields were estimated to decline about 300,000 bales without insecticide use and about 30,000 bales with insecticide use. This decline suggests potentially serious implications for the comparative economic position of cotton in this region if insecticide resistance were to develop among insect pests.Crop Production/Industries,

    Review of SERT 2 power conditioning

    Get PDF
    SERT 2 spacecraft power conditioner performanc

    Finding the optimal background subtraction algorithm for EuroHockey 2015 video

    Get PDF
    Background subtraction is a classic step in a vision-based localization and tracking workflow. Previous studies have compared background subtraction algorithms on publicly available datasets; however comparisons were made only with manually optimized parameters. The aim of this research was to identify the optimal background subtraction algorithm for a set of field hockey videos captured at EuroHockey 2015. Particle Swarm Optimization was applied to find the optimal background subtraction algorithm. The objective function was the F-score, i.e. the harmonic mean of precision and recall. The precision and recall were calculated using the output of the background subtraction algorithm and gold standard labeled images. The training dataset consisted of 15 x 13 second field hockey video segments. The test data consisted of 5 x 13 second field hockey video segments. The video segments were chosen to be representative of the teams present at the tournament, the times of day the matches were played and the weather conditions experienced. Each segment was 960 pixels x 540 pixels and had 10 ground truth labeled frames. Eight commonly used background subtraction algorithms were considered. Results suggest that a background subtraction algorithm must use optimized parameters for a valid comparison of performance. Particle Swarm Optimization is an appropriate method to undertake this optimization. The optimal algorithm, Temporal Median, achieved an F-score of 0.791 on the test dataset, suggesting it generalizes to the rest of the video footage captured at EuroHockey 2015

    El sistema constructivo

    Get PDF
    Se describe el sistema constructivo de entramado en madera, “plataforma” (platform frame) utilizado preferentemente en las construcciones de varios pisos en el campamento minero de Sewell, Chile./The following is a description of the platform frame used in several stories buildings in the mining settlement Sewell, in Chile

    Pathophysiology of acute experimental pancreatitis: Lessons from genetically engineered animal models and new molecular approaches

    Get PDF
    The incidence of acute pancreatitis is growing and worldwide population-based studies report a doubling or tripling since the 1970s. 25% of acute pancreatitis are severe and associated with histological changes of necrotizing pancreatitis. There is still no specific medical treatment for acute pancreatitis. The average mortality resides around 10%. In order to develop new specific medical treatment strategies for acute pancreatitis, a better understanding of the pathophysiology during the onset of acute pancreatitis is necessary. Since it is difficult to study the early acinar events in human pancreatitis, several animal models of acute pancreatitis have been developed. By this, it is hoped that clues into human pathophysiology become possible. In the last decade, while employing molecular biology techniques, a major progress has been made. The genome of the mouse was recently sequenced. Various strategies are possible to prove a causal effect of a single gene or protein, using either gain-of-function (i.e., overexpression of the protein of interest) or loss-of-function studies (i.e., genetic deletion of the gene of interest). The availability of transgenic mouse models and gene deletion studies has clearly increased our knowledge about the pathophysiology of acute pancreatitis and enables us to study and confirm in vitro findings in animal models. In addition, transgenic models with specific genetic deletion or overexpression of genes help in understanding the role of one specific protein in a cascade of inflammatory processes such as pancreatitis where different proteins interact and co-react. This review summarizes the recent progress in this field. Copyright (c) 2005 S. Karger AG, Basel

    Validity constraints for data analysis workflows

    Get PDF
    \ua9 2024Porting a scientific data analysis workflow (DAW) to a cluster infrastructure, a new software stack, or even only a new dataset with some notably different properties is often challenging. Despite the structured definition of the steps (tasks) and their interdependencies during a complex data analysis in the DAW specification, relevant assumptions may remain unspecified and implicit. Such hidden assumptions often lead to crashing tasks without a reasonable error message, poor performance in general, non-terminating executions, or silent wrong results of the DAW, to name only a few possible consequences. Searching for the causes of such errors and drawbacks in a distributed compute cluster managed by a complex infrastructure stack, where DAWs for large datasets typically are executed, can be tedious and time-consuming. We propose validity constraints (VCs) as a new concept for DAW languages to alleviate this situation. A VC is a constraint specifying logical conditions that must be fulfilled at certain times for DAW executions to be valid. When defined together with a DAW, VCs help to improve the portability, adaptability, and reusability of DAWs by making implicit assumptions explicit. Once specified, VCs can be controlled automatically by the DAW infrastructure, and violations can lead to meaningful error messages and graceful behavior (e.g., termination or invocation of repair mechanisms). We provide a broad list of possible VCs, classify them along multiple dimensions, and compare them to similar concepts one can find in related fields. We also provide a proof-of-concept implementation for the workflow system Nextflow
    corecore