8,828 research outputs found

    Farrowing accommodation for organic pigs

    Get PDF
    Newborn piglets in organic farrowing pens have a lower survival rate than in conventional farrowing pens. This difference is mainly caused by housing the sow loose compared to crated and by climatic effects of the outdoor temperature. Organic lactating sows should have at least 7.5 m² indoor area with straw and a 2.5 m² outdoor run. The aim of the project was to increase piglet survival in order to improve animal welfare as well as the profitability of organic farms. In the experiment we compared survival and behaviour in 3 pen types: type 1 with outdoor run, type 2 with an indoor run and a higher proportion of solid floor and type 3 without run. Data was analysed with Analysis of Variance using parity and liveborn piglets as covariables. Results of 131 litters in total showed 9.6a, 10.8b and 9.4a (p=0.05) weaned piglets per litter for pen type 1, 2 and 3. Fouling scores indicating dunging behaviour in the indoor lying area showed 13a, 21b and 19b (p=0.04) for pen types 1, 2 and 3. We found a tendency that litters with high survival rates used the separate piglet nest sooner for lying than the litters with low survival rates. Climatic conditions seemed to be crucial for the vitality and survival of the newborn piglets. The better climatic conditions combined with the higher proportion of solid floor resulted in a higher survival rate of the piglets. These results are currently used in a second experiment focussing on extra heating around farrowing and solid floor proportion in a new farrowing accommodation

    Control versus Data Flow in Parallel Database Machines

    Get PDF
    The execution of a query in a parallel database machine can be controlled in either a control flow way, or in a data flow way. In the former case a single system node controls the entire query execution. In the latter case the processes that execute the query, although possibly running on different nodes of the system, trigger each other. Lately, many database research projects focus on data flow control since it should enhance response times and throughput. The authors study control versus data flow with regard to controlling the execution of database queries. An analytical model is used to compare control and data flow in order to gain insights into the question which mechanism is better under which circumstances. Also, some systems using data flow techniques are described, and the authors investigate to which degree they are really data flow. The results show that for particular types of queries data flow is very attractive, since it reduces the number of control messages and balances these messages over the node

    On the selection of secondary indices in relational databases

    Get PDF
    An important problem in the physical design of databases is the selection of secondary indices. In general, this problem cannot be solved in an optimal way due to the complexity of the selection process. Often use is made of heuristics such as the well-known ADD and DROP algorithms. In this paper it will be shown that frequently used cost functions can be classified as super- or submodular functions. For these functions several mathematical properties have been derived which reduce the complexity of the index selection problem. These properties will be used to develop a tool for physical database design and also give a mathematical foundation for the success of the before-mentioned ADD and DROP algorithms

    On parity check collections for iterative erasure decoding that correct all correctable erasure patterns of a given size

    Full text link
    Recently there has been interest in the construction of small parity check sets for iterative decoding of the Hamming code with the property that each uncorrectable (or stopping) set of size three is the support of a codeword and hence uncorrectable anyway. Here we reformulate and generalise the problem, and improve on this construction. First we show that a parity check collection that corrects all correctable erasure patterns of size m for the r-th order Hamming code (i.e, the Hamming code with codimension r) provides for all codes of codimension rr a corresponding ``generic'' parity check collection with this property. This leads naturally to a necessary and sufficient condition on such generic parity check collections. We use this condition to construct a generic parity check collection for codes of codimension r correcting all correctable erasure patterns of size at most m, for all r and m <= r, thus generalising the known construction for m=3. Then we discussoptimality of our construction and show that it can be improved for m>=3 and r large enough. Finally we discuss some directions for further research.Comment: 13 pages, no figures. Submitted to IEEE Transactions on Information Theory, July 28, 200

    All Abelian Quotient C.I.-Singularities Admit Projective Crepant Resolutions in All Dimensions

    Get PDF
    For Gorenstein quotient spaces Cd/GC^d/G, a direct generalization of the classical McKay correspondence in dimensions d4d\geq 4 would primarily demand the existence of projective, crepant desingularizations. Since this turned out to be not always possible, Reid asked about special classes of such quotient spaces which would satisfy the above property. We prove that the underlying spaces of all Gorenstein abelian quotient singularities, which are embeddable as complete intersections of hypersurfaces in an affine space, have torus-equivariant projective crepant resolutions in all dimensions. We use techniques from toric and discrete geometry.Comment: revised version of MPI-preprint 97/4, 35 pages, 13 figures, latex2e-file (preprint.tex), macro packages and eps-file
    corecore