11,923 research outputs found

    Performance, lean meat proportion and behaviour of fattening pigs given a liquid diet at different animal/feeding-place ratios

    Get PDF
    Sensor feeding is a liquid feeding system for fattening pigs that is operated with a restricted animal/feeding-place ratio (AFR). The aim of the present study was to quantify the effect of three different AFRs (4:1, 7:1 and 13:1, calculated with a feeding space of 33 cm per animal) on the performance and behaviour of fattening pigs (mean initial weight 26·3 (s.d. 3·3) kg, live weight at slaughter 102 (s.d. 5) kg). The pigs were housed in groups of 40 and each AFR was tested with seven groups (21 groups in total). The daily weight gain of the individual pigs was calculated from the beginning of the experiments until slaughter. Additionally, the lean meat percentage was recorded (AutoFOM). Feeding behaviour was observed by means of 24-h video recording at the ages of 14 and 17 weeks with scan sampling every 5 min. The daily weight gain decreased with increasing AFR ( P<0·01) and females had lower weight gains than barrows ( P<0·001). The lean meat proportion was influenced by the AFR ( P<0·01) and sex of the pigs ( P<0·001). Proportions were highest with the AFR 13:1 and in females. The average number of pigs feeding simultaneously was highest for the AFR of 4:1 ( P<0·01). Moreover, the ingestion rate per day (kg/min) increased with increasing AFR ( P<0·05). The average number of pigs waiting behind other pigs feeding at the trough was highest with the AFR 13:1 ( P<0·001).In conclusion, growth performance and pig behaviour were negatively affected by an AFR of 13:1, which cannot be recommended for use with this feeding system. With an AFR of 4:1 lean meat values were lo

    Parallel Batch-Dynamic Graph Connectivity

    Full text link
    In this paper, we study batch parallel algorithms for the dynamic connectivity problem, a fundamental problem that has received considerable attention in the sequential setting. The most well known sequential algorithm for dynamic connectivity is the elegant level-set algorithm of Holm, de Lichtenberg and Thorup (HDT), which achieves O(log2n)O(\log^2 n) amortized time per edge insertion or deletion, and O(logn/loglogn)O(\log n / \log\log n) time per query. We design a parallel batch-dynamic connectivity algorithm that is work-efficient with respect to the HDT algorithm for small batch sizes, and is asymptotically faster when the average batch size is sufficiently large. Given a sequence of batched updates, where Δ\Delta is the average batch size of all deletions, our algorithm achieves O(lognlog(1+n/Δ))O(\log n \log(1 + n / \Delta)) expected amortized work per edge insertion and deletion and O(log3n)O(\log^3 n) depth w.h.p. Our algorithm answers a batch of kk connectivity queries in O(klog(1+n/k))O(k \log(1 + n/k)) expected work and O(logn)O(\log n) depth w.h.p. To the best of our knowledge, our algorithm is the first parallel batch-dynamic algorithm for connectivity.Comment: This is the full version of the paper appearing in the ACM Symposium on Parallelism in Algorithms and Architectures (SPAA), 201

    Estimation non paramétrique des quantiles de crue par la méthode des noyaux

    Get PDF
    La détermination du débit de crue d'une période de retour donnée nécessite l'estimation de la distribution des crues annuelles. L'utilisation des distributions non paramétriques - comme alternative aux lois statistiques - est examinée dans cet ouvrage. Le principal défi dans l'estimation par la méthode des noyaux réside dans le calcul du paramètre qui détermine le degré de lissage de la densité non paramétrique. Nous avons comparé plusieurs méthodes et avons retenu la méthode plug-in et la méthode des moindres carrés avec validation croisée comme les plus prometteuses.Plusieurs conclusions intéressantes ont été tirées de cette étude. Entre autres, pour l'estimation des quantiles de crue, il semble préférable de considérer des estimateurs basés directement sur la fonction de distribution plutôt que sur la fonction de densité. Une comparaison de la méthode plug-in à l'ajustement de trois lois statistiques a permis de conclure que la méthode des noyaux représente une alternative intéressante aux méthodes paramétriques traditionnelles.Traditional flood frequency analysis involves the fitting of a statistical distribution to observed annual peak flows. The choice of statistical distribution is crucial, since it can have significant impact on design flow estimates. Unfortunately, it is often difficult to determine in an objective way which distribution is the most appropriate.To avoid the inherent arbitrariness associated with the choice of distribution in parametric frequency analysis, one can employ a method based on nonparametric density estimation. Although potentially subject to larger standard error of quantile estimates, the use of nonparametric densities eliminates the need for selecting a particular distribution and the potential bias associated with a wrong choice.The kernel method is a conceptually simple approach, similar in nature to a smoothed histogram. The critical parameter in kernel estimation is the smoothing parameter that determines the degree of smoothing. Methods for estimating the smoothing parameter have already been compared in a number of statistical papers. The novelty of our work is the particular emphasis on quantile estimation, in particular the estimation of quantiles outside the range of observed data. The flood estimation problem is unique in this sense and has been the motivating factor for this study.Seven methods for estimating the smoothing parameter are compared in the paper. All methods are based on some goodness-of-fit measures. More specifically, we considered the least-squares cross-validation method, the maximum likelihood cross-validation method, Adamowski's (1985) method, a plug-in method developed by Altman and Leger (1995) and modified by the authors (Faucher et al., 2001), Breiman's goodness-of-fit criterion method (Breiman, 1977), the variable-kernel maximum likelihood method, and the variable-kernel least-squares cross-validation method.The estimation methods can be classified according to whether they are based on fixed or variable kernels, and whether they are based on the goodness-of-fit of the density function or cumulative distribution function.The quality of the different estimation methods was explored in a Monte Carlo study. Hundred (100) samples of sizes 10, 20, 50, and 100 were simulated from an LP3 distribution. The nonparametric estimation methods were then applied to each of the simulated samples, and quantiles with return period 10, 20, 50, 100, 200, and 1000 were estimated. Bias and root-mean square error of quantile estimates were the key figures used to compare methods. The results of the study can be summarized as follows :1. Comparison of kernels. The literature reports that the kernel choice is relatively unimportant compared to the choice of the smoothing parameter. To determine whether this assertion also holds in the case of the estimation of large quantiles outside the range of data, we compared six different kernel candidates. We found no major differences between the biweight, the Normal, the Epanechnikov, and the EV1 kernels. However, the rectangular and the Cauchy kernel should be avoided.2. Comparison of sample size. The quality of estimates, whether parametric or nonparametric, deteriorates as sample size decreases. To examine the degree of sensitivity to sample size, we compared estimates of the 200-year event obtained by assuming a GEV distribution and a nonparametric density estimated by maximum likelihood cross-validation. The main conclusion is that the root mean square error for the parametric model (GEV) is more sensitive to sample size than the nonparametric model. 3. Comparison of estimators of the smoothing parameter. Among the methods considered in the study, the plug-in method, developed by Altman and Leger (1995) and modified by the authors (Faucher et al. 2001), turned out to perform the best along with the least-squares cross-validation method which had a similar performance. Adamowski's method had to be excluded, because it consistently failed to converge. The methods based on variable kernels generally did not perform as well as the fixed kernel methods.4. Comparison of density-based and cumulative distribution-based methods. The only cumulative distribution-based method considered in the comparison study was the plug-in method. Adamowski's method is also based on the cumulative distribution function, but was rejected for the reasons mentioned above. Although the plug-in method did well in the comparison, it is not clear whether this can be attributed to the fact that it is based on estimation of the cumulative distribution function. However, one could hypothesize that when the objective is to estimate quantiles, a method that emphasizes the cumulative distribution function rather than the density should have certain advantages. 5. Comparison of parametric and nonparametric methods. Nonparametric methods were compared with conventional parametric methods. The LP3, the 2-parameter lognormal, and the GEV distributions were used to fit the simulated samples. It was found that nonparametric methods perform quite similarly to the parametric methods. This is a significant result, because data were generated from an LP3 distribution so one would intuitively expect the LP3 model to be superior which however was not the case. In actual applications, flood distributions are often irregular and in such cases nonparametric methods would likely be superior to parametric methods

    Light bullets in quadratic media with normal dispersion at the second harmonic

    Full text link
    Stable two- and three-dimensional spatiotemporal solitons (STSs) in second-harmonic-generating media are found in the case of normal dispersion at the second harmonic (SH). This result, surprising from the theoretical viewpoint, opens a way for experimental realization of STSs. An analytical estimate for the existence of STSs is derived, and full results, including a complete stability diagram, are obtained in a numerical form. STSs withstand not only the normal SH dispersion, but also finite walk-off between the harmonics, and readily self-trap from a Gaussian pulse launched at the fundamental frequency.Comment: 4 pages, 5 figures, accepted to Phys. Rev. Let

    Bayesian optimization of the PC algorithm for learning Gaussian Bayesian networks

    Full text link
    The PC algorithm is a popular method for learning the structure of Gaussian Bayesian networks. It carries out statistical tests to determine absent edges in the network. It is hence governed by two parameters: (i) The type of test, and (ii) its significance level. These parameters are usually set to values recommended by an expert. Nevertheless, such an approach can suffer from human bias, leading to suboptimal reconstruction results. In this paper we consider a more principled approach for choosing these parameters in an automatic way. For this we optimize a reconstruction score evaluated on a set of different Gaussian Bayesian networks. This objective is expensive to evaluate and lacks a closed-form expression, which means that Bayesian optimization (BO) is a natural choice. BO methods use a model to guide the search and are hence able to exploit smoothness properties of the objective surface. We show that the parameters found by a BO method outperform those found by a random search strategy and the expert recommendation. Importantly, we have found that an often overlooked statistical test provides the best over-all reconstruction results

    Re-evaluation of cosmic ray cutoff terminology

    Get PDF
    The study of cosmic ray access to locations inside the geomagnetic field has evolved in a manner that has led to some misunderstanding and misapplication of the terminology originally developed to describe particle access. This paper presents what is believed to be a useful set of definitions for cosmic ray cutoff terminology for use in theoretical and experimental cosmic ray studies

    Heterobimetallic conducting polymers based on salophen complexes via electrosynthesis

    Get PDF
    In this work, we report the first electrochemical synthesis of two copolymeric bimetallic conducting polymers by a simple anodic electropolymerization method. The adopted precursors are electroactive transition metal (M = Ni, Cu and Fe) salophen complexes, which can be easily obtained by direct chemical synthesis. The resulting films, labeled poly-NiCu and poly-CuFe, were characterized by cyclic voltammetry in both organic and aqueous media, attenuated total reflectance Fourier transform infrared spectroscopy, UV-vis spectroscopy, scanning electron microscopy, and coupled energy dispersive X-ray spectroscopy. The films are conductive and exhibit great electrochemical stability in both organic and aqueous media (resistant over 100 cycles without significant loss in current response or changes in electrochemical behavior), which makes them good candidates for an array of potential applications. Electrochemical detection of ascorbic acid was performed using both materials

    Soliton motion in a parametrically ac-driven damped Toda lattice

    Full text link
    We demonstrate that a staggered parametric ac driving term can support stable progressive motion of a soliton in a Toda lattice with friction, while an unstaggered drivng force cannot. A physical context of the model is that of a chain of anharmonically coupled particles adsorbed on a solid surface of a finite size. The ac driving force models a standing acoustic wave excited on the surface. Simulations demonstrate that the state left behind the moving soliton, with the particles shifted from their equilibrium positions, gradually relaxes back to the equilibrium state that existed before the passage of the soliton. Perturbation theory predicts that the ac-driven soliton exists if the amplitude of the drive exceeds a certain threshold. The analytical prediction for the threshold is in reasonable agreement with that found numerically. Collisions between two counter propagating solitons were also simulated, demonstrating that the collisions are, essentially fully elastic
    corecore