836 research outputs found

    Soil Variation and Sampling Intensity Under Red Pine and Aspen in Minnesota

    Get PDF

    Adaptive Neural Compilation

    Full text link
    This paper proposes an adaptive neural-compilation framework to address the problem of efficient program learning. Traditional code optimisation strategies used in compilers are based on applying pre-specified set of transformations that make the code faster to execute without changing its semantics. In contrast, our work involves adapting programs to make them more efficient while considering correctness only on a target input distribution. Our approach is inspired by the recent works on differentiable representations of programs. We show that it is possible to compile programs written in a low-level language to a differentiable representation. We also show how programs in this representation can be optimised to make them efficient on a target distribution of inputs. Experimental results demonstrate that our approach enables learning specifically-tuned algorithms for given data distributions with a high success rate.Comment: Submitted to NIPS 2016, code and supplementary materials will be available on author's pag

    Efficient Linear Programming for Dense CRFs

    Get PDF
    The fully connected conditional random field (CRF) with Gaussian pairwise potentials has proven popular and effective for multi-class semantic segmentation. While the energy of a dense CRF can be minimized accurately using a linear programming (LP) relaxation, the state-of-the-art algorithm is too slow to be useful in practice. To alleviate this deficiency, we introduce an efficient LP minimization algorithm for dense CRFs. To this end, we develop a proximal minimization framework, where the dual of each proximal problem is optimized via block coordinate descent. We show that each block of variables can be efficiently optimized. Specifically, for one block, the problem decomposes into significantly smaller subproblems, each of which is defined over a single pixel. For the other block, the problem is optimized via conditional gradient descent. This has two advantages: 1) the conditional gradient can be computed in a time linear in the number of pixels and labels; and 2) the optimal step size can be computed analytically. Our experiments on standard datasets provide compelling evidence that our approach outperforms all existing baselines including the previous LP based approach for dense CRFs.Comment: 24 pages, 10 figures and 4 table

    Efficient Relaxations for Dense CRFs with Sparse Higher Order Potentials

    Full text link
    Dense conditional random fields (CRFs) have become a popular framework for modelling several problems in computer vision such as stereo correspondence and multi-class semantic segmentation. By modelling long-range interactions, dense CRFs provide a labelling that captures finer detail than their sparse counterparts. Currently, the state-of-the-art algorithm performs mean-field inference using a filter-based method but fails to provide a strong theoretical guarantee on the quality of the solution. A question naturally arises as to whether it is possible to obtain a maximum a posteriori (MAP) estimate of a dense CRF using a principled method. Within this paper, we show that this is indeed possible. We will show that, by using a filter-based method, continuous relaxations of the MAP problem can be optimised efficiently using state-of-the-art algorithms. Specifically, we will solve a quadratic programming (QP) relaxation using the Frank-Wolfe algorithm and a linear programming (LP) relaxation by developing a proximal minimisation framework. By exploiting labelling consistency in the higher-order potentials and utilising the filter-based method, we are able to formulate the above algorithms such that each iteration has a complexity linear in the number of classes and random variables. The presented algorithms can be applied to any labelling problem using a dense CRF with sparse higher-order potentials. In this paper, we use semantic segmentation as an example application as it demonstrates the ability of the algorithm to scale to dense CRFs with large dimensions. We perform experiments on the Pascal dataset to indicate that the presented algorithms are able to attain lower energies than the mean-field inference method

    Assessment of human risk for yersinosis after consumption of Danish produced non-heat-treated ready-to-eat pork products

    Get PDF
    The objectives were to evaluate the risk of obtaining an infective dose of Yersinia enterocolitica (Y. enterocolitica) after consuming fermented sausages (made in a controlled process) and smoked filet made of Danish pork. For fermented sausages it was estimated that a maximum of two bacteria would be present in a serving of up to 40 g. However, most likely only one bacterium would be present per serving (4,000 times of one million simulations of 40 g serving’s = 0.4 %)

    Clogging by sieving in microchannels: Application to the detection of contaminants in colloidal suspensions

    Full text link
    We report on a microfluidic method that allows measurement of a small concentration of large contaminants in suspensions of solid micrometer-scale particles. To perform the measurement, we flow the colloidal suspension through a series of constrictions, i.e. a microchannel of varying cross-section. We show and quantify the role of large contaminants in the formation of clogs at a constriction and the growth of the resulting filter cake. By measuring the time interval between two clogging events in an array of parallel microchannels, we are able to estimate the concentration of contaminants whose size is selected by the geometry of the microfluidic device. This technique for characterizing colloidal suspensions offers a versatile and rapid tool to explore the role of contaminants on the properties of the suspensions

    The impact of aspen harvesting on site productivity

    Get PDF

    Handling of chronic cases of pyaemia/osteomyelitis in finishing pigs in Denmark – is de-boning necessary to maintain food safety?

    Get PDF
    Meat inspection is up for debate and one issue deals with how to handle chronic cases of pyaemia/ osteomyelitis in finishing pigs. In Denmark, such carcasses are required to be de-boned to avoid presence of osteomyelitis not found in the rework area. Around 40,000 pigs (0.24%) are subjected to de-boning in Denmark per year, and the associated costs amount to approx. €3 million. The questions are: 1) is the meat from such pigs fit for human consumption? 2) Is de-boning necessary, or do the meat inspectors find what they should in the rework area? And 3) which alternative practices could replace de-boning? To address this, data covering 1 year were extracted from the Danish Slaughterhouse Database including information from the 7 largest Danish abattoirs. Registration schemes covering findings during de-boning and the result of de-boning (approved/ condemned) were provided by the individual abattoirs. Additionally, a questionnaire survey was undertaken regarding the de-boning personals’ experience related to de-boning. Furthermore, samples from 102 pigs sent for de-boning at one slaughterhouse were collected. These samples included abscesses found in pigs at the rework area plus one muscle sample per pig. All samples underwent microbiological investigation. As a control group, microbiological results obtained from a similar study from carcasses unconditionally approved at meat inspection were included. Staphylococcus aureus, which has the potential to cause human illness, was found in 15 abscesses and 1 muscle of the 102 pigs sent for de-boning. S. aureus was also found in 1 of the 60 control samples. The results were included in a risk assessment that revealed the same very low health risk related to consumption of meat from de-boned pigs as from fully accepted pigs. Abscesses were found at de-boning in a low proportion of the pigs, at different sites of the carcass, varying between abattoirs. The vast majority of pigs sent for de-boning were accepted after de-boning (99.7%). If routine de-boning is no longer required, then focus on a thorough inspection at the rework-area will most likely result in a higher probability of finding abscesses at that stage of inspection. Moreover, overlooked abscesses will be found during cutting. Therefore de-boning is not considered necessary and could be replaced by condemnation of the affected part(s) only

    Risk-based surveillance for human health hazards: the example of Trichinella

    Get PDF
    Increasing demands for cost-effectiveness in surveillance for human health hazards can be met by introducmg risk-based principles. This implies targeting subpopulations w1th higher risk of infection compared to the whole population. We demonstrate how historical data from surveillance can be used to assess risk of infection. The model is called Discounting historical evidence and depends mainly on two variables: Annual risk of introduction Plntro and surveillance system sensitivity SSe (ability to detect infection if present). The model Implies simulations that reiterate for a number of years, and for each year the output is updated with the confidence on absence of infection. Trichinella spiralis infection in pigs is used as an example. In Denmark, pigs at slaughter are tested (currently 23 million per year), and despite of \u3e 70 years of sampling no pigs have been found positive. Hence, we concluded that Plntro is low. SSe can be estimated from the maximum number of infected carcasses expected under the specified design prevalence, and the sensitivity of the test applied. According to the assessment, the prevalence of Trichmella in Danish pigs is negligible (\u3c1 case/million). Based on this , a risk-based surveillance programme for Trichinella is designed that targets all out-door reared pigs as well as all sows and boars (currently 610,000 per year). Compared to confined pigs, outdoor-reared pigs have higher risk of getting Trichinella because of their exposure to wildlife, which might harbour Trichinella. Sows and boars are at increased risk, because they live longer than fimshers. Again, SSe and Plntro are estimated and the model is used to show how risk-based surveillance can be applied without jeopardizing human health. Finally, we incorporate wildlife surveys and test quality assurance in the programme. The model results are included in an application to the European Commission concern1ng Denmark\u27s status as a region with negligible risk of Trichinella

    Aspen ecosystem properties in the Upper Great Lakes

    Get PDF
    • …
    corecore