1,870 research outputs found

    Pattern Reduction in Paper Cutting

    Get PDF
    A large part of the paper industry involves supplying customers with reels of specified width in specifed quantities. These 'customer reels' must be cut from a set of wider 'jumbo reels', in as economical a way as possible. The first priority is to minimize the waste, i.e. to satisfy the customer demands using as few jumbo reels as possible. This is an example of the one-dimensional cutting stock problem, which has an extensive literature. Greycon have developed cutting stock algorithms which they include in their software packages. Greycon's initial presentation to the Study Group posed several questions, which are listed below, along with (partial) answers arising from the work described in this report. (1) Given a minimum-waste solution, what is the minimum number of patterns required? It is shown in Section 2 that even when all the patterns appearing in minimum-waste solutions are known, determining the minimum number of patterns may be hard. It seems unlikely that one can guarantee to find the minimum number of patterns for large classes of realistic problems with only a few seconds on a PC available. (2) Given an n → n-1 algorithm, will it find an optimal solution to the minimum- pattern problem? There are problems for which n → n - 1 reductions are not possible although a more dramatic reduction is. (3) Is there an efficient n → n-1 algorithm? In light of Question 2, Question 3 should perhaps be rephrased as 'Is there an efficient algorithm to reduce n patterns?' However, if an algorithm guaranteed to find some reduction whenever one existed then it could be applied iteratively to minimize the number of patterns, and we have seen this cannot be done easily. (4) Are there efficient 5 → 4 and 4 → 3 algorithms? (5) Is it worthwhile seeking alternatives to greedy heuristics? In response to Questions 4 and 5, we point to the algorithm described in the report, or variants of it. Such approaches seem capable of catching many higher reductions. (6) Is there a way to find solutions with the smallest possible number of single patterns? The Study Group did not investigate methods tailored specifically to this task, but the algorithm proposed here seems to do reasonably well. It will not increase the number of singleton patterns under any circumstances, and when the number of singletons is high there will be many possible moves that tend to eliminate them. (7) Can a solution be found which reduces the number of knife changes? The algorithm will help to reduce the number of necessary knife changes because it works by bringing patterns closer together, even if this does not proceed fully to a pattern reduction. If two patterns are equal across some of the customer widths, the knives for these reels need not be changed when moving from one to the other

    The properties of a stochastic model for two competing species

    Get PDF

    Phenotypic and genotypic monitoring of Schistosoma mansoni in Tanzanian schoolchildren five years into a preventative chemotherapy national control programme

    Get PDF
    We conducted combined in vitro PZQ efficacy testing with population genetic analyses of S. mansoni collected from children from two schools in 2010, five years after the introduction of a National Control Programme. Children at one school had received four annual PZQ treatments and the other school had received two mass treatments in total. We compared genetic differentiation, indices of genetic diversity, and estimated adult worm burden from parasites collected in 2010 with samples collected in 2005 (before the control programme began) and in 2006 (six months after the first PZQ treatment). Using 2010 larval samples, we also compared the genetic similarity of those with high and low in vitro sensitivity to PZQ

    Chemotherapy versus supportive care in advanced non-small cell lung cancer: improved survival without detriment to quality of life

    Get PDF
    BACKGROUND: In 1995 a meta-analysis of randomised trials investigating the value of adding chemotherapy to primary treatment for non-small cell lung cancer (NSCLC) suggested a small survival benefit for cisplatin-based chemotherapy in each of the primary treatment settings. However, the metaanalysis included many small trials and trials with differing eligibility criteria and chemotherapy regimens. METHODS: The aim of the Big Lung Trial was to confirm the survival benefits seen in the meta-analysis and to assess quality of life and cost in the supportive care setting. A total of 725 patients were randomised to receive supportive care alone (n = 361) or supportive care plus cisplatin-based chemotherapy (n = 364). RESULTS: 65% of patients allocated chemotherapy (C) received all three cycles of treatment and a further 27% received one or two cycles. 74% of patients allocated no chemotherapy (NoC) received thoracic radiotherapy compared with 47% of the C group. Patients allocated C had a significantly better survival than those allocated NoC: HR 0.77 (95% CI 0.66 to 0.89, p = 0.0006), median survival 8.0 months for the C group v 5.7 months for the NoC group, a difference of 9 weeks. There were 19 (5%) treatment related deaths in the C group. There was no evidence that any subgroup benefited more or less fromchemotherapy. No significant differences were observed between the two groups in terms of the pre-defined primary and secondary quality of life end points, although large negative effects of chemotherapy were ruled out. The regimens used proved to be cost effective, the extra cost of chemotherapy being offset by longer survival. CONCLUSIONS: The survival benefit seen in this trial was entirely consistent with the NSCLC meta-analysis and subsequent similarly designed large trials. The information on quality of life and cost should enablepatients and their clinicians to make more informed treatment choices

    On site : installations by Tom Arthur... [et al.].

    Get PDF
    On site : installations by Tom Arthur... [et al.]. Catalogue of exhibition held at the Tasmanian School of Art Gallery, Sept. 4-Oct. 27, 1984 Includes bibliographical references Includes installations by Tom Arthur, Julie Brown, Elizabeth Gower and Hossein Valamanes

    A specialized autocode for the analysis of replicated experiments

    Get PDF
    The paper describes a general program, written for the Elliott 401, for the analysis of orthogonal or nearly orthogonal data, such as arise from replicated experiments. This is in essence of specialized autocode for performing on tables the types of operations required in such analyses, and is similar to Part 2 of our general survey program. Modifications and extensions planned for the Orion are briefly discussed
    • …
    corecore