89,274 research outputs found

    Experiments to shed light on the best way to use Iterated Local Search for a complex combinatorial problem

    Get PDF
    Iterated Local Search (ILS) is a popular metaheuristic search technique for use on combinatorial optimisation problems. As with most such techniques, there are many ways in which ILS can be implemented. The aim of this paper is to shed light on the best variants and choice of parameters when using ILS on a complex combinatorial problem with many objectives, by reporting on the results of an exhaustive set of experimental computer runs using ILS for a real-life sports scheduling problem. The results confirm the prevailing orthodoxy that a random element is ended for the ILS "kick", but also concludes that a non-random element can be valuable if it is chosen intelligently. Under these circumstances it is also found that the best ILS acceptance criterion to choose appears to depend upon the length of the run; for short runs, a high-diversification approach works best; for very long runs a high-intensification approach is best; while between these extremes, a more sophisticated approach using simulated annealing or threshold methods appears to be best

    Enriched factorization systems

    Full text link
    In a paper of 1974, Brian Day employed a notion of factorization system in the context of enriched category theory, replacing the usual diagonal lifting property with a corresponding criterion phrased in terms of hom-objects. We set forth the basic theory of such enriched factorization systems. In particular, we establish stability properties for enriched prefactorization systems, we examine the relation of enriched to ordinary factorization systems, and we provide general results for obtaining enriched factorizations by means of wide (co)intersections. As a special case, we prove results on the existence of enriched factorization systems involving enriched strong monomorphisms or strong epimorphisms

    Some conservative stopping rules for the operational testing of safety-critical software

    Get PDF
    Operational testing, which aims to generate sequences of test cases with the same statistical properties as those that would be experienced in real operational use, can be used to obtain quantitative measures of the reliability of software. In the case of safety critical software it is common to demand that all known faults are removed. This means that if there is a failure during the operational testing, the offending fault must be identified and removed. Thus an operational test for safety critical software takes the form of a specified number of test cases (or a specified period of working) that must be executed failure-free. This paper addresses the problem of specifying the numbers of test cases (or time periods) required for a test, when the previous test has terminated as a result of a failure. It has been proposed that, after the obligatory fix of the offending fault, the software should be treated as if it were completely novel, and be required to pass exactly the same test as originally specified. The reasoning here claims to be conservative, inasmuch as no credit is given for any previous failure-free operation prior to the failure that terminated the test. We show that, in fact, this is not a conservative approach in all cases, and propose instead some new Bayesian stopping rules. We show that the degree of conservatism in stopping rules depends upon the precise way in which the reliability requirement is expressed. We define a particular form of conservatism that seems desirable on intuitive grounds, and show that the stopping rules that exhibit this conservatism are also precisely the ones that seem preferable on other grounds

    The use of multilegged arguments to increase confidence in safety claims for software-based systems: A study based on a BBN analysis of an idealized example

    Get PDF
    The work described here concerns the use of so-called multi-legged arguments to support dependability claims about software-based systems. The informal justification for the use of multi-legged arguments is similar to that used to support the use of multi-version software in pursuit of high reliability or safety. Just as a diverse, 1-out-of-2 system might be expected to be more reliable than each of its two component versions, so a two-legged argument might be expected to give greater confidence in the correctness of a dependability claim (e.g. a safety claim) than would either of the argument legs alone. Our intention here is to treat these argument structures formally, in particular by presenting a formal probabilistic treatment of ‘confidence’, which will be used as a measure of efficacy. This will enable claims for the efficacy of the multi-legged approach to be made quantitatively, answering questions such as ‘How much extra confidence about a system’s safety will I have if I add a verification argument leg to an argument leg based upon statistical testing?’ For this initial study, we concentrate on a simplified and idealized example of a safety system in which interest centres upon a claim about the probability of failure on demand. Our approach is to build a BBN (“Bayesian Belief Network”) model of a two-legged argument, and manipulate this analytically via parameters that define its node probability tables. The aim here is to obtain greater insight than is afforded by the more usual BBN treatment, which involves merely numerical manipulation. We show that the addition of a diverse second argument leg can, indeed, increase confidence in a dependability claim: in a reasonably plausible example the doubt in the claim is reduced to one third of the doubt present in the original single leg. However, we also show that there can be some unexpected and counter-intuitive subtleties here; for example an entirely supportive second leg can sometimes undermine an original argument, resulting overall in less confidence than came from this original argument. Our results are neutral on the issue of whether such difficulties will arise in real life - i.e. when real experts judge real systems

    A summary of AFCRL passive-sphere development efforts and experience

    Get PDF
    Falling spheres for meteorological rocket soundin

    Advancements in the LEWICE Ice Accretion Model

    Get PDF
    Recent evidence has shown that the NASA/Lewis Ice Accretion Model, LEWICE, does not predict accurate ice shapes for certain glaze ice conditions. This paper will present the methodology used to make a first attempt at improving the ice accretion prediction in these regimes. Importance is given to the correlations for heat transfer coefficient and ice density, as well as runback flow, selection of the transition point, flow field resolution, and droplet trajectory models. Further improvements and refinement of these modules will be performed once tests in NASA's Icing Research Tunnel, scheduled for 1993, are completed

    COPYRIGHTS, COMPETITION AND DEVELOPMENT: THE CASE OF THE MUSIC INDUSTRY

    Get PDF
    The economic importance of copyright industries in developed market economies has been well documented. Although less important in developing countries, this is likely to change with the growing weight of the service sector in these economies and its importance for their closer integration into the global market economy. This paper analyses the relationship between the copyright and income generation in the audio-visual sector, in particular music, and argues that the appropriate copyright administration is essential in creating the conditions for a viable music industry in developing countries. However, an effective copyright regime is not, by itself, sufficient to guarantee a flourishing music industry, and other institutional arrangements will be needed in countries looking to better exploit their musical resources.

    A High-Order Kernel Method for Diffusion and Reaction-Diffusion Equations on Surfaces

    Get PDF
    In this paper we present a high-order kernel method for numerically solving diffusion and reaction-diffusion partial differential equations (PDEs) on smooth, closed surfaces embedded in Rd\mathbb{R}^d. For two-dimensional surfaces embedded in R3\mathbb{R}^3, these types of problems have received growing interest in biology, chemistry, and computer graphics to model such things as diffusion of chemicals on biological cells or membranes, pattern formations in biology, nonlinear chemical oscillators in excitable media, and texture mappings. Our kernel method is based on radial basis functions (RBFs) and uses a semi-discrete approach (or the method-of-lines) in which the surface derivative operators that appear in the PDEs are approximated using collocation. The method only requires nodes at "scattered" locations on the surface and the corresponding normal vectors to the surface. Additionally, it does not rely on any surface-based metrics and avoids any intrinsic coordinate systems, and thus does not suffer from any coordinate distortions or singularities. We provide error estimates for the kernel-based approximate surface derivative operators and numerically study the accuracy and stability of the method. Applications to different non-linear systems of PDEs that arise in biology and chemistry are also presented

    Multitransient electromagnetic demonstration survey in France

    Get PDF
    corecore