29,325 research outputs found

    A bi-objective genetic algorithm approach to risk mitigation in project scheduling

    Get PDF
    A problem of risk mitigation in project scheduling is formulated as a bi-objective optimization problem, where the expected makespan and the expected total cost are both to be minimized. The expected total cost is the sum of four cost components: overhead cost, activity execution cost, cost of reducing risks and penalty cost for tardiness. Risks for activities are predefined. For each risk at an activity, various levels are defined, which correspond to the results of different preventive measures. Only those risks with a probable impact on the duration of the related activity are considered here. Impacts of risks are not only accounted for through the expected makespan but are also translated into cost and thus have an impact on the expected total cost. An MIP model and a heuristic solution approach based on genetic algorithms (GAs) is proposed. The experiments conducted indicate that GAs provide a fast and effective solution approach to the problem. For smaller problems, the results obtained by the GA are very good. For larger problems, there is room for improvement

    The role of falsification in the development of cognitive architectures: insights from a Lakatosian analysis

    Get PDF
    It has been suggested that the enterprise of developing mechanistic theories of the human cognitive architecture is flawed because the theories produced are not directly falsifiable. Newell attempted to sidestep this criticism by arguing for a Lakatosian model of scientific progress in which cognitive architectures should be understood as theories that develop over time. However, Newell’s own candidate cognitive architecture adhered only loosely to Lakatosian principles. This paper reconsiders the role of falsification and the potential utility of Lakatosian principles in the development of cognitive architectures. It is argued that a lack of direct falsifiability need not undermine the scientific development of a cognitive architecture if broadly Lakatosian principles are adopted. Moreover, it is demonstrated that the Lakatosian concepts of positive and negative heuristics for theory development and of general heuristic power offer methods for guiding the development of an architecture and for evaluating the contribution and potential of an architecture’s research program

    The miracle of the Septuagint and the promise of data mining in economics

    Get PDF
    This paper argues that the sometimes-conflicting results of a modern revisionist literature on data mining in econometrics reflect different approaches to solving the central problem of model uncertainty in a science of non-experimental data. The literature has entered an exciting phase with theoretical development, methodological reflection, considerable technological strides on the computing front and interesting empirical applications providing momentum for this branch of econometrics. The organising principle for this discussion of data mining is a philosophical spectrum that sorts the various econometric traditions according to their epistemological assumptions (about the underlying data-generating-process DGP) starting with nihilism at one end and reaching claims of encompassing the DGP at the other end; call it the DGP-spectrum. In the course of exploring this spectrum the reader will encounter various Bayesian, specific-to-general (S-G) as well general-to-specific (G-S) methods. To set the stage for this exploration the paper starts with a description of data mining, its potential risks and a short section on potential institutional safeguards to these problems.Data mining, model selection, automated model selection, general to specific modelling, extreme bounds analysis, Bayesian model selection

    Evaluation of early and late presentation of patients with ocular mucous membrane pemphigoid to two major tertiary referral hospitals in the United Kingdom

    Get PDF
    PURPOSE: Ocular mucous membrane pemphigoid (OcMMP) is a sight-threatening autoimmune disease in which referral to specialists units for further management is a common practise. This study aims to describe referral patterns, disease phenotype and management strategies in patients who present with either early or established disease to two large tertiary care hospitals in the United Kingdom.\ud \ud PATIENTS AND METHODS: In all, 54 consecutive patients with a documented history of OcMMP were followed for 24 months. Two groups were defined: (i) early-onset disease (EOD:<3 years, n=26, 51 eyes) and (ii) established disease (EstD:>5 years, n=24, 48 eyes). Data were captured at first clinic visit, and at 12 and 24 months follow-up. Information regarding duration, activity and stage of disease, visual acuity (VA), therapeutic strategies and clinical outcome were analysed.\ud \ud RESULTS: Patients with EOD were younger and had more severe conjunctival inflammation (76% of inflamed eyes) than the EstD group, who had poorer VA (26.7%=VA<3/60, P<0.01) and more advanced disease. Although 40% of patients were on existing immunosuppression, 48% required initiation or switch to more potent immunotherapy. In all, 28% (14) were referred back to the originating hospitals for continued care. Although inflammation had resolved in 78% (60/77) at 12 months, persistence of inflammation and progression did not differ between the two phenotypes. Importantly, 42% demonstrated disease progression in the absence of clinically detectable inflammation.\ud \ud CONCLUSIONS: These data highlight that irrespective of OcMMP phenotype, initiation or escalation of potent immunosuppression is required at tertiary hospitals. Moreover, the conjunctival scarring progresses even when the eye remains clinically quiescent. Early referral to tertiary centres is recommended to optimise immunosuppression and limit long-term ocular damage.\ud \u

    Lost in spatial translation - A novel tool to objectively assess spatial disorientation in Alzheimer's disease and frontotemporal dementia

    Get PDF
    Spatial disorientation is a prominent feature of early Alzheimer's disease (AD) attributed to degeneration of medial temporal and parietal brain regions, including the retrosplenial cortex (RSC). By contrast, frontotemporal dementia (FTD) syndromes show generally intact spatial orientation at presentation. However, currently no clinical tasks are routinely administered to objectively assess spatial orientation in these neurodegenerative conditions. In this study we investigated spatial orientation in 58 dementia patients and 23 healthy controls using a novel virtual supermarket task as well as voxel-based morphometry (VBM). We compared performance on this task with visual and verbal memory function, which has traditionally been used to discriminate between AD and FTD. Participants viewed a series of videos from a first person perspective travelling through a virtual supermarket and were required to maintain orientation to a starting location. Analyses revealed significantly impaired spatial orientation in AD, compared to FTD patient groups. Spatial orientation performance was found to discriminate AD and FTD patient groups to a very high degree at presentation. More importantly, integrity of the RSC was identified as a key neural correlate of orientation performance. These findings confirm the notion that i) it is feasible to assess spatial orientation objectively via our novel Supermarket task; ii) impaired orientation is a prominent feature that can be applied clinically to discriminate between AD and FTD and iii) the RSC emerges as a critical biomarker to assess spatial orientation deficits in these neurodegenerative conditions

    Algorithmic Jim Crow

    Get PDF
    This Article contends that current immigration- and security-related vetting protocols risk promulgating an algorithmically driven form of Jim Crow. Under the “separate but equal” discrimination of a historic Jim Crow regime, state laws required mandatory separation and discrimination on the front end, while purportedly establishing equality on the back end. In contrast, an Algorithmic Jim Crow regime allows for “equal but separate” discrimination. Under Algorithmic Jim Crow, equal vetting and database screening of all citizens and noncitizens will make it appear that fairness and equality principles are preserved on the front end. Algorithmic Jim Crow, however, will enable discrimination on the back end in the form of designing, interpreting, and acting upon vetting and screening systems in ways that result in a disparate impact
    corecore