1,332 research outputs found

    Histogram Monte Carlo study of multicritical behavior in the hexagonal easy-axis Heisenberg antiferromagnet

    Full text link
    The results of a detailed histogram Monte-Carlo study of critical-fluctuation effects on the magnetic-field temperature phase diagram associated with the hexagonal Heisenberg antiferromagnet with weak axial anisotropy are reported. The multiphase point where three lines of continuous transitions merge at the spin-flop boundary exhibits a structure consistent with scaling theory but without the usual umbilicus as found in the case of a bicritical point.Comment: 7 pages (RevTex 3.0), 1 figure available upon request, CRPS-93-1

    Thresholded Covering Algorithms for Robust and Max-Min Optimization

    Full text link
    The general problem of robust optimization is this: one of several possible scenarios will appear tomorrow, but things are more expensive tomorrow than they are today. What should you anticipatorily buy today, so that the worst-case cost (summed over both days) is minimized? Feige et al. and Khandekar et al. considered the k-robust model where the possible outcomes tomorrow are given by all demand-subsets of size k, and gave algorithms for the set cover problem, and the Steiner tree and facility location problems in this model, respectively. In this paper, we give the following simple and intuitive template for k-robust problems: "having built some anticipatory solution, if there exists a single demand whose augmentation cost is larger than some threshold, augment the anticipatory solution to cover this demand as well, and repeat". In this paper we show that this template gives us improved approximation algorithms for k-robust Steiner tree and set cover, and the first approximation algorithms for k-robust Steiner forest, minimum-cut and multicut. All our approximation ratios (except for multicut) are almost best possible. As a by-product of our techniques, we also get algorithms for max-min problems of the form: "given a covering problem instance, which k of the elements are costliest to cover?".Comment: 24 page

    Approximability of Connected Factors

    Get PDF
    Finding a d-regular spanning subgraph (or d-factor) of a graph is easy by Tutte's reduction to the matching problem. By the same reduction, it is easy to find a minimal or maximal d-factor of a graph. However, if we require that the d-factor is connected, these problems become NP-hard - finding a minimal connected 2-factor is just the traveling salesman problem (TSP). Given a complete graph with edge weights that satisfy the triangle inequality, we consider the problem of finding a minimal connected dd-factor. We give a 3-approximation for all dd and improve this to an (r+1)-approximation for even d, where r is the approximation ratio of the TSP. This yields a 2.5-approximation for even d. The same algorithm yields an (r+1)-approximation for the directed version of the problem, where r is the approximation ratio of the asymmetric TSP. We also show that none of these minimization problems can be approximated better than the corresponding TSP. Finally, for the decision problem of deciding whether a given graph contains a connected d-factor, we extend known hardness results.Comment: To appear in the proceedings of WAOA 201

    The Role of Human-Automation Consensus in Multiple Unmanned Vehicle Scheduling

    Get PDF
    Objective: This study examined the impact of increasing automation replanning rates on operator performance and workload when supervising a decentralized network of heterogeneous unmanned vehicles. Background: Futuristic unmanned vehicles systems will invert the operator-to-vehicle ratio so that one operator can control multiple dissimilar vehicles connected through a decentralized network. Significant human-automation collaboration will be needed because of automation brittleness, but such collaboration could cause high workload. Method: Three increasing levels of replanning were tested on an existing multiple unmanned vehicle simulation environment that leverages decentralized algorithms for vehicle routing and task allocation in conjunction with human supervision. Results: Rapid replanning can cause high operator workload, ultimately resulting in poorer overall system performance. Poor performance was associated with a lack of operator consensus for when to accept the automation’s suggested prompts for new plan consideration as well as negative attitudes toward unmanned aerial vehicles in general. Participants with video game experience tended to collaborate more with the automation, which resulted in better performance. Conclusion: In decentralized unmanned vehicle networks, operators who ignore the automation’s requests for new plan consideration and impose rapid replans both increase their own workload and reduce the ability of the vehicle network to operate at its maximum capacity. Application: These findings have implications for personnel selection and training for futuristic systems involving human collaboration with decentralized algorithms embedded in networks of autonomous systems.Aurora Flight Sciences Corp.United States. Office of Naval Researc

    Comment on "Critique of q-entropy for thermal statistics" by M. Nauenberg

    Full text link
    It was recently published by M. Nauenberg [1] a quite long list of objections about the physical validity for thermal statistics of the theory sometimes referred to in the literature as {\it nonextensive statistical mechanics}. This generalization of Boltzmann-Gibbs (BG) statistical mechanics is based on the following expression for the entropy: S_q= k\frac{1- \sum_{i=1}^Wp_i^q}{q-1} (q \in {\cal R}; S_1=S_{BG} \equiv -k\sum_{i=1}^W p_i \ln p_i) . The author of [1] already presented orally the essence of his arguments in 1993 during a scientific meeting in Buenos Aires. I am replying now simultaneously to the just cited paper, as well as to the 1993 objections (essentially, the violation of "fundamental thermodynamic concepts", as stated in the Abstract of [1]).Comment: 7 pages including 2 figures. This is a reply to M. Nauenberg, Phys. Rev. E 67, 036114 (2003

    Cultural and Family Challenges to Managing Type 2 Diabetes in Immigrant Chinese Americans

    Get PDF
    OBJECTIVE— Although Asians demonstrate elevated levels of type 2 diabetes, little attention has been directed to their unique cultural beliefs and practices regarding diabetes. We describe cultural and family challenges to illness management in foreign-born Chinese American patients with type 2 diabetes and their spouses. RESEARCH DESIGN AND METHODS— This was an interpretive comparative interview study with 20 foreign-born Chinese American couples (n = 40) living with type 2 diabetes. Multiple (six to seven) semistructured interviews with each couple in individual, group, and couple settings elicited beliefs about diabetes and narratives of care within the family and community. Interpretive narrative and thematic analysis were completed. A separate respondent group of 19 patients and spouses who met the RESULTS— Cultural and family challenges to diabetes management within foreign-born Chinese American families included how 1) diabetes symptoms challenged family harmony, 2) dietary prescriptions challenged food beliefs and practices, and 3) disease management requirements challenged established family role responsibilities. CONCLUSIONS— Culturally nuanced care with immigrant Chinese Americans requires attentiveness to the social context of disease management. Patients’ and families’ disease management decisions are seldom made independent of their concerns for family well-being, family face, and the reciprocal responsibilities required by varied family roles. Framing disease recommendations to include cultural concerns for balance and significant food rituals are warranted

    A 7/9 - Approximation Algorithm for the Maximum Traveling Salesman Problem

    Full text link
    We give a 7/9 - Approximation Algorithm for the Maximum Traveling Salesman Problem.Comment: 6 figure

    Plausibility functions and exact frequentist inference

    Full text link
    In the frequentist program, inferential methods with exact control on error rates are a primary focus. The standard approach, however, is to rely on asymptotic approximations, which may not be suitable. This paper presents a general framework for the construction of exact frequentist procedures based on plausibility functions. It is shown that the plausibility function-based tests and confidence regions have the desired frequentist properties in finite samples---no large-sample justification needed. An extension of the proposed method is also given for problems involving nuisance parameters. Examples demonstrate that the plausibility function-based method is both exact and efficient in a wide variety of problems.Comment: 21 pages, 5 figures, 3 table

    Exact Zeros of the Partition Function for a Continuum System with Double Gaussian Peaks

    Full text link
    We calculate the exact zeros of the partition function for a continuum system where the probability distribution for the order parameter is given by two asymmetric Gaussian peaks. When the positions of the two peaks coincide, the two separate loci of zeros which used to give first-order transition touch each other, with density of zeros vanishing at the contact point on the positive real axis. Instead of the second-order transition of Ehrenfast classification as one might naively expect, one finds a critical behavior in this limit.Comment: 13 pages, 6 figures, revtex, minor changes in fig.2, to be published in Physical Review
    corecore