1,965 research outputs found

    Sensitivity analysis of the greedy heuristic for binary knapsack problems

    Get PDF
    Greedy heuristics are a popular choice of heuristics when we have to solve a large variety of NP -hard combinatorial problems. In particular for binary knapsack problems, these heuristics generate good results. If some uncertainty exists beforehand regarding the value of any one element in the problem data, sensitivity analysis procedures can be used to know the tolerance limits within which the value may vary will not cause changes in the output. In this paper we provide a polynomial time characterization of such limits for greedy heuristics on two classes of binary knapsack problems, namely the 0-1 knapsack problem and the subset sum problem. We also study the relation between algorithms to solve knapsack problems and algorithms to solve their sensitivity analysis problems, the conditions under which the sensitivity analysis of the heuristic generates bounds for the toler-ance limits for the optimal solutions, and the empirical behavior of the greedy output when there is a change in the problem data.

    An X-Ray Study of Some Leathers

    Get PDF

    Complexity of determining exact tolerances for min-max combinatorial optimization problems

    Get PDF
    Suppose that we are given an instance of a combinatorial optimization problemwith min-max objective along with an optimal solution for it. Let the cost of asingle element be varied. We refer to the range of values of the element’s costfor which the given optimal solution remains optimal as its exact tolerance. Inthis paper we examine the problem of determining the exact tolerance of eachelement in combinatorial optimization problems with min-max objectives. Weshow that under very weak assumptions, the exact tolerance of each elementcan be determined in polynomial time if and only if the original optimizationproblem can be solved in polynomial time

    Rheumatoid arthritis-associated interstitial lung disease seen in two generation of females in an Indian family

    Get PDF
    Interstitial lung diseases (ILDs) or diffuse parenchymal lung diseases (DPLDs) are a group of lung diseases that is distinguished by subacute or chronic inflammation and/or fibrosis. Family history is currently being considered one of the biggest risk factors for ILD. Rheumatoid arthritis (RA) a systemic autoimmune disease has lungs as its most common extraarticular organ involved. Interstitial lung disease associated with it is one of the major causes of mortality along with severe disability. Lung involvement in RA might appear as ILD, pleural effusion, or pulmonary vasculitis. In this case report a 42-year-old female presented with complain of progressive breathlessness, dry cough, chest pain, joint pain since past 10 years. HRCT Thorax of patient suggested it to be ILD of UIP pattern with raised RF, anti CCP and positivity in ANA profile. Patient had a family history with mother being diagnosed with ILD-NSIP pattern. She was suspicioned for RA as she had complained of small joint pains and swellings and was responding well to steroids and HCQ

    Systematic effects from black hole-neutron star waveform model uncertainties on the neutron star equation of state

    Get PDF
    We identify various contributors of systematic effects in the measurement of the neutron star (NS) tidal deformability and quantify their magnitude for several types of neutron star - black hole (NSBH) binaries. Gravitational waves from NSBH mergers contain information about the components' masses and spins as well as the NS equation of state. Extracting this information requires comparison of the signal in noisy detector data with theoretical templates derived from some combination of post-Newtonian (PN) approximants, effective one-body (EOB) models and %analytic fits to numerical relativity (NR) simulations. The accuracy of these templates is limited by errors in the NR simulations, by the approximate nature of the PN/EOB waveforms, and by the hybridization procedure used to combine them. In this paper, we estimate the impact of these errors by constructing and comparing a set of PN-NR hybrid waveforms, for the first time with NR waveforms from two different codes, namely, SpEC and SACRA, for such systems. We then attempt to recover the parameters of the binary using two non-precessing template approximants. We find that systematic errors are too large for tidal effects to be accurately characterized for any realistic NS equation of state model. We conclude that NSBH waveform models must be significantly improved if they are to be useful for the extraction of NS equation of state information or even for distinguishing NSBH systems from binary black holes

    Quality assessment and refinement of chromatin accessibility data using a sequence-based predictive model

    Get PDF
    Chromatin accessibility assays are central to the genome-wide identification of gene regulatory elements associated with transcriptional regulation. However, the data have highly variable quality arising from several biological and technical factors. To surmount this problem, we developed a sequence-based machine learning method to evaluate and refine chromatin accessibility data. Our framework, gapped k-mer SVM quality check (gkmQC), provides the quality metrics for a sample based on the prediction accuracy of the trained models. We tested 886 DNase-seq samples from the ENCODE/Roadmap projects to demonstrate that gkmQC can effectively identify high-quality (HQ) samples with low conventional quality scores owing to marginal read depths. Peaks identified in HQ samples are more accurately aligned at functional regulatory elements, show greater enrichment of regulatory elements harboring functional variants, and explain greater heritability of phenotypes from their relevant tissues. Moreover, gkmQC can optimize the peak-calling threshold to identify additional peaks, especially for rare cell types in single-cell chromatin accessibility data

    Can trans-generational experiments be used to enhance species resilience to ocean warming and acidification?

    Get PDF
    Human-assisted, trans-generational exposure to ocean warming and acidification has been proposed as a conservation and/or restoration tool to produce resilient offspring. To improve our understanding of the need for and the efficacy of this approach, we characterized life-history and physiological responses in offspring of the marine polychaete Ophryotrocha labronica exposed to predicted ocean warming (OW: + 3 degrees C), ocean acidification (OA: pH -0.5) and their combination (OWA: + 3 degrees C, pH -0.5), following the exposure of their parents to either control conditions (within-generational exposure) or the same conditions (trans-generational exposure). Trans-generational exposure to OW fully alleviated the negative effects of within-generational exposure to OW on fecundity and egg volume and was accompanied by increased metabolic activity. While within-generational exposure to OA reduced juvenile growth rates and egg volume, trans-generational exposure alleviated the former but could not restore the latter. Surprisingly, exposure to OWA had no negative impacts within-or trans-generationally. Our results highlight the potential for trans-generational laboratory experiments in producing offspring that are resilient to OW and OA. However, trans-generational exposure does not always appear to improve traits and therefore may not be a universally useful tool for all species in the face of global change

    Adapting the HHL algorithm to (non-unitary) quantum many-body theory

    Full text link
    Rapid progress in developing near- and long-term quantum algorithms for quantum chemistry has provided us with an impetus to move beyond traditional approaches and explore new ways to apply quantum computing to electronic structure calculations. In this work, we identify the connection between quantum many-body theory and a quantum linear solver, and implement the Harrow-Hassidim-Lloyd (HHL) algorithm to make precise predictions of correlation energies for light molecular systems via the (non-unitary) linearised coupled cluster theory. We alter the HHL algorithm to integrate two novel aspects- (a) we prescribe a novel scaling approach that allows one to scale any arbitrary symmetric positive definite matrix A, to solve for Ax = b and achieve x with reasonable precision, all the while without having to compute the eigenvalues of A, and (b) we devise techniques that reduce the depth of the overall circuit. In this context, we introduce the following variants of HHL for different eras of quantum computing- AdaptHHLite in its appropriate forms for noisy intermediate scale quantum (NISQ), late-NISQ, and the early fault-tolerant eras, as well as AdaptHHL for the fault-tolerant quantum computing era. We demonstrate the ability of the NISQ variant of AdaptHHLite to capture correlation energy precisely, while simultaneously being resource-lean, using simulation as well as the 11-qubit IonQ quantum hardware
    corecore