502 research outputs found

    Estimating Depth from RGB and Sparse Sensing

    Full text link
    We present a deep model that can accurately produce dense depth maps given an RGB image with known depth at a very sparse set of pixels. The model works simultaneously for both indoor/outdoor scenes and produces state-of-the-art dense depth maps at nearly real-time speeds on both the NYUv2 and KITTI datasets. We surpass the state-of-the-art for monocular depth estimation even with depth values for only 1 out of every ~10000 image pixels, and we outperform other sparse-to-dense depth methods at all sparsity levels. With depth values for 1/256 of the image pixels, we achieve a mean absolute error of less than 1% of actual depth on indoor scenes, comparable to the performance of consumer-grade depth sensor hardware. Our experiments demonstrate that it would indeed be possible to efficiently transform sparse depth measurements obtained using e.g. lower-power depth sensors or SLAM systems into high-quality dense depth maps.Comment: European Conference on Computer Vision (ECCV) 2018. Updated to camera-ready version with additional experiment

    Prognostic factors in patients with acute mesenteric ischemia-novel tools for determining patient outcomes

    Full text link
    BACKGROUND Acute mesenteric ischemia (AMI) is a devastating disease with poor prognosis. Due to the multitude of underlying factors, prediction of outcomes remains poor. We aimed to identify factors governing diagnosis and survival in AMI and develop novel prognostic tools. METHODS This monocentric retrospective study analyzed patients with suspected AMI undergoing imaging between January 2014 and December 2019. Subgroup analyses were performed for patients with confirmed AMI undergoing surgery. Nomograms were calculated based on multivariable logistic regression models. RESULTS Five hundred and thirty-nine patients underwent imaging for clinically suspected AMI, with 216 examinations showing radiological indication of AMI. Intestinal necrosis (IN) was confirmed in 125 undergoing surgery, 58 of which survived and 67 died (median 9 days after diagnosis, IQR 22). Increasing age, ASA score, pneumatosis intestinalis, and dilated bowel loops were significantly associated with presence of IN upon radiological suspicion. In contrast, decreased pH, elevated creatinine, radiological atherosclerosis, vascular occlusion (versus non-occlusive AMI), and colonic affection (compared to small bowel ischemia only) were associated with impaired survival in patients undergoing surgery. Based on the identified factors, we developed two nomograms to aid in prediction of IN upon radiological suspicion (C-Index = 0.726) and survival in patients undergoing surgery for IN (C-Index = 0.791). CONCLUSION As AMI remains a condition with high mortality, we identified factors predicting occurrence of IN with suspected AMI and survival when undergoing surgery for IN. We provide two new tools, which combine these parameters and might prove helpful in treatment of patients with AMI

    On Optimization Modulo Theories, MaxSMT and Sorting Networks

    Full text link
    Optimization Modulo Theories (OMT) is an extension of SMT which allows for finding models that optimize given objectives. (Partial weighted) MaxSMT --or equivalently OMT with Pseudo-Boolean objective functions, OMT+PB-- is a very-relevant strict subcase of OMT. We classify existing approaches for MaxSMT or OMT+PB in two groups: MaxSAT-based approaches exploit the efficiency of state-of-the-art MAXSAT solvers, but they are specific-purpose and not always applicable; OMT-based approaches are general-purpose, but they suffer from intrinsic inefficiencies on MaxSMT/OMT+PB problems. We identify a major source of such inefficiencies, and we address it by enhancing OMT by means of bidirectional sorting networks. We implemented this idea on top of the OptiMathSAT OMT solver. We run an extensive empirical evaluation on a variety of problems, comparing MaxSAT-based and OMT-based techniques, with and without sorting networks, implemented on top of OptiMathSAT and {\nu}Z. The results support the effectiveness of this idea, and provide interesting insights about the different approaches.Comment: 17 pages, submitted at Tacas 1

    What makes a phase transition? Analysis of the random satisfiability problem

    Full text link
    In the last 30 years it was found that many combinatorial systems undergo phase transitions. One of the most important examples of these can be found among the random k-satisfiability problems (often referred to as k-SAT), asking whether there exists an assignment of Boolean values satisfying a Boolean formula composed of clauses with k random variables each. The random 3-SAT problem is reported to show various phase transitions at different critical values of the ratio of the number of clauses to the number of variables. The most famous of these occurs when the probability of finding a satisfiable instance suddenly drops from 1 to 0. This transition is associated with a rise in the hardness of the problem, but until now the correlation between any of the proposed phase transitions and the hardness is not totally clear. In this paper we will first show numerically that the number of solutions universally follows a lognormal distribution, thereby explaining the puzzling question of why the number of solutions is still exponential at the critical point. Moreover we provide evidence that the hardness of the closely related problem of counting the total number of solutions does not show any phase transition-like behavior. This raises the question of whether the probability of finding a satisfiable instance is really an order parameter of a phase transition or whether it is more likely to just show a simple sharp threshold phenomenon. More generally, this paper aims at starting a discussion where a simple sharp threshold phenomenon turns into a genuine phase transition

    Verification of Item Usage Rules in Product Configuration

    Get PDF
    In the development of complex products product configuration systems are often used to support the development process. Item Usage Rules (IURs) are conditions for including specific items in products bills of materials based on a high-level product description. Large number of items and significant complexity of IURs make it difficult to maintain and analyze IURs manually. In this paper we present an automated approach for verifying IURs, which guarantees the presence of exactly one item from a predefined set in each product, as well as that an IUR can be reformulated without changing the set of products for which the item was included

    Generalized Totalizer Encoding for Pseudo-Boolean Constraints

    Full text link
    Pseudo-Boolean constraints, also known as 0-1 Integer Linear Constraints, are used to model many real-world problems. A common approach to solve these constraints is to encode them into a SAT formula. The runtime of the SAT solver on such formula is sensitive to the manner in which the given pseudo-Boolean constraints are encoded. In this paper, we propose generalized Totalizer encoding (GTE), which is an arc-consistency preserving extension of the Totalizer encoding to pseudo-Boolean constraints. Unlike some other encodings, the number of auxiliary variables required for GTE does not depend on the magnitudes of the coefficients. Instead, it depends on the number of distinct combinations of these coefficients. We show the superiority of GTE with respect to other encodings when large pseudo-Boolean constraints have low number of distinct coefficients. Our experimental results also show that GTE remains competitive even when the pseudo-Boolean constraints do not have this characteristic.Comment: 10 pages, 2 figures, 2 tables. To be published in 21st International Conference on Principles and Practice of Constraint Programming 201

    Efficient Certified Resolution Proof Checking

    Get PDF
    We present a novel propositional proof tracing format that eliminates complex processing, thus enabling efficient (formal) proof checking. The benefits of this format are demonstrated by implementing a proof checker in C, which outperforms a state-of-the-art checker by two orders of magnitude. We then formalize the theory underlying propositional proof checking in Coq, and extract a correct-by-construction proof checker for our format from the formalization. An empirical evaluation using 280 unsatisfiable instances from the 2015 and 2016 SAT competitions shows that this certified checker usually performs comparably to a state-of-the-art non-certified proof checker. Using this format, we formally verify the recent 200 TB proof of the Boolean Pythagorean Triples conjecture

    Alternative splicing substantially diversifies the transcriptome during early photomorphogenesis and correlates with the energy availability in arabidopsis

    Get PDF
    Plants use light as source of energy and information to detect diurnal rhythms and seasonal changes. Sensing changing light conditions is critical to adjust plant metabolism and to initiate developmental transitions. Here we analyzed transcriptome-wide alterations in gene expression and alternative splicing (AS) of etiolated seedlings undergoing photomorphogenesis upon exposure to blue, red, or white light. Our analysis revealed massive transcriptome reprograming as reflected by differential expression of ~20% of all genes and changes in several hundred AS events. For more than 60% of all regulated AS events, light promoted the production of a presumably protein-coding variant at the expense of an mRNA with nonsense-mediated decay-triggering features. Accordingly, AS of the putative splicing factor REDUCED RED-LIGHT RESPONSES IN CRY1CRY2 BACKGROUND 1 (RRC1), previously identified as a red light signaling component, was shifted to the functional variant under light. Downstream analyses of candidate AS events pointed at a role of photoreceptor signaling only in monochromatic but not in white light. Furthermore, we demonstrated similar AS changes upon light exposure and exogenous sugar supply, with a critical involvement of kinase signaling. We propose that AS is an integration point of signaling pathways that sense and transmit information regarding the energy availability in plants

    On Tackling the Limits of Resolution in SAT Solving

    Full text link
    The practical success of Boolean Satisfiability (SAT) solvers stems from the CDCL (Conflict-Driven Clause Learning) approach to SAT solving. However, from a propositional proof complexity perspective, CDCL is no more powerful than the resolution proof system, for which many hard examples exist. This paper proposes a new problem transformation, which enables reducing the decision problem for formulas in conjunctive normal form (CNF) to the problem of solving maximum satisfiability over Horn formulas. Given the new transformation, the paper proves a polynomial bound on the number of MaxSAT resolution steps for pigeonhole formulas. This result is in clear contrast with earlier results on the length of proofs of MaxSAT resolution for pigeonhole formulas. The paper also establishes the same polynomial bound in the case of modern core-guided MaxSAT solvers. Experimental results, obtained on CNF formulas known to be hard for CDCL SAT solvers, show that these can be efficiently solved with modern MaxSAT solvers
    corecore