4,283 research outputs found

    Independent predictors of breast malignancy in screen-detected microcalcifications: biopsy results in 2545 cases

    Get PDF
    Background: Mammographic microcalcifications are associated with many benign lesions, ductal carcinoma in situ (DCIS) and invasive cancer. Careful assessment criteria are required to minimise benign biopsies while optimising cancer diagnosis. We wished to evaluate the assessment outcomes of microcalcifications biopsied in the setting of population-based breast cancer screening. Methods: Between January 1992 and December 2007, cases biopsied in which microcalcifications were the only imaging abnormality were included. Patient demographics, imaging features and final histology were subjected to statistical analysis to determine independent predictors of malignancy. Results: In all, 2545 lesions, with a mean diameter of 21.8 mm (s.d. 23.8 mm) and observed in patients with a mean age of 57.7 years (s.d. 8.4 years), were included. Using the grading system adopted by the RANZCR, the grade was 3 in 47.7%; 4 in 28.3% and 5 in 24.0%. After assessment, 1220 lesions (47.9%) were malignant (809 DCIS only, 411 DCIS with invasive cancer) and 1325 (52.1%) were non-malignant, including 122 (4.8%) premalignant lesions (lobular carcinoma in situ, atypical lobular hyperplasia and atypical ductal hyperplasia). Only 30.9% of the DCIS was of low grade. Mammographic extent of microcalcifications >15 mm, imaging grade, their pattern of distribution, presence of a palpable mass and detection after the first screening episode showed significant univariate associations with malignancy. On multivariate modeling imaging grade, mammographic extent of microcalcifications >15 mm, palpable mass and screening episode were retained as independent predictors of malignancy. Radiological grade had the largest effect with lesions of grade 4 and 5 being 2.2 and 3.3 times more likely to be malignant, respectively, than grade 3 lesions. Conclusion: The radiological grading scheme used throughout Australia and parts of Europe is validated as a useful system of stratifying microcalcifications into groups with significantly different risks of malignancy. Biopsy assessment of appropriately selected microcalcifications is an effective method of detecting invasive breast cancer and DCIS, particularly of non-low-grade subtypes.G Farshid, T Sullivan, P Downey, P G Gill, and S Pieters

    The Parameterized Complexity of Domination-type Problems and Application to Linear Codes

    Full text link
    We study the parameterized complexity of domination-type problems. (sigma,rho)-domination is a general and unifying framework introduced by Telle: a set D of vertices of a graph G is (sigma,rho)-dominating if for any v in D, |N(v)\cap D| in sigma and for any $v\notin D, |N(v)\cap D| in rho. We mainly show that for any sigma and rho the problem of (sigma,rho)-domination is W[2] when parameterized by the size of the dominating set. This general statement is optimal in the sense that several particular instances of (sigma,rho)-domination are W[2]-complete (e.g. Dominating Set). We also prove that (sigma,rho)-domination is W[2] for the dual parameterization, i.e. when parameterized by the size of the dominated set. We extend this result to a class of domination-type problems which do not fall into the (sigma,rho)-domination framework, including Connected Dominating Set. We also consider problems of coding theory which are related to domination-type problems with parity constraints. In particular, we prove that the problem of the minimal distance of a linear code over Fq is W[2] for both standard and dual parameterizations, and W[1]-hard for the dual parameterization. To prove W[2]-membership of the domination-type problems we extend the Turing-way to parameterized complexity by introducing a new kind of non deterministic Turing machine with the ability to perform `blind' transitions, i.e. transitions which do not depend on the content of the tapes. We prove that the corresponding problem Short Blind Multi-Tape Non-Deterministic Turing Machine is W[2]-complete. We believe that this new machine can be used to prove W[2]-membership of other problems, not necessarily related to dominationComment: 19 pages, 2 figure

    The Complexity of Routing with Few Collisions

    Full text link
    We study the computational complexity of routing multiple objects through a network in such a way that only few collisions occur: Given a graph GG with two distinct terminal vertices and two positive integers pp and kk, the question is whether one can connect the terminals by at least pp routes (e.g. paths) such that at most kk edges are time-wise shared among them. We study three types of routes: traverse each vertex at most once (paths), each edge at most once (trails), or no such restrictions (walks). We prove that for paths and trails the problem is NP-complete on undirected and directed graphs even if kk is constant or the maximum vertex degree in the input graph is constant. For walks, however, it is solvable in polynomial time on undirected graphs for arbitrary kk and on directed graphs if kk is constant. We additionally study for all route types a variant of the problem where the maximum length of a route is restricted by some given upper bound. We prove that this length-restricted variant has the same complexity classification with respect to paths and trails, but for walks it becomes NP-complete on undirected graphs

    Covering Pairs in Directed Acyclic Graphs

    Full text link
    The Minimum Path Cover problem on directed acyclic graphs (DAGs) is a classical problem that provides a clear and simple mathematical formulation for several applications in different areas and that has an efficient algorithmic solution. In this paper, we study the computational complexity of two constrained variants of Minimum Path Cover motivated by the recent introduction of next-generation sequencing technologies in bioinformatics. The first problem (MinPCRP), given a DAG and a set of pairs of vertices, asks for a minimum cardinality set of paths "covering" all the vertices such that both vertices of each pair belong to the same path. For this problem, we show that, while it is NP-hard to compute if there exists a solution consisting of at most three paths, it is possible to decide in polynomial time whether a solution consisting of at most two paths exists. The second problem (MaxRPSP), given a DAG and a set of pairs of vertices, asks for a path containing the maximum number of the given pairs of vertices. We show its NP-hardness and also its W[1]-hardness when parametrized by the number of covered pairs. On the positive side, we give a fixed-parameter algorithm when the parameter is the maximum overlapping degree, a natural parameter in the bioinformatics applications of the problem

    Completeness and Accuracy of Emergency Medical Information on the Web: Update 2008

    Get PDF
    <p>Introduction: Reliable and accurate Web-based health information is extremely valuable when applied to emergency medical diagnoses. With this update we seek to build upon on the 2004 study by determining whether the completeness and accuracy of emergency medical information available online has improved over time.</p> <p>Methods: The top 15 healthcare information sites, as determined by internet traffic, were reviewed between February 4, 2008, and February 29, 2008. Standard checklists were created from information provided by American Stroke Association, American Heart Association, National Institutes of Health, and American College of Emergency Physicians to evaluate medical content on each of the Web sites for 4 common emergency department diagnoses: myocardial infarct, stroke, influenza, and febrile child. Each Web site was evaluated for descriptive information, completeness, and accuracy. Data were sorted for total medical checklist items, certification and credentialing, and medical items by topic.</p> <p>Results: Three of the 15 sites were excluded because of a lack of medical information on the selected topics. Completeness of sites ranged from 46% to 80% of total checklist items found. The median percentage of items found was 72. Two sites, MSN Health and Yahoo!Health, contained the greatest amount of medical information, with 98 of 123 checklist items found for each site. All Web sites but 1, Healthology.com, contained greater than 50% of aggregated checklist items, and the majority (ie, 7 of 12) contained greater than 70%. Healthology.com was the least complete Web site, containing 57 of 123 items. No significant correlation was found between credentialing and completeness of site (correlation coefficient = -0.385) or credentialing and site popularity (correlation coefficient = 0.184).</p> <p>Conclusion: This study indicates that the completeness and accuracy of online emergency medical information available to the general public has improved over the past 6 years. Overall, health Web sites studied contained greater than 70% of aggregated medical information on 4 common emergency department diagnoses, and 4 sites examined advanced from 2002 to 2008. [West J Emerg Med. 2011;12(4):448–454.]</p

    Tree Compression with Top Trees Revisited

    Get PDF
    We revisit tree compression with top trees (Bille et al, ICALP'13) and present several improvements to the compressor and its analysis. By significantly reducing the amount of information stored and guiding the compression step using a RePair-inspired heuristic, we obtain a fast compressor achieving good compression ratios, addressing an open problem posed by Bille et al. We show how, with relatively small overhead, the compressed file can be converted into an in-memory representation that supports basic navigation operations in worst-case logarithmic time without decompression. We also show a much improved worst-case bound on the size of the output of top-tree compression (answering an open question posed in a talk on this algorithm by Weimann in 2012).Comment: SEA 201

    The parameterized complexity of some geometric problems in unbounded dimension

    Full text link
    We study the parameterized complexity of the following fundamental geometric problems with respect to the dimension dd: i) Given nn points in \Rd, compute their minimum enclosing cylinder. ii) Given two nn-point sets in \Rd, decide whether they can be separated by two hyperplanes. iii) Given a system of nn linear inequalities with dd variables, find a maximum-size feasible subsystem. We show that (the decision versions of) all these problems are W[1]-hard when parameterized by the dimension dd. %and hence not solvable in O(f(d)nc){O}(f(d)n^c) time, for any computable function ff and constant cc %(unless FPT=W[1]). Our reductions also give a nΩ(d)n^{\Omega(d)}-time lower bound (under the Exponential Time Hypothesis)

    A note on the differences of computably enumerable reals

    Get PDF
    We show that given any non-computable left-c.e. real α there exists a left-c.e. real β such that α≠β+γ for all left-c.e. reals and all right-c.e. reals γ. The proof is non-uniform, the dichotomy being whether the given real α is Martin-Loef random or not. It follows that given any universal machine U, there is another universal machine V such that the halting probability of U is not a translation of the halting probability of V by a left-c.e. real. We do not know if there is a uniform proof of this fact

    Counting dependent and independent strings

    Full text link
    The paper gives estimations for the sizes of the the following sets: (1) the set of strings that have a given dependency with a fixed string, (2) the set of strings that are pairwise \alpha independent, (3) the set of strings that are mutually \alpha independent. The relevant definitions are as follows: C(x) is the Kolmogorov complexity of the string x. A string y has \alpha -dependency with a string x if C(y) - C(y|x) \geq \alpha. A set of strings {x_1, \ldots, x_t} is pairwise \alpha-independent if for all i different from j, C(x_i) - C(x_i | x_j) \leq \alpha. A tuple of strings (x_1, \ldots, x_t) is mutually \alpha-independent if C(x_{\pi(1)} \ldots x_{\pi(t)}) \geq C(x_1) + \ldots + C(x_t) - \alpha, for every permutation \pi of [t]

    Mathematical models for use in planning regional water resources and energy systems

    Get PDF
    Existing and projected energy facilities will, in the near future, place major demands on the country's water resources. These demands compete with many other uses of the resources, including municipal and industrial uses, navigation, irrigation, and water quality maintenance. The possible development of coal conversion facilities presents another potential water demand. Complex public sector problems such as: 1) the extent and development of coal conversion capacity, 2) interbasin transfer of water, 3) cooling technologies for large energy facilities, 4) diversion of Lake Michigan water, and 5) allowable withdrawal and consumptive uses of river water, all arise from the interlocking nature of the water resources-energy system. Although mathematical models cannot solve these problems directly, they can be useful in gaining insight into major issues associated with policy alternatives. With the aid of such models, quantitative trends such as costs and water development patterns associated with each decision alternative can be more readily identified. In this report, mathematical models are presented for use in planning a regional allocation of water for energy facilities as well as for other water uses. These models include components for the interrelated water and energy subsystems. The use of these models in conjunction with other existing models in order to provide a better picture of the overall system is discussed. Since the models use widely available computer codes, they are practical and easy to utilize. Example applications are presented, with a discussion of computational results.U.S. Geological SurveyU.S. Department of the InteriorOpe
    • …
    corecore