85 research outputs found

    Mean-square stability analysis of approximations of stochastic differential equations in infinite dimensions

    Full text link
    The (asymptotic) behaviour of the second moment of solutions to stochastic differential equations is treated in mean-square stability analysis. This property is discussed for approximations of infinite-dimensional stochastic differential equations and necessary and sufficient conditions ensuring mean-square stability are given. They are applied to typical discretization schemes such as combinations of spectral Galerkin, finite element, Euler-Maruyama, Milstein, Crank-Nicolson, and forward and backward Euler methods. Furthermore, results on the relation to stability properties of corresponding analytical solutions are provided. Simulations of the stochastic heat equation illustrate the theory.Comment: 22 pages, 4 figures; deleted a section; shortened the presentation of results; corrected typo

    Linked Data Entity Summarization

    Get PDF
    On the Web, the amount of structured and Linked Data about entities is constantly growing. Descriptions of single entities often include thousands of statements and it becomes difficult to comprehend the data, unless a selection of the most relevant facts is provided. This doctoral thesis addresses the problem of Linked Data entity summarization. The contributions involve two entity summarization approaches, a common API for entity summarization, and an approach for entity data fusion

    PageRank on Wikipedia: Towards General Importance Scores for Entities

    Get PDF
    Abstract. Link analysis methods are used to estimate importance in graph-structured data. In that realm, the PageRank algorithm has been used to analyze directed graphs, in particular the link structure of the Web. Recent developments in information retrieval focus on entities and their relations (i. e. knowledge graph panels). Many entities are documented in the popular knowledge base Wikipedia. The cross-references within Wikipedia exhibit a directed graph structure that is suitable for computing PageRank scores as importance indicators for entities. In this work, we present different PageRank-based analyses on the link graph of Wikipedia and according experiments. We focus on the question whether some links -based on their position in the article text -can be deemed more important than others. In our variants, we change the probabilistic impact of links in accordance to their position on the page and measure the effects on the output of the PageRank algorithm. We compare the resulting rankings and those of existing systems with pageview-based rankings and provide statistics on the pairwise computed Spearman and Kendall rank correlations

    Browsing DBpedia Entities with Summaries

    Get PDF
    Abstract. The term "Linked Data" describes online-retrievable formal descriptions of entities and their links to each other. Machines and humans alike can retrieve these descriptions and discover information about links to other entities. However, for human users it becomes difficult to browse descriptions of single entities because, in many cases, they are referenced in more than a thousand statements. In this demo paper we present summarum, a system that ranks triples and enables entity summaries for improved navigation within Linked Data. In its current implementation, the system focuses on DBpedia with the summaries being based on the PageRank scores of the involved entities

    GaN heterostructures as innovative x-ray imaging sensors — change of paradigm

    Get PDF
    Direct conversion of X-ray irradiation using a semiconductor material is an emerging technology in medical and material sciences. Existing technologies face problems, such as sensitivity or resilience. Here, we describe a novel class of X-ray sensors based on GaN thin film and GaN/AlGaN high-electron-mobility transistors (HEMTs), a promising enabling technology in the modern world of GaN devices for high power, high temperature, high frequency, optoelectronic, and military/space applications. The GaN/AlGaN HEMT-based X-ray sensors offer superior performance, as evidenced by higher sensitivity due to intensification of electrons in the two-dimensional electron gas (2DEG), by ionizing radiation. This increase in detector sensitivity, by a factor of 104 compared to GaN thin film, now offers the opportunity to reduce health risks associated with the steady increase in CT scans in today’s medicine, and the associated increase in exposure to harmful ionizing radiation, by introducing GaN/AlGaN sensors into X-ray imaging devices, for the benefit of the patient

    Results of the 2016 ENtity Summarization Evaluation Campaign (ENSEC 2016)

    Get PDF
    Entities and their descriptions are becoming an important part of the datasets and knowledge graphs available on the Web. These descriptions can be used in concise representation (i.e., summaries) to help users understand the Web content (e.g., summaries generated from Google Knowledge Graph in Google Search). In the recent past, several systems emerged to tackle the problem of automatic summary generation for entity descriptions. Even though these proposed systems continuously push the boundaries, the problem is not yet resolved completely. Therefore, there is a need to support and encourage researchers in the community to participate in solving this important problem. ENSEC, the entity summarization evaluation campaign, is the first step taken towards realizing that goal, and we present the results of the systems participating in the campaign

    Circulating cell-free methylated DNA and lactate dehydrogenase release in colorectal cancer

    Get PDF
    Background: Hypermethylation of DNA is an epigenetic alteration commonly found in colorectal cancer (CRC) and can also be detected in blood samples of cancer patients. Methylation of the genes helicase-like transcription factor (HLTF) and hyperplastic polyposis 1 (HPP1) have been proposed as prognostic, and neurogenin 1 (NEUROG1) as diagnostic biomarker. However the underlying mechanisms leading to the release of these genes are unclear. This study aimed at examining the possible correlation of the presence of methylated genes NEUROG1, HLTF and HPP1 in serum with tissue breakdown as a possible mechanism using serum lactate dehydrogenase (LDH) as a surrogate marker. Additionally the prognostic impact of these markers was examined. Methods: Pretherapeutic serum samples from 259 patients from all cancer stages were analyzed. Presence of hypermethylation of the genes HLTF, HPP1, and NEUROG1 was examined using methylation-specific quantitative PCR (MethyLight). LDH was determined using an UV kinetic test. Results: Hypermethylation of HLTF and HPP1 was detected significantly more often in patients with elevated LDH levels (32% vs. 12% {[}p = 0.0005], and 68% vs. 11% {[}p < 0.0001], respectively). Also, higher LDH values correlated with a higher percentage of a fully methylated reference in a linear fashion (Spearman correlation coefficient 0.18 for HLTF {[}p = 0.004]; 0.49 {[}p <.0001] for HPP1). No correlation between methylation of NEUROG1 and LDH was found in this study. Concerning the clinical characteristics, high levels of LDH as well as methylation of HLTF and HPP1 were significantly associated with larger and more advanced stages of CRC. Accordingly, these three markers were correlated with significantly shorter survival in the overall population. Moreover, all three identified patients with a worse prognosis in the subgroup of stage IV patients. Conclusions: We were able to provide evidence that methylation of HLTF and especially HPP1 detected in serum is strongly correlated with cell death in CRC using LDH as surrogate marker. Additionally, we found that prognostic information is given by both HLTF and HPP1 as well as LDH. In sum, determining the methylation of HLTF and HPP1 in serum might be useful in order to identify patients with more aggressive tumors

    Improving Image Quality of Sparse-view Lung Cancer CT Images with a Convolutional Neural Network

    Full text link
    Purpose: To improve the image quality of sparse-view computed tomography (CT) images with a U-Net for lung cancer detection and to determine the best trade-off between number of views, image quality, and diagnostic confidence. Methods: CT images from 41 subjects (34 with lung cancer, seven healthy) were retrospectively selected (01.2016-12.2018) and forward projected onto 2048-view sinograms. Six corresponding sparse-view CT data subsets at varying levels of undersampling were reconstructed from sinograms using filtered backprojection with 16, 32, 64, 128, 256, and 512 views, respectively. A dual-frame U-Net was trained and evaluated for each subsampling level on 8,658 images from 22 diseased subjects. A representative image per scan was selected from 19 subjects (12 diseased, seven healthy) for a single-blinded reader study. The selected slices, for all levels of subsampling, with and without post-processing by the U-Net model, were presented to three readers. Image quality and diagnostic confidence were ranked using pre-defined scales. Subjective nodule segmentation was evaluated utilizing sensitivity (Se) and Dice Similarity Coefficient (DSC) with 95% confidence intervals (CI). Results: The 64-projection sparse-view images resulted in Se = 0.89 and DSC = 0.81 [0.75,0.86] while their counterparts, post-processed with the U-Net, had improved metrics (Se = 0.94, DSC = 0.85 [0.82,0.87]). Fewer views lead to insufficient quality for diagnostic purposes. For increased views, no substantial discrepancies were noted between the sparse-view and post-processed images. Conclusion: Projection views can be reduced from 2048 to 64 while maintaining image quality and the confidence of the radiologists on a satisfactory level
    • …
    corecore