137 research outputs found

    Optimization of multi-domain queries on the Web

    Get PDF
    Where can I attend an interesting database workshop close to a sunny beach? Who are the strongest experts on service computing based upon their recent publication record and accepted European projects? Can I spend an April week- end in a city served by a low-cost direct flight from Milano offering a Mahler's symphony? We regard the above queries as multi-domain queries, i.e., queries that can be answered by combining knowledge from two or more domains (such as: seaside locations, flights, publications, accepted projects, conference offerings, and so on). This information is avail- able on the Web, but no general-purpose software system can accept the above queries nor compute the answer. At the most, dedicated systems support specific multi-domain compositions (e.g., Google-local locates information such as restaurants and hotels upon geographic maps). This paper presents an overall framework for multi-domain queries on the Web. We address the following problems: (a) expressing multi-domain queries with an abstract formalism, (b) separating the treatment of "search" services within the model, by highlighting their dierences from "exact" Web services, (c) explaining how the same query can be mapped to multiple "query plans", i.e., a well-dened scheduling of service invocations, possibly in parallel, which complies with their access limitations and preserves the ranking order in which search services return results; (d) introducing cross- domain joins as first-class operation within plans; (e) eval- uating the query plans against several cost metrics so as to choose the most promising one for execution. This frame- work adapts to a variety of application contexts, ranging from end-user-oriented mash-up scenarios up to complex ap- plication integration scenarios

    Keyword search in the Deep Web

    Get PDF
    The Deep Web is constituted by data accessible through Web pages, but not readily indexable by search engines, as they are returned in dynamic pages. In this paper we propose a framework for accessing Deep Web sources, represented as relational tables with so-called ac- cess limitations, with keyword-based queries. We formalize the notion of optimal answer and investigate methods for query processing. To our knowledge, this problem has never been studied in a systematic way

    Time-resolved single-photon detection module based on silicon photomultiplier: A novel building block for time-correlated measurement systems

    Get PDF
    We present the design and preliminary characterization of the first detection module based on Silicon Photomultiplier (SiPM) tailored for single-photon timing applications. The aim of this work is to demonstrate, thanks to the design of a suitable module, the possibility to easily exploit SiPM in many applications as an interesting detector featuring large active area, similarly to photomultipliers tubes, but keeping the advantages of solid state detectors (high quantum efficiency, low cost, compactness, robustness, low bias voltage, and insensitiveness to magnetic field). The module integrates a cooled SiPM with a total photosensitive area of 1 mm2 together with the suitable avalanche signal read-out circuit, the signal conditioning, the biasing electronics, and a Peltier cooler driver for thermal stabilization. It is able to extract the single-photon timing information with resolution better than 100 ps full-width at half maximum. We verified the effective stabilization in response to external thermal perturbations, thus proving the complete insensitivity of the module to environment temperature variations, which represents a fundamental parameter to profitably use the instrument for real-field applications. We also characterized the single-photon timing resolution, the background noise due to both primary dark count generation and afterpulsing, the single-photon detection efficiency, and the instrument response function shape. The proposed module can become a reliable and cost-effective building block for time-correlated single-photon counting instruments in applications requiring high collection capability of isotropic light and detection efficiency (e.g., fluorescence decay measurements or time-domain diffuse optics systems)

    Spectrally Resolved Single-Photon Timing of Silicon Photomultipliers for Time-Domain Diffuse Spectroscopy

    Get PDF
    We characterized the single-photon timing response function of various silicon photomultipliers (SiPMs) over a broad (500-1100 nm) spectral range. We selected two SiPM manufacturers, and we investigated two active areas, i.e., a small (1-1.69 mm2 and a large (9 mm2) one, for each of them. We demonstrate that selected SiPMs are suitable for time-resolved diffuse optics (DO) applications where a very large detection area and sensitivity down to single photons are crucial to detecting the very faint return signal from biological tissues, like the brain, thus allowing replacement of photomultiplier tubes and opening the way to a novel generation of DO multichannel instrumentation. Due to our custom front-end electronics, we show the world's best single-photon timing resolution for SiPMs, namely, 57-ps full-width at half maximum for Hamamatsu 1.69 mm2 and 115 ps for Excelitas 9 mm2. Even further, we provide a thorough spectral investigation of the full single-photon timing response function, also detailing diffusion tails' time constants and dynamic range. The achieved insight and the reported performance open the way to a widespread diffusion of SiPMs not just in many-photon regimes (e.g., PET) but at single-photon counting regimes like DO as well

    Large area silicon photomultipliers allow extreme depth penetration in time-domain diffuse optics

    Get PDF
    We present the design of a novel single-photon timing module, based on a Silicon Photomultiplier (SiPM) featuring a collection area of 9 mm2. The module performs Single-Photon Timing Resolution of about 140 ps, thus being suitable for diffuse optics application. The small size of the instrument (5 cm × 4 cm × 10 cm) allows placing it directly in contact with the sample under investigation, maximizing that way the signal harvesting. Thanks to that, it is possible to increase the source detector distance up to 6 cm or more, therefore enhancing the penetration depth up to an impressive value of 4 cm and paving the way to the exploration of the deepest human body structures in a completely non-invasive approach

    Inconsistency-Tolerant Integrity Checking

    Full text link
    All methods for efficient integrity checking require all integrity constraints to be totally satisfied, before any update is executed. However, a certain amount of inconsistency is the rule, rather than the exception in databases. In this paper, we close the gap between theory and practice of integrity checking, i.e., between the unrealistic theoretical requirement of total integrity and the practical need for inconsistency tolerance, which we define for integrity checking methods. We show that most of them can still be used to check whether updates preserve integrity, even if the current state is inconsistent. Inconsistency-tolerant integrity checking proves beneficial both for integrity preservation and query answering. Also, we show that it is useful for view updating, repairs, schema evolution, and other applications.Hendrik Decker has been supported by FEDER and the Spanish MEC grant TIN2006-14738-C02-01. Davide Martinenghi has been supported by the Search Computing (SeCo) project, funded by ERC under the 2008 Call for "IDEAS Advanced Grants." The authors also wish to thank Davide Barbieri for his valuable contribution to the experimental evaluation.Decker, H.; Martinenghi, D. (2011). Inconsistency-Tolerant Integrity Checking. IEEE Transactions on Knowledge and Data Engineering. 23(2):218-234. https://doi.org/10.1109/TKDE.2010.87S21823423

    Injecting Conceptual Constraints into Data Fabrics

    Get PDF
    Unlike traditional sources managed by DBMSs, data lakes do not provide any guarantee about the quality of the data they store, which can severely limit their use for analysis purposes. The recent notion of data fabric, which introduces a semantic layer allowing uniform access to underlying data sources, makes it possible to tackle this problem by specifying conceptual constraints to which data sources must adhere to be considered meaningful. Along these lines, in this discussion paper, we exploit the data fabric approach by proposing a general methodology for data curation in data fabrics based on: (i) the specification of integrity constraints over a conceptual representation of the data lake and (ii) the automatic translation and enforcement of such constraints over the actual data. We discuss the advantages of this idea and the challenges behind its implementation

    Towards a Standard for Triggers in Property Graphs

    Get PDF
    Graph databases are emerging as the leading data management technology for storing large knowledge graphs; significant efforts are ongoing to produce new standards (such as the Graph Query Language, GQL) and enrich them with properties, types, schemas, and keys. In this article, we present PG-Triggers, a proposal for adding triggers to Property Graphs, along the direction marked by the SQL3 Standard. Wedefine the syntax and semantics of PG-Triggers and briefly discuss how they can be implemented on top of the most popular graph databases. The main objective of this article is to discuss an active database standard for graph databases as a first-class citizen at a time when reactive graph management is in its infancy, so as to minimize the conversion efforts towards a full-fledged standard proposal
    corecore