120 research outputs found

    An Automatic Data Grabber for Large Web Sites

    Get PDF

    K-Space at TRECVID 2008

    Get PDF
    In this paper we describe K-Space’s participation in TRECVid 2008 in the interactive search task. For 2008 the K-Space group performed one of the largest interactive video information retrieval experiments conducted in a laboratory setting. We had three institutions participating in a multi-site multi-system experiment. In total 36 users participated, 12 each from Dublin City University (DCU, Ireland), University of Glasgow (GU, Scotland) and Centrum Wiskunde and Informatica (CWI, the Netherlands). Three user interfaces were developed, two from DCU which were also used in 2007 as well as an interface from GU. All interfaces leveraged the same search service. Using a latin squares arrangement, each user conducted 12 topics, leading in total to 6 runs per site, 18 in total. We officially submitted for evaluation 3 of these runs to NIST with an additional expert run using a 4th system. Our submitted runs performed around the median. In this paper we will present an overview of the search system utilized, the experimental setup and a preliminary analysis of our results

    Silicon resonant microcantilevers for absolute pressure measurement

    Get PDF
    This work is focused on the developing of silicon resonant microcantilevers for the measurement of the absolute pressure. The microcantilevers have been fabricated with a two-mask bulk micromachining process. The variation in resonance response of microcantilevers was investigated as a function of pressure 10−1-105 Pa, both in terms of resonance frequency and quality factor. A theoretical description of the resonating microstructure is given according to different molecular and viscous regimes. Also a brief discussion on the different quality factors contributions is presented. Theoretical and experimental data show a very satisfying agreement. The microstructure behavior demonstrates a certain sensitivity over a six decade range and the potential evolution of an absolute pressure sensor working in the same rang

    EURECOM at TRECVID 2016. The Adhoc Video Search and Video Hyperlinking Tasks

    Get PDF
    Contains fulltext : 166346.pdf (publisher's version ) (Open Access)TRECVID 2016, 14 november 201

    Design and Maintenance of Data-Intensive Web Sites

    No full text
    A methodology for designing and maintaining large Web sites is introduced. It would be especially useful if data to be published in the site are managed using a DBMS. The design process is composed of two intertwined activities: database design and hypertext design. Each of these is further divided in a conceptual phase and a logical phase, based on specific data models, proposed in our project. The methodology strongly supports site maintenance: in fact, the various models provide a concise description of the site structure; they allow to reason about the overall organization of pages in the site and possibly to restructure it

    Araneus in the Era of XML

    No full text
    A large body of research has been recently motivated by the attempt to extend database manipulation techniques to data on the Web. Most of these research efforts -- which range from the definition of Web query languages and the related optimizations, to systems for Web site development and management, and to integration techniques -- started before XML was introduced, and therefore have strived for a long time to handle the highly heterogeneous nature of HTML pages. In the meanwhile, Web data sources have evolved from small, home-made collections of HTML pages into complex platforms for distributed data access and application development, and XML promises to impose itself as a more appropriate format for this new breed of Web sites. XML brings data on the Web closer to databases, since, differently from HTML, it is based on a clean distinction between the way the data, its logical structure (the DTD), and the chosen presentation (the stylesheet) are specified. By virtue of this, most of the early research proposals for data management on the Web are now being reconsidered in this new perspective. In this paper, we discuss the impact of XML on the research work conducted in the last few years by our group in the framework of the Araneus project. Araneus started as an attempt to investigate the chances of re-applying traditional database concepts and abstractions, such as the ones of data-model and query language, to data on the Web. In this spirit, we have developed several tools and techniques to handle both structured and semistructured data, in the Web style, as follows: (i) a data model called ADM for modeling Web documents and hypertexts; (ii) languages for wrapping and querying Web sites; (iii) tools and techniques for Web site design and implementation

    To Weave the Web

    No full text
    The paper discusses the issue of views in the Web context. We introduce a set of languages for managing and restructuring data coming from the World Wide Web. We present a spe- cific data model, called the ARANEUS Data Model, inspired to the structures typically present in Web sites. The model allows us to describe the scheme of a Web hypertext, in the spirit of databases. Based on the data model, we develop two languages to support a sophisticate view definition process: the first, called ULIXES, is used to build database views of the Web, which can then be analyzed and integrated using database techniques; the sec- ond, called PENELOPE, allows the definition of derived Web hypertexts from relational views. This can be used to generate hypertextual views over the Web

    Design and Development of Data-Intensive Web Sites: The Araneus Approach

    No full text
    Data-intensive Web sites are large sites based on a back-end database, with a fairly complex hypertext structure. The paper develops two main contributions: (a) a specific design methodology for data-intensive Web sites, composed of a set of steps and design transformations that lead from a conceptual specification of the domain of interest to the actual implementation of the site; (b) a tool called Homer, conceived to support the site design and implementation process, by allowing the designer to move through the various steps of the methodology, and to automate the generation of the code needed to implement the actual site.Our approach to site design is based on a clear separation between several design activities, namely database design, hypertext design, and presentation design. All these activities are carried on by using high-level models, all subsumed by an extension of the nested relational model; the mappings between the models can be nicely expressed using an extended relational algebra for nested structures. Based on the design artifacts produced during the design process, and on their representation in the algebraic framework, Homer is able to generate all the code needed for the actual generation of the site, in a completely automatic way
    corecore