365 research outputs found

    A Bandwidth-Conserving Architecture for Crawling Virtual Worlds

    Get PDF
    A virtual world is a computer-based simulated environment intended for its users to inhabit via avatars. Content in virtual worlds such as Second Life or OpenSimulator is increasingly presented using three-dimensional (3D) dynamic presentation technologies that challenge traditional search technologies. As 3D environments become both more prevalent and more fragmented, the need for a data crawler and distributed search service will continue to grow. By increasing the visibility of content across virtual world servers in order to better collect and integrate the 3D data we can also improve the crawling and searching efficiency and accuracy by avoiding crawling unchanged regions or downloading unmodified objects that already exist in our collection. This will help to save bandwidth resources and Internet traffic during the content collection and indexing and, for a fixed amount of bandwidth, maximize the freshness of the collection. This work presents a new services paradigm for virtual world crawler interaction that is co-operative and exploits information about 3D objects in the virtual world. Our approach supports analyzing redundant information crawled from virtual worlds in order to decrease the amount of data collected by crawlers, keep search engine collections up to date, and provide an efficient mechanism for collecting and searching information from multiple virtual worlds. Experimental results with data crawled from Second Life servers demonstrate that our approach provides the ability to save crawling bandwidth consumption, to explore more hidden objects and new regions to be crawled that facilitate the search service in virtual worlds

    Bioinformatics tools and database resources for systems genetics analysis in mice—a short review and an evaluation of future needs

    Get PDF
    During a meeting of the SYSGENET working group ‘Bioinformatics’, currently available software tools and databases for systems genetics in mice were reviewed and the needs for future developments discussed. The group evaluated interoperability and performed initial feasibility studies. To aid future compatibility of software and exchange of already developed software modules, a strong recommendation was made by the group to integrate HAPPY and R/qtl analysis toolboxes, GeneNetwork and XGAP database platforms, and TIQS and xQTL processing platforms. R should be used as the principal computer language for QTL data analysis in all platforms and a ‘cloud’ should be used for software dissemination to the community. Furthermore, the working group recommended that all data models and software source code should be made visible in public repositories to allow a coordinated effort on the use of common data structures and file formats

    The Secret Is Out: Patent Law Preempts Mass Market License Terms Barring Reverse Engineering for Interoperability Purposes

    Get PDF
    As patent protection has emerged to protect software, courts and commentators have mistakenly focused on copyright law and overlooked the centrality of patent preemption to limit contract law where a mass market license which prohibits reverse engineering (RE) for purposes of developing interoperable products leads to patent-like protection. Review of copyright fair use cases on RE and Congress’s policy favoring RE for interoperability purposes in the Digital Millennium Copyright Act reinforce the case for patent preemption. Also, the fundamental freedom to RE embodied in state trade secret law, coupled with federal patent and copyright law and policies, cumulatively should override a contract barrier on RE based upon the public policy exception to contract enforcement. If courts fail to consider patent and public policy limits on contract, the anomalous result is potential outsourcing of interoperability development to one of the increasing number of foreign jurisdictions where interoperability policy overrides contract law. Ironically, that would harm the U.S. economy and thereby frustrate the purpose of the Intellectual Property Clause of the Constitution. Finally, the patent preemption/public policy invalidation approach to mass market contracts outlined in this article may also provide a new lens whenever a mass market contract results in a de-facto monopoly on useful data

    Government-to-Government E-Government: A Case Study of a Federal Financial Program

    Get PDF
    The problem with the study of the concept of electronic government (e-Gov) is that scholars in the field have not adequately explored various dimensions ofthe concept. Literature on e-Gov is replete with works on the form of government to consumer e-Gov. Much less work had been done on the government to government (G2G) e-Gov. This qualitative case study was predicated on the concepts of intergovernmental relations and intergovernmental management, and it sought to fill the gap in the literature by providing a clear understanding of G2G e-Gov by exploring a federal program in the United States. The central research question determined how G2G e-Gov enhanced accountability, efficiency, and public service value. Data were collected using face to face and email interviews, documents, and archival data. Data were analyzed with a modified content analysis technique. Findings from the study indicated that improvements in communication, process, technology, and legislative proposals are linked to programmatic success in G2G e-Gov. The study has implications for social change as the knowledge of G2G e-Gov is useful to governments because of its emphasis on accountability, efficiency, collaboration, and information sharing. It also has the potential to assist public policy officials and academics to better understand the importance of G2G e-Gov for public service delivery, and help developing countries in their e-Gov implementations

    IO-Lite: a unified I/O buffering and caching system

    Get PDF
    This article presents the design, implementation, and evaluation of IO -Lite, a unified I/O buffering and caching system for general-purpose operating systems. IO-Lite unifies all buffering and caching in the system, to the extent permitted by the hardware. In particular, it allows applications, the interprocess communication system, the file system, the file cache, and the network subsystem to safely and concurrently share a single physical copy of the data. Protection and security are maintained through a combination of access control and read-only sharing. IO-Lite eliminates all copying and multiple buffering of I/O data, and enables various cross-subsystem optimizations. Experiments with a Web server show performance improvements between 40 and 80% on real workloads as a result of IO-Lite

    Integration of Legacy and Heterogeneous Databases

    Get PDF
    • …
    corecore