2,372 research outputs found

    Plug and Play: Interoperability in Concert

    Full text link
    In order to make database systems interoperate with systems beyond traditional application areas a new paradigm called "exporting database functionality" as a radical departure from traditional thinking has been proposed in research anddevelopment. Traditionally, all data is loaded into and owned by the database, whereas according to the new paradigm data may reside outside the database in external repositories or archives. Nevertheless, database functionality, such as query processing, and indexing, is provided exploiting interoperability ofthe DBMS with the external repositories. Obviously, there is an overhead involved having the DBMS interoperate with external repositories instead of a priori loading all data into the DBMS. In this paper we discuss alternatives for interoperability at di erent levels of abstraction, and we report on evaluations performed using the Concert prototype system making these cost factors explicit

    Musical accompaniments in the preparation of marimba concerti: a survey of selective interactive music software programs

    Get PDF
    The purpose of this study was to investigate the features of three interactive music software programs and their application in preparing marimba concerti. Specifically, the study evaluated Finale, NOTION, and SmartMusic for their viability in preparing Concerto No. 1 in D Minor for Marimba and Orchestra by Noah Taylor. A review of the literature relating to interactive music software programs revealed a lack of studies examining the use of these types of programs in the preparation of marimba concerti. All three software programs were installed on a 15-inch MacBook Pro computer that met system requirements for all three programs. Documentation indicated that all three interactive music software programs offered viable alternatives to preparing marimba concerti with piano reductions. Finale and NOTION provided comparable instrument sounds in terms of quantity and quality. Finale improved its instrument sound quality and quantity through its integrated Garritan Instruments sound library. NOTION offered improved sound quality and quantity through the purchase of Sound Expansion Kits. Finale’s Tempo Tap feature and NOTION’s NTempo function provided real-time tempo adjustment and the Audio Mixer mechanism for both programs allowed the user to isolate instruments. SmartMusic offered comparable instrument realizations through its SoftSynth device. The program, however, did not offer a tempo control feature that was compatible with marimba. Also, SmartMusic’s export options and Practice Loop feature allowed the user to effectively isolate instruments. Further research recommendations included empirical studies examining the benefits of interactive music software programs on the preparation of marimba concerti and applying earlier studies performed on the Vivace interactive music software program to current music software programs. Descriptive study recommendations included investigating the applications of interactive music software in the preparation of orchestral percussion excerpts and marimba concertos with wind ensemble, percussion ensemble, or chamber ensemble accompaniments

    Geospatial Data Modeling to Support Energy Pipeline Integrity Management

    Get PDF
    Several hundred thousand miles of energy pipelines span the whole of North America -- responsible for carrying the natural gas and liquid petroleum that power the continent\u27s homes and economies. These pipelines, so crucial to everyday goings-on, are closely monitored by various operating companies to ensure they perform safely and smoothly. Happenings like earthquakes, erosion, and extreme weather, however -- and human factors like vehicle traffic and construction -- all pose threats to pipeline integrity. As such, there is a tremendous need to measure and indicate useful, actionable data for each region of interest, and operators often use computer-based decision support systems (DSS) to analyze and allocate resources for active and potential hazards. We designed and implemented a geospatial data service, REST API for Pipeline Integrity Data (RAPID) to improve the amount and quality of data available to DSS. More specifically, RAPID -- built with a spatial database and the Django web framework -- allows third-party software to manage and query an arbitrary number of geographic data sources through one centralized REST API. Here, we focus on the process and peculiarities of creating RAPID\u27s model and query interface for pipeline integrity management; this contribution describes the design, implementation, and validation of that model, which builds on existing geospatial standards

    Enterprise systems and labor productivity: disentangling combination effects

    Get PDF
    This study analyzes the relationship between the three main enterprise systems (Enterprise Resource Planning (ERP), Supply Chain Management (SCM), Customer Relationship Management (CRM)) and labor productivity. It reveals the performance gains due to different combinations of these systems. It also tests for complementarity among the enterprise systems with respect to their interacting nature. Using German firm-level data the results show that the highest productivity gains due to enterprise system usage are realized through use of the three main enterprise systems together. In addition, SCM and CRM function as complements, especially if ERP is also in use. --Labor productivity,enterprise systems,complementarity,Enterprise Resource Planning,Supply Chain Management,Customer Relationship Management

    Vocaodoru - Rhythm Gaming and Artificial Cinematography in Virtual Reality

    Get PDF
    Vocaodoru is a virtual reality rhythm game centered around two novel components. The gameplay of Vocaodoru is a never before-seen pose-based gameplay system that uses a player’s measurements to adapt gameplay to their needs. Tied to the gameplay is a human-in-the-loop utility AI that controls a cinematographic camera to allow streamers to broadcast a more interesting, dynamic view of the player. We discuss our efforts to develop and connect these components and how we plan to continue development after the conclusion of the MQP

    The Design and Implementation of Database-Access Middleware for Live Object-Oriented Programming

    Get PDF
    We describe middleware and programming environment tools (JPie/qt) that allow programmers to access relational databases in an object-oriented way. Building on top of the JDBC API and leveraging live dynamic class creation and modification in JPie, the JPie/qt middleware presents the user with a simple interactive mechanism for creating object-oriented applications that access databases. Classes are generated mirroring the database schema and programmers deal directly with these classes. Objects of these classes can be database-bound, so reads and writes to their fields are reflected in the relational database immediately. Database transactions are supported by connecting commit and rollback to Java exception semantics

    Visualising biological data: a semantic approach to tool and database integration

    Get PDF
    <p>Abstract</p> <p>Motivation</p> <p>In the biological sciences, the need to analyse vast amounts of information has become commonplace. Such large-scale analyses often involve drawing together data from a variety of different databases, held remotely on the internet or locally on in-house servers. Supporting these tasks are <it>ad hoc </it>collections of data-manipulation tools, scripting languages and visualisation software, which are often combined in arcane ways to create cumbersome systems that have been customised for a particular purpose, and are consequently not readily adaptable to other uses. For many day-to-day bioinformatics tasks, the sizes of current databases, and the scale of the analyses necessary, now demand increasing levels of automation; nevertheless, the unique experience and intuition of human researchers is still required to interpret the end results in any meaningful biological way. Putting humans in the loop requires tools to support real-time interaction with these vast and complex data-sets. Numerous tools do exist for this purpose, but many do not have optimal interfaces, most are effectively isolated from other tools and databases owing to incompatible data formats, and many have limited real-time performance when applied to realistically large data-sets: much of the user's cognitive capacity is therefore focused on controlling the software and manipulating esoteric file formats rather than on performing the research.</p> <p>Methods</p> <p>To confront these issues, harnessing expertise in human-computer interaction (HCI), high-performance rendering and distributed systems, and guided by bioinformaticians and end-user biologists, we are building reusable software components that, together, create a toolkit that is both architecturally sound from a computing point of view, and addresses both user and developer requirements. Key to the system's usability is its direct exploitation of semantics, which, crucially, gives individual components knowledge of their own functionality and allows them to interoperate seamlessly, removing many of the existing barriers and bottlenecks from standard bioinformatics tasks.</p> <p>Results</p> <p>The toolkit, named Utopia, is freely available from <url>http://utopia.cs.man.ac.uk/</url>.</p

    Current advances in systems and integrative biology

    Get PDF
    Systems biology has gained a tremendous amount of interest in the last few years. This is partly due to the realization that traditional approaches focusing only on a few molecules at a time cannot describe the impact of aberrant or modulated molecular environments across a whole system. Furthermore, a hypothesis-driven study aims to prove or disprove its postulations, whereas a hypothesis-free systems approach can yield an unbiased and novel testable hypothesis as an end-result. This latter approach foregoes assumptions which predict how a biological system should react to an altered microenvironment within a cellular context, across a tissue or impacting on distant organs. Additionally, re-use of existing data by systematic data mining and re-stratification, one of the cornerstones of integrative systems biology, is also gaining attention. While tremendous efforts using a systems methodology have already yielded excellent results, it is apparent that a lack of suitable analytic tools and purpose-built databases poses a major bottleneck in applying a systematic workflow. This review addresses the current approaches used in systems analysis and obstacles often encountered in large-scale data analysis and integration which tend to go unnoticed, but have a direct impact on the final outcome of a systems approach. Its wide applicability, ranging from basic research, disease descriptors, pharmacological studies, to personalized medicine, makes this emerging approach well suited to address biological and medical questions where conventional methods are not ideal
    • …
    corecore