1,821 research outputs found

    A new approach to onset detection: towards an empirical grounding of theoretical and speculative ideologies of musical performance

    Get PDF
    This article assesses aspects of the current state of a project which aims, with the help of computers and computer software, to segment soundfiles of vocal melodies into their component notes, identifying precisely when the onset of each note occurs, and then tracking the pitch trajectory of each note, especially in melodies employing a variety of non-standard temperaments, in which musical intervals smaller than 100 cents are ubiquitous. From there, we may proceed further, to describe many other “micro-features” of each of the notes, but for now our focus is on the onset times and pitch trajectories

    Gyrus Higher Learning Management System

    Get PDF
    Our project was to develop a prototype learning management system for use of higher education for our sponsor, Gyrus Systems. This consisted of creating a MySQL relational database to store user and class information, to design and code a user interface that emphasized user experience, and to implement functionalities for each user role. Early in the design phase we outlined which features were must haves, in order to demonstrate an adequate prototype, and had this list approved by our sponsor. They were then divided into two roles. The role of “student” has the ability to submit assignments, get information from their instructor, and receive and review grades for submitted assignments. The role of “professor” has the ability to post assignments, receive and grade student submissions, send their students information about the class, and post materials for the students to review. The application is hosted on a shared web server and the work was done through a cPanel Portal. The UI is constructed with a custom bootstrap and our functional code is in jQuery. Transactions between our front-end and the MySQL database are handled in PHP. Our final product is a functional web-application that accommodates the must haves outlined in our original design. The main pages are the page for class information, the page for viewing grades, the page for viewing announcements, and the dashboard containing quick information. The pages are designed for the fulfillment of the different roles’ unique needs.https://scholarscompass.vcu.edu/capstone/1168/thumbnail.jp

    Simulation of Cu-Mg metallic glass: Thermodynamics and Structure

    Get PDF
    We have obtained effective medium theory (EMT) interatomic potential parameters suitable for studying Cu-Mg metallic glasses. We present thermodynamic and structural results from simulations of such glasses over a range of compositions. We have produced low-temperature configurations by cooling from the melt at as slow a rate as practical, using constant temperature and pressure molecular dynamics. During the cooling process we have carried out thermodynamic analyses based on the temperature dependence of the enthalpy and its derivative, the specific heat, from which the glass transition temperature may be determined. We have also carried out structural analyses using the radial distribution function (RDF) and common neighbor analysis (CNA). Our analysis suggests that the splitting of the second peak, commonly associated with metallic glasses, in fact has little to do with the glass transition itself, but is simply a consequence of the narrowing of peaks associated with structural features present in the liquid state. In fact the splitting temperature for the Cu-Cu RDF is well above TgT_g. The CNA also highlights a strong similarity between the structure of the intermetallic alloys and the amorphous alloys of similar composition. We have also investigated the diffusivity in the supercooled regime. Its temperature dependence indicates fragile-liquid behavior, typical of binary metallic glasses. On the other hand, the relatively low specific heat jump of around 1.5kB/at.1.5 k_B/\mathrm{at.} indicates apparent strong-liquid behavior, but this can be explained by the width of the transition due to the high cooling rates.Comment: 12 pages (revtex, two-column), 12 figures, submitted to Phys. Rev.

    Extracorporeal membrane oxygenation with multiple-organ failure: Can molecular adsorbent recirculating system therapy improve survival?

    Get PDF
    BACKGROUND: Liver dialysis, molecular adsorbent recirculating system (MARS) particularly, has been used in liver failure to bridge to transplantation. We expanded the indication for MARS to patients with acute shock liver failure and cardiopulmonary failure on extracorporeal membrane oxygenation (ECMO), aiming to improve survival to wean from ECMO. METHODS: Retrospective chart analysis of patients on ECMO between 2010 and 2015 found 28 patients who met the criteria for acute liver failure, diagnosed by hyperbilirubinemia (total bilirubin ≥10 mg/dl) or by elevated transaminase (alanine transaminase \u3e1,000 IU/liter). Of these patients, 14 underwent MARS treatment (Group M), and 14 were supported with optimal medical treatment without MARS (Group C). Patient characteristics, liver function, and survival were compared between groups. RESULTS: Demographics, clinical risk factors, and pre-ECMO laboratory data were identical between the groups. MARS was used continuously for 8 days ± 9 in Group M. Total bilirubin, alanine transaminase, and international normalized ratio were improved significantly in Group M. There were no MARS-related complications. Survival to wean from ECMO for Group M was 64% (9/14) vs 21% (3/14) for Group C (p = 0.02). Mortality related to worsening liver dysfunction during ECMO was 40% (2/5 deaths) in Group M and 100% (11/11 deaths) in Group C (p = 0.004). The 30-day survival after ECMO was 43% (6/14) in Group M and 14% (2/14) in Group C (p = 0.09). CONCLUSIONS: MARS therapy in patients on ECMO safely accelerated recovery of liver function and improved survival to wean from ECMO, without increasing complications

    A microtonal wind controller building on Yamaha’s technology to facilitate the performance of music based on the “19-EDO” scale

    Get PDF
    We describe a project in which several collaborators adapted an existing instrument to make it capable of playing expressively in music based on the microtonal scale characterised by equal divsion of the octave into 19 tones (“19-EDO”). Our objective was not just to build this instrument, however, but also to produce a well-formed piece of music which would exploit it idiomatically, in a performance which would provide listeners with a pleasurable and satisfying musical experience. Hence, consideration of the extent and limits of the playing-techniques of the resulting instrument (a “Wind-Controller”) and of appropriate approaches to the composition of music for it were an integral part of the project from the start. Moreover, the intention was also that the piece, though grounded in the musical characteristics of the 19-EDO scale, would nevertheless have a recognisable relationship with what Dimitri Tymoczko (2010) has called the “Extended Common Practice” of the last millennium. So the article goes on to consider these matters, and to present a score of the resulting new piece, annotated with comments documenting some of the performance issues which it raises. Thus, bringing the project to fruition involved elements of composition, performance, engineering and computing, and the article describes how such an inter-disciplinary, multi-disciplinary and cross-disciplinary collaboration was co-ordinated in a unified manner to achieve the envisaged outcome. Finally, we consider why the building of microtonal instruments is such a problematic issue in a contemporary (“high-tech”) society like ours

    On the synthesis and processing of high quality audio signals by parallel computers

    Get PDF
    This work concerns the application of new computer architectures to the creation and manipulation of high-quality audio bandwidth signals. The configuration of both the hardware and software in such systems falls under consideration in the three major sections which present increasing levels of algorithmic concurrency. In the first section, the programs which are described are distributed in identical copies across an array of processing elements; these programs run autonomously, generating data independently, but with control parameters peculiar to each copy: this type of concurrency is referred to as isonomic}The central section presents a structure which distributes tasks across an arbitrary network of processors; the flow of control in such a program is quasi- indeterminate, and controlled on a demand basis by the rate of completion of the slave tasks and their irregular interaction with the master. Whilst that interaction is, in principle, deterministic, it is also data-dependent; the dynamic nature of task allocation demands that no a priori knowledge of the rate of task completion be required. This type of concurrency is called dianomic? Finally, an architecture is described which will support a very high level of algorithmic concurrency. The programs which make efficient use of such a machine are designed not by considering flow of control, but by considering flow of data. Each atomic algorithmic unit is made as simple as possible, which results in the extensive distribution of a program over very many processing elements. Programs designed by considering only the optimum data exchange routes are said to exhibit systolic^ concurrency. Often neglected in the study of system design are those provisions necessary for practical implementations. It was intended to provide users with useful application programs in fulfilment of this study; the target group is electroacoustic composers, who use digital signal processing techniques in the context of musical composition. Some of the algorithms in use in this field are highly complex, often requiring a quantity of processing for each sample which exceeds that currently available even from very powerful computers. Consequently, applications tend to operate not in 'real-time' (where the output of a system responds to its input apparently instantaneously), but by the manipulation of sounds recorded digitally on a mass storage device. The first two sections adopt existing, public-domain software, and seek to increase its speed of execution significantly by parallel techniques, with the minimum compromise of functionality and ease of use. Those chosen are the general- purpose direct synthesis program CSOUND, from M.I.T., and a stand-alone phase vocoder system from the C.D.P..(^4) In each case, the desired aim is achieved: to increase speed of execution by two orders of magnitude over the systems currently in use by composers. This requires substantial restructuring of the programs, and careful consideration of the best computer architectures on which they are to run concurrently. The third section examines the rationale behind the use of computers in music, and begins with the implementation of a sophisticated electronic musical instrument capable of a degree of expression at least equal to its acoustic counterparts. It seems that the flexible control of such an instrument demands a greater computing resource than the sound synthesis part. A machine has been constructed with the intention of enabling the 'gestural capture' of performance information in real-time; the structure of this computer, which has one hundred and sixty high-performance microprocessors running in parallel, is expounded; and the systolic programming techniques required to take advantage of such an array are illustrated in the Occam programming language

    NEOimpactor: a tool for assessing Earth's vulnerability to the NEO impact hazard

    No full text
    The Earth’s surface bears the scars of 4.5 billion years of bombardment by asteroids,despite most having been erased by tectonic activity and erosion. Asteroids predominantlyorbit the Sun in the asteroid belt between Mars and Jupiter, but a large numberoccupy orbits close to the Earth’s. These bodies are termed Near Earth Objects (NEOs)and they present a very real impact threat to the Earth. In 1998 NASA inauguratedthe ‘Spaceguard Survey’ to catalogue 90% of NEOs greater than 1 km in diameter. Thesmaller bodies, meanwhile, remain undetected and far more numerous.In order to understand the NEO hazard, the consequences resulting from an asteroidimpact require modelling. While the atmospheric entry of asteroids is a criticalpart of the impact process, it is the surface impact which is most important, both ontoland and into the oceans. It is the impact generated effects (IGEs) that are hazardousto human populations on the Earth and the infrastructure they occupy. By modellingthese IGEs and the consequences they present for humans and infrastructure, anunderstanding of the global vulnerability to the hazard is developed.‘NEOimpactor’ is the software solution built to investigate the global vulnerabilityto NEO impacts. By combining existing mathematical models which describethe impact and effects, a unified impact simulator tool has been developed with thecapacity to model the real consequences of any terrestrial impact.By comparing the consequences of multiple impact events, a complete vulnerabilityassessment of the global NEO hazard is derived. The result maps are designedfor ease of dissemination to explain the impact risk to a non-specialist audience. Thesystem has identified China, US, India, Japan and Brazil as facing the greatest overallrisk, as well as indicating the various factors influencing vulnerability. The results canbe used for informing the international decision making processes regarding the NEOhazard and potential mitigation strategies

    Crystallization of the Wahnstr\"om Binary Lennard-Jones Liquid

    Full text link
    We report observation of crystallization of the glass-forming binary Lennard-Jones liquid first used by Wahnstr\"om [G. Wahnstr\"om, Phys. Rev. A 44, 3752 (1991)]. Molecular dynamics simulations of the metastable liquid on a timescale of microseconds were performed. The liquid crystallized spontaneously. The crystal structure was identified as MgZn_2. Formation of transient crystallites is observed in the liquid. The crystallization is investigate at different temperatures and compositions. At high temperature the rate of crystallite formation is the limiting factor, while at low temperature the limiting factor is growth rate. The melting temperature of the crystal is estimated to be T_m=0.93 at rho=0.82. The maximum crystallization rate of the A_2B composition is T=0.60+/-0.02.Comment: 4 pages, 4 figures; corrected typo
    • …
    corecore