2,760 research outputs found

    Numerical Method for Shock Front Hugoniot States

    Full text link
    We describe a Continuous Hugoniot Method for the efficient simulation of shock wave fronts. This approach achieves significantly improved efficiency when the generation of a tightly spaced collection of individual steady-state shock front states is desired, and allows for the study of shocks as a function of a continuous shock strength parameter, vpv_p. This is, to our knowledge, the first attempt to map the Hugoniot continuously. We apply the method to shock waves in Lennard-Jonesium along the direction. We obtain very good agreement with prior simulations, as well as our own benchmark comparison runs.Comment: 4 pages, 3 figures, from Shock Compression of Condensed Matter 200

    La poésie québécoise comme source d’inspiration musicale

    Full text link
    La version intégrale de ce mémoire est disponible uniquement pour consultation individuelle à la Bibliothèque de musique de l’Université de Montréal (http://www.bib.umontreal.ca/MU)Ce mémoire analyse cinq pièces par Matthew Lane basées sur des poèmes québécois variés, afin d’expliquer plusieurs façons d'utiliser la poésie comme inspiration et matériel pour la musique, hors des choix traditionnels (chansons, symphonies à programme). Il comprend l'analyse des techniques utilisées pour la cohabitation des mondes instrumentaux et électroniques afin de créer un résultat musical convaincant. Il discute aussi de la manière dont le compositeur s’est servi de techniques de Composition Assistée par Ordinateur (COA). Les pièces explorées sont : 1) Le Lézard Vert, basée sur un poème de Dany Laferrière (pour cor et bande) 2) L’Écho bouge beau, basée sur des poèmes de Nicole Brossard (pour ensemble de chambre et électroniques live) 3) Saisie, basée sur des poèmes de Pierre Nepveu (pour chœur, 3 récitants, et bande) 4) L’Héritage, basée sur un poème de Marc Vaillancourt (pour orchestre de chambre et sons déclenchés) 5) Sans titre à Montréal, basée sur un poème de Louis Carmel (pour quatuor à cordes)This mémoire analyses five pieces by Matthew Lane based on various Québécois poems, in order to explain several of the ways that poetry can serve as musical inspiration and material beyond the traditional forms (songs, program symphonies). It includes the analysis of techniques used to convincingly present instruments and electronics as part of the same musical world. It also discusses the way in which the composer took advantage of Computer Assisted Composition (CAC) techniques. The pieces explored are: 1) Le Lézard Vert, based on a poem by Dany Laferrière (for horn and tape) 2) L’Écho bouge beau, based on poems by Nicole Brossard (for chamber ensemble and live electronics) 3) Saisie, based on poems by Pierre Nepveu (for choir, three narrators and tape) 4) L’Héritage, based on a poem by Marc Vaillancourt (for chamber orchestra and prerecorded sounds) 5) Sans titre à Montréal, based on a poem by Louis Carmel (for string quartet

    Corrective Action Plan for the New Mexico Landfill

    Get PDF
    Beginning in 1979 the New Mexico Landfill accepted municipal waste from neighboring communities and businesses for 21 years. During that time landfill operators buried as much as 1.5 million cubic yards of refuse along the bottom of a local ephemeral drainage basin. Refuse was deposited in a series of trenches excavated from the center of the basin and mounded up to 15 feet over the preexisting grade. In anticipation of cessation of operations the New Mexico Landfill began closure activities in compliance with New Mexico Environment Department regulations. Between 1995 and 2000 a clay liner was constructed over the top of the landfill to prevent surface water infiltration and a number of piezometers and monitoring wells were installed to sample groundwater. Analyses of water samples from monitoring wells hydraulically up gradient and down gradient of the landfill showed downgradient groundwater is being impacted by the landfill. Comparison of the analyses from upgradient and downgradient monitoring wells during an October 2004 sampling event showed the following maximum contaminant concentration increases in downgradient monitoring wells: aluminum 460%, barium 400%, iron 400%, lead 320%, manganese 2,300%, chloride 2,400%, sulfate 1,700%, total dissolved solids 570%, and total phenols were not detectable in the upgradient monitoring well and were in excess of the New Mexico Environment Department drinking water standards in a downgradient monitoring well. This document characterizes groundwater contamination associated with the New Mexico Landfill and presents corrective actions to minimize further leachate generation. Corrective actions proposed for the New Mexico Landfill consist of groundwater and surface water diversion from the refuse material, landfill cap improvement, and leachate collection and treatment. Water inflow to the landfill will be diverted from the refuse with a combination of trenching to sever Groundwater flow paths and cap improvements to reduce infiltration. Leachate will be collected and treated with a filtration system and discharged into a constructed wetland. Contaminant concentrations in water discharged from the treatment system will be equal to or below levels established in the upgradient monitoring well. The total cost of the Corrective Action Plan is estimated to be 1.7 million dollars

    Negotiating the frontier between computer-assisted composition and traditional writing : the utility of each and their effective cross-integration

    Full text link
    Subventionné en partie par la FRQSCAlors que les ordinateurs ont eu une influence majeure sur la composition musicale, par le biais de la musique électroacoustique et la composition assistée par ordinateur (CAO), il peut exister une division entre ceux qui utilisent surtout des techniques d’écriture traditionnelles (composition intuitive faite à la main) et ceux qui incorporent des éléments algorithmiques dans leur musique. Ce qui suit est une exploration de quelques-unes des façons de créer des points de rencontre plus fluides entre les mondes d’écriture intuitive et la composition assistée par ordinateur, en utilisant certains logiciels et pratiques spécifiques à la composition assistée par ordinateur. Ceux-ci s’étendent des situations où l’ordinateur nous pousse légèrement dans une direction ou fournit un réservoir d’information dans laquelle on peut puiser, jusqu’à des situations où, en se servant des indices de l’usager, l’ordinateur exerce un grand degré de contrôle sur l’information musicale finale. Des œuvres de l'auteur serviront à démontrer l’usage d’un nombre de ces technologies, en conjonction avec des explications plus détaillées de leur incorporation. Une première section ciblera la composition et les techniques de programmation pour l’intégration légère de CAO, reflétant une approche plus intuitive. Les pièces Mutations II, « Waves » et « Run » de Short Pieces on Falling, Never a Moment Lost, et (Let Me Hear) What Maria Hears, parmi d’autres, serviront à démontrer l’efficacité de ces techniques. La deuxième section observera l’intégration moyenne de CAO, démontrée par le système modulaire de progressions de l’auteur. Cette structure, développée en OpenMusic, aide à la génération de progressions musicales, et est facilement adaptable et modifiable pour différentes pièces. Ce système sera examiné principalement par une analyse des œuvres Melodious Viscosity et Like a Square Peg. La troisième et dernière section concerne un niveau élevé d’intégration de CAO par l’intermédiaire des gestes, utilisant le logiciel ScoreScrub de l’auteur. En se servant de ce logiciel, l’usager peut effectivement faire du scrubbing à travers des segments de partitions existants afin de produire de nouveaux passages musicaux. Les œuvres centrales analysées seront Gift efter Carl Herman Erlandsson et la pièce orchestrale, Världen och Jag.While computers have had a major influence on music composition, both through electroacoustic music and computer-assisted composition (CAC), there can remain a divide between those pursuing more traditional writing techniques (intuitive composition done by hand) and those incorporating algorithmic elements in their music. The following is an exploration of some of the ways to produce smoother intersections between the worlds of intuitive writing and computer-assisted composition, through the use of a number of different computer-assisted composition software and practises. These range from situations where the computer provides little more than a gentle nudge or a pool of information from which to draw, to situations where, through the user’s input, the computer exerts a high degree of control on the final musical information. Works by Matthew Lane will demonstrate the use of some of these technologies, alongside detailed explanations of how they were incorporated. A first section will look at composition and programming techniques for low integration of CAC, reflecting a more intuitive approach. The works Mutations II, “Waves” and “Run” from Short Pieces on Falling, Never a Moment Lost, and (Let Me Hear) What Maria Hears, amongst others, will serve to demonstrate the efficiency of these techniques. A second section focuses on medium integration of CAC, as demonstrated by the author’s modular progression management system. This framework, developed in OpenMusic, helps in the generation of progression passages, and is adaptable and easily modified for different works. This framework will be examined primarily through the works Melodious Viscosity and Like a Square Peg. The third and final section looks at high CAC integration through gesture, using the author’s software ScoreScrub. Using this software, the user can effectively “scrub” across existing score samples to produce new musical passages. The primary works analysed will be Gift efter Carl Herman Erlandsson and the orchestral work Världen och jag

    Thinking Inside the Box: Placing Form Over Function in the Application of the Statutory Sentencing Procedure in State of Maine v. Eugene Downs

    Get PDF
    In State v. Hewey, the Maine Supreme Judicial Court found that the sentencing court erred in imposing a sentence that exceeded the maximum applicable period of incarceration for a Class A crime and accordingly vacated the sentence. Perhaps more importantly, the Law Court used the case as an “opportunity for clarification of [its] review of an appeal from a sentence imposed by the trial court.” A unanimous court sought to clear up some inconsistencies in previous decisions regarding “the terminology used to define each of the three steps” of the sentencing process by better describing the procedure “by which the significant purposes [of criminal sanction] and relevant factors may be articulated by the trial court in an individual case.” Moreover, the court opined that the three steps were “necessary . . . to achieve a greater uniformity in the sentencing process and to enable [the Law Court] to apply the correct standard of review to each of those steps.” The court’s decision in Hewey was an attempt to help sentencing judges more clearly articulate their sentencing rationale which would allow for more efficient review. The resulting process is commonly referred to as the Hewey analysis. Your analysis here is critical because of the standards of review clearly articulated by the Law Court in Hewey. In the Law Court’s sentence review, different standards of review are applied depending on which step of the Hewey analysis is at issue. The first step in the process is reviewed for misapplication of principle. This standard is less deferential than the abuse of discretion standard applied for the latter two. Because the trial court is in a superior position to evaluate the factors “peculiar to the particular offender,” the reviewing court grants greater deference to the weight and effect given these individualized factors by the sentencing court in determining the maximum period of incarceration and the amount that shall, if any, be suspended. However, the difficultly lies in defining the “principle” to be applied

    Numerical studies of flow in porous media using an unstructured approach

    Get PDF
    Flow and transport in porous media is relevant to many areas of engineering and science including groundwater hydrology and the recovery of oil and gas. Porous materials are characterized by the unique shape and connectivity of the internal void structures which give rise to a large range in macroscopic transport properties. Historically an inability to accurately describe the internal pore-structure has prevented detailed study of the role of pore structure on transport. In recent decades however, the combination of high resolution imaging technologies with computational modeling has seen the development of fundamental pore-scale techniques for studying flow in porous media. Image-based pore-scale modeling of transport phenomena has become an important tool for understanding the complicated relationships between pore structure and measurable macroscopic properties, including permeability and formation factor. This has commonly been achieved by a network-based approach where the pore space is idealized as a series of pores connected by throats, or by a grid-based approach where the voxels of a 3D image represent structured quadrilateral elements or nodal locations. In this work however, image-based unstructured meshing techniques are used to represent voxelised pore spaces by grids comprising entirely of tetrahedral elements. These unstructured tetrahedral grids are used in finite element models to calculate permeability and formation factor. Solutions to the Stokes equations governing creeping, or Darcy flow, are used to validate the finite element approach employed in this work, and to assess the impact of different image-based unstructured meshing strategies on predicted permeability. Testing shows that solutions to the Stokes equations by a P2P1 tetrahedral element are significantly more accurate than solutions based on a P1P1 element, while permeability is shown to be sensitive to structural changes to the pore space induced by different meshing approaches. The modeling approach is also used to investigate the relationship of an electric and hydraulic definition of tortuosity to the Carman-Kozeny equation. The results of simulations using a number of computer generated porous structures indicate that an electrical tortuosity based on computed formation factor is well correlated with the tortuosity suggested by the Carman-Kozeny equation

    Network Exploration of Correlated Multivariate Protein Data for Alzheimer\u27s Disease Association

    Get PDF
    Alzheimer Disease (AD) is difficult to diagnose by using genetic testing or other traditional methods. Unlike diseases with simple genetic risk components, there exists no single marker determining as to whether someone will develop AD. Furthermore, AD is highly heterogeneous and different subgroups of individuals develop the disease due to differing factors. Traditional diagnostic methods using perceivable cognitive deficiencies are often too little too late due to the brain having suffered damage from decades of disease progression. In order to observe AD at early stages prior to the observation of cognitive deficiencies, biomarkers with greater accuracy are required. By using the non-scalar, bidirectional correlation measure, Duo, we overcame the problem of AD’s heterogeneity by creating a bidirectional network. By using this method, we identified key communities of synchronized proteins that are significantly associated with AD. We found that low levels of IP10 and MIG in the cerebrospinal fluid (CSF) may be a protective factor, whereas high values in the CSF appeared as a risk factor. High levels of Clusterin and Sortilin are also found to be risk factors in the cerebrospinal fluid. Additionally, low levels of Testosterone, FSH and LH in the blood plasma show a protective factor in men, whereas high levels of GH, LH, and FSH exhibit a risk factor in women. With these initial findings from a cohort of individuals, we seek to replicate the process on independent datasets, ultimately facilitating the development of methods for revealing preclinical AD and to further understand the pathogenesis of AD

    Flood Damage and Shutdown Times for Industrial Process Facilities

    Get PDF
    The vulnerability of the Gulf Coast to inundation poses a real threat to both national security and the regional economy due to the concentration of the nation’s energy infrastructure throughout the waterways of the southeastern United States’ waterways. Mitigation efforts thus far have been qualitative and fail to provide raw, quantitative data to aid in the successful management of flooding liabilities. This paper proposes a novel approach to analyzing infrastructure susceptibility by means of a component-based approach to consequences posed by water-borne incursions. Systems are simplified to collections of components, each with a lowest-member elevation, thereby identifying the benchmark for vulnerability. Further, the maintenance efforts required to return these systems to processing capability are integrated into the component database, identified by available repair and replacement tasks. Simulations for site-specific flood information are analyzed through National Oceanic and Atmospheric Agency data, which provide the expected inundation levels for the five separate categories of tropical events on the Saffir-Simpson Hurricane Wind Scale. These levels are applied to the elevations determined in the component analysis, thereby producing a legitimate estimate, measured in manhours, for reconstruction efforts following a flood event. These manhours are then used to calculate cost within a labor database composed of technical laborers and supervision, yielding a labor cost. Material costs based on historic pricing, equipment costs based on current market rates, and company overhead costs, composed of site project management, are aggregated to realize a total direct cost as a result of inundation at a specified flood depth. From this total direct cost, decisions at the owner level can be made concerning acceptable risk with quantitative data to support mitigation and prevention strategies
    corecore