5,440 research outputs found

    Explicit correlation and basis set superposition error: The structure and energy of carbon dioxide dimer

    Get PDF
    We have investigated the slipped parallel and t-shaped structures of carbon dioxide dimer [(CO₂)₂] using both conventional and explicitly correlated coupled cluster methods, inclusive and exclusive of counterpoise (CP) correction. We have determined the geometry of both structures with conventional coupled cluster singles doubles and perturbative triples theory [CCSD(T)] and explicitly correlated cluster singles doubles and perturbative triples theory [CCSD(T)-F12b] at the complete basis set (CBS) limits using custom optimization routines. Consistent with previous investigations, we find that the slipped parallel structure corresponds to the global minimum and is 1.09 kJ mol⁻Âč lower in energy. For a given cardinal number, the optimized geometries and interaction energies of (CO₂)₂ obtained with the explicitly correlated CCSD(T)-F12b method are closer to the CBS limit than the corresponding conventional CCSD(T) results. Furthermore, the magnitude of basis set superposition error (BSSE) in the CCSD(T)-F12b optimized geometries and interaction energies is appreciably smaller than the magnitude of BSSE in the conventional CCSD(T) results. We decompose the CCSD(T) and CCSD(T)-F12b interaction energies into the constituent HF or HF CABS, CCSD or CCSD-F12b, and (T) contributions. We find that the complementary auxiliary basis set (CABS) singles correction and the F12b approximation significantly reduce the magnitude of BSSE at the HF and CCSD levels of theory, respectively. For a given cardinal number, we find that non-CP corrected, unscaled triples CCSD(T)-F12b/VXZ-F12 interaction energies are in overall best agreement with the CBS limit

    Physical processes in polar stratospheric ice clouds

    Get PDF
    A one dimensional model of cloud microphysics was used to simulate the formation and evolution of polar stratospheric ice clouds. Some of the processes which are included in the model are outlined. It is found that the clouds must undergo preferential nucleation upon the existing aerosols just as do tropospheric cirrus clouds. Therefore, there is an energy barrier between stratospheric nitric acid particles and ice particles implying that nitric acid does not form a continuous set of solutions between the trihydrate and ice. The Kelvin barrier is not significant in controlling the rate of formation of ice particles. It was found that the cloud properties are sensitive to the rate at which the air parcels cool. In wave clouds, with cooling rates of hundreds of degrees per day, most of the existing aerosols nucleate and become ice particles. Such clouds have particles with sizes on the order of a few microns, optical depths on order of unity and are probably not efficient at removing materials from the stratosphere. In clouds which form with cooling rates of a few degrees per day or less, only a small fraction of the aerosols become cloud particles. In such clouds the particle radius is larger than 10 microns, the optical depths are low and water vapor is efficiently removed. Seasonal simulations show that the lowest water vapor mixing ratio is determined by the lowest temperature reached, and that the time when clouds disappear is controlled by the time when temperatures begin to rise above the minimum values

    Modeling Ozark Caves with Structure-from-Motion Photogrammetry: An Assessment of Stand-Alone Photogrammetry for 3-Dimensional Cave Survey

    Get PDF
    Nearly all aspects of karst science and management begin with a map. Yet despite this fact, cave survey is largely conducted in the same archaic way that is has been for years - with a compass, tape measure, and a sketchpad. Traditional cave survey can establish accurate survey lines quickly. However, passage walls, ledges, profiles, and cross-sections are time intensive and ultimately rely on the sketcher’s experience at interpretively hand drawing these features between survey stations. This project endeavors to experiment with photogrammetry as a method of improving on traditional cave survey, while also avoiding some of the major pitfalls of terrestrial laser scanning. The proposed method allows for the creation of 3D models which capture cave wall geometry, important cave formations, as well as providing the ability to create cross sections anywhere desired. The interactive 3D cave models are produced cheaply, with equipment that can be operated in extremely confined, harsh conditions, by unpaid volunteers with little to no technical training. While the rapid advancement of photogrammetric software has led to its use in many 3D modeling applications, there is only a sparse body of research examining the use of photogrammetry as a standalone method for surveying caves. The proposed methodology uses a GoPro camera and a 1000 lumen portable floodlight to capture still images down the length of cave passages. The procedure goes against several traditional rules of thumb, both operating in the dark with a moving light source, as well as utilizing a wide angle, fish eye lens, to capture scene information that is not perpendicular to the camera\u27s field of view. Images are later processed into 3D models using Agisoft’s PhotoScan. Four caves were modeled using the method, with varying levels of success. The best results occurred in dry confined passages, while passages greater than 9 meters (30ft) in width, or those with a great deal of standing water in the floor, produced large holes. An additional experiment occurred in the University of Arkansas utility tunnel

    Planning for Continuity of Services: A Comprehensive Strategic Assessment Model for Healthcare Business Continuity Planning

    Get PDF
    With the release of the 2016 Centers for Medicare and Medicaid Services’ (CMS) requirements for healthcare institutions to implement business continuity planning into their organizations by November 15, 2017, the focus of business continuity and disaster recovery planning solely for information services has now transitioned into an enterprise-wide requirement. Over the past decade, there have been increasing numbers of naturally occurring and man-made disasters that have significantly interrupted or altogether closed healthcare facilities, impacting the health and well-being of entire communities. This study examines the changing regulatory landscape that requires healthcare institutions to develop, maintain, and regularly test their business continuity plans in an effort to enhance their operational resiliency. After a retrospective review of regulations, guidelines, and best practices, this study pilots an addition to the Kaiser Permanente hazard vulnerability assessment (HVA) tool that is intended to enable healthcare organizations to objectively identify, prioritize, and maintain their business continuity and emergency management planning efforts through the identification of potential operational and financial impacts to healthcare facilities during and following disasters. The major benefits of this study are to identify the historical shortcomings of a healthcare facility’s hazard and risk identification processes and to facilitate the use of the information collected during that process. Identified inadequacies from past healthcare preparedness efforts will be used to form new meaningful efforts to enhance the recognition of risks to healthcare organizations, in an effort to enhance their resiliency to interruptions of services and to minimize financial losses during austere events

    SISTERS

    Get PDF
    This paper will examine the process of producing my short thesis film SISTERS. It includes a self reflective analysis of my filmmaking approach from start to finish with accompanying documentation to further show the specifics of the production process. My experience as a graduate student in the University of New Orleans film and theatre department will also be discussed in length

    Painful Virtue, Marginalisation, and Resistance

    Get PDF
    This paper argues a potentially controversial thesis in virtue ethics, i.e., in situations of oppression and marginalisation, it is better to be a person of atypical virtue, one who has struggled to resist oppressive circumstances, than it is to be a traditionally defined virtuous agent. As such, those who have been through a tragic dilemma (or several) are more important for successful resistance movements than their traditionally defined counterparts. This paper does not romanticise oppressive situations or their influence on some individuals developing virtuous actions and behaviours. Instead, it acknowledges that these are tragic circumstances that permanently affect some individuals for the rest of their lives. However, the argument here is that these individuals can utilise their experiences as reasons to continue resisting until a time comes where future generations will not need to experience such tragic circumstances. To demonstrate the applicability of this argument, this paper will consider the struggles of queer individuals in a Canadian context. This is achieved by demonstrating how those individuals who led the fight for queer rights used their experiences of marginalisation in early resistance movements. It then shifts focus to address current issues in Canadian queer lives

    Radiation Metrology of Small Animal Molecular Imaging and Molecular Radiotherapy Using mirco-PET/CT

    Get PDF
    Genetically engineered animal models of diseases are increasingly recapitulating human diseases. With this, in vivo preclinical imaging of small laboratory animals has emerged as a critical component of biomedical research because of its noninvasive nature allowing serial assay of animal models and monitoring its safety and effectiveness over the history of the disease. The concept of quantitative molecular imaging is to go beyond displaying images in digital form and to consider the image and extract quantitative information that allows for a better understanding of disease progression and treatment. The aim of this work is to demonstrate the need for the metrology of molecular imaging of animal models using micro-PET/CT devices. System characteristics are determined within each subsystem, micro-PET and micro-CT, independent of each other, and as integrated systems. The characterization of tissues, composition and density, by micro-CT was determined along with the noise level of the unit. Moreover, the nominal superficial and deep absorbed doses were estimated to assess the confounding effect of multiple scans in animal studies. The Q value, used to convert counts per milliliter to activity per milliliter, was estimated to assess the observed activity present in the animal. The resolution of the micro-PET subsystem was also estimated using a modified Derenzo phantom to assess the uncertainty of the activity distribution within tissues. Once both modalities were characterized separately the coordinate system of each individual system was checked for spatial accuracy using a cross capillary method. The offset values were then used to establish the same coordinate system for co-registration. Once both micro-PET and micro-CT image data sets had been verified, they were used to generate a voxel image of the subject for use in the Monte Carlo program, MCNP6, where an absorbed dose map was generated for the radiolabeled compound. Two basic examples are given to demonstrate the use of the voxelized absorbed dose maps for calculating the absorbed dose to any segmented organ of interest, across longitudinal studies. In this way, it was shown that an animal-specific model can be used to accurately calculate the absorbed dose for each time point during a study

    Efficient quantum processing of ideals in finite rings

    Full text link
    Suppose we are given black-box access to a finite ring R, and a list of generators for an ideal I in R. We show how to find an additive basis representation for I in poly(log |R|) time. This generalizes a recent quantum algorithm of Arvind et al. which finds a basis representation for R itself. We then show that our algorithm is a useful primitive allowing quantum computers to rapidly solve a wide variety of problems regarding finite rings. In particular we show how to test whether two ideals are identical, find their intersection, find their quotient, prove whether a given ring element belongs to a given ideal, prove whether a given element is a unit, and if so find its inverse, find the additive and multiplicative identities, compute the order of an ideal, solve linear equations over rings, decide whether an ideal is maximal, find annihilators, and test the injectivity and surjectivity of ring homomorphisms. These problems appear to be hard classically.Comment: 5 page
    • 

    corecore