498 research outputs found

    Overdue Process: Why Denial of Physician-Prescribed Marijuana to Terminally Ill Patients Violates the United States Constitution

    Get PDF
    This Comment will begin with a brief history of the medical use of marijuana in western culture and the United States. It will then examine the existing federal statutory scheme governing the use of marijuana and conclude with a look at current beliefs about the medical value of marijuana. Section III will analyze previous attempts to collaterally attack the scheduling of marijuana through the courts and show why those efforts have generally failed. Section IV will perform a substantive due process analysis of William Cohen\u27s case and submit that Mr. Cohen has a fundamental right to consult with his physician to ease severe pain. It will then balance this right against the relevant state interests, which it will suggest are illusory. Finally, Section V will examine the policies supporting judicial, as opposed to legislative, solutions to the medicinal marijuana problem and will conclude that judicial action is the only method likely to bring about necessary change

    Pixelated Scapegoat: False Connections Between Video Games and Violent Crime in America\u27s Largest Cities

    Get PDF
    For too many years, the public perception of violent crime has been viewed through the warped lens of media representations and reporting of mass killings, the likes of Columbine, Sandy Hook, and Parkland, while ignoring the body counts that rack up year after year in America\u27s own cities. Many pundits and politicians declare cultural decay and glorification of violence in video games as the main reason for mass shootings. However, these same voices fail to take the explanation any further to explain the eruptions of violence that rack up thousands of lives a year in large metropolitan areas. The question then is a simple one. If violent video games are responsible for the recent upticks in mass public violence, then it should it not also be true that violent video games have some causal connection to everyday violent crime? This study aims to answer this question and then some. Using the 100 most populous cities as sample, traditional criminological explanations for violent crime, such as poverty, income inequality, population density, segregation, divorce, and the contexts of our racialized past, will be compared to simple measures of video games prevalence, such as sales figures and surveys that detail video game preferences, in explaining homicide and aggravated assault rates in the cities over a five-year period. Using OLS regression analyses, the results suggest that video games, when taken by themselves, have a negative relationship with both homicide and aggravated assault rates, meaning that the more video games sold in any given city, violent crime is lower. However, when taken together with the traditional explanations of violent crime in the same model, video game related sales and public sentiments fall short of significance when compared to variables like poverty and historical racial segregation

    Overdue Process: Why Denial of Physician-Prescribed Marijuana to Terminally Ill Patients Violates the United States Constitution

    Get PDF
    This Comment will begin with a brief history of the medical use of marijuana in western culture and the United States. It will then examine the existing federal statutory scheme governing the use of marijuana and conclude with a look at current beliefs about the medical value of marijuana. Section III will analyze previous attempts to collaterally attack the scheduling of marijuana through the courts and show why those efforts have generally failed. Section IV will perform a substantive due process analysis of William Cohen\u27s case and submit that Mr. Cohen has a fundamental right to consult with his physician to ease severe pain. It will then balance this right against the relevant state interests, which it will suggest are illusory. Finally, Section V will examine the policies supporting judicial, as opposed to legislative, solutions to the medicinal marijuana problem and will conclude that judicial action is the only method likely to bring about necessary change

    An evidence-based health workforce model for primary and community care

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The delivery of best practice care can markedly improve clinical outcomes in patients with chronic disease. While the provision of a skilled, multidisciplinary team is pivotal to the delivery of best practice care, the occupational or skill mix required to deliver this care is unclear; it is also uncertain whether such a team would have the capacity to adequately address the complex needs of the clinic population. This is the role of needs-based health workforce planning. The objective of this article is to describe the development of an evidence-informed, needs-based health workforce model to support the delivery of best-practice interdisciplinary chronic disease management in the primary and community care setting using diabetes as a case exemplar.</p> <p>Discussion</p> <p>Development of the workforce model was informed by a strategic review of the literature, critical appraisal of clinical practice guidelines, and a consensus elicitation technique using expert multidisciplinary clinical panels. Twenty-four distinct patient attributes that require unique clinical competencies for the management of diabetes in the primary care setting were identified. Patient attributes were grouped into four major themes and developed into a conceptual model: the Workforce Evidence-Based (WEB) planning model. The four levels of the WEB model are (1) promotion, prevention, and screening of the general or high-risk population; (2) type or stage of disease; (3) complications; and (4) threats to self-care capacity. Given the number of potential combinations of attributes, the model can account for literally millions of individual patient types, each with a distinct clinical team need, which can be used to estimate the total health workforce requirement.</p> <p>Summary</p> <p>The WEB model was developed in a way that is not only reflective of the diversity in the community and clinic populations but also parsimonious and clear to present and operationalize. A key feature of the model is the classification of subpopulations, which gives attention to the particular care needs of disadvantaged groups by incorporating threats to self-care capacity. The model can be used for clinical, health services, and health workforce planning.</p

    Composite QDrift-Product Formulas for Quantum and Classical Simulations in Real and Imaginary Time

    Full text link
    Recent work has shown that it can be advantageous to implement a composite channel that partitions the Hamiltonian HH for a given simulation problem into subsets AA and BB such that H=A+BH=A+B, where the terms in AA are simulated with a Trotter-Suzuki channel and the BB terms are randomly sampled via the QDrift algorithm. Here we show that this approach holds in imaginary time, making it a candidate classical algorithm for quantum Monte-Carlo calculations. We upper-bound the induced Schatten-111 \to 1 norm on both imaginary-time QDrift and Composite channels. Another recent result demonstrated that simulations of Hamiltonians containing geometrically-local interactions for systems defined on finite lattices can be improved by decomposing HH into subsets that contain only terms supported on that subset of the lattice using a Lieb-Robinson argument. Here, we provide a quantum algorithm by unifying this result with the composite approach into ``local composite channels" and we upper bound the diamond distance. We provide exact numerical simulations of algorithmic cost by counting the number of gates of the form eiHjte^{-iH_j t} and eHjβe^{-H_j \beta} to meet a certain error tolerance ϵ\epsilon. We show constant factor advantages for a variety of interesting Hamiltonians, the maximum of which is a 20\approx 20 fold speedup that occurs for a simulation of Jellium.Comment: 49 pages, 13 figure

    Challenges in molecular dynamics simulations of heat exchange statistics

    Full text link
    We study heat exchange in temperature-biased metal-molecule-metal molecular junctions by employing the LAMMPS atomic molecular dynamics simulator. Generating the nonequilibrium steady state with Langevin thermostats at the boundaries of the junction, we show that the {\it average} heat current across a gold-alkanedithiol-gold nanojunction behaves correctly-physically, with the thermal conductance value matching the literature. In contrast, the {\it full probability distribution function} for heat exchange, as generated by the simulator, violates the fundamental fluctuation symmetry for entropy production. We trace this failure back to the implementation of the thermostats and the expression used to calculate the heat exchange. To rectify this issue and produce the correct statistics, we introduce single-atom thermostats as an alternative to conventional many-atom thermostats. Once averaging heat exchange over the hot and cold thermostats, this approach successfully generates the correct probability distribution function, which we use to study the behavior of both the average heat current and its noise. We further examine the thermodynamic uncertainty relation in the molecular junction and show that it holds, albeit demonstrating nontrivial trends. Our study points to the need to carefully implement nonequilibrium molecular dynamics solvers in atomistic simulation software tools for future investigations of noise phenomena in thermal transport

    A Compact Microchip-Based Atomic Clock Based on Ultracold Trapped Rb Atoms

    Full text link
    We propose a compact atomic clock based on ultracold Rb atoms that are magnetically trapped near the surface of an atom microchip. An interrogation scheme that combines electromagnetically-induced transparency (EIT) with Ramsey's method of separated oscillatory fields can achieve atomic shot-noise level performance of 10^{-13}/sqrt(tau) for 10^6 atoms. The EIT signal can be detected with a heterodyne technique that provides noiseless gain; with this technique the optical phase shift of a 100 pW probe beam can be detected at the photon shot-noise level. Numerical calculations of the density matrix equations are used to identify realistic operating parameters at which AC Stark shifts are eliminated. By considering fluctuations in these parameters, we estimate that AC Stark shifts can be canceled to a level better than 2*10^{-14}. An overview of the apparatus is presented with estimates of duty cycle and power consumption.Comment: 15 pages, 11 figures, 5 table

    Development of Composite Sandwich Bonded Longitudinal Joints for Space Launch Vehicle Structures

    Get PDF
    The NASA Composite Technology for Exploration (CTE) Project is developing and demonstrating critical composite technologies with a focus on composite bonded joints; incorporating materials, design/analysis, manufacturing, and tests that utilize NASAs expertise and capabilities. The project has goals of advancing composite technologies and providing lightweight structures to support future NASA exploration missions. In particular, the CTE project will demonstrate weight-saving, performance-enhancing composite bonded joint technology for Space Launch System (SLS)-scale composite hardware. Advancements from the CTE project may be incorporated as future block upgrades for SLS structural components. This paper discusses the details of the development of a composite sandwich bonded longitudinal joint for a generic space launch vehicle structure called the CTE Point Design. The paper includes details of the design, analysis, materials, manufacturing, and testing of sub-element joint test articles to test the capability of the joint design. The test results show that the composite longitudinal bonded joint design significantly exceeds the design loads with a 2.0 factor of safety. Analysis pre-test failure predictions for all sub-element bonded joint test coupons were all within 10% of the average test coupon failure load. This testing and analysis provides confidence in the potential use of composite bonded joints for future launch vehicle structures

    Effects on the transcriptome upon deletion of a distal element cannot be predicted by the size of the H3K27Ac peak in human cells.

    Get PDF
    Genome-wide association studies (GWAS) have identified single nucleotide polymorphisms (SNPs) associated with increased risk for colorectal cancer (CRC). A molecular understanding of the functional consequences of this genetic variation is complicated because most GWAS SNPs are located in non-coding regions. We used epigenomic information to identify H3K27Ac peaks in HCT116 colon cancer cells that harbor SNPs associated with an increased risk for CRC. Employing CRISPR/Cas9 nucleases, we deleted a CRC risk-associated H3K27Ac peak from HCT116 cells and observed large-scale changes in gene expression, resulting in decreased expression of many nearby genes. As a comparison, we showed that deletion of a robust H3K27Ac peak not associated with CRC had minimal effects on the transcriptome. Interestingly, although there is no H3K27Ac peak in HEK293 cells in the E7 region, deletion of this region in HEK293 cells decreased expression of several of the same genes that were downregulated in HCT116 cells, including the MYC oncogene. Accordingly, deletion of E7 causes changes in cell culture assays in HCT116 and HEK293 cells. In summary, we show that effects on the transcriptome upon deletion of a distal regulatory element cannot be predicted by the size or presence of an H3K27Ac peak
    corecore