5,496 research outputs found

    Transient Response of a Mass Mounted on a Nonlinear, Strain-Rate Sensitive Element

    Get PDF
    The object of the present transient movement study was not that of investigation of response of a nonlinear system alone, but was projected to include the effect of a strain-rate sensitive, nonlinear restoring element

    Asclepios, M.D.? The Ancient Greeks and Integrative Medicine

    Get PDF
    The healing at the Sanctuaries of Asclepios in antiquity was thought to occur due to divine intervention, so it is often assumed in modernity that any healing which took place was product of ancient spirituality or had no legitimate medical foundation. The practices in the temples are cloudy, with Pausanias, Aristophanes, Aelius Aristides, steles, and votive offerings providing the bulk of the evidence. Due to the limited evidence available of what occurred in these sanctuaries, evidence of healing at Asclepieia is analyzed through a modern Integrative Medicine lens, specifically showing how techniques similar to optimal healing environments, hypnosis, and imagery were heavily relied upon in antiquity, revealing the medical legitimacy of these practices at Asclepieia

    Paradigm shift : how the evolution of two generations of home consoles, arcades, and computers influenced American culture, 1985-1995.

    Get PDF
    As of 2016, unlike many popular media forms found here in the United States, video games possess a unique influence, one that gained its own a large widespread appeal, but also its own distinct cultural identity created by millions of fans both here stateside and across the planet. Yet, despite its significant contributions, outside of the gaming's arcade golden age of the early 1980s, the history of gaming post Atari shock goes rather unrepresented as many historians simply refuse to discuss the topic for trivial reasons thus leaving a rather noticeable gap within the overall history. One such important aspect not covered by the majority of the scholarship and the primary focus of thesis argues that the history of early modern video games in the North American market did not originate during the age of Atari in the 1970s and early 1980s. Instead, the real genesis of today's market and popular gaming culture began with the creation and establishment of the third and fourth generation of video games, which firmly solidified gaming as both a multi-billion dollar industry and as an accepted form of entertainment in the United States. This project focuses on the ten-year resurrection of the US video game industry from 1985 to 1995. Written as a case study, the project looks into the three main popular hardware mediums of the late 1980s and 1990s through a pseudo-business, cultural, and technological standpoint that ran parallel with the current events at the time. Through this evaluation of the home consoles, personal computers, and the coin operated arcade machines, gaming in America transformed itself from a perceived fad into a serious multi-billion dollar industry while at the same time, slowly gained popular acceptance. Furthermore, this study will examine the country's love-hate relationship with gaming by looking into reactions towards a Japanese-dominated market, the coming of popular computer gaming, the influence of the bit-wars, and the issue of violence that aided in the establishment of the Entertainment Software Rating Board (ESRB). In order to undertake such a massive endeavor, the project utilizes various sources that include newspapers, magazine articles, US government documents, scholarly articles, video game manuals, commercials, and popular websites to complete the work. Furthermore, another vital source came from firsthand experience playing several of these popular video games from across the decades in question, which include such consoles as the Nintendo Entertainment System, Super Nintendo, Genesis, home computer, and several notable arcade titles. The project's goal and its four main chapters serves as a historical viewpoint of towards neglected video game industry during the third and fourth generation of gaming and the influence it possess in the United States... 'Paradigm Shift...' examines the often-overlooked early modern history of video games from 1985-1995 and how they would go on to become a larger part of American culture. Each chapter attempts to explain the growing influence gaming has had via home console, computer, and arcades in the US market, and in turn show the origins of today's modern gaming market... The significance of 'Paradigm Shift...' comes down to one word, acceptance. Despite the controversy it generated before and during the ten critical years of its rebirth, what the gaming industry did right was breaking the notion that video games were simply a popular craze. Unlike the second generation that only fed this belief, the third and fourth generation of gaming proved this assumption wrong. With countless successful launches of influential games across the decade, video games slowly gained the acceptance of both gamers and non-gamers alike allowing gaming to ingrain itself within the American culture. By 1995, the foundation of both the modern gaming industry and culture came into existence, and it would only become greater as the years progressed thanks to the efforts of Nintendo, Sega, and countless other developers and licensees that kept video games from falling to the wayside during this period of growth and uncertainty

    An Autumn Day

    Get PDF

    Some Aspects of Non-Darcy Behavior of Gas Flow In Wood

    Get PDF
    A study was undertaken to determine if the Klinkenberg equation for gas flow through porous media could be applied to a wide variety of wood species. When the results proved negative, a second study was initiated to determine if the cause of the nonconformities could be traced to turbulence. The combined experiments indicated that neither molecular slippage, as described by the Klinkenberg equation, nor turbulence can adequately explain the reduction in apparent permeability with increasing mean pressure

    Low speed and angle of attack effects on sonic and near-sonic inlets

    Get PDF
    Tests of the Quiet, Clean Short-Haul Experimental Engine (QCSEE) were conducted to determine the effects of forward velocity and angle of attack on sonic and near-sonic inlet aerodynamic performance penalties and acoustic suppression characteristics. The tests demonstrate that translating centerbody and radial vane sonic inlets, and QCSEE high throat Mach number inlets, can be designed to operate effectively at forward speed and moderate angle of attack with good performance and noise suppression capability. The test equipment and procedures used in conducting the evaluation are described. Results of the tests are presented in tabular form

    Economic Evaluation of Membrane Systems for Large Scale Capture and Storage of CO2 Mixtures

    Get PDF
    The capture and storage of CO2 (CCS) as a greenhouse mitigation option is becoming an increasingly important priority for Australian industry. Membrane based CO2 removal systems can provide a cost effective, low maintenance approach for removing CO2 from gas streams. This study examines the effect of membrane characteristics and operating parameters on CCS costs using economic models developed by UNSW for any source-sink combination. The total sequestration cost per tonne of CO2 avoided for separation, transport and storage are compared for the separation of CO2 from coal fired power plants and natural gas processing. A cost benefit analysis indicates that sequestration of gases of high purities are dominated by compression costs which can be off-set by utilising membranes of higher selectivity coupled with higher permeability to reduce the required transmembrane pressure

    Apollo experience report: Development of guidance targeting techniques for the command module and launch vehicle

    Get PDF
    The development of the guidance targeting techniques for the Apollo command module and launch vehicle is discussed for four types of maneuvers: (1) translunar injection, (2) translunar midcourse, (3) lunar orbit insertion, and (4) return to earth. The development of real-time targeting programs for these maneuvers and the targeting procedures represented are discussed. The material is intended to convey historically the development of the targeting techniques required to meet the defined target objectives and to illustrate the solutions to problems encountered during that development

    Astronomy in the Cloud: Using MapReduce for Image Coaddition

    Full text link
    In the coming decade, astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. The study of these sources will involve computation challenges such as anomaly detection and classification, and moving object tracking. Since such studies benefit from the highest quality data, methods such as image coaddition (stacking) will be a critical preprocessing step prior to scientific investigation. With a requirement that these images be analyzed on a nightly basis to identify moving sources or transient objects, these data streams present many computational challenges. Given the quantity of data involved, the computational load of these problems can only be addressed by distributing the workload over a large number of nodes. However, the high data throughput demanded by these applications may present scalability challenges for certain storage architectures. One scalable data-processing method that has emerged in recent years is MapReduce, and in this paper we focus on its popular open-source implementation called Hadoop. In the Hadoop framework, the data is partitioned among storage attached directly to worker nodes, and the processing workload is scheduled in parallel on the nodes that contain the required input data. A further motivation for using Hadoop is that it allows us to exploit cloud computing resources, e.g., Amazon's EC2. We report on our experience implementing a scalable image-processing pipeline for the SDSS imaging database using Hadoop. This multi-terabyte imaging dataset provides a good testbed for algorithm development since its scope and structure approximate future surveys. First, we describe MapReduce and how we adapted image coaddition to the MapReduce framework. Then we describe a number of optimizations to our basic approach and report experimental results comparing their performance.Comment: 31 pages, 11 figures, 2 table
    • …
    corecore