3,453 research outputs found

    Television Histories: Shaping Collective Memory in the Media Age

    Get PDF
    From Ken Burns’s documentaries to historical dramas such as Roots, from A&E’s Biography series to CNN, television has become the primary source for historical information for tens of millions of Americans today. Why has television become such a respected authority? What falsehoods enter our collective memory as truths? How is one to know what is real and what is imagined—or ignored—by producers, directors, or writers? Gary Edgerton and Peter Rollins have collected a group of essays that answer these and many other questions. The contributors examine the full spectrum of historical genres, but also institutions such as the History Channel and production histories of such series as The Jack Benny Show, which ran for fifteen years. The authors explore the tensions between popular history and professional history, and the tendency of some academics to declare the past “off limits” to nonscholars. Several of them point to the tendency for television histories to embed current concerns and priorities within the past, as in such popular shows as Quantum Leap and Dr. Quinn, Medicine Woman. The result is an insightful portrayal of the power television possesses to influence our culture. Winner of the 2001 Ray and Pat Browne Award for Outstanding Textbook given by the Popular Culture Association Offers much food for thought in this highly visual age. —Alliance (OH) Review As an example of well-reasoned, original research, Television Histories makes an important contribution to the study of the medium. —Anthony Slide, Classic Images This book is even more timely and provocative because much of the material discussed is being rebroadcast now that digital television is opening even more new channels. —Choice An engrossing collection that slides the thorny subject of television, history, and memory under a microscope. . . . Digs deep into a contemporary phenomenon, and its many conclusions are right on target. —Film & History Helps those of us who care about history think more clearly about how television can shape historical thinking among our friends, neighbors, and students. —Florida Historical Quarterly Television Histories, a pioneer work, weaves an inspired and informed interdisciplinary analysis of television and history. The chapters are enlightening, readable, and entertaining; the editors and the authors have produced a work that enriches and strengthens the study of film and history. —Michael Schoenecke The stuff serious thinkers in a media age should read, mark and remember. —Rockland (ME) Courier-Gazette An insightful and important addition to the literature that sheds light on an often controversial subject for professional historians. —Southern Historian Most of the essays are likely to be of considerable value to any attentive student of television. —Television Quarterly Working from the thesis that people learn about history through television more than any other medium, Edgerton and Rollins look at what TV subliminally teaches us by what is shows and does not show. —Varietyhttps://uknowledge.uky.edu/upk_film_and_media_studies/1020/thumbnail.jp

    Converting InSAR- and GNSS-derived strain rate maps into earthquake hazard models for Anatolia

    Get PDF
    &amp;lt;p&amp;gt;Geodetic measurements of crustal deformation rates can provide important constraints on a region&amp;amp;#8217;s earthquake hazard that purely seismicity-based hazard models may miss. For example, geodesy might show that strain (or a deficit of seismic moment) is accumulating faster than the total rate at which known earthquakes have released it, implying that the long-term hazard may include larger earthquakes with long recurrence intervals (and/or temporal increases in seismicity rates). Conversely, the moment release rate in recent earthquakes might surpass the geodetic moment buildup rate, suggesting that the long-term-average earthquake activity and hazard may in fact may be more quiescent than might be estimated using the earthquake history alone. Such geodetic constraints, however, have traditionally been limited by poor spatial and/or temporal sampling, resulting in ambiguities about how the lithosphere accommodates strain in space and time that can bias estimates of the resulting hazard. High-resolution deformation maps address this limitation by imaging (rather than presuming and/or modelling) where and how deformation takes place. These maps are now within reach for the Alpine-Himalayan Belt &amp;amp;#8211; one of the most populous and seismically hazardous regions on Earth &amp;amp;#8211; thanks to the COMET-LiCSAR InSAR processing system, which performs large-scale automated processing and timeseries analysis of Sentinel-1 data provided by the EU&amp;amp;#8217;s Copernicus programme. We are pairing LiCSAR products with GNSS data to generate high-resolution maps of interseismic surface motion (velocity) and strain rate for the Anatolia region. Here we quantitively investigate what these strain rate distributions imply for seismic hazard in this region, using two approaches in parallel.&amp;lt;/p&amp;gt;&amp;lt;p&amp;gt;First, building on previous work, we develop a fully probability-based method to pair geodesy and seismic catalogs to estimate the recurrence times of large, moderate and small earthquakes in a given region. We assume that earthquakes 1) obey a power-law magnitude-frequency distribution up to a maximum magnitude and 2) collectively release seismic moment at the same rate that we estimate it is accumulating from the strain rate maps. Iterating over various magnitude-frequency distributions and their governing parameters, and formally incorporating uncertainties in moment buildup rate and the magnitudes of recorded earthquakes, we build a probabilistic long-term-average earthquake model for Anatolia as a whole, including the most likely maximum earthquake magnitude. Second, we estimate how seismic hazard may vary from place to place within Anatolia. Using insights from dislocation models, we identify two key signatures of a locked fault in a strain rate field, allowing us to convert the newly developed strain maps to &amp;amp;#8220;effective fault maps.&amp;amp;#8221; Additionally, we explore how characteristics of earthquake magnitude-frequency distributions may scale with the rate of strain (or moment) buildup, and what these scaling relations imply for the distribution of hazard in Anatolia, using the seismic catalog to evaluate these hypotheses. We also explore the implications of our findings for seismic hazard and address how to expand these approaches to the Alpine-Himalaya Belt as a whole.&amp;lt;/p&amp;gt; </jats:p

    Virginia

    Full text link

    Parametric Amplification of Nonlinear Response of Single Crystal Niobium

    Full text link
    Giant enhancement of the nonlinear response of a single crystal Nb sample, placed in {\it a pumping ac magnetic field}, has been observed experimentally. The experimentally observed amplitude of the output signal is about three orders of magnitude higher than that seen without parametric pumping. The theoretical analysis based on the extended double well potential model provides a qualitative explanation of the experimental results as well as new predictions of two bifurcations for specific values of the pumping signal.Comment: 6 pages, 10 figure

    Model Systems of Human Intestinal Flora, to Set Acceptable Daily Intakes of Antimicrobial Residues

    Get PDF
    The veterinary use of antimicrobial drugs in food producing animals may result in residues in food, that might modify the consumer gut flora. This review compares three model systems that maintain a complex flora of human origin: (i) human flora associated (HFA) continuous flow cultures in chemostats, (ii) HFA mice, and (iii) human volunteers. The "No Microbial Effect Level" of an antibiotic on human flora, measured in one of these models, is used to set the accept¬able daily intake (ADI) for human consumers. Human volunteers trials are most relevant to set microbio¬log¬ical ADI, and may be considered as the "gold standard". However, human trials are very expensive and unethical. HFA chemostats are controlled systems, but tetracycline ADI calculated from a chemostat study is far above result of a human study. HFA mice studies are less expensive and better controlled than human trials. The tetracycline ADI derived from HFA mice studies is close to the ADI directly obtained in human volunteers

    Heat and water transport in soils and across the soil-atmosphere interface: 1. Theory and different model concepts

    Get PDF
    Evaporation is an important component of the soil water balance. It is composed of water flow and transport processes in a porous medium that are coupled with heat fluxes and free air flow. This work provides a comprehensive review of model concepts used in different research fields to describe evaporation. Concepts range from nonisothermal two-phase flow, two-component transport in the porous medium that is coupled with one-phase flow, two-component transport in the free air flow to isothermal liquid water flow in the porous medium with upper boundary conditions defined by a potential evaporation flux when available energy and transfer to the free airflow are limiting or by a critical threshold water pressure when soil water availability is limiting. The latter approach corresponds with the classical Richards equation with mixed boundary conditions. We compare the different approaches on a theoretical level by identifying the underlying simplifications that are made for the different compartments of the system: porous medium, free flow and their interface, and by discussing how processes not explicitly considered are parameterized. Simplifications can be grouped into three sets depending on whether lateral variations in vertical fluxes are considered, whether flow and transport in the air phase in the porous medium are considered, and depending on how the interaction at the interface between the free flow and the porous medium is represented. The consequences of the simplifications are illustrated by numerical simulations in an accompanying paper

    Characterization of the stretched exponential trap-time distributions in one-dimensional coupled map lattices

    Full text link
    Stretched exponential distributions and relaxation responses are encountered in a wide range of physical systems such as glasses, polymers and spin glasses. As found recently, this type of behavior occurs also for the distribution function of certain trap time in a number of coupled dynamical systems. We analyze a one-dimensional mathematical model of coupled chaotic oscillators which reproduces an experimental set-up of coupled diode-resonators and identify the necessary ingredients for stretched exponential distributions.Comment: 8 pages, 8 figure

    Low energy polarization sensitivity of the Gas Pixel Detector

    Full text link
    An X-ray photoelectric polarimeter based on the Gas Pixel Detector has been proposed to be included in many upcoming space missions to fill the gap of about 30 years from the first (and to date only) positive measurement of polarized X-ray emission from an astrophysical source. The estimated sensitivity of the current prototype peaks at an energy of about 3 keV, but the lack of readily available polarized sources in this energy range has prevented the measurement of detector polarimetric performances. In this paper we present the measurement of the Gas Pixel Detector polarimetric sensitivity at energies of a few keV and the new, light, compact and transportable polarized source that was devised and built to this aim. Polarized photons are produced, from unpolarized radiation generated with an X-ray tube, by means of Bragg diffraction at nearly 45 degrees. The employment of mosaic graphite and flat aluminum crystals allow the production of nearly completely polarized photons at 2.6, 3.7 and 5.2 keV from the diffraction of unpolarized continuum or line emission. The measured modulation factor of the Gas Pixel Detector at these energies is in good agreement with the estimates derived from a Monte Carlo software, which was up to now employed for driving the development of the instrument and for estimating its low energy sensitivity. In this paper we present the excellent polarimetric performance of the Gas Pixel Detector at energies where the peak sensitivity is expected. These measurements not only support our previous claims of high sensitivity but confirm the feasibility of astrophysical X-ray photoelectric polarimetry.Comment: 15 pages, 12 figures. Accepted for publication in NIM

    ixpeobssim: a Simulation and Analysis Framework for the Imaging X-ray Polarimetry Explorer

    Get PDF
    ixpeobssim is a simulation and analysis framework, based on the Python programming language and the associated scientific ecosystem, specifically developed for the Imaging X-ray Polarimetry Explorer (IXPE). Given a source model and the response functions of the telescopes, it is designed to produce realistic simulated observations, in the form of event lists in FITS format, containing a strict super-set of the information provided by standard IXPE level-2 files. The core ixpeobssim simulation capabilities are complemented by a full suite of post-processing applications, allowing for the implementation of complex, polarization-aware analysis pipelines, and facilitating the inter-operation with the standard visualization and analysis tools traditionally in use by the X-ray community. We emphasize that, although a significant part of the framework is specific to IXPE, the modular nature of the underlying implementation makes it potentially straightforward to adapt it to different missions with polarization capabilities.Comment: 12 pages, 6 figures. Accepted for publication on SoftwareX; source code available at https://github.com/lucabaldini/ixpeobssi
    • …
    corecore