7,375 research outputs found

    Flow and fracture of ice and ice mixtures

    Get PDF
    Frozen volatiles make up an important volume fraction of the low density moons of the outer solar system. Understanding the tectonic history of the surfaces of these moons, as well as the evolution of their interiors, requires knowledge of the mechanical strength of these icy materials under the appropriate planetary conditions (temperature, hydrostatic pressure, strain rate). Ongoing lab research is being conducted to measure mechanical properties of several different ices under conditions that faithfully reproduce condition both at the moons' surfaces (generally low temperature, to about 100 K, and low pressures) and in the deep interiors (warmer temperatures, pressures to thousands of atmospheres). Recent progress is reported in two different phases of the work: rheology of ices in the NH3-H2O system at temperatures and strain rates lower than ever before explored, with application to the ammonia-rich moons of Saturn and Uranus; and the water ice I yields II phase transformation, which not only applies directly to process deep in the interiors of Ganymede and Callisto, but holds implications for deep terrestrial earthquakes as well

    Internal Migration and Regional Population Dynamics in Europe: France Case Study

    Get PDF
    The paper examines the patterns of internal migration and population change in France over the recent decades at departément and commune scales. Regional population change is controlled by both natural increase and internal migration. There are two differing patterns of natural increase: north and east France has higher natural increase and south and east has lower. The geographic pattern of internal migration has changed substantially over the last 50 years, most dramatically in the Île-de-France, which showed the highest gains between 1954 and 1962 but the highest losses between 1975 and 1982. Urban growth, which was strong in the 1950s and 1960s, reversed in the 1970s favouring small towns but recovered slightly in the last 20 years. Migration gains and losses show a quite complicated pattern of depopulation of city centres combined with slow suburbanisation and advanced periurbanisation. Periurbanisation is evident in Paris region and in nearly all large urban agglomerations. Most other cities show suburbanisation or periurbanisation at various stages of development. Out-migration shows a clear division of the country into a northern part with higher rates, and a central and southern part of the country with lower out-migration. This simple pattern is modified by higher out-migration from some cities such as Lyon or Clermont-Ferrand and from isolated rural communes scattered all over the country. Out-migration also has a regional dimension: there are shifts towards more attractive areas, in particular Alpine region and Mediterranean and Atlantic coasts. Analysis of migration between size bands of rural and urban units shows a significant deconcentration process, and a similar pattern characterises migration between population density bands. The general movement is down the urban/density band hierarchy, from higher to lower urban/density bands. Deep rural areas are not attractive and excluded from the process of counterurbanisation. In addition, unemployment was found to have a strong and very efficient impact on migration behaviour. Analysis for 1990-1999 leads to slight modification of this picture: a slow recovery of central parts of the largest urban agglomerations and less differentiated patterns than in the 1980s. Deconcentration of the French population continues but is less powerful

    Ground data systems resource allocation process

    Get PDF
    The Ground Data Systems Resource Allocation Process at the Jet Propulsion Laboratory provides medium- and long-range planning for the use of Deep Space Network and Mission Control and Computing Center resources in support of NASA's deep space missions and Earth-based science. Resources consist of radio antenna complexes and associated data processing and control computer networks. A semi-automated system was developed that allows operations personnel to interactively generate, edit, and revise allocation plans spanning periods of up to ten years (as opposed to only two or three weeks under the manual system) based on the relative merit of mission events. It also enhances scientific data return. A software system known as the Resource Allocation and Planning Helper (RALPH) merges the conventional methods of operations research, rule-based knowledge engineering, and advanced data base structures. RALPH employs a generic, highly modular architecture capable of solving a wide variety of scheduling and resource sequencing problems. The rule-based RALPH system has saved significant labor in resource allocation. Its successful use affirms the importance of establishing and applying event priorities based on scientific merit, and the benefit of continuity in planning provided by knowledge-based engineering. The RALPH system exhibits a strong potential for minimizing development cycles of resource and payload planning systems throughout NASA and the private sector

    Resource Allocation Planning Helper (RALPH): Lessons learned

    Get PDF
    The current task of Resource Allocation Process includes the planning and apportionment of JPL's Ground Data System composed of the Deep Space Network and Mission Control and Computing Center facilities. The addition of the data driven, rule based planning system, RALPH, has expanded the planning horizon from 8 weeks to 10 years and has resulted in large labor savings. Use of the system has also resulted in important improvements in science return through enhanced resource utilization. In addition, RALPH has been instrumental in supporting rapid turn around for an increased volume of special what if studies. The status of RALPH is briefly reviewed and important lessons learned from the creation of an highly functional design team are focused on through an evolutionary design and implementation period in which an AI shell was selected, prototyped, and ultimately abandoned, and through the fundamental changes to the very process that spawned the tool kit. Principal topics include proper integration of software tools within the planning environment, transition from prototype to delivered to delivered software, changes in the planning methodology as a result of evolving software capabilities and creation of the ability to develop and process generic requirements to allow planning flexibility

    What Is Grounded Theory Good For?

    Get PDF
    Grounded theory (GT) made its appearance in the social sciences in 1967 with publication of Barney Glaser and Anselm Strauss’s The Discovery of Grounded Theory. Glaser and Strauss advocated for systematically discovering and interpreting empirical data to generate theory, in contrast to testing or verifying theory derived from a priori assumptions. In the intervening 50 years, GT has spread into a wide range of fields including journalism and mass communication. Variations of the method have been developed, and debate has ensued about its relation to positivism and constructivism as well as pragmatism and postmodernism and about its value for critical race theory, feminist theory, and indigenous and other critical methods and theories. When and how is it best used? Is it misunderstood or misused by some? Is it more than a method? We asked senior scholars with expertise in GT to reflect on these issues, beginning with Vivian Martin, coeditor with Astrid Gynnild of Grounded Theory: The Philosophy, Method, and Work of Barney Glaserpublished by BrownWalker Press (2012). Martin, professor and chair of the Department of Journalism at Central Connecticut State University, argues the method has been misunderstood even by those who use it, often conflated with qualitative studies, with only two GT studies published in journalism and mass communication. It is practical and subversive, she observes, with the ability to develop new concepts and link ideas across disciplines. She advocates a closer adherence to Glaser’s original intentions for the method. Responding to Martin is Clifton Scott, associate professor in the Department of Communication Studies at the University of North Carolina at Charlotte. Scott is the author of “Grounded Theory” in Encyclopedia of Communication Theory, edited by Steven Littlejohn and Sonja Foss published by SAGE (2009). While agreeing with Martin that the name often is misapplied, Scott argues for less preoccupation with policing the purity of the method in favor of developing multiple approaches appropriate to it as a methodology. Reacting to both Martin and Scott, Bonnie Brennen critiques the original GT approach as neglecting “methodological self-consciousness,” which would uncover researchers “theoretical assumptions, power relations, class positions and personal experiences.” Brennen, the Nieman Professor of Journalism in the Diederich College of Communication at Marquette University, is the author of Qualitative Research Methods for Media Studies, second edition, published by Routledge in 2017. Finally, Meenakshi Gigi Durham, responding to all three, expresses optimism about GT’s potential to spur new inquiry through exploration of social life, while she proposes that, like all theory, it be seen as necessarily dynamic and evolutionary. Durham is a professor in the School of Journalism and Mass Communication at the University of Iowa and associate dean of the College of Liberal Arts and Sciences. She is the editor with Douglas M. Kellner of Media and Cultural Studies: Keyworks, second edition, published by Blackwell (2011). Lana Rakow, Associate Editor Louisa Ha, Edito

    Creep of ice: Further studies

    Get PDF
    Detailed studies have been done of ice creep as related to the icy satellites, Ganymede and Callisto. Included were: (1) the flow of high-pressure water ices II, III, and V, and (2) frictional sliding of ice I sub h. Work was also begun on the study of the effects of impurities on the flow of ice. Test results are summarized

    Anomalous relaxation kinetics of biological lattice-ligand binding models

    Full text link
    We discuss theoretical models for the cooperative binding dynamics of ligands to substrates, such as dimeric motor proteins to microtubules or more extended macromolecules like tropomyosin to actin filaments. We study the effects of steric constraints, size of ligands, binding rates and interaction between neighboring proteins on the binding dynamics and binding stoichiometry. Starting from an empty lattice the binding dynamics goes, quite generally, through several stages. The first stage represents fast initial binding closely resembling the physics of random sequential adsorption processes. Typically this initial process leaves the system in a metastable locked state with many small gaps between blocks of bound molecules. In a second stage the gaps annihilate slowly as the ligands detach and reattach. This results in an algebraic decay of the gap concentration and interesting scaling behavior. Upon identifying the gaps with particles we show that the dynamics in this regime can be explained by mapping it onto various reaction-diffusion models. The final approach to equilibrium shows some interesting dynamic scaling properties. We also discuss the effect of cooperativity on the equilibrium stoichiometry, and their consequences for the interpretation of biochemical and image reconstruction results.Comment: REVTeX, 20 pages, 17 figures; review, to appear in Chemical Physics; v2: minor correction

    Magneto-x-ray effects in transition-metal alloys

    Get PDF
    We present a theory that combines the relativistic spin-polarized version of the Koringa-Kohn-Rostoker coherent-potential approximation theory and the macroscopic theory of magneto-optical effects enabling us to calculate magneto-x-ray effects from first principles. The theory is illustrated by calculation of Faraday and Kerr rotations and ellipticities for transition-metal alloys
    corecore