4,725 research outputs found

    Flow and fracture of ice and ice mixtures

    Get PDF
    Frozen volatiles make up an important volume fraction of the low density moons of the outer solar system. Understanding the tectonic history of the surfaces of these moons, as well as the evolution of their interiors, requires knowledge of the mechanical strength of these icy materials under the appropriate planetary conditions (temperature, hydrostatic pressure, strain rate). Ongoing lab research is being conducted to measure mechanical properties of several different ices under conditions that faithfully reproduce condition both at the moons' surfaces (generally low temperature, to about 100 K, and low pressures) and in the deep interiors (warmer temperatures, pressures to thousands of atmospheres). Recent progress is reported in two different phases of the work: rheology of ices in the NH3-H2O system at temperatures and strain rates lower than ever before explored, with application to the ammonia-rich moons of Saturn and Uranus; and the water ice I yields II phase transformation, which not only applies directly to process deep in the interiors of Ganymede and Callisto, but holds implications for deep terrestrial earthquakes as well

    Internal Migration and Regional Population Dynamics in Europe: France Case Study

    Get PDF
    The paper examines the patterns of internal migration and population change in France over the recent decades at departément and commune scales. Regional population change is controlled by both natural increase and internal migration. There are two differing patterns of natural increase: north and east France has higher natural increase and south and east has lower. The geographic pattern of internal migration has changed substantially over the last 50 years, most dramatically in the Île-de-France, which showed the highest gains between 1954 and 1962 but the highest losses between 1975 and 1982. Urban growth, which was strong in the 1950s and 1960s, reversed in the 1970s favouring small towns but recovered slightly in the last 20 years. Migration gains and losses show a quite complicated pattern of depopulation of city centres combined with slow suburbanisation and advanced periurbanisation. Periurbanisation is evident in Paris region and in nearly all large urban agglomerations. Most other cities show suburbanisation or periurbanisation at various stages of development. Out-migration shows a clear division of the country into a northern part with higher rates, and a central and southern part of the country with lower out-migration. This simple pattern is modified by higher out-migration from some cities such as Lyon or Clermont-Ferrand and from isolated rural communes scattered all over the country. Out-migration also has a regional dimension: there are shifts towards more attractive areas, in particular Alpine region and Mediterranean and Atlantic coasts. Analysis of migration between size bands of rural and urban units shows a significant deconcentration process, and a similar pattern characterises migration between population density bands. The general movement is down the urban/density band hierarchy, from higher to lower urban/density bands. Deep rural areas are not attractive and excluded from the process of counterurbanisation. In addition, unemployment was found to have a strong and very efficient impact on migration behaviour. Analysis for 1990-1999 leads to slight modification of this picture: a slow recovery of central parts of the largest urban agglomerations and less differentiated patterns than in the 1980s. Deconcentration of the French population continues but is less powerful

    Resource Allocation Planning Helper (RALPH): Lessons learned

    Get PDF
    The current task of Resource Allocation Process includes the planning and apportionment of JPL's Ground Data System composed of the Deep Space Network and Mission Control and Computing Center facilities. The addition of the data driven, rule based planning system, RALPH, has expanded the planning horizon from 8 weeks to 10 years and has resulted in large labor savings. Use of the system has also resulted in important improvements in science return through enhanced resource utilization. In addition, RALPH has been instrumental in supporting rapid turn around for an increased volume of special what if studies. The status of RALPH is briefly reviewed and important lessons learned from the creation of an highly functional design team are focused on through an evolutionary design and implementation period in which an AI shell was selected, prototyped, and ultimately abandoned, and through the fundamental changes to the very process that spawned the tool kit. Principal topics include proper integration of software tools within the planning environment, transition from prototype to delivered to delivered software, changes in the planning methodology as a result of evolving software capabilities and creation of the ability to develop and process generic requirements to allow planning flexibility

    Investigations in geostatistical simulation as an aid to mine planning

    Get PDF
    Imperial Users onl

    Ground data systems resource allocation process

    Get PDF
    The Ground Data Systems Resource Allocation Process at the Jet Propulsion Laboratory provides medium- and long-range planning for the use of Deep Space Network and Mission Control and Computing Center resources in support of NASA's deep space missions and Earth-based science. Resources consist of radio antenna complexes and associated data processing and control computer networks. A semi-automated system was developed that allows operations personnel to interactively generate, edit, and revise allocation plans spanning periods of up to ten years (as opposed to only two or three weeks under the manual system) based on the relative merit of mission events. It also enhances scientific data return. A software system known as the Resource Allocation and Planning Helper (RALPH) merges the conventional methods of operations research, rule-based knowledge engineering, and advanced data base structures. RALPH employs a generic, highly modular architecture capable of solving a wide variety of scheduling and resource sequencing problems. The rule-based RALPH system has saved significant labor in resource allocation. Its successful use affirms the importance of establishing and applying event priorities based on scientific merit, and the benefit of continuity in planning provided by knowledge-based engineering. The RALPH system exhibits a strong potential for minimizing development cycles of resource and payload planning systems throughout NASA and the private sector

    What Is Grounded Theory Good For?

    Get PDF
    Grounded theory (GT) made its appearance in the social sciences in 1967 with publication of Barney Glaser and Anselm Strauss’s The Discovery of Grounded Theory. Glaser and Strauss advocated for systematically discovering and interpreting empirical data to generate theory, in contrast to testing or verifying theory derived from a priori assumptions. In the intervening 50 years, GT has spread into a wide range of fields including journalism and mass communication. Variations of the method have been developed, and debate has ensued about its relation to positivism and constructivism as well as pragmatism and postmodernism and about its value for critical race theory, feminist theory, and indigenous and other critical methods and theories. When and how is it best used? Is it misunderstood or misused by some? Is it more than a method? We asked senior scholars with expertise in GT to reflect on these issues, beginning with Vivian Martin, coeditor with Astrid Gynnild of Grounded Theory: The Philosophy, Method, and Work of Barney Glaserpublished by BrownWalker Press (2012). Martin, professor and chair of the Department of Journalism at Central Connecticut State University, argues the method has been misunderstood even by those who use it, often conflated with qualitative studies, with only two GT studies published in journalism and mass communication. It is practical and subversive, she observes, with the ability to develop new concepts and link ideas across disciplines. She advocates a closer adherence to Glaser’s original intentions for the method. Responding to Martin is Clifton Scott, associate professor in the Department of Communication Studies at the University of North Carolina at Charlotte. Scott is the author of “Grounded Theory” in Encyclopedia of Communication Theory, edited by Steven Littlejohn and Sonja Foss published by SAGE (2009). While agreeing with Martin that the name often is misapplied, Scott argues for less preoccupation with policing the purity of the method in favor of developing multiple approaches appropriate to it as a methodology. Reacting to both Martin and Scott, Bonnie Brennen critiques the original GT approach as neglecting “methodological self-consciousness,” which would uncover researchers “theoretical assumptions, power relations, class positions and personal experiences.” Brennen, the Nieman Professor of Journalism in the Diederich College of Communication at Marquette University, is the author of Qualitative Research Methods for Media Studies, second edition, published by Routledge in 2017. Finally, Meenakshi Gigi Durham, responding to all three, expresses optimism about GT’s potential to spur new inquiry through exploration of social life, while she proposes that, like all theory, it be seen as necessarily dynamic and evolutionary. Durham is a professor in the School of Journalism and Mass Communication at the University of Iowa and associate dean of the College of Liberal Arts and Sciences. She is the editor with Douglas M. Kellner of Media and Cultural Studies: Keyworks, second edition, published by Blackwell (2011). Lana Rakow, Associate Editor Louisa Ha, Edito

    Monte Carlo methods for estimating, smoothing, and filtering one- and two-factor stochastic volatility models

    Get PDF
    One- and two-factor stochastic volatility models are assessed over three sets of stock returns data: S&P 500, DJIA, and Nasdaq. Estimation is done by simulated maximum likelihood using techniques that are computationally efficient, robust, straightforward to implement, and easy to adapt to different models. The models are evaluated using standard, easily interpretable time-series tools. The results are broadly similar across the three data sets. The tests provide no evidence that even the simple single-factor models are unable to capture the dynamics of volatility adequately; the problem is to get the shape of the conditional returns distribution right. None of the models come close to matching the tails of this distribution. Including a second factor provides only a relatively small improvement over the single-factor models. Fitting this aspect of the data is important for option pricing and risk management

    SV mixture models with application to S&P 500 index returns

    Get PDF
    Understanding both the dynamics of volatility and the shape of the distribution of returns conditional on the volatility state is important for many financial applications. A simple single-factor stochastic volatility model appears to be sufficient to capture most of the dynamics. It is the shape of the conditional distribution that is the problem. This paper examines the idea of modeling this distribution as a discrete mixture of normals. The flexibility of this class of distributions provides a transparent look into the tails of the returns distribution. Model diagnostics suggest that the model, SV-mix, does a good job of capturing the salient features of the data. In a direct comparison against several affine-jump models, SV-mix is strongly preferred by Akaike and Schwarz information criteria

    CONTROLLED CLIMATES AND HUMAN VARIATION: UBIQUITOUS AIR CONDITIONING AND LOWERING HEAT THRESHOLDS IN A HOTTER WORLD

    Get PDF
    This project assessed how ubiquitous air conditioning is affecting human biological and cultural adaptation to heat. Big data from the “CDC Environmental Health Tracker” on morbidity and air conditioning (AC) usage was used to identify relevant Texan and Floridian populations; who were then anonymously interviewed regarding AC use, hot weather exposure, and heat related illness. IBM-SPSS was used to analyze both quantitative and qualitative variables. A final sample of 13 participants from each state between the ages of 21-28 was selected. In this population, AC usage was strongly linked to increased irritability in the heat along with resulting correlations with heat related illness (r = .469, p = .005). Qualitatively, a culture of dependency on air conditioning is shown in Texas while Floridians took advantage of “beach culture” more often. These findings link air conditioning use to the health risks of inactivity along with identified trends in biological maladaptation.Comparative Cultural Studies, Department o
    • …
    corecore