143 research outputs found

    Standardization of mine accounting

    Get PDF
    His paper presents the history of the international efforts to standardize mine accounting between 1895 and 1915. Extractive industries, such as mining and oil and gas, posed especially difficult problems for the accounting profession. In 1895 there was almost no literature to help in the resolution of these problems. During this following interval the issues of mine accounting were thoroughly discussed and limited standardization was achieved in some regions. Near the end of this period the Institution of Mining and Metallurgy unanimously adopted a set of accounting standards for the mining industry

    Herbert C. Hoover: Accounting influences on his life

    Get PDF
    Herbert C. Hoover was arguably the most distinguished mining engineer in American history. During the early part of this century he enjoyed great success as a consultant and mine manager. He was regarded as one of the most skilled practitioners of his day. In addition, he was a regular contributer to the mining journals and wrote several books, two of which became classics (Principles of Mining and De Re Metallica, an annotated translation of the works of Agricola). The rationalization of management was an important theme in his life. He frequently turned to accounting as a means to improve management systems. This paper describes several of the more interesting stories concerning the role accounting played in Hoover\u27s early life

    The relation between gravity rate of change and vertical displacement in previously glaciated areas

    Get PDF
    The rate of change of surface gravity, dg/dt, and vertical deformation rate of the solid surface, du/dt, are two observables of glacial isostatic adjustment (GIA). They contribute with different information on the same phenomenon. Their relation contains information of the underlying physics and a trustworthy relation allows to combine these observations to strengthen the overall observational accuracy of the phenomenon. In this paper we investigate the predicted relation between dg/dt and du/dt in previously glaciated areas. We use the normal mode approach for one dimensional earth models and solutions of the sea level equation with time-dependent coastline geometry. Numerical predictions of dg/dt and du/dt are computed for Laurentia, Fennoscandia and the British Isles respectively, using six different earth models. Within each region a linear trend is then fitted using the relation dg/dt = C du/dt + dg_0/dt. The estimated C and dg_0/dt differ more between the regions than between different earth models within each region. For Fennoscandia C ≈ −0.163 ÎŒGal/mm and for Laurentia C ≈ −0.152 ÎŒGal/mm. Maximum residuals between the linear trend and spatially varying model predictions of dg/dt are 0.04 ÎŒGal/yr in Fennoscandia and 0.17 ÎŒGal/yr in Laurentia. For the British Isles the results are harder to interpret, mainly since this region is located on the zero uplift isoline of Fennoscandia. In addition, we show temporal variation of the relation since the last glacial maximum till present-day. The temporal and spatial variation of the relation between dg/dt and du/dt can be explained by (i) the elastic respectively viscous proportion of the total signal and (ii) the spectral composition of the regional signal. Additional local effects, such as the Newtonian attraction and elastic deformation from local sea level changes, are examined in a case study for six stations in the Nordic absolute gravity network. The influence of these local effects on the relation between View the dg/dt and du/dt is negligible except for extreme locations close to the sea

    Fennoscandian strain rates from BIFROST GPS: A gravitating, thick-plate approach

    Get PDF
    The aim of this investigation is to develop a method for the analysis of crustal strain determined by station networks that continuously measurements of Global Navigation Satellite Systems (GNSS). The major new ingredient is that we require a simultaneous minimum of the observation error and the elastic and potential energy implied by the deformation. The observations that we analyse come from eight years worth of daily solutions from continuous BIFROST GPS measurements in the permanent networks of the Nordic countries and their neighbours. Reducing the observations with best fitting predictions for the effects of glacial isostatic adjustment (GIA) we find strain rates of maximum 5 nano/yr in the interior of the rebound area predominantly as areal strain. The largest strain rates are found in the Finnmarken area, where however the GNSS network density is much lower than in the central and southern parts. The thick-plate adjustment furnishes a simultaneous treatment of 3-D displacements and the ensuing elastic and potential energy due to the deformation. We find that the strain generated by flexure due to GIA is important. The extensional regime seen at the surface turns over into a compressive style already at moderated depth, some 50 km

    Timescales of glacial isostatic adjustment in Greenland: is transient rheology required?

    Get PDF
    The possibility of a transient rheological response to ice age loading, first discussed in the literature of the 1980s, has received renewed attention. Transient behaviour across centennial to millennial timescales has been invoked to reconcile apparently contradictory inferences of steady-state (Maxwell) viscosity based on two distinct data sets from Greenland: Holocene sea-level curves and Global Navigation Satellite System (GNSS) derived modern crustal uplift data. To revisit this issue, we first compute depth-dependent Fréchet kernels using 1-D Maxwell viscoelastic Earth models and demonstrate that the mantle resolving power of the two Greenland data sets is highly distinct, reflecting the differing spatial scale of the associated surface loading: the sea-level records are sensitive to viscosity structure across the entire upper mantle while uplift rates associated with post-1000 CE fluctuations of the Greenland Ice Sheet have a dominant sensitivity to shallow asthenosphere viscosity. Guided by these results, we present forward models which demonstrate that a moderate low viscosity zone beneath the lithosphere in Maxwell Earth models provides a simple route to simultaneously reconciling both data sets by significantly increasing predictions of present-day uplift rates in Greenland whilst having negligible impact on predictions of Holocene relative sea-level curves from the region. Our analysis does not rule out the possibility of transient deformation, but it suggests that it is not required to simultaneously explain these two data sets. A definitive demonstration of transient behaviour requires that one account for the resolving power of the data sets in modelling the glacial isostatic adjustment process

    Sea-level constraints on the amplitude and source distribution of Meltwater Pulse 1A.

    Get PDF
    During the last deglaciation, sea levels rose as ice sheets retreated. This climate transition was punctuated by periods of more intense melting; the largest and most rapid of these—Meltwater Pulse 1A—occurred about 14,500 years ago, with rates of sea-level rise reaching approximately 4 m per century1, 2, 3. Such rates of rise suggest ice-sheet instability, but the meltwater sources are poorly constrained, thus limiting our understanding of the causes and impacts of the event4, 5, 6, 7. In particular, geophysical modelling studies constrained by tropical sea-level records1, 8, 9 suggest an Antarctic contribution of more than seven metres, whereas most reconstructions10 from Antarctica indicate no substantial change in ice-sheet volume around the time of Meltwater Pulse 1A. Here we use a glacial isostatic adjustment model to reinterpret tropical sea-level reconstructions from Barbados2, the Sunda Shelf3 and Tahiti1. According to our results, global mean sea-level rise during Meltwater Pulse 1A was between 8.6 and 14.6 m (95% probability). As for the melt partitioning, we find an allowable contribution from Antarctica of either 4.1 to 10.0 m or 0 to 6.9 m (95% probability), using two recent estimates11, 12 of the contribution from the North American ice sheets. We conclude that with current geologic constraints, the method applied here is unable to support or refute the possibility of a significant Antarctic contribution to Meltwater Pulse 1A

    An investigation of causes of false positive single nucleotide polymorphisms using simulated reads from a small eukaryote genome

    Get PDF
    Background: Single Nucleotide Polymorphisms (SNPs) are widely used molecular markers, and their use has increased massively since the inception of Next Generation Sequencing (NGS) technologies, which allow detection of large numbers of SNPs at low cost. However, both NGS data and their analysis are error-prone, which can lead to the generation of false positive (FP) SNPs. We explored the relationship between FP SNPs and seven factors involved in mapping-based variant calling - quality of the reference sequence, read length, choice of mapper and variant caller, mapping stringency and filtering of SNPs by read mapping quality and read depth. This resulted in 576 possible factor level combinations. We used error- and variant-free simulated reads to ensure that every SNP found was indeed a false positive. Results: The variation in the number of FP SNPs generated ranged from 0 to 36,621 for the 120 million base pairs (Mbp) genome. All of the experimental factors tested had statistically significant effects on the number of FP SNPs generated and there was a considerable amount of interaction between the different factors. Using a fragmented reference sequence led to a dramatic increase in the number of FP SNPs generated, as did relaxed read mapping and a lack of SNP filtering. The choice of reference assembler, mapper and variant caller also significantly affected the outcome. The effect of read length was more complex and suggests a possible interaction between mapping specificity and the potential for contributing more false positives as read length increases. Conclusions: The choice of tools and parameters involved in variant calling can have a dramatic effect on the number of FP SNPs produced, with particularly poor combinations of software and/or parameter settings yielding tens of thousands in this experiment. Between-factor interactions make simple recommendations difficult for a SNP discovery pipeline but the quality of the reference sequence is clearly of paramount importance. Our findings are also a stark reminder that it can be unwise to use the relaxed mismatch settings provided as defaults by some read mappers when reads are being mapped to a relatively unfinished reference sequence from e.g. a non-model organism in its early stages of genomic exploration

    Ocean tides and Heinrich events

    Full text link
    Climate varied enormously over the most recent ice age1 — for example, large pulses of ice-rafted debris2, originating mainly from the Labrador Sea3, were deposited into the North Atlantic at roughly 7,000-year intervals, with global climatic implications3. Here we show that ocean tides within the Labrador Sea were exceptionally large over the period spanning these huge, abrupt ice movements, which are known as Heinrich events. We propose that tides played a catalytic role in liberating iceberg armadas during that time.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/84375/1/nature_tidesheinrich.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/84375/2/432460a-s1.do

    Ecosystem Resilience and Threshold Response in the GalĂĄpagos Coastal Zone

    Get PDF
    Background: The Intergovernmental Panel on Climate Change (IPCC) provides a conservative estimate on rates of sea-level rise of 3.8 mm yr⁻Âč at the end of the 21st century, which may have a detrimental effect on ecologically important mangrove ecosystems. Understanding factors influencing the long-term resilience of these communities is critical but poorly understood. We investigate ecological resilience in a coastal mangrove community from the GalĂĄpagos Islands over the last 2700 years using three research questions: What are the 'fast and slow' processes operating in the coastal zone? Is there evidence for a threshold response? How can the past inform us about the resilience of the modern system?Methodology/Principal Findings: Palaeoecological methods (AMS radiocarbon dating, stable carbon isotopes (ÎŽ13C)) were used to reconstruct sedimentation rates and ecological change over the past 2,700 years at Diablas lagoon, Isabela, GalĂĄpagos. Bulk geochemical analysis was also used to determine local environmental changes, and salinity was reconstructed using a diatom transfer function. Changes in relative sea level (RSL) were estimated using a glacio-isostatic adjustment model. Non-linear behaviour was observed in the Diablas mangrove ecosystem as it responded to increased salinities following exposure to tidal inundations. A negative feedback was observed which enabled the mangrove canopy to accrete vertically, but disturbances may have opened up the canopy and contributed to an erosion of resilience over time. A combination of drier climatic conditions and a slight fall in RSL then resulted in a threshold response, from a mangrove community to a microbial mat.Conclusions/Significance: Palaeoecological records can provide important information on the nature of non-linear behaviour by identifying thresholds within ecological systems, and in outlining responses to 'fast and slow' environmental change between alternative stable states. This study highlights the need to incorporate a long-term ecological perspective when designing strategies for maximizing coastal resilience.</p
    • 

    corecore