23,303 research outputs found

    Stochastic and Reactive Methods for the Determination of Optimal Calibration Intervals

    Get PDF
    The length of calibration intervals of measurement instrumentations can be determined by means of several techniques. In this paper three different methods are compared for the establishment of optimal calibration intervals of atomic clocks. The first one, is based on a stochastic model, and provides the estimation of the calibration interval also in the transient situation, while the others, attain to the class of the so–called reactive methods, which determine the value of the optimal interval on the basis of the last calibration outcomes. Algorithms have been applied to experimental data and obtained results have been compared in order to determine the most effective technique. Since the analyzed reactive methods present a large transient time, a new algorithm is proposed and applied to the available data

    An improved design of a fully automated multiple output micropotentiometer

    Get PDF
    This paper describes in details a new design of a fully automated multiple output micropotentiometer (?pot). A prototype has been built at the National Institute for Standards (NIS), Egypt to establish this highly improved AC voltage source in the millivolt range. The new device offers three different outputs covering a wide frequency range from only one outlet. This valuably supports the precise sourcing ranges of low AC voltage at NIS. The design and the operation theory of this prototype have been discussed in details. An automatic calibration technique has been introduced through specially designed software using the LabVIEW program to enhance the calibration technique and to reduce the uncertainty contributions. Relative small AC-DC differences of our prototype in the three output ranges are fairly verified. The expanded uncertainties of the calibration results for the three output ranges have been faithfully estimated. However, further work is needed to achieve the optimum performance of this new device

    Worker flows and job flows: a quantitative investigation

    Get PDF
    Worker flows and job flows behave differently over the business cycle. The authors investigate the sources of the differences by studying quantitative properties of a multiple-worker version of the search/matching model that features endogenous job separation and intra-firm wage bargaining. Their calibration incorporates micro- and macro-level evidence on worker and job flows. The authors show that the dynamic stochastic equilibrium of the model replicates important cyclical features of worker flows and job flow simultaneously. In particular, the model correctly predicts that hires from unemployment move countercyclically while the job creation rate moves procyclically. The key to this result is to allow for a large hiring flow that does not go through unemployment but is part of job creation, for which procyclicality of the job finding rate dominates its cyclicality. The authors also show that the model generates large volatilities of unemployment and vacancies when a worker's outside option is at 83 percent of aggregate labor productivity.Employment ; Business cycles

    Bayesian Estimation Under Informative Sampling

    Full text link
    Bayesian analysis is increasingly popular for use in social science and other application areas where the data are observations from an informative sample. An informative sampling design leads to inclusion probabilities that are correlated with the response variable of interest. Model inference performed on the observed sample taken from the population will be biased for the population generative model under informative sampling since the balance of information in the sample data is different from that for the population. Typical approaches to account for an informative sampling design under Bayesian estimation are often difficult to implement because they require re-parameterization of the hypothesized generating model, or focus on design, rather than model-based, inference. We propose to construct a pseudo-posterior distribution that utilizes sampling weights based on the marginal inclusion probabilities to exponentiate the likelihood contribution of each sampled unit, which weights the information in the sample back to the population. Our approach provides a nearly automated estimation procedure applicable to any model specified by the data analyst for the population and retains the population model parameterization and posterior sampling geometry. We construct conditions on known marginal and pairwise inclusion probabilities that define a class of sampling designs where L1L_{1} consistency of the pseudo posterior is guaranteed. We demonstrate our method on an application concerning the Bureau of Labor Statistics Job Openings and Labor Turnover Survey.Comment: 24 pages, 3 figure

    Development, sensitivity and uncertainty analysis of LASH model

    Get PDF
    Diversos modelos hidrológicos têm sido desenvolvidos no intuito de auxiliar na gestão de recursos naturais em todo o mundo. Porém, a maioria desses modelos apresenta um alto grau de complexidade em relação tanto à necessidade de base de dados, quanto ao número de parâmetros de calibração. Em virtude desses fatores, se torna difícil a aplicação em bacias hidrográficas que têm bases de dados reduzidas. Neste artigo é descrito o desenvolvimento do modelo Lavras Simulation of Hydrology (LASH) em uma estrutura de SIG, buscando enfatizar seus principais componentes e parâmetros, bem como suas potencialidades. Além da descrição do modelo, também foram realizadas a análise de sensibilidade, a redução do intervalo de parâmetros e a análise de incertezas, anteriormente à fase de calibração, utilizando metodologias específicas (método de Morris, simulação de Monte Carlo e o método Generalized Likelihood Uncertainty Equation (GLUE)), com a base de dados de uma bacia hidrográfica experimental tropical brasileira (32 km²), a fim de simular a vazão total média diária. O LASH é um modelo classificado como determinístico e distribuído, que utiliza dados de longo termo e poucos mapas para predizer vazão na seção de controle de bacias hidrográficas. Foi possível identificar os parâmetros mais sensíveis do modelo para a bacia hidrográfica de referência, os quais estão associados com os componentes de escoamento de base e superficial direto. Em função do limiar conservador utilizado neste estudo, foram reduzidos os intervalos de dois parâmetros, dessa forma gerando resultados simulados mais realísticos e também facilitando a calibração automática do modelo com um menor número de iterações necessárias. O método da GLUE mostrou ser eficiente frente à análise de incertezas relacionadas à predição de vazão na bacia de estudo.Many hydrologic models have been developed to help manage natural resources all over the world. Nevertheless, most models have presented a high complexity regarding data base requirements, as well as, many calibration parameters. This has brought serious difficulties for applying them in watersheds where there is scarcity of data. The development of the Lavras Simulation of Hydrology (LASH) in a GIS framework is described in this study, which focuses on its main components, parameters, and capabilities. Coupled with LASH, sensitivity analysis, parameter range reduction, and uncertainty analysis were performed prior to the calibration effort by using specific techniques (Morris method, Monte Carlo simulation and a Generalized Likelihood Uncertainty Estimation -GLUE) with a data base from a Brazilian Tropical Experimental Watershed (32 km²), in order to predict streamflow on a daily basis. LASH is a simple deterministic and spatially distributed model using long-term data sets, and a few maps to predict streamflow at a watershed outlet. We were able to identify the most sensitive parameters which are associated with the base flow and surface runoff components, using a reference watershed. Using a conservative threshold, two parameters had their range of values reduced, thus resulting in outputs closer to measured values and facilitating automatic calibration of the model with less required iterations. GLUE was found to be an efficient method to analyze uncertainties related to the prediction of mean daily streamflow in the watershed

    Weight Adjustment Methods and Their Impact on Sample-based Inference

    Get PDF
    Weighting samples is important to reflect not only sample design decisions made at the planning stage, but also practical issues that arise during data collection and cleaning that necessitate weighting adjustments. Adjustments to base weights are used to account for these planned and unplanned eventualities. Often these adjustments lead to variations in the survey weights from the original selection weights (i.e., the weights based solely on the sample units' probabilities of selection). Large variation in survey weights can cause inferential problems for data users. A few extremely large weights in a sample dataset can produce unreasonably large estimates of national- and domain-level estimates and their variances in particular samples, even when the estimators are unbiased over many samples. Design-based and model-based methods have been developed to adjust such extreme weights; both approaches aim to trim weights such that the overall mean square error (MSE) is lowered by decreasing the variance more than increasing the square of the bias. Design-based methods tend to be ad hoc, while Bayesian model-based methods account for population structure but can be computationally demanding. I present three research papers that expand the current weight trimming approaches under the goal of developing a broader framework that connects gaps and improves the existing alternatives. The first paper proposes more in-depth investigations of and extensions to a newly developed method called generalized design-based inference, where we condition on the realized sample and model the survey weight as a function of the response variables. This method has potential for reducing the MSE of a finite population total estimator in certain circumstances. However, there may be instances where the approach is inappropriate, so this paper includes an in-depth examination of the related theory. The second paper incorporates Bayesian prior assumptions into model-assisted penalized estimators to produce a more efficient yet robust calibration-type estimator. I also evaluate existing variance estimators for the proposed estimator. Comparisons to other estimators that are in the literature are also included. In the third paper, I develop summary- and unit-level diagnostic tools that measure the impact of variation of weights and of extreme individual weights on survey-based inference. I propose design effects to summarize the impact of variable weights produced under calibration weighting adjustments under single-stage and cluster sampling. A new diagnostic for identifying influential, individual points is also introduced in the third paper

    An Aerial Gamma Ray Survey of Springfields and the Ribble Estuary in September 1992

    Get PDF
    <p>A short aerial gamma ray survey was conducted in the vicinity of the Springfields site and Ribble Estuary from 1st-5th September 1992, to define existing background radiation levels, against which any future changes can be assessed. A twin engine AS 355 "Squirrel" helicopter chartered from Dollar Helicopters was used for this work. It was loaded with a 16 litre NaI(Tl) gamma ray detector and spectroscopy system on the 31st August and during the following days over 2700 separate spectra were recorded within a survey area of 20 x 12 km. Gamma ray spectra were recorded every 5 seconds at survey speed and altitude of 120 kph and 75 m respectively. A flight line spacing of 0.3km was chosen for the main survey area. On the 3rd September a low altitude, high spatial resolution (flight line spacing 100m and altitude 30m) was made over Banks Marsh (an area frequented by local wild fowlers).</p> <p>Survey results have been stored archivally and used to map the naturally occurring radionuclides 40K, 214Bi and 208Tl together with 137Cs and total gamma ray flux. In addition, for the first time, estimates of 234mPa in terms of deconvoluted count rate (normalised to 100m altitude) were made in the presence of 228Ac interference probably in disequilibrium with its parent thorium series.</p> <p>The maps provide a clear indication of the distribution and sources of environmental radioactivity in the Ribble at the time of the survey. The Ribble estuary is subject to regular and ongoing ground based studies by BNF, MAFF, HMIP, and University based groups, as a result of the authorised discharges of low level radioactivity from the Springfields site. The results of this survey complement this ground based work, and add to confidence that the estuarine system, it's associated sediments, tide washed pastures, salt marshes and river banks, have been thoroughly examined. There is support for earlier conclusions that the Cs on the salt marshes is the dominant source of external gamma exposure, and that the Springfields contribution to these locations is minor in comparison with this, Sellafield derived, signal. Upstream the situation is more complex, particularly where the dynamic sources of beta radiation are considered. As far as critical group assessments are concerned the survey provides clear evidence that the areas affected by 137Cs, where external gamma dose and possible food chain effects are of greatest interest, are in the lower reaches of the Ribble, whereas, at the time of the survey the 234mPa distribution was in the upper reaches of the river. This not only confirms the findings of ground based work, but provides some assurance that the different exposure paths (external gamma dose, skin dose) are not entirely synergistic. The discovery of possible transient sources of natural 228Ac in the salt marsh environment as a consequence of Th series disequilibrium immediately following spring tides is extremely interesting. If substantiated by further studies using semiconductor detectors this provides a new insight into the dynamic radiation environment of tide washed contexts.</p> <p>Aerial survey can potentially provide a rapid and cost effective means of studying environmentally dynamic sources such as 234mPa. In the case of the Ribble it would be necessary to reduce survey height to below 50m ground clearance to improve spatial resolution. Possible inconvenience to residents and property owners of such low altitude flights would have to be considered in addition to the potential value of environmental knowledge of the behaviour of short lived nuclides in a dynamic system such the Ribble estuary. There is nonetheless considerable potential for time series studies of this location. Recent flight trials by SURRC incorporating high efficiency germanium semiconductor detectors have verified the feasibility and potential a hybrid scintillation⁄ semiconductor spectrometer. Such a device can resolve any ambiguities arising from overlapping gamma ray peaks. This is particularly relevant to the confirmation of 228Ac in salt marshes. Ground based sampling at the time of measurement would enable concentration calibrations to be made for these dynamic sources. Further ground based measurements would be desirable to establish the extent to which low energy photons contribute to external gamma ray dose rates from sources with pronounced subsurface activity maxima.</p&gt

    Sugars' quantifications using a potentiometric electronic tongue with cross-selective sensors: Influence of an ionic background

    Get PDF
    Glucose, fructose and sucrose are sugars with known physiological e ects, and their consumption has impact on the human health, also having an important e ect on food sensory attributes. The analytical methods routinely used for identification and quantification of sugars in foods, like liquid chromatography and visible spectrophotometry have several disadvantages, like longer analysis times, high consumption of chemicals and the need for pretreatments of samples. To overcome these drawbacks, in this work, a potentiometric electronic tongue built with two identical multi-sensor systems of 20 cross-selectivity polymeric sensors, coupled with multivariate calibration with feature selection (a simulated annealing algorithm) was applied to quantify glucose, fructose and sucrose, and the total content of sugars as well. Standard solutions of ternary mixtures of the three sugars were used for multivariate calibration purposes, according to an orthogonal experimental design (multilevel fractional factorial design) with or without ionic background (KCl solution). The quantitative models’ predictive performance was evaluated by cross-validation with K-folds (internal validation) using selected data for training (selected with the K-means algorithm) and by external validation using test data. Overall, satisfactory predictive quantifications were achieved for all sugars and total sugar content based on subsets comprising 16 or 17 sensors. The test data allowed us to compare models’ predictions values and the respective sugar experimental values, showing slopes varying between 0.95 and 1.03, intercept values statistically equal to zero (p-value 0.05) and determination coe cients equal to or greater than 0.986. No significant di erences were found between the predictive performances for the quantification of sugars using synthetic solutions with or without KCl (1 mol L1), although the adjustment of the ionic background allowed a better homogenization of the solution’s matrix and probably contributed to an enhanced confidence in the analytical work across all of the calibration working range.This research work was funded by strategic project CIMO–PEst-OE/AGR/UI0690/2014 and Associate Laboratory LSRE-LCM–UID/EQU/50020/2019, financially supported by the FEDER—Fundo Europeu de Desenvolvimento Regional through COMPETE2020—Programa Operacional Competitividade e Internacionalização (POCI); and by national funds through FCT—Fundação para a Ciência e a Tecnologia, Portugalinfo:eu-repo/semantics/publishedVersio

    Recommendations for the Determination of Nutrients in Seawater to High Levels of Precision and Inter-Comparability using Continuous Flow Analysers

    Get PDF
    The Global Ocean Ship-based Hydrographic Investigations Program (GO-SHIP) brings together scientists with interests in physical oceanography, the carbon cycle, marine biogeochemistry and ecosystems, and other users and collectors of ocean interior data to develop a sustained global network of hydrographic sections as part of the Global Ocean Climate Observing System. A series of manuals and guidelines are being produced by GO-SHIP which update those developed by the World Ocean Circulation Experiment (WOCE) in the early 1990s. Analysis of the data collected in WOCE suggests that improvements are needed in the collection of nutrient data if they are to be used for determining change within the ocean interior. Production of this manual is timely as it coincides with the development of reference materials for nutrients in seawater (RMNS). These RMNS solutions will be produced in sufficient quantities and be of sufficient quality that they will provide a basis for improving the consistency of nutrient measurements both within and between cruises. This manual is a guide to suggested best practice in performing nutrient measurements at sea. It provides a detailed set of advice on laboratory practice for all the procedures surrounding the use of 1 gas-segmented continuous flow analysers (CFA) for the determination of dissolved nutrients (usually ammonium, nitrate, nitrite, phosphate and silicate) at sea. It does not proscribe the use of a particular instrument or related chemical method as these are well described in other publications. The manual provides a brief introduction to the CFA method, the collection and storage of samples, considerations in the preparation of reagents and the calibrations of the system. It discusses how RMNS solutions can be used to “track” the performance of a system during a cruise and between cruises. It provides a format for the meta-data that need to be reported along side the sample data at the end of a cruise so that the quality of the reported data can be evaluated and set in context relative to other data sets. Most importantly the central manual is accompanied by a set of nutrient standard operating procedures (NSOPs) that provide detailed information on key procedures that are necessary if best quality data are to be achieved consistently. These cover sample collection and storage, an example NSOP for the use of a CFA system at sea, high precision preparation of calibration solutions, assessment of the true calibration blank, checking the linearity of a calibration and the use of internal and externally prepared reference solutions for controlling the precision of data during a cruise and between cruises. An example meta-data report and advice on the assembly of the quality control and statistical data that should form part of the meta-data report are also given
    corecore