750 research outputs found
Three-way error analysis between AATSR, AMSR-E and in situ sea surface temperature observations
Using co-locations of three different observation types of sea surface temperatures (SSTs) gives enough information to enable the standard deviation of error on each observation type to be derived. SSTs derived from the Advanced Along-Track Scanning Radiometer (AATSR) and Advanced Microwave Scanning Radiometer (AMSR-E) instruments are used, along with SST observations from buoys. Various assumptions are made within the error theory including that the errors are not correlated, which should be the case for three independent data sources. An attempt is made to show that this assumption is valid and also that the covariances between the observations due to representativity error are negligible. Overall, the AATSR observations are shown to have a very small standard deviation of error of 0.16K, whilst the buoy SSTs have an error of 0.23K and the AMSR-E SST observations have an error of 0.42K. 1
Characterisation of the genomic architecture of human chromosome 17q and evaluation of different methods for haplotype block definition
BACKGROUND: The selection of markers in association studies can be informed through the use of haplotype blocks. Recent reports have determined the genomic architecture of chromosomal segments through different haplotype block definitions based on linkage disequilibrium (LD) measures or haplotype diversity criteria. The relative applicability of distinct block definitions to association studies, however, remains unclear. We compared different block definitions in 6.1 Mb of chromosome 17q in 189 unrelated healthy individuals. Using 137 single nucleotide polymorphisms (SNPs), at a median spacing of 15.5 kb, we constructed haplotype block maps using published methods and additional methods we have developed. Haplotype tagging SNPs (htSNPs) were identified for each map. RESULTS: Blocks were found to be shorter and coverage of the region limited with methods based on LD measures, compared to the method based on haplotype diversity. Although the distribution of blocks was highly variable, the number of SNPs that needed to be typed in order to capture the maximum number of haplotypes was consistent. CONCLUSION: For the marker spacing used in this study, choice of block definition is not important when used as an initial screen of the region to identify htSNPs. However, choice of block definition has consequences for the downstream interpretation of association study results
Recommended from our members
A new floating model level scheme for the assimilation of boundary layer top inversions: the univariate assimilation of temperature.
The assimilation of observations with a forecast is often heavily influenced by
the description of the error covariances associated with the forecast. When a
temperature inversion is present at the top of the boundary layer (BL), a significant
part of the forecast error may be described as a vertical positional error (as opposed
to amplitude error normally dealt with in data assimilation). In these cases, failing
to account for positional error explicitly is shown t o r esult in an analysis for which
the inversion structure is erroneously weakened and degraded.
In this article, a new assimilation scheme is proposed to explicitly include the
positional error associated with an inversion. This is done through the introduction
of an extra control variable to allow position errors in the a priori to be treated
simultaneously with the usual amplitude errors. This new scheme, referred to as
the ‘floating BL scheme’, is applied to the one-dimensional (vertical) variational
assimilation of temperature. The floating BL scheme is tested with a series of idealised
experiments a nd with real data from radiosondes.
For each idealised experiment, the floating BL scheme gives an analysis which has
the inversion structure and position in agreement with the truth, and outperforms
the a ssimilation which accounts only for forecast a mplitude error. When the
floating BL scheme is used to assimilate a l arge sample of radiosonde data, its
ability to give an analysis with an inversion height in better agreement with that
observed is confirmed. However, it is found that the use of Gaussian statistics is
an inappropriate description o f t he error statistics o f t he extra c ontrol variable.
This problem is alleviated by incorporating a non-Gaussian description of the new
control variable in the new scheme. Anticipated challenges in implementing the
scheme operationally are discussed towards the end of the article
Jim Renacci
This is an advertisement for the re-election of Jim Renacci to the U.S. House of Representatives
Secure, reliable and effective institution-wide e-assessment: paving the way for new technologies
This short paper addresses a number of the key themes of the 12th
International CAA conference with particular regard to evaluation, innovation
and strategic developments. It is based on the current findings and
experiences from two interrelated CAA projects underway at the University of
Bradford: “Embedded support processes for e-Assessment” and “Integrating
thin client systems and smart card technology to provide flexible, accessible
and secure e-Assessment”. These two projects, along with specific aims in the University’s Learning,
Teaching and Assessment Strategy and other projects conducted as part the
institution’s e-Strategy, aim to establish an effective and efficient system for
online summative and formative assessment at the University of Bradford that
will meet the needs of a Higher Education Institution in the 21st century. This
is very much a work in progress, and it is hoped that this work will be written
up as a long paper for a future CAA conference
Resolution of the Miller-Popper paradox
A longstanding paradox was first reported by David Miller in 1975 and highlighted by Karl Popper in 1979. Miller showed that the ranking of predictions from two theories, in terms of closeness to observation, appears to be reversed when the problem is transformed into a different mathematical space. He concluded that “… no false theory can … be closer to the truth than is another theory”. This flies in the face of normal scientific practice and is thus paradoxical; it is named here the “Miller-Popper paradox”.
This paper proposes a resolution of the paradox, through consideration of the inevitable errors and uncertainties in both observations and predictions. It is proved that, for linear transformations and Gaussian error distributions, the transformation between spaces creates no change in quantitative measures of “closeness-to-observation” when these measures are based in probability theory. The extension of this result to nonlinear transformations and to non-Gaussian error distributions is also discussed.
These results demonstrate that concepts used in comparison of predictions with observations – concepts of “closeness”, “consistency”, “agreement”, “falsification”, etc. – all imply some knowledge of the uncertainty characteristics of both predictions and observations
Algumas reflexões sobre o desenvolvimento do ensino da ciência da informação no Brasil
Homenagem aos 25 anos do Curso de Pós-graduação em Ciência da Informação do CNPq/IBICT-UFRJ/ECO. Relata a experiência de um professor do programa de pós-graduação no Brasil e trata do impacto desse programa nas escolas de biblioteconomia. Pretende elucidar o que são os cursos de biblioteconomia, os desafios para a profissão, currículos, o papel futuro do bibliotecário, além dos assuntos como tecnologia, sobrevivência da biblioteca atual e a capacitação do bibliotecário do futuro.
Palavras-chave
Formação profissional; Biblioteconomia; Ciência da informação; Pós-graduação; Currículo; Escola de biblioteconomia; IBICT; Brasil
The Benefits of Laser Scanning & 3D Modelling in Accident Investigation: In a Mining Context
PublishedArticleThis is the author’s final accepted version of the article: M. L. Eyre, P. J. Foster, J. Jobling-Purser and J. Coggan. "The benefits of laser scanning and 3D modelling in accident investigation: in a mining context." Mining Technology 2015; 124(2), 73-77. DOI: 10.1179/1743286315Y.0000000004Accurate reconstruction of the facts and causes surrounding accidents is critical if the mining
industry is to learn from incidents and prevent future events. Effective accident investigation and
training are essential in order to accomplish this, while providing a record of the incident in order
to help in explaining the situation to people unconnected to the event itself. Over a number of
years there have been considerable innovations in survey instrumentation and software used to
record data. However, the final deliverable data has remained the same, with surveyors tasked to
represent a 3D environment using 2D deliverables. This paper explores the benefits that can be
obtained using 3D data capture and representation with regard to accident investigation with
discussion on accuracy, time, witness verification and reduction in human error
O impacto da automação nas bibliotecas - uma revisão
Após rever-se os problemas surgidos quando os computadores foram pela primeira vez usados nos serviços técnicos de biblioteca, discute-se diversos aspectos relacionados com os efeitos sobre o pessoal. Especialmente relevante é o aparecimento do bibliotecário de sistemas, novas estruturas de pessoal, reciclagem, possibilidades de verificação e monitoração, e problemas de saúde advindos do uso de determinados equipamentos. Historia-se o desenvolvimento tecnológico, das estruturas principais (mainframes) até os microcomputadores, e avalia-se o efeito sobre os periféricos. Faz-se também referência ao impulso que a automação tem dado às atividades cooperativas e de normalização. Inclui-se referências.
Descritores
Tecnologia da computação. Sistemas de informação. Automação de sistemas de informação. Automação de bibliotecas. Software de aplicação. Hardware. Aplicação do computador. Pessoal.
Abstract
After reviewing the problems when computers were first used for library housekeeping routines, various aspects concerning the effects on staff are discussed. Particularly relevant are the emergence of the systems librarian, new staff structures, retraining, exposure to scrutiny and monitoring and health problems associated with the use of certain equipment. Technical developments from mainframes to microcomputers are outlined and the effect on peripherals assessed. Reference is also made to the impetus which automation has given to cooperation and standardisation. References are given
Factors influencing engineering students for choosing techno-entrepreneurship as a career: An implication for better learning
Techno-entrepreneurship is critical to the growth of society as a useful technique for overcoming youth unemployment. However, the growth of techno-entrepreneurship has been limited with the end outcome being less than satisfying. Hence, the purpose of this study was to determine the factors that influence the possibility of choosing techno-entrepreneurship as a profession among engineering students in the Philippines. There were 200 engineering students selected by stratified random sampling and the significance of the factors was then determined using Pearson correlation analysis. Based on the findings, students’ likelihood of choosing techno-entrepreneurship as a career was not influenced by their equipment availability but by their e-commerce experience, geographical location, and internet ability. This implies the need for academic personnel and instructors teaching techno-entrepreneurship courses to guarantee students have relevant technopreneur knowledge, skills, and competencies that value students’ creativity and innovation to encourage techno-entrepreneurship as a profession
- …